ETL Developer / Data Engineer
Job Description:
Please respond to the job if you are an Australian Citizen and residing in Australia.
- Contract start 01 July 2023 To 12 months, 2 x 12 months extensions.
- Australian Citizen, ability to obtain Baseline Clearance, Canberra, Offsite role.
Send your responses to jobs@softtestpays.com
Overview
Data Engineering, in the Centre of Data Excellence (CODE) is seeking an ETL Developer / Data Engineer to provide technical leadership to carry out development of and provide production support for a series of data load or data replication workflows on our AWS-based Data Lake platform.
The responsibilities of the ETL Developer / Data Engineer will include, but are not limited to:
Liaise with internal and external clients to determine requirements and design solutions.
Design ETL routines for ingestion of data from existing source systems including but not limited to: Oracle databases, Postgres databases and data delivered as CSV, JSON or XML via API or file transfer protocols such as SFTP.
Build and maintain data ingestion pipelines using technologies such as AWS DMS and AWS Glue, Pyspark and Iceberg.
Define the database schema for intermediate data transformations.
Contribute to the development of data load/migration plans, including effort estimates.
Contribute to the specification of data transformation, both content and structure, for end-to-end load/migration from source to target systems.
Define and construct cleansing rules to be implemented as part of the data load/migration routines.
Implement (including design, build and test) the data extract, transform, load (ETL) routines to load data from one system to another and/or migrate data from current to future data stores.
Participate in discussions on data transformations with business and technical areas.
Provide technical leadership and mentoring to a small team, including active knowledge transfer.
Every application requires to address selection criteria as part of application submission
Essential Criteria
1. The successful candidate enjoys working in a tight-knit agile team providing data capabilities and services to technical and business stakeholders. You communicate well (verbally and in writing), share knowledge and experiences, have a can-do and flexible approach, and follow processes and documentation requirements. 30 %
2. You must have a strong knowledge of and demonstrated high-level experience in data analysis, problem-solving and ETL development, including design, build, test and deployment. 30 %
3. The successful candidate will be required to demonstrate experience with the following technologies:
- AWS Glue - AWS DMS - Apache Parquet - PySpark - Logstash - ElasticSearch - Iceberg
40 %
Desirable Criteria
1. Experience in the following areas: - Database design - Oracle DBMSs, Elasticsearch DBMSs and Azure services - Data load/migration projects including data profiling and data quality analysis.
2. Use of Informatica IICS, PowerCenter, Developer, Analyst or MDM tools
Required Skills:
Data Quality Informatica Production Support Database Design ElasticSearch Pipelines Estimates Analysis Load Apache Mentoring JSON XML Data Analysis Writing Databases Oracle Documentation Leadership Engineering Design Business