Data Engineers (Multiple)
Job Description:
Australian Citizens residing in Australia with ability to obtain Baseline Clearance only respond.
- Contract start 18 December 2023 to 30 June 2024, 2 x 12 months.
- Australian Citizen, Ability to obtain Baseline Clearance, Canberra, Offsite role.
Send your responses to jobs@softtestpays.com
Overview
The Department of Agriculture, Fisheries and Forestry (DAFF) is looking for Data Engineers with strong experience developing Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes across several data and analytics platforms to join its Enterprise Analytics section.
The role will be responsible for design, development and unit testing activities across several data movement and transformation processes within DAFF. These data movement processes are being developed to focus on the preparation of data for use in decision making processes across the department, utilising modern cloud technology (Azure) to enable operational analytics use cases.
The successful candidate will require experience with the following techniques and technologies:
- MS Azure Stack
- Data Integration
- Data Factory
- SQL Server Integration Services
- Databricks
- Data Store
- SQL Server
- Data Lake Storage
- Analytics
- Azure Databricks
- Azure Machine Learning
- ArcGIS Enterprise
- Development Tools DevOps, Visual Studio, VS Code
- Data technology solutions sourcing (Oracle, Ingres, Azure, SQL Server), collecting, ingesting and storing
- Data Preparation
- Transformation of data into formats tailored to analytics use cases
- Parquet
- Delta
Every application requires to address selection criteria as part of application submission.
Essential Criteria Weighting
1. Experience developing ETL/ELT processes for data movement and transformation. 25%
2. Experience preparing data optimised for query performance in cloud computed engines. E.g. Distributed computing engines (Spark) Graph Databases Azure SQL 25%
3. Experience working with Engineering, Storage and Analytics services in cloud infrastructure. 25%
4. Working with multi-disciplinary teams using an Agile methodology 25%
Desirable Criteria Weighting
1. Experience working with Azure Data Factory and Databricks
2. Experience/Knowledge of working with Data Lake and Lakehouses