Job Openings G11 - Senior Data Engineer

About the job G11 - Senior Data Engineer

As a Senior Data Engineer, you will be working on:

  • Translate data requirements from business users into technical specifications.
  • Collaborate with partner agencys IT teams on technology stack, infrastructure and security alignment.
  • Build out data product as part of a data team:
  • Architect and build ingestion pipelines to collect, clean, merge, and harmonize data from different source systems.
  • Day-to-day monitoring of databases and ETL systems, e.g., database capacity planning and maintenance, monitoring, and performance tuning; diagnose issues and deploy measures to prevent recurrence; ensure maximum database uptime;
  • Construct, test, and update useful and reusable data models based on data needs of end users.
  • Design and build secure mechanisms for end users and systems to access data in data warehouse.
  • Research, propose and develop new technologies and processes to improve agency data infrastructure.
  • Collaborate with data stewards to establish and enforce data governance policies, best practices and procedures.
  • Maintain data catalogue to document data assets, metadata and lineage.
  • Implement data quality checks and validation processes to ensure data accuracy and consistency.
  • Implement and enforce data security best practices, including access control, encryption, and data masking, to safeguard sensitive data.

What we are looking for:

  • A Bachelors Degree, preferably in Computer Science, Software Engineering, Information Technology, or related disciplines.
  • Deep understanding of system design, data structure and algorithms, data modelling, data access, and data storage.
  • Demonstrated ability in using cloud technologies such as AWS, Azure, and Google Cloud.
  • Experience with Databricks.
  • Experience in designing, building, and maintaining batch and real-time data pipelines.
  • Experience with orchestration frameworks such as Airflow, Azure Data Factory.
  • Proficiency in working with Python, Shell Scripts, and SQL.

Preferred requirements:

  • Familiarity with building and using CI/CD pipelines.
  • Familiarity with DevOps tools such as Docker, Git, Terraform.
  • Experience with implementing technical processes to enforce data security, data quality, and data governance.
  • Familiarity with systems and policies relating to data governance, data management, data infrastructure, and data security.
  • Experience in Climate and Weather domains will be an advantage.