Canberra, ACT, Australia

Data Operations Engineer

 Job Description:

Please respond to the job if you are an Australian Citizen and residing in Australia.

  • Contract start 06 March 2023 To 24 months, 12 months extensions.
  • Australian Citizen, Ability to obtain Baseline Clearance, Canberra role.

Send your responses to jobs@softtestpays.com

Overview

The Data & Innovation section within the Clean Energy Regulator requires a temporary Data Operations Engineer contractor, in accordance with position requirements.

About the Branch

The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator.

About the Section

The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks and infrastructure to suit the agencys data operation needs. Specific responsibilities include:

  • Support the Chief Data Officer and the agency in delivering the strategies to improve data maturity.
  • Contribute to uplifting enterprise data capability through the significant change program occurring in the agency.
  • Supporting standards and modelling that improve data architecture
  • Support innovative approaches to reduce regulatory burden, support self-service approaches, improve efficiency and encourage compliance with our schemes.
  • Providing targeted skills and resources to consult on, design, build and/or roll out strategic initiatives that result in improvements to internal processes, data collection, client compliance, and communication related to scheme operations.

Role Responsibilities and Duties

  • Inspire front line teams to go beyond the standard call of duty to find creative solutions for our product issues.
  • Investigate and understand data anomalies.
  • Design web-based tools for business process management and automation.
  • The Data Operations Engineer is responsible for the integrity of the production data utilized by CER employees by designing, creating and administering steps required to resolve or escalate issues that may prevent successful completion of scheduled production processes and data transfer, ensuring changes to the production environment have been reviewed and authorised.
  • Support the data requirements of the different business areas and processes including data management, governance, quality, architecture, enterprise modelling, security and integration.
  • Supports daily operational tasks.
  • Analyses impact of proposed solutions to the business requirements, data integrity and quality.
  • Monitors and ensures operating integrity.
  • Provides recommendations on systems and business process enhancements.
  • Provides operational support and recommendations during testing efforts.

Every application requires to address selection criteria as part of application submission

Essential Criteria

  • Ability to scope and define data sets needed for specific use cases and identifying data gaps.
  • Ability to translate scientific insights into product decisions and work streams.
  • Flexibility to handle directional changes and moving priorities to ensure project success.
  • Knowledge and application of DMBOK data management frameworks and principals.
  • Familiarity with star schemas and data vault modelling.
  • Experience using enterprise data modelling tools (LucidChart, EA Sparx, ERWin, ER Studio, etc).
  • Capability to create Conceptual, Logical and Physical data models for a range of technological solutions, databases and pipelines at an enterprise level in accordance with prescribed data modelling standards.
  • Familiarity with DataOps and/or DevOps Repos & Pipelines, with Attribute Based Access Control and deployments for Power BI.
  • Familiarity with DataOps and/or DevOps Repos & Pipelines, using Github, CI/CD and attribute-based Access Control and deployments for Power BI
  • A demonstrated in-depth knowledge of data base infrastructure and cloud data solutions, preferably Azure suite (D365, SQL, Analysis Services, Power BI, Purview, Dataverse, Gen II data lake, Synapse, Data Factory).
  Required Skills:

Access Control Data Integrity Pipelines Steps Analysis Data Collection Data Management Business Process DevOps Energy Power BI Business Requirements Github Architecture Automation Infrastructure Integration Databases Security Testing SQL Design Communication Business Management