Job Openings Data Science Developer

About the job Data Science Developer

RQ#: RQ08371
Role: Data Science Developer - Senior

Organization: Labour and Transportation Cluster
Ministry: Ministry of Transportation

Assignment: A2250451
Assignment Type: 3 days onsite / 2 days remote
Clearance Type: CRJMC

Work Location: 87 Sir William Hearst Ave., Toronto, Ontario
Duration: 252.00 Days
No. of Extensions: 1

Assignment Start Date: April 1st, 2025
Assignment End Date: March 31st, 2026

Job Description:

The Data Science Developer - Senior will play a crucial role in the development and maintenance of data pipelines, analytics models, and data products for the Ministry of Transportation and the Ministry of Labour, Immigration, Training, and Skills Development. The role will involve creating, enhancing, maintaining, and supporting solutions that optimize the way data is handled, transformed, and visualized.

Key Responsibilities:

  • Data Storage & Preparation:
    Create, enhance, and maintain data structures that are suitable for storage and consumption in analytics solutions. Experience with Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse is essential.
  • Data Pipelines:
    Automate data pipelines for data ingestion, transformation, and modeling, using Microsoft Azure technologies such as Python, Databricks, and Azure Data Factory.
  • Analytics & Reporting:
    Design, enhance, and maintain Power BI dashboards and reports to provide actionable insights. Ensure the proper integration of data analytics with reporting solutions.
  • Infrastructure & Technology:
    Improve performance and simplify architecture patterns by implementing new technologies that reduce cloud hosting costs and improve overall system performance.
  • Knowledge Transfer:
    Conduct knowledge transfer sessions, including documentation and walkthroughs for technical staff on analytics solutions architecture, design, and continuous improvement.

Skills and Experience Requirements:

  • Data Storage and Preparation (35%):
    Demonstrated experience with Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse in real-world implementations.
  • Data Pipelines (35%):
    Proficiency in automating data pipelines using Microsoft Azure Platform technologies such as Python, Databricks, and Azure Data Factory.
  • Data Analytics (15%):
    Experience in developing Power BI reports and dashboards, with a strong understanding of how to integrate analytics with business solutions.
  • Knowledge Transfer (15%):
    Experience in conducting knowledge transfer sessions and creating documentation for technical staff related to designing, architecting, and implementing end-to-end analytics solutions.

Mandatory Skills:

  • Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse
  • Python, Databricks, Azure Data Factory
  • Power BI reports and dashboards

Nice to Have:

  • Experience with high data / big data projects