Job Openings Senior Data Engineer

About the job Senior Data Engineer

Position Purpose Summary:

DEPLOY's client is building a next-generation business management platform from the ground up and were looking for a hands-on Senior Data Engineer to deliver and scale our data strategy. As we grow rapidly through mergers and acquisitions, we face a rich and complex data environment. This role will be central to cleaning, unifying, and powering data flows across the enterprise. Youll work primarily on building and maintaining production-ready pipelines, cleaning and standardizing datasets, and ensuring data quality while also contributing to the long-term design of our data architecture. We operate on an Azure-based infrastructure with a PostgreSQL backend and Python-scripted ETL pipelines. Youll collaborate with engineering, finance, and operations teams to make our data reliable, scalable, and analytics-ready.

KEY RESPONSIBILITIES / TASKS

  • Clean, standardize, and integrate large, messy datasets from diverse ERP, finance, and operational systems, applying AI-driven approaches for anomaly detection, data quality monitoring, and metadata enrichment.
  • Maintain and optimize relational database schemas (PostgreSQL), including materialized views for transactional and reporting systems.
  • Build and maintain ETL pipelines using Python, Airflow, DBT and related tooling leveraging AI-powered developer tools (e.g., Copilot, Claude, Cursor) to accelerate coding, testing, and documentation.
  • Establish and enforce data quality rules, validation logic, and monitoring standards.
  • Create database connections and integrations between PostgreSQL and Microsoft applications.
  • Document data models, pipelines, and business logic to ensure scalability and knowledge sharing.
  • Collaborate with business stakeholders to understand data requirements and deliver actionable solutions.
  • Contribute to data governance practices, including metadata management and access controls.
  • Design data models to support complex ERP and operational domains.
  • Support vendor management and coordinate contributions from internal and external implementation teams.
  • Participate in tool evaluations and contribute to the evolution of our modern data stack (warehouse, orchestration, catalog, etc.).


This is a rare opportunity to help build the foundational data layer of a rapidly scaling company. You'll play a critical role in shaping how data is ingested, standardized, and leveraged to drive insights across the enterprise, while applying the latest AI-powered tools to accelerate development and improve quality. As we continue to grow through acquisitions, your work will unify platforms, unlock intelligence, and enable business growth at scale.


Expected to influence architectural direction, tooling, and data model standards. Expected to work independently.

Financial Authority:

May participate in vendor/tool selection and cost-performance tradeoffs, but does not own a budget directly.

Work / Problem Complexity:

Will address highly complex, multi-source data integration challenges spanning ERP, inventory, finance, and operations. Requires balancing long-term scalability with short-term tactical needs.

Influencing / People Leadership:

Ability to collaborate with other developers, product team, vendors, and stakeholder;

EDUCATION / EXPERIENCE

Certifications:

  • None required; cloud platform certifications (e.g., Azure Data Engineer Associate) are a plus.

Educational Requirements:

  • Preferred: Bachelors degree in Computer Science, Data Engineering, or related STEM field (Not Required)

Years of Experience:

  • 8+ years of experience in data engineering or database architecture roles
  • Nice to have: 2+ years in a lead or architect capacity

Knowledge / Skills / Abilities:

  • Strong experience with relational database design (PostgreSQL preferred)
  • Deep understanding of Modern ELT/ETL best practices and tools (Python, Airflow, dbt, etc)
  • Experience designing databases and integrations for ERP system
  • Direct experience using modern AI development tools like Copilot, Claude, or Cursor to accelerate coding, testing, and documentation.
  • Preferred: Familiarity with applying AI/ML services in Azure (e.g., Cognitive Services, Azure Machine Learning) for advanced data engineering use cases.
  • Preferred: Experience in Distribution or transactional accounting