Job Openings
Data Engineer III
About the job Data Engineer III
Job Summary:
We are looking for a technically strong and solution-oriented Data Engineer III to design and maintain large-scale, cloud-based data pipelines and analytics infrastructure that support business operations across a global shared services or financial services enterprise. This senior-level role drives best practices in data architecture, automation, and reliability, while collaborating with data analysts, architects, and business users to deliver trusted and scalable data solutions.
The ideal candidate brings hands-on expertise in cloud platforms, big data tools, and enterprise data workflows, with a focus on performance, security, and continuous improvement.
Key Responsibilities:
- Lead the design, development, and maintenance of end-to-end data pipelines using modern ETL/ELT frameworks and tools.
- Architect and implement scalable data solutions in cloud environments (e.g., Azure, AWS, or GCP).
- Build and optimize data models, lakehouses, and warehouse structures to support analytics and reporting.
- Implement monitoring, alerting, and data quality checks to ensure pipeline reliability and data integrity.
- Collaborate with data architects, analysts, and DevOps teams to align on infrastructure, tools, and standards.
- Automate ingestion and transformation processes across diverse data sources (structured/unstructured).
- Participate in code reviews and mentor junior data engineers.
- Evaluate new technologies and tools to enhance data engineering capabilities and system performance.
- Support governance, security, and compliance standards in data handling and access.
Qualifications:
- Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field.
- 5+ years of experience in data engineering, with strong hands-on background in cloud-native platforms and tools.
- Expert in SQL and at least one modern programming language (e.g., Python, Scala, or Java).
- Proven experience with tools such as Databricks, Snowflake, Azure Data Factory, AWS Glue, Airflow, or equivalent.
- Deep understanding of data warehousing, lakehouse architectures, and dimensional modeling.
- Experience implementing CI/CD for data pipelines and infrastructure-as-code (e.g., Terraform, ARM).
- Strong communication skills and ability to collaborate with technical and non-technical stakeholders.
- Experience in regulated industries (e.g., finance, insurance, healthcare) is a plus.