Job Openings
Data Engineer
About the job Data Engineer
Responsibilities
As a Data Engineer at our client, you will be responsible for:
- SQL Database Management: Design, implement, and maintain SQL databases to ensure efficient data storage and retrieval. Optimize and tune SQL queries for maximum performance.
- API, Elasticsearch and Data Loading: Develop and manage data ingestion processes using APIs, Elasticsearch and other data loading techniques.
- Azure Data Factory Pipelines: Develop and manage data pipelines using Azure Data Factory.
- ETL Development and Maintenance: Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse.
- Implement data quality checks: Ensure the integrity of data throughout the ETL process.
- Ensure the reliability, scalability, and efficiency of data movement within the Azure cloud environment.
- Work in migrating our data sources into Snowflake.
- Proactive Problem Solving: Proactively identify and address data-related issues, ensuring data accuracy and consistency.
- Collaborate with other teams to understand their data requirements and provide effective solutions.
- Clearly communicate complex technical concepts to non-technical stakeholders.
- Collaborate with data scientists, analysts, and other team members to understand data needs and deliver solutions.
- Documentation: Maintain thorough documentation for all data engineering processes, ensuring knowledge transfer and best practices.
- Data Security and Compliance: Ensure all data engineering processes adhere to data security and compliance standards. Implement data encryption and access controls as needed.
- Version Control and CI/CD: Use version control systems for code management. Implement CI/CD pipelines for data engineering workflows.
Qualifications
- Bachelors degree in Computer Science, Information Technology, or related field.
- Proven experience in SQL database design and optimization.
- Hands-on experience with Snowflake.
- Proficiency in creating and managing data pipelines using Azure Data Factory.
- Strong ETL development skills.
- Experience with data modelling.
- Excellent problem-solving and analytical skills.
- Proactive mindset with the ability to work independently and collaboratively.
- Strong communication and interpersonal skills.
Preferred Skills
- Familiarity with other cloud platforms (AWS, GCP).
- Experience with big data technologies.
- Knowledge of data warehousing concepts.
- Certifications in relevant technologies.