Senior Data Engineer - Enterprise Data Warehouse & Lakehouse Solutions - Hybrid
Job Description:
We currently have a vacancy for a Senior Data Engineer - Enterprise Data Warehouse & Lakehouse Solutions fluent in English, to offer his/her services as an expert who will be based in Belgium. The work will be carried out either in the companys premises or on-site at customer premises. In the context of the first assignment, the successful candidate will be integrated with the Development team of the company that will closely cooperate with a major clients IT team on site.
Your tasks
- Development and maintenance on Enterprise Data Warehouses (EDW) and complex Business Intelligence Solutions (Data Lakes / Data Lakehouses);
- Design and development of data pipelines for scalable and reliable data workflows to transform extensive quantities of both structured and unstructured data;
- Data integration from various sources, including relational databases, APIs, data streaming services and cloud data platforms;
- Optimisation of queries and workflows for increased performance and enhanced efficiency;
- Writing modular, testable and production-grade code;
- Ensuring data quality through monitoring, validation and data quality checks, maintaining accuracy and consistency across the data platform;
- Elaboration of test programs;
- Document processes comprehensively to ensure seamless data pipeline management and troubleshooting;
- Assistance with deployment and configuration of the system.
Requirements
- University degree in IT combined with relevant IT professional experience of 13 years;
- At least 5 years of experience in relational database systems applied to data warehouse, data warehouse design & architecture;
- At least 5 years of experience in code-based data transformation tools such as Data build tool (dbt), Spark;
- At least 5 years of experience in SQL and data integration and ETL/ELT tools;
- Hands-on experience as Data Engineer in a modern data platform and on data analytics techniques and tools;
- At least 3 years of experience in Python and orchestration tools such as Airflow, Dagster;
- At least 3 years of experience in data modelling tools as well as online analytical data processing (OLAP) and data mining tools;
- Experience with data platforms such as Fabric, Talend, Databricks and Snowflake;
- Experience with containerised application development and deployment tools, such as Docker, Podman, Kubernetes;
- Excellent command of the English language.
Benefits
If you are seeking a career in an exciting and dynamic company, where you will offer your services as part of a team of a major European Institution, operating in an international, multilingual and multicultural environment where you can expect real chances to make a difference, please send us your detailed CV in English.
We offer a competitive remuneration (either on contract basis or remuneration with full benefits package), based on qualifications and experience. All applications will be treated as confidential.