About the job Data Engineer - Hybrid
We currently have a vacancy for a Data Engineer fluent in English, to offer his/her services as an expert in Valletta, Malta. The work will be carried out either remotely or on site at customer premises. In the context of the first assignment, the successful candidate will be integrated in the Development team of the company that will closely cooperate with a major clients IT team on site.
Your tasks
- Design, develop, document, and maintain ETL/ELT processes, data integration, cleaning, transformation, dissemination and automation processes;
- Design, develop, document and maintain data architecture, data modelling and metadata;
- Develop and support data warehouse/lakehouse architectures and data processing ensuring data quality, lineage, auditing, metadata, logging, linkage across datasets and impact assessments;
- Develop and maintain business intelligence models, interactive dashboards, reports and analytics using tools such as Databricks, Jupyter Notebooks, and Power BI;
- Design, develop, document, improve and maintain the Data Warehouse/Lakehouse ecosystem (e.g. the DataDevOps lifecycle, architecture);
- Contribute to the definition and documentation of data governance policies, procedures, standards, and metadata models.
Requirements
- University degree in IT or relevant discipline, combined with minimum 6 years of relevant working experience in IT;
- Experience with development and data processing using e.g. Python, SQL, Power M and DAX;
- Experience with structured, semi-structured and unstructured data types and related file format (e.g. JSON, Parquet, Delta);
- Experience with gathering business requirements and transforming it into data collection, integration and analysis processes;
- Experience in Microsoft On-Prem and Azure Data Platform tools (such as Azure Data Factory, Azure Functions, Azure Logic Apps, SQL Server, ADLS, Azure Databricks, Microsoft Fabric/Power BI, Azure DevOps, Azure AI Services, PowerShell);
- Experience in CI/CD lifecycle using Azure DevOps;
- Experience in Databricks ecosystem, Apache Spark and Python data processing libraries;
- Experience with Data Modelling principles and methods;
- Experience with Data Lakes and Data Lakehouse architecture, concepts and governance;
- Experience with Data Integration and data warehouse/lakehouse modelling techniques, concepts and methods (e.g. SCD, Functional Engineering, Data Vault, Data Streaming, etc);
- Experience with data governance and data management standards, policies, processes, metadata, quality, etc;
- Experience with WebAPIs and OpenAPI standard;
- Knowledge of DAMA Data Management best practices and standards;
- Knowledge of Data Governance and Discovery tools such as Azure Purview;
- Knowledge of Master data and reference data management concepts;
- Knowledge of Business glossaries, data dictionaries, and data catalogues;
- Knowledge of Moodle or other Learning Management System;
- Excellent command of the English language.
Benefits
If you are seeking a career in an exciting and dynamic company, where you will offer your services as part of a team of a major European Institution, operating in an international, multilingual and multicultural environment where you can expect real chances to make a difference, please send us your detailed CV in English.
We offer a competitive remuneration (either on contract basis or remuneration with full benefits package), based on qualifications and experience. All applications will be treated as confidential.