Job Description:
Job: Senior Data Engineer
Must have experience: Databricks, Snowflake
Location: Dublin, Ireland
Type: Contract
Term: 12 months +
Rate: 400 per day
We are looking for a Senior Data Engineer that will provide and influence our data architecture, working with innovative cloud technologies to drive impactful, data-centric projects. Ideally you will have in-depth experience with Databricks, Snowflake, AWS, and MLOps to support and enhance the deployment and scalability of machine learning models. Youll play a pivotal role in ensuring data accessibility, optimising data sourcing pipelines, and enhancing the performance of large-scale data solutions.
Some responsibilities will include:
Design and Implement Cloud-Native Data Solutions: Develop scalable, resilient data platforms using cloud-native technologies, data mesh frameworks, and integration across diverse data sources
Build and Maintain MLOps Pipelines: Use tools like MLflow to create reliable, efficient pipelines for deploying machine learning models in production environments
Establish Data Governance and Quality Standards: Develop and uphold data governance practices, ensuring robust data quality and compliance using tools like Databricks Unity Catalog
Oversee Data Integration and Migration: Lead migration projects from legacy data systems to modern cloud platforms, focusing on optimising operational costs and efficiencies
Performance Optimisation and Tuning: Use tools such as Snowflake and Delta Lake to enhance data accessibility, reliability, and performance, delivering robust, high-quality data products
We are looking for people with experience/background in:
Data Engineering Expertise: Proven experience in architecting and delivering large-scale, cloud-native data solutions
Advanced Knowledge in Databricks and Snowflake: Hands-on experience in Databricks, Spark, Pyspark, and Delta Lake, with strong skills in data warehousing and lakehouse solutions
MLOps Skills: Practical experience in MLOps, ideally with MLflow for model management and deployment
Cloud Proficiency: Strong knowledge of AWS, with additional experience in Azure advantageous for multi-cloud setups
Programming Proficiency: Advanced coding abilities in Python, SQL and Scala
Tooling Competence: Familiarity with version control (GitHub), CI/CD tools (Azure DevOps, GitHub Actions), orchestration tools (Airflow, Jenkins), and dashboarding tools (Tableau, Alteryx)
Desirable Skills:
Experience with Synapse Analytics, Netezza, and legacy data systems
Knowledge of data governance best practices and tools
Excellent problem-solving skills, with the ability to work both autonomously and collaboratively in cross-functional teams
Apply now for immediate consideration.