About the job Senior Data Engineer
Our client have over 275 employees worldwide. As part of our continued growth, we are looking for a Senior Data Engineer to join the Data Platform team. We need someone that is an enthusiastic problem solver, who likes to be challenged, and has the drive to go out of their comfort zone. We need you to be resourceful and committed to quality, a good communicator, and a team player.
Your role in our client team The Data Platform team is responsible for a collecting, analysing and sharing data across the business. You will work on the design, development, and maintenance of reliable, efficient, and scalable data systems and pipelines to support the company growing business.
Key Responsibilities
- Analyze complex data challenges and recommend adequate solutions according to the identified requirements.
- Design, develop and maintain efficient, reliable and scalable data systems and pipelines to support the idealized solutions.
- Work closely with the data analytics team to understand requirements, ensure adequate data architectures and deliver end-to-end data initiatives.
- Align with the IT, platform, engineering and product teams to deliver technically robust and business aligned solutions to production.
- Communicate with different teams and stakeholders to align technical and business requirements.
- Work well in a modern agile software engineering environment, with source code control, release cycles, extensive testing, and continuous deployment.
Skills Knowledge and Expertise
The right candidate is important to us, and we are looking for someone who has the following skills/knowledge:
- BSc/MSc in Computer Science or related fields.
- Experience designing architecturally robust solutions for complex software problems, especially in data related areas.
- Experience with rigorous software development practices using state of the art patterns and modern practices.
- Experience communicating with different teams and stakeholders to align technical and business requirements.
- Experience with software prototyping, development and testing.
- Experience with ETL/ELT pipelines that include both batch and stream processing.
- Experience with programming in Python/Spark (e.g. PySpark, Spark SQL, Spark Streaming, etc).
- Experience with data manipulation libraries (e.g. Pandas, NumPy, etc).
- Experience with data querying and manipulation using SQL.
- Knowledge of Databricks and related technologies (e.g. Delta Lake, Delta Live Tables, etc).
- Knowledge of relational (e.g. SQL Server, etc) and No-SQL (e.g. MongoDB, etc) databases.
- Familiarity with message queues and streaming services (e.g. Azure Event Hub, Kafka, etc).
- Familiarity with Azure and other cloud-native technologies.
- Ownership, attention to detail and commitment to quality.
- Strong written and verbal English communication skills.
Benefits
We offer you a challenging job in a growing and truly global international SaaS company with a competitive compensation structure. Ah, and you will be part of a fun and hardworking team:
- Remote / Hybrid working
- Healthcare
- Life insurance
- 6 weeks - work anywhere option per year
- Employee Assistance program
- Contributory retirement savings plan
- Opportunities for training & development
- Great team and culture
- Discounts portal
- Income protection Insurance
- Online GP services