About the job Senior Data Engineer - Kafka Streams (Remote)
This position is fully remote and available to anyone within the EU. Unfortunately, we can't offer relocation outside of the EU. Also, it's a direct employment opportunity with one of our client's entities in the EU, not a contract job.
Join our client, a dynamic EdTech company that empowers individuals to achieve their life goals through transformative educational services. With a high social impact, you'll be part of meaningful work that makes a difference. Boasting a team of 150+ tech experts, our client constantly innovates to improve their business and customer experience.
As a Data Engineer, you'll collaborate with the Data team to build cutting-edge digital platforms. Thrive in an international work environment that embraces change and be part of a team dedicated to reshaping the future of education by delivering real-time, actionable insights for users.
Responsibilities:
- Design/implement scalable data streamings using Apache Kafka
- Experience in dbt, Airbyte, Snowflake, etc.
- Ensure correctness and reliability of real-time insights.
- Mentor and guide team members on data engineering best practices.
- Collaborate with stakeholders to understand requirements and ensure delivery.
- The work environment:
- Remote-first, the cross-functional team across Europe.
- Agile practices, collaboration, knowledge-sharing.
- Emphasis on social aspects and having fun.
Compensation details:
- A salary range of 80,000 to 86,000 euros, based on your experience.
- This is a permanent position, B2B contracts are not possible.
- Additional and extra benefits in accordance with the legislation of your country.
Ideal candidate:
- Hands-on experience with Kafka and other data processing technologies.
- Proficient in designing/implementing data pipelines.
- Can code in Python, Java, or Scala; proficient in SQL.
- Strong communication skills (written and verbal).
Bonus:
- Experience in collaborative data teams.
- Familiarity with modern data ingestion/transformation tools.
- Experience with cloud-based data processing (e.g., Snowflake).
- Knowledge of ML frameworks (e.g., PyTorch, Tensorflow) or willingness to learn.
- Awareness of data governance procedures
If you're interested in the job but feel you might not fully meet the technical requirements, please apply anyway.