Senior Data Engineer (Big Data)
Job Description:
Contracting period: Long term project(1 year)
Location: Hybrid (Poland)
About the Role
We are looking for an experienced Senior (Big) Data Engineer to join large-scale data platform initiatives for an international technology-driven organization. The company builds and operates high-volume, low-latency data platforms, processing large amounts of event data through modern batch and streaming architectures.
This role is suited for a senior-level data engineer who enjoys working with distributed systems, event-driven data processing, and cloud-native technologies. The position requires fluency in Polish and location within Poland.
Your Profile
-
6–8 years of hands-on experience in Data Engineering roles
-
Experience with at least one major cloud platform (GCP, AWS, or Azure); willingness to work in a GCP-based environment (prior GCP experience is a plus)
-
Strong production experience with Apache Spark, using Python / PySpark
-
Hands-on experience with streaming and event-driven architectures, using technologies such as:
-
Kafka
-
Google Pub/Sub
-
AWS Kinesis
-
Azure Event Hubs
-
-
Strong SQL skills, including data transformations, analytical queries, and performance optimization
Nice-to-Have Skills
-
Previous experience specifically in a Big Data Engineer role
-
Background in JVM-based languages (Scala, Java, Kotlin)
-
Familiarity with data lake or lakehouse architectures
-
Experience implementing monitoring, observability, and data quality checks
-
Exposure to high-throughput event processing systems
-
Experience with CI/CD pipelines or Infrastructure-as-Code approaches
Your Responsibilities
-
Design, build, and maintain scalable batch and streaming data pipelines
-
Develop, optimize, and operate Apache Spark jobs using PySpark
-
Work with event-driven and streaming platforms to process high-volume datasets
-
Perform advanced data transformations and analytics using SQL
-
Improve the performance, reliability, and observability of data pipelines
-
Collaborate with analytics, platform, and product teams to deliver end-to-end data solutions
-
Participate in technical and architectural decision-making
-
Take end-to-end ownership of data solutions, from design through production
What You Can Expect
-
Work on large-scale, data-intensive systems with real-world impact
-
A technically challenging environment focused on distributed data processing
-
Collaboration with cross-functional teams in a modern data platform ecosystem
-
Opportunities to influence architecture, tooling, and best practices
Required Skills:
Big Data