Job Openings Software Engineer - Confluent

About the job Software Engineer - Confluent

We are looking for a talented and motivated Software Engineer with strong experience in Confluent Kafka and Qlik to join our team. This role involves designing, developing, and maintaining real-time data streaming and data visualization solutions that drive business intelligence and operational insights. The ideal candidate will work closely with cross-functional teams to build scalable, efficient, and secure data pipelines and reporting systems.

Responsibilities:

  • Data Pipeline Development: Design, develop, and optimize real-time and batch data pipelines using Confluent Kafka and associated tools.

  • Streaming Data Processing: Implement event-driven architectures leveraging Kafka Streams, ksqlDB, and Kafka Connect.

  • Data Visualization & Reporting: Build dynamic dashboards and insightful visualizations using Qlik to support decision-making across the organization.

  • Integration & Transformation: Integrate structured and semi-structured data from multiple sources for analytics and reporting.

  • Security & Compliance: Apply data security best practices including encryption, RBAC, and compliance with relevant data governance standards.

  • Optimization & Monitoring: Monitor pipeline health, optimize performance, and troubleshoot issues in real-time data infrastructure.

  • Documentation & Collaboration: Maintain clear technical documentation and collaborate with Data Scientists, BI Analysts, and Cloud Engineers to ensure alignment of data architecture and analytics initiatives.

Qualifications:

Must have

  • Strong understanding of real-time data streaming and event-driven architectures.

  • Hands-on experience with Confluent Kafka (Kafka Streams, Kafka Connect, ksqlDB).

  • Proficiency in SQL, Python, or Scala for data transformation and automation.

  • Familiarity with cloud platforms (AWS, Azure, or GCP) for pipeline deployment.

  • Experience with ETL/ELT processes and data pipeline automation.

  • Strong problem-solving and analytical skills.

  • Exposure to CI/CD pipelines, version control systems (e.g., Git), and DevOps practices.

Nice to have

  • Experience developing interactive dashboards and analytics with Qlik (QlikView/Qlik Sense).

  • Experience with Snowflake for data warehousing and analytics.

  • Certifications in Confluent Kafka, Qlik, or Snowflake.

  • Familiarity with data orchestration tools like Apache Airflow or dbt.

  • Knowledge of Big Data technologies such as Spark, Flink, or Druid.

  • Understanding of REST APIs, GraphQL, or WebSockets for data integration.

  • Experience in regulated industries such as finance or healthcare..