Job Openings Data Engineer - Confluent Kafka

About the job Data Engineer - Confluent Kafka

We are searching for an accountable, multitalented data engineer to join our team of data engineers on our Confluent Kafka platform. As part of the Delivery team, the data engineer will be responsible for employing their skills and techniques to deploy, maintain and support the Confluent Kafka platform within the client organization in line with the customer strategy. During various aspects of this process, you should collaborate with co-workers using agile methodology to ensure that your approach meets the needs of the customer.

Role Responsibilities

  • Develop and implement solutions using Confluent Kafka.
  • Administer and improve use of Confluent Kafka across the client organization
  • including Kafka Connect, Zookeeper, Brokers, Schema Registry, Kafka Rest, ksqlDB, and custom implementations.
  • Work with multiple teams to ensure best use of Confluent Kafka and safe-data event streaming.
  • Understand and apply event-driven architecture patterns and Confluent Kafka best practices and enable development teams to do the same.
  • Assist developers and operations team in ensuring that Confluent Kafka platform is configured, secured and operates in line with the customer
  • expectations.
  • Continuous learning to be a Confluent Kafka subject matter expert. Within the organization.
  • Work with Kafka and Confluent API's (e.g. metadata, metrics, admin) to provide pro-active insights and automation.
  • Work with DevOps team to ensure that Kafka-related metrics are exported to the required platform.
  • Perform regular reviews of performance data to ensure efficiency and resiliency
  • Contribute regularly to event-driven patterns, best practices, and guidance.
  • Review feature release and change logs for Confluent Kafka and related components to ensure best use of these systems across the organization.
  • Develop an expert-level understanding of data integration, migration and deployment using CI/CD tools as it relates to Confluent Kafka
  • Acquire a deep understanding of source and sink connector technical details for a variety of platforms including S3, Casandra, Oracle and others as required.

  • Qualifications
  • 4 years work experience on Confluent Kafka (Brokers, Zookeeper, Kafka Connect, Schema Registry, Kafka Rest).
  • 4 years experience working in an Event streaming environment.
  • Solid experience in creating Kafka topics, installing, and configuring relevant Kafka connectors and administering Confluent Kafka environment.
  • Solid experience working with Linux Operating System
  • Experience with Kerberos Authentication and Authorization (Krb5, JAAS).
  • Experience with SSL communication. (Keystores and Trust stores).
  • Working experience working as part of an agile team
  • Experience using Ansible.
  • Flink streaming experience (Optional).
  • Experience with RabbitMQ (Optional).
  • Experience with JulieOps (Optional).
  • Experience with Apache Nifi (Optional)

Salary~: N2,500,000 - N3,000,000