Job Openings
Data Platform Engineer
About the job Data Platform Engineer
Data Platform Engineer - 12 Month Contract
Key Responsibilities
Platform Engineering & Development
-
Design, implement, and maintain Big Data platforms (e.g., Hadoop, Spark, Kafka) used across the CIB environment.
-
Build robust batch and real-time data ingestion pipelines using tools like Apache NiFi, Airflow, Spark, and Kafka Streams.
-
Maintain and enhance enterprise data lakes and warehouse environments using technologies such as Hive, Delta Lake, and Azure Synapse.
️ Cloud & Hybrid Integration
-
Architect and deploy data platform solutions on Microsoft Azure (Databricks, Azure Data Lake Storage, Synapse Analytics).
-
Build hybrid cloud systems to integrate on-premises and cloud-based data infrastructure.
-
Ensure optimal performance, scalability, and cost-efficiency across cloud workloads.
️ Data Governance, Compliance & Security
-
Ensure platform compliance with data governance and privacy regulations (POPIA, GDPR, BCBS239).
-
Implement robust security controls across infrastructure including encryption, access controls, and audit logging.
-
Work closely with data stewards and governance teams to integrate metadata management and data cataloging tools.
Automation & DevOps
-
Develop and maintain CI/CD pipelines for automated testing, deployment, and monitoring of data solutions.
-
Automate infrastructure provisioning using tools like Terraform and Azure DevOps.
-
Perform routine system administration, performance tuning, and issue resolution across data platforms.
Monitoring & Support
-
Implement monitoring solutions (e.g., Prometheus, Grafana, ELK Stack) to ensure system availability and reliability.
-
Provide L2/L3 support for production data environments, managing incidents and service requests effectively.
Stakeholder Engagement
-
Collaborate with cross-functional teams including data scientists, analysts, developers, and compliance teams.
-
Translate business and analytical requirements into scalable platform solutions.
-
Participate in Agile sprints and architecture design reviews.
Qualifications & Experience
Minimum Requirements
-
Bachelors degree in Computer Science, Information Systems, Engineering, or related field.
-
5+ years of experience in Big Data engineering or platform operations.
-
Experience in enterprise-grade platforms in banking or financial services.
Technical Skills
-
Strong proficiency with Hadoop ecosystem: HDFS, Hive, Spark, Kafka.
-
Expertise in Azure cloud services: Azure Data Factory, Azure Databricks, Azure Data Lake, Synapse Analytics.
-
Solid programming skills in Python, Scala, Java, and SQL.
-
Familiarity with Terraform, Git, Jenkins, Docker, and Kubernetes.
-
Experience with data governance tools such as Apache Atlas or Collibra is advantageous.
Soft Skills
-
Excellent problem-solving and troubleshooting ability.
-
Strong communication and collaboration skills.
-
Ability to work in a fast-paced, high-stakes environment.
Preferred Certifications
-
Microsoft Certified: Azure Data Engineer Associate
-
Cloudera Data Platform Certified Developer
-
Databricks Certified Data Engineer Associate
-
TOGAF or similar architecture frameworks (advantageous)