Data Engineer
Job Description:
As a core component of KBZ Bank Analytics centre of excellence (ACE) team, this role, in technical stream, will be responsible for the design, development and delivery of data engineering solutions such as data lakes, data warehouses, data pipelines and orchestration of operational processes. Data Engineers work hand-in-hand with IT department / platform vendors / merchant systems / social media platforms.
Description
- Data Pipeline Development:
Collaborate cross-functionally to design, implement, and maintain high-performance data pipelines for diverse sources. Ensure seamless data movement, transformation, and integration to support analytics and insights.
- Data Models, Data Lakes, Data Warehouses and Data Marts:
Develop and optimize data models aligned with business needs. Design and maintain data lakes, data warehouses, purpose-built data marts for efficient querying, reporting, and strategic decision-making across operational and analytical contexts in both on-premise and cloud environments with the objective of enabling business users to be self-dependent in data extraction, scenario / hypothesis testing, opportunity formulation.
- Performance Optimization:
Monitor, identify, and proactively address bottlenecks in data pipelines and platform components of various analytical containers in cost efficient manners. Tune queries and streamline data retrieval and transformation processes for sustained high performance.
- Data Quality, Security and Compliance:
Implement rigorous data validation and quality checks throughout the pipeline. Collaborate internally to ensure data privacy, security, and adherence to industry regulations and standards. Govern access control of data analytics platform / reports / dashboards / analytical throughput while conforming to information security guidelines.
- Collaboration and Documentation:
Engage closely with Power User group, BI Team, data science teams to comprehend data requirements and support their analytics endeavours by enabling their data and platform needs either through available data streams or exploring external data streams (through system analysis). Document data flows, processes, and guides to foster effective communication and knowledge sharing.
Requirements
- Bachelors degree in IT, Computer Science, Software Engineering, Business Analytics or equivalent
- 3+ year of demonstrable track record in hands-on data engineering / analytics roles within the fintech, banking, or related domains.
- Excellent experience in using ANSI SQL for relational databases
- Expert level hands-on experience in ETL and data integration, orchestration tools
- Knowledge of data lakes, data warehousing principles.
- Experience on cloud systems like AWS, Azure, or Google Cloud Platform
- Good understanding of data modelling, data/software engineering best practices, automation, DevOps.
- Proficiency in programming languages such as Python, Java or Scala is preferrable.
Required Skills:
FinTech Scala Formulation Access Control Business Analytics Data Quality Information Security Pipelines Decision-Making Data Science DevOps Data Analytics Checks Validation Programming Languages Components Optimization Analytics Banking Automation Programming Integration Databases Regulations Social Media Security Computer Science Vendors Java Documentation Python Software Testing SQL Design Engineering Business Communication Science