Job Openings Data Engineer

About the job Data Engineer

Data Engineer

Founded in 2012 by a Sri Lankan American entrepreneur from the US Silicon Valley, iLabs currently has several major partnerships with global tech companies and also has its own high-growth eCommerce marketplace in the US (CloudofGoods.com). Today, iLabs specializes in Web, Mobile, AI technologies, and technology outsourcing services. We also provide IT strategy consulting, IT operations, DevOps, SEO, and Digital Marketing/Social Media solutions to our global clients through iLabs, a Colombo Port City Corporation

Over the past decade, we have grown to be one of the top IT companies in Sri Lanka. We are currently experiencing rapid growth with HUNDREDS OF URGENT JOB OPENINGS. We hire top talent in Sri Lanka for above-market remuneration. Come, work for us!

We are looking for a Data Engineer to join our team who will work on various data engineering projects. Your responsibilities include, but not limited to choosing optimal solutions, then implementing, maintaining and monitoring them. You will also be responsible for integrating them with the organization's infrastructure or other 3rd party client infrastructures.

Responsibilities

  • Design and implement scalable, reliable distributed data processing frameworks and analytical infrastructure
  • Be part of a team to define, design, and implement data integration, management, storage, consumption, backup, and recovery solutions that ensure the high performance of the organization's enterprise data
  • Develop Structured Query Language (SQL), Data Definition Language (DDL), and Python or equivalent programming scripts to support data pipeline development, problem-solving, data validation, and performance tuning
  • Work with software engineers, devops, ML engineers, and data scientists to achieve the organization’s goals


Requirements

  • Minimum 3-5 years experience in the similar capacity with BS/MS degree in Computer Science, Engineering or a related subject
  • Strong experience with data engineering tools and platforms, including Delta Lake and Data Lakehouse architectures
  • Proficiency in data pipeline orchestration tools, such as Apache Airflow

  • Broad knowledge of programming languages is required (Python/Scala)

  • Understanding about data warehousing solutions, relational database theories and no-sql databases
  • Good knowledge of new and emerging tools for extracting, ingesting and processing of large datasets (Kafka, Spark, Hadoop, DataBricks or equivalent)
  • Familiarity with Azure Data Factory and Azure Logic Apps is a big plus

  • Hands-on experience with Amazon Web Services is a big plus and knowledge about other cloud systems (Azure/GCP)
  • Understanding about dockerization and kubernetes, data modeling and design patterns is a big plus
  • Knowledge of web scraping technologies is a big plus (selenium, beautiful soup etc).
  • Familiarity with Linux

  • Excellent interpersonal, communication and organizational skills are required