About the job Data Engineer
About TEKEVER:
At TEKEVER, we are at the forefront of leveraging cutting-edge data and AI technologies to drive innovation and deliver exceptional value to our clients. Our team is passionate about creating intelligent solutions that transform countries, governments, businesses and improve lives.
TEKEVERs mission is to provide our customers with actionable intelligence to make the best decisions faster - both in real-time and non-real-time - in the most challenging environments across the globe. We design and build state-of-the-art autonomous Unmanned Aerial Systems (UAS), both in terms of hardware and software platforms and are growing fast in multiple areas, within both Defense and Civil domains. Typical Use Cases that our portfolio of drones and AI platforms focus on are, amongst others, military surveillance & intelligence gathering, oil pipeline inspections, maritime surveillance, wildfire monitoring, crowd control, change detection, automated area search, sense & avoid, precision landing, swarming - and many more.
As such, we are significantly expanding our Data & AI function and are seeking highly skilled and motivated Data Engineers.
Job Overview:
As a Data Engineer, you will play a critical role in designing, building and maintaining the data pipelines and systems that support our data-driven initiatives, as well as supporting the evolution of our Data & Analytics Platform. You will work closely with data scientists, analysts and other stakeholders to ensure that our data & AI infrastructure is robust, scalable and efficient. The ideal candidate will have a strong background in data engineering, with experience in data integration, ETL processes, database management and Data & Analytics Platform development.
What will be your responsibilities:
Data Pipeline Development: Design, develop and maintain scalable and efficient data pipelines to collect, process and store large volumes of data from various sources.
ETL Processes: Implement ETL (Extract, Transform, Load) processes to ensure data is accurately and efficiently transformed and loaded into data storage systems.
Database Management: Manage and optimize databases and data warehouses to ensure data integrity, performance and availability.
Data Integration: Integrate data from multiple sources, including APIs, databases and external data providers, to create unified datasets for analysis.
Data & Analytics Platform development & expansion: support the expansion of our Data & Analytics Platform.
Data Quality Assurance: Implement data validation and quality assurance processes to ensure the accuracy and consistency of data.
Collaboration: Work closely with data scientists, analysts and other stakeholders to understand data requirements and provide the necessary data infrastructure and support.
Performance Optimization: Monitor and optimize the performance of data pipelines and databases to ensure efficient data processing and retrieval.
Documentation: Maintain comprehensive documentation of data pipelines, ETL processes and database schemas.
Profile and requirements:
Education: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field.
Experience: 3+ years of experience in data engineering or a similar role.
- Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Experience with SQL and database management systems (e.g., MySQL, PostgreSQL, SQL Server).
Familiarity with big data technologies (e.g., Hadoop, Spark) and data warehousing solutions (e.g., Redshift, Snowflake).
Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services, with a focus on Google Cloud. Google Cloud certification is preferred.
Knowledge of data integration tools and frameworks (e.g., Apache Nifi, Talend, Informatica).
Experience with data modeling and schema design.
Experience with Iaac (e.g. Ansible, Terraform), data pipeline orchestration (e.g. Airflow), log exploration tools (e.g. Streamlit, Dash), data extraction (e.g. PostGIS, Kafka, Airflow, FastAPI), pandas, scikit-learn, Docker.
Basic understanding of DevOps best practices and tools: GIT, CI/CD, telemetry and monitoring, etc.
Analytical Skills: Strong analytical and problem-solving skills with a focus on delivering scalable and efficient data solutions.
Communication: Excellent verbal and written communication skills, with the ability to effectively collaborate with technical and non-technical stakeholders.
Attention to Detail: High attention to detail and a commitment to ensuring data quality and accuracy.
Adaptability: Ability to work in a fast-paced, dynamic environment and manage multiple priorities simultaneously.
What we have to offer you:
An excellent work environment and an opportunity to create a real impact in the world;
A truly high-tech, state-of-the-art engineering company with flat structure and no politics;
Working with the very latest technologies in Data & AI, including Edge AI, Swarming - both within our software platforms and within our embedded on-board systems;
Flexible work arrangements;
Professional development opportunities;
Collaborative and inclusive work environment;
Salary compatible with the level of proven experience.
TEKEVER is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
If the above excites you, apply now! Send your CV to jobs@tekever.com.