Job Description:
About ShipIn:
At ShipIn Systems, we are driving operations for the leaders in the maritime industry through our Visual Fleet Management Platform. With patented computer vision applications and real-time visual analytics, ShipIn’s platform proactively alerts shipowners, managers, and seafarers to activity onboard to improve safety and drive more efficient operations to modernize the global supply chain.
Position Description:
We are looking for an experienced and motivated Data Engineer to join our Data Pipeline team. In this role, you will take ownership over the existing data pipelines, on the on-vessel environments. You will work closely with data scientists, backend developers and other stakeholders to ensure seamless integration of ML models into production.
Key Responsibilities:
- Design, implement, and optimize the product data pipelines on the on-vessel environments.
- Collaborate with data scientists to deploy and scale ML models, ensuring they operate efficiently in real-time environments.
- Develop high-performance data pipelines for both batch and stream processing.
- Write clean, efficient code using Python to maintain the data pipelines.
- Collaborate with cross-functional teams to establish the required infrastructure in the data pipelines for supporting new product features, simplified ML models deployment and improve the overall system performance.
- Participate in design review and code reviews, providing constructive feedback, maintaining high quality standards across the team.
- Stay current with the latest developments in data engineering and distributed systems, applying new insights to improve existing processes.
- Troubleshoot and resolve issues related to the data pipelines, ensuring minimal downtime and optimal performance.
Qualifications / Experiences:
- 3+ years of experience working with data pipelines and machine learning models.
- Hands-on experience working with PostGres
- Hands-on experience in Kafka
- Proficiency in Python.
- Understanding of how to integrate machine learning models into large scale production environments.
- Experience working with Docker and Kubernetes.
- Experience in Agile methodologies and CI/CD processes.
- Experience with working on Cloud and on prem environments
- Team player and ability to learn quickly.
- Excellent problem-solving skills and ability to troubleshoot complex issues in distributed systems.
- Excellent communication and collaboration skills, to collaborate with cross-functional teams.
- Bachelors or Masters degree in Computer Science, Engineering, or a related field.
- Advantage - experience with image\video processing