Machine Learning Developer - II
Job Description:
We're looking for a skilled Machine Learning Developer to play a key role in driving data-driven decision-making across our digital platform. This is a unique opportunity to contribute to impactful machine learning initiatives as part of a collaborative, forward-thinking team.
If you're ready to make a lasting impact and thrive in a long-term contract role, we want to hear from you. Apply now to be part of our journey toward innovation and excellence!
Job Responsibilities
- Develop Scalable Data Pipelines: Collaborate with engineers to build efficient data pipelines and architectures, incorporating MLOps best practices for large language models (LLMs).
- Data Handling & Processing: Support data collection, analysis, content understanding, storage, and processing tasks.
- Model Development: Write code for training, testing, and deploying machine learning models.
- Model Monitoring: Monitor and troubleshoot machine learning models to maintain accuracy and performance.
- Requirements Analysis: Perform requirements analysis, collaborate with team members and document solutions.
- Data Management: Work with large-scale data sets and manage data flow between systems.
- Batch Data Processing: Organize and process large batches of text and geometric data.
- Data Insights & Visualization: Communicate findings through quantitative analysis, visuals, and actionable insights.
Minimum Qualifications
- Educational Background: MS in Machine Learning, Artificial Intelligence, Mathematics, Statistics, Computer Science, or a related field.
- Experience: 5 years of experience in machine learning engineering or a related field.
- Deep Learning Expertise: Hands-on experience with training deep neural networks (e.g., CNNs, transformers) and proficiency with at least one deep learning framework such as PyTorch or TensorFlow.
- LLM Knowledge: Experience with large language models (LLMs), embedding models, vector databases, and Retrieval-Augmented Generation (RAG) systems.
- Data Modeling & Processing: Experience with data modeling, architecture, and processing, including handling 2D/3D geometric data.
- Cloud & Platform Proficiency: Proficiency in AWS cloud services, including SageMaker Studio, for scalable data processing and model development.
- Algorithm Understanding: Strong understanding of fundamental computer science algorithms and their scalability.
- Programming Skills: Proficient in coding, covering both procedural and data-analytics-oriented languages (like Python).
- Solution-Oriented Mindset: Ability to convert theoretical concepts into practical, prototype-ready solutions.
APPLY NOW!
NearSource Technologies values diversity and is committed to equal opportunity. All qualified applicants will be considered regardless of their race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as protected veterans.
Required Skills:
Requirements Analysis Analysis Religion Modeling Data Processing AWS Intelligence Deep Learning Data Modeling Pipelines Artificial Intelligence Visualization Data Collection Scalability Data Management Algorithms Decision-Making Machine Learning Statistics Storage Architecture Mathematics Programming Databases Computer Science Testing Python Engineering Science Training Management