About the job CRI - Integrations Engineer (Data & Systems)
Job Title: Integrations Engineer (Data & Systems)
Contractor Fee: $3,000-$3,200/month (USD)
Work Arrangement: Remote
Engagement Type: Independent Contractor
Commitment: Full-time (Approx. 40 hours/week)
Company Overview:
Tidal is a Direct Placement Agency that helps job seekers find job opportunities for real growth. We work with stable, responsible businesses experienced in remote hires and are excited to welcome international team members. Tidal is owned and operated by consumer brand owners and operators. We have offshore team-building experience and aim to help businesses leverage global talent.
About This Role:
We are hiring a highly skilled Integrations Engineer to internalize third-party system integrations and build scalable frameworks that power complex ecommerce and fulfillment operations. You will design and maintain secure, high-throughput data pipelines, standardize data from unstructured sources, and create infrastructure that supports financial and operational workflows. This role requires strong backend engineering experience, cloud data expertise, and the ability to work independently in a production environment, handling mission-critical integrations.
Key Responsibilities:
Design and maintain scalable ELT and data pipeline frameworks
Build and manage integrations using REST APIs, GraphQL APIs, webhooks, SFTP, and file-based workflows
Transform and normalize data from multiple structured and unstructured sources into unified schemas
Develop integrations with warehouse management systems and accounting platforms
Extract and structure carrier invoice data from email attachments and PDF files
Implement secure, production-ready data handling and pipeline security best practices
Maintain a high-throughput data infrastructure with strong observability and logging
Leverage LLM-based solutions to parse and process unstructured documents
Build and maintain MCP servers to support customer-facing applications
Monitor, debug, and continuously optimize integration reliability and performance
Software/Platforms/Tools:
Google Cloud Platform (BigQuery, Cloud SQL, Cloud Storage)
Python
Logfire or similar observability/logging tools
REST APIs, GraphQL APIs, Webhooks
SFTP and CSV-based ingestion workflows
LLM tools for document parsing
Modern data security frameworks
Qualifications:
3–5 years of freelancing experience
Excellent English communication skills (C1 or C2 level)
3–6+ years of experience in backend or data engineering
Strong hands-on experience with GCP data infrastructure
Advanced proficiency in Python
Proven experience building and maintaining production data pipelines
Experience transforming unstructured data into operational schemas
Familiarity with secure data pipeline architecture and best practices
Ability and willingness to work with AI/LLM-based processing workflows
Experience in ecommerce, logistics, or fulfillment systems is a strong plus
Highly independent, systems-oriented, and solution-driven
Shift Schedule:
Monday to Friday, 9:00 AM – 5:00 PM US CST