Job Openings Senior Data/Python Engineer (Snowflake/Airflow/AWS) - Remote Portugal

About the job Senior Data/Python Engineer (Snowflake/Airflow/AWS) - Remote Portugal

ABOUT THE OPPORTUNITY

Join a world-class technology consultancy as a Senior Data/Python Engineer, leading the design and implementation of sophisticated data pipelines and systems. You'll leverage your strong Python development skills to guide teams while building scalable, high-performance data solutions using modern cloud-native technologies. This role offers you the opportunity to work with cutting-edge data platforms including Snowflake, Airflow, and dbt, architecting both real-time and batch data pipelines that drive critical business insights.

PROJECT & CONTEXT

You'll architect and operate data-centric systems handling large-scale datasets on AWS, with deep focus on Snowflake optimization including data modeling, performance tuning, and cost efficiency. The role involves designing and orchestrating complex workflows using Apache Airflow, building data transformations with dbt including modeling, testing, and documentation. You'll implement both real-time and batch data pipelines leveraging AWS services like S3 and core cloud infrastructure, all managed through Infrastructure as Code using Terraform. Strong emphasis on engineering excellence through CI/CD pipelines with GitHub Actions, comprehensive automated testing (unit, linting, integration, end-to-end), and observability using tools like New Relic. You'll work with structured, semi-structured, and unstructured data, applying solid understanding of microservices design and architectural principles while collaborating through Git/GitHub workflows.

WHAT WE'RE LOOKING FOR (Required)

  • Strong Python development skills with ability to guide and mentor other engineers
  • Advanced SQL expertise with experience working with large-scale datasets
  • Snowflake mastery: Deep experience with Snowflake including data modeling, optimization, and performance tuning
  • Engineering principles: Solid understanding of architectural principles, microservices design, and data-centric systems
  • Airflow experience: Hands-on with Apache Airflow for designing, orchestrating, and monitoring workflows
  • dbt expertise: Experience with dbt for data modeling, transformations, and testing
  • Pipeline development: Proven experience building and operating both real-time and batch data pipelines
  • AWS proficiency: Hands-on experience with AWS S3 and core cloud services
  • CI/CD expertise: Strong experience with GitHub Actions or similar CI/CD technologies
  • Version control: Proficient with Git/GitHub and collaborative development workflows
  • Infrastructure as Code: Experience using Terraform for infrastructure management
  • Automated testing: Strong practices across unit testing, linting, integration, and end-to-end tests
  • Diverse data handling: Experience with structured, semi-structured, and unstructured data
  • Observability tools: Experience with monitoring platforms like New Relic or similar technologies
  • Language requirement: Fluent English (mandatory)

NICE TO HAVE (Preferred)

  • Familiarity with Customer Data Platforms (CDP) and customer data ecosystems
  • Container orchestration experience with Kubernetes and Docker
  • AWS Lambda for serverless data processing
  • Exposure to Apache Kafka or similar event streaming platforms