Senior AWS Data Engineer
Job Description:
Syffer is an all-inclusive consulting company focused on talent, tech and innovation. We exist to elevate companies and humans all around the world, making change, from the inside to the outside.
We believe that technology + human kindness positively impacts every community around the world. Our approach is simple, we see a world without borders, and believe in equal opportunities. We are guided by our core principles of spreading positivity, good energy and promote equality and care for others.
Our hiring process is unique! People are selected by their value, education, talent and personality. We dont present ethnicity, religion, national origin, age, gender, sexual orientation or identity.
Its time to burst the bubble, and we will do it together!
What You'll do:
- Design, implement, and operate scalable cloud-based data pipelines to ingest, process, and transform data coming from diverse sources such as APIs and secure file transfers;
- Build and maintain both batch and streaming data solutions with a strong focus on reliability, performance, and observability;
- Ensure data pipelines are monitored, documented, and supported to enable efficient usage by analytics and business teams;
- Structure and manage complex datasets while meeting business, technical, and performance requirements;
- Contribute to data governance initiatives by documenting datasets, maintaining metadata, and improving data quality standards;
- Develop and execute data validation, functional, and non-functional tests across the data platform;
- Design and maintain efficient data models and database objects, including tables, views, and transformation scripts;
- Analyze, troubleshoot, and optimize SQL queries to support high-performance analytics workloads;
- Participate in production support activities, incident resolution, and continuous monitoring;
- Collaborate with cross-functional teams to translate functional requirements into robust technical solutions;
- Apply architectural best practices and continuously improve the scalability, maintainability, and performance of data solutions.
- Hybrid work model;
Who You Are:
- Degree in Engineering, Computer Science, or a related technical field;
- Proven experience building and maintaining data pipelines in AWS environments (S3, ECS, Lambda, RDS, SNS/SQS, IAM, CloudWatch);
- Strong expertise in SQL, including complex analytical queries, and experience with analytical and non-relational databases;
- Solid programming skills in Python and experience with data-related libraries such as Pandas and Boto3;
- Good understanding of data platform architecture, data governance, and best practices;
- Hands-on experience with version control systems such as GitHub or GitLab;
- Experience with Snowflake is a strong advantage;
- Familiarity with DBT Core is a plus;
- Experience with orchestration tools (e.g. AWS Step Functions, Airflow, Prefect);
- Exposure to CI/CD practices and Infrastructure as Code tools (Terraform, CloudFormation) is appreciated;
- Knowledge of data visualization tools (e.g. Tableau) is a plus;
- AWS and/or Snowflake certifications are considered an advantage;
What youll get:
- Wage according to candidate's professional experience;
- Remote Work whenever possible;
- Delivery of work equipment adjusted to the performance of functions;
- Benefits plan;
- And others.
Work together with expert teams on projects of large magnitude and intensity, long term together with our clients, all leaders in their industries.
Are you ready to step into a diverse and inclusive world with us?
Together we will promote uniquess!
Required Skills:
Religion Hiring Energy Consulting AWS Education