Lisbon, Portugal

Data Engineer

 Job Description:

Syffer is an all-inclusive consulting company focused on talent, tech and innovation. We exist to elevate companies and humans all around the world, making change, from the inside to the outside.

We believe that technology + human kindness positively impacts every community around the world. Our approach is simple, we see a world without borders, and believe in equal opportunities. We are guided by our core principles of spreading positivity, good energy and promote equality and care for others.

Our hiring process is unique! People are selected by their value, education, talent and personality. We dont present ethnicity, religion, national origin, age, gender, sexual orientation or identity.

Its time to burst the bubble, and we will do it together!

What You'll do:

- Analyze user problems, ensure clear understanding of architecture, and maintain open communication with Data Architect, peers, and Project Manager;

Design and implement data pipelines and infrastructure (e.g., with Terraform), follow data best practices, and manage interface contracts with version control and code reviews;

- Apply strong knowledge of data warehousing, ETL/ELT processes, data lakes, and modeling throughout development;

- Define, execute, and document functional and technical tests in collaboration with the Project Manager, sharing regular updates on results;

- Participate in Deployment Reviews, monitor post-deployment behavior, log errors, and ensure proper use of deployment and monitoring strategies.


What You Are:

Proficiency with PySpark and Spark SQL for data processing.

Experience with Databricks using Unit Catalog.

Knowledge of Delta Live Tables (DLT) for automated ETL and workflow orchestration in Databricks.

Familiarity with Azure Data Lake Storage.

Experience with orchestration tools (e.g., Apache Airflow or similar) for building and scheduling ETL/ELT pipelines.

Knowledge of data partitioning and data lifecycle management on cloudbased storage.

Familiarity with implementing data security and data privacy practices in a cloud environment.

Terraform: At least one year of experience with Terraform and know good practices of GitOps.

Additional Knowledge and Experience that are a Pluss: Databricks Asset Bundles, Kubernetes, Apache Kafka, Vault.

What youll get:

- Wage according to candidate's professional experience;

- Remote Work whenever possible;

- Allocation of health insurance from the beginning of the employment;

- Delivery of work equipment adjusted to the performance of functions;

- And others.

Work together with expert teams on projects of large magnitude and intensity, long term together with our clients, all leaders in their industries.

Are you ready to step into a diverse and inclusive world with us?

Together we will promote uniquess!

  Required Skills:

Collaboration Data Warehousing Apache Kafka Religion Modeling Data Processing Azure Spark Pipelines Apache Version Control Hiring Energy Consulting Reviews Storage Architecture Kubernetes Insurance Infrastructure Security Education Scheduling SQL Design Communication Management