Job Openings Senior Data Engineer/Analyst

About the job Senior Data Engineer/Analyst

Senior Data Engineer/Analyst - 3 Year Contract

Qualifications & Experience:

Must-Have:

  • Bachelors or Masters degree in Computer Science, Data Science, Engineering, Mathematics, or a related field.

  • 5+ years of experience in data engineering, analytics, or BI development.

  • Strong proficiency in SQL and Python for data manipulation and transformation.

  • Experience with ETL/ELT processes, data modeling, and data warehousing concepts.

  • Expertise in cloud platforms (AWS, Azure, or GCP) and big data tools (Spark, Snowflake, Databricks, Kafka).

  • Familiarity with data visualization tools (Power BI, Tableau, Looker).

Nice-to-Have:

  • Experience with AI/ML model deployment for predictive analytics.

  • Knowledge of DevOps for data (CI/CD, Infrastructure-as-Code).

  • Certifications in AWS Data Analytics, Azure Data Engineer, or Google Cloud Professional Data Engineer.

Responsibilities:

Data Engineering & Architecture:

  • Design, develop, and maintain scalable and efficient ETL pipelines for data ingestion, transformation, and storage.

  • Build and optimize data warehouses, data lakes, and real-time streaming solutions to support business intelligence and analytics needs.

  • Ensure data quality, integrity, and security across all data processing workflows.

  • Collaborate with Data Scientists, Analysts, and Software Engineers to design data models that enable advanced analytics.

  • Implement data governance, cataloging, and lineage tracking to ensure transparency and compliance.

Data Analysis & Business Intelligence:

  • Conduct data exploration, statistical analysis, and trend identification to extract actionable insights.

  • Develop interactive dashboards and reports using BI tools like Power BI, Tableau, or Looker.

  • Work closely with business teams to understand KPIs and performance metrics, translating data into valuable insights.

  • Optimize query performance and database efficiency for large-scale data processing.

Cloud & Big Data Technologies:

  • Design and manage cloud-based data solutions (AWS, Azure, GCP) with services such as AWS Glue, Azure Data Factory, Google BigQuery, Snowflake, and Databricks.

  • Work with big data frameworks like Apache Spark, Hadoop, or Kafka for distributed data processing.

  • Develop automated data pipelines using orchestration tools like Airflow, Prefect, or Luigi.

Collaboration & Leadership:

  • Work cross-functionally with engineering, product, and business teams to define data requirements.

  • Mentor junior team members and provide guidance on best practices in data engineering and analytics.

  • Drive continuous improvement initiatives in data architecture, automation, and AI-driven analytics.