Job Openings
Azure Big Data Engineer | 4 to 8 Yrs | WFO | SOW 2024 2579
About the job Azure Big Data Engineer | 4 to 8 Yrs | WFO | SOW 2024 2579
Job Title: Azure Big Data Engineer
Years of Experience: 4 to 8 Years
Relevant Experience: 5 Years
Headcount: 02
Job/Work type: WFO (5 days a week for First 3 months, later its Hybrid 3 days a week)
Location: Bangalore (preferred), Hyderabad
Time Zone: IST time zone (In IST 10:00 am to 7:00 pm)
Project Name: -
Client: Bosch
End Client: -
Other details:
Interview details:
- L1 round - Face 2 Face, Bosch locations at Bangalore or Hyderabad (with Bosch team)
- L2 round - Virtual with end client.
Mandatory Skills:
Azure Skills Like
- Azure Databricks
- Azure Synapse
- Azure Functions
- Azure Data Lake
- Azure Integration Services
- Azure API Management
- CI/CD and PowerShell Scripts
- ADF
- ETL Pipelines
- Unity Catalog
- Azure SQL Database
- SQL Coding
- Storage Account, Azure Data Explorer, Spark Structured Streaming, ADF, Netezza SQL queries, Data lineage, Spark jobs.
Role & Responsibilities / Job Description:
- Overall 4 to 8 years of experience with a minimum 4+ years of relevant professional work experience.
- Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure.
- Work together with data analysts to understand the needs for data and create effective data workflows.
- Create and maintain data storage solutions including Azure Databricks, Azure Functions, Azure Data Lake, Azure Synapse, Azure Integration Services, Azure API Management, CI/CD and PowerShell Scripts. ADF, ETL Pipelines, Unity Catalog, Azure SQL Database, SQL Coding, Azure SQL Database, Azure Data Lake.
- Utilizing Azure Data Factory to create and maintain ETL (Extract, Transform, Load) operations using ADF pipelines.
- Hands-On Experience working on Data Bricks for implementing Transformations and Delta Lake.
- Hands-On Experience working on Serverless SQL Pool, Dedicated SQL Pool.
- Use ADF pipelines to orchestrate the end to end data transformation including the execution of DataBricks notebooks.
- Should have experience working on Medallion Architecture.
- Experience working on CI/CD pipelines using Azure DevOps.
- Attaching both ADF and ADB to DevOps.
- Creating and managing Azure infrastructure across the landscape using Bicep.
- Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.
- Improve the scalability, efficiency, and cost-effectiveness of data pipelines.
- Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data.
- Good to have exposure on Power BI, Power Automate.
Task description:
- We are looking for a Data Engineer to join our growing team of analytics experts.
- The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
- The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
- Should have good hands on implementing Incremental/Delta loads.
- Develop and Support activities of development in Projects and Automation solutions for customers
Deliverables Expected
The following shall be the deliverables as per project documented quality process:
- Updated requirements document / Detailed Bug Analysis report
- Concept document (wherever applicable)
- Design document (for all features)
- Test specification document
- Source code
- Review report
- Test report
- Filled review checklists
- Traceability matrix
- User manuals and related artifacts
- Filled pre-delivery checklist (PDC)
- Release Note & release mails