Job Openings
Azure Big data Engineer (Development & Support) | 4 to 6 Yrs | WFO | SOW 2756
About the job Azure Big data Engineer (Development & Support) | 4 to 6 Yrs | WFO | SOW 2756
Job Title: Azure Big data Engineer (Development & Support)
Years of Experience: 4 to 6 Years
Relevant Experience: 5+ Years
Headcount: 01
Job/Work type: WFO (5 days a week for First 3 months, later its Hybrid 3 days a week)
Location: Pune
Time Zone: IST time zone
Work Days: Monday to Friday
Project Name: Finolex
Client: Bosch
End Client: Finolex
Role & Responsibilities / Job Description:
Must to have skills :
- Databricks, Logic Apps, Azure Data Factory (ADF), ADLS, Azure SQL, and Databricks SQL..
- Good understanding and knowledge of Azure DevOps.
Responsibilities:
- Bachelor's degree in Information Technology or related field.
- 5+ years of hands-on experience in developing and supporting Azure Services.
- An end-to-end project development or implementation experience from start to finish is required of the candidate.
- Proficient in Azure: Databricks, Logic Apps, Azure Data Factory (ADF), ADLS, Azure SQL, and Databricks SQL.
- Hands-on experience in languages like Python and understanding the Spark architecture (Pyspark).
- Should have a good understanding and knowledge of SQL and NoSQL databases, preferably SAP HANA, My-SQL, MS-SQL, etc.
- Should have a good understanding and knowledge of Azure DevOps.
- Should possess a good understanding of different environments (development, testing, and production setups).
- Good understanding of the project life cycle and agile methodology.
- Knowledge of best practices being followed in the Data Engineering discipline.
- Excellent communication and collaboration skills.
Mandatory Skills:
- Monitoring: Keep a close eye on the platforms, apps, data flow, and Azure infrastructure.
- First Level Triage: Utilize Standard Operating Procedures (SOP) to guide early issue analysis and resolution.
- Communication: Effectively communicate with end users and escalate problems as needed Help with the setup of alerts and notifications for important events or breaches.
- Managing Incidents: Respond to regular issues and service requests in accordance with established SLAs.
- Cooperation: Work together to resolve issues with the L2 and L3 teams and with OEM's like Microsoft and Databricks, etc.
- Development: Create new pipelines based on the demands of the business requirements, from source to target (till the semantic layer).
- Improvement: Streamline the current ETL/ELT pipelines in order to cut expenses and time, and adjust the semantic layer to maximize undelaying performance and reduce data latency.
Education and Work Experience Requirements:
- Bachelor's degree in information technology or a related field.
- 5+ years of hands-on experience in developing and supporting PowerBI reports and dashboards.
- Proficient in PowerBI-DAX, Role-Based Access Control (RBAC), Dashboard Development, Testing and Production Setups.
- Good understanding of Project Life Cycle and different development environments.
- Knowledge of BI best practices and data visualization principles.
- Excellent communication and collaboration skills.