Job Openings Databricks + Pyspark + SQL | 4 to 8 Years | IST Time | Bangalore | Onsite

About the job Databricks + Pyspark + SQL | 4 to 8 Years | IST Time | Bangalore | Onsite

Job description:

Role: Databricks + Pyspark + SQL

Year of Experience:4-8 Years

Responsibilities:

  1. Collaborate with cross-functional teams to understand data requirements and design efficient data processing solutions.
  2. Develop and maintain ETL processes using Databricks and PySpark for large-scale data processing.
  3. Optimize and tune existing data pipelines to ensure optimal performance and scalability.
  4. Create and implement data quality and validation processes to ensure the accuracy of data.
  5. Work with stakeholders to understand business needs and translate them into actionable data solutions.
  6. Collaborate with the data science team to support machine learning model deployment and integration into production systems.
  7. Troubleshoot and resolve data-related issues promptly.

Requirements:

  1. Bachelor's degree in Computer Science, Engineering, or a related field.
  2. Proven experience   working with Databricks, PySpark, and SQL in a professional setting.
  3. Strong proficiency in designing and optimizing ETL processes for large-scale data sets.
  4. Experience with data modeling, data warehousing, and database design principles.
  5. Familiarity with cloud platforms such as AWS, Azure, or GCP.