Job Openings GCP Big Data Developer- Remote

About the job GCP Big Data Developer- Remote

Primary & Must have:

Python/PySpark & GCP Cloud Must with experience

Hadoop Ecosystem Exposure (knowledge on HDFS, Hive, Big data)

SQL (Strong in SQL as this is the base to whatever we do in HQL)

CLOUD working experience- GCP preferred

Job Description

Total IT exp of 5-12 years

Knowledge of Cloud (GCP).

Analyze and organize raw data

Build data systems and pipelines

Evaluate business needs and objectives

Interpret trends and patterns

Conduct complex data analysis and report on results

Prepare data for prescriptive and predictive modeling

Build algorithms and prototypes

Combine raw information from different sources

Explore ways to enhance data quality and reliability

Identify opportunities for data acquisition

Develop analytical tools and programs

Collaborate with data scientists and architects on several projects

Technical expertise with data models, data mining, and segmentation techniques

Knowledge of programming languages (e.g. Java and Python)

Hands-on experience with SQL database design

Great numerical and analytical skills

Soft Skills:

Good communication skills

Flexible to work and learn on new technologies