About the job GCP Data Engineer _Senior Consultant
3-5years - GCP Data Engineer - Headcount: 20
6-8years - GCP Lead Data Engineer - Headcount: 8
8-12years GCP Architect - Headcount: 4
FAQ:
1. How much percentage you spend writing Python script for data cleanup, or any other work
2. Big Query - which tool you have used?
3. Have you worked on Batch processing of data or Streaming? How much exp: Batch processing, How much on Streaming
4. Have you worked on workflow creation using Airflow? Yes/No
5. SQL rating? then Have you worked on SQL query tuning? Yes/No
6. Which domain you have worked? Note down the domain
7. Have to created Data pipeling for ETL end to end? years of experience?
Job Description
Minimum 12 months of core experience in Airflow (writing Directed Acyclic Graphs DAGs) Good to have
Minimum 12 months of core experience in Big Query, Composer, and development of pipelines (batch or streaming) - MUST
Minimum 6 months of experience in Python - MUST
Very strong in SQL (at least 4 years of experience in writing SQLs) - MUST
Strong communication skills - MUST
Knowledge in Pub/Sub would be an advantage - Good To Have
Knowledge of the telecom domain would be an advantage - Good To Have
- Have experience with Data Fusion or Equivalent, Big Query or Equivalent, SQL server, scripting in Java/Python that works well in GCP products, and their respective practices. Python experience is a plus.
- Develop data ETL pipelines that meet both functional and non-functional requirements, including performance, scalability, availability, reliability, and security.
- Have experience with writing code in Java, to work on data extracts that require cleanup.
- Have a working knowledge of XML, JSON, and other forms of data streaming artifacts and related technologies in a Java/Python environment.
- Have strong written and verbal communication skills
- Able to multi-task on various streams of the entire data process.