Job Openings Hadoop Developer

About the job Hadoop Developer

Job Description for Hadoop Developer in Sydney.

Implementation experience in Bigdata Platform; preferably in Cloudera Hadoop platform
Minimum 2 years of Development experience using Hadoop ecosystem tools & utilities: MapReduce, PySpark, Kafka, Sqoop, Impala, Hive etc
Minimum 2 years of development experience in traditional Enterprise Data Warehouse (EDW), ETL design & development using Informatica or Talend
Ability to work independently and contribute to overall architecture and design
Experience in writing Shell scripts in Linux Platform
Knowledge of API management concepts and design
Experience in CI/CD deployment automation in Bigdata Platform (Preferred)
Experience on Metadata management tools like Informatica Metadata Manager or Collibra (Preferred).

Let me know if you have any questions.