Hadoop / Big Data developer - REMOTE
• Minimum 2+ years of IT experience development with regard to Big Data Platform is must.
• Proficiency with at least with one of the following languages : Java, Python, Scala.
• Experience with HBase, Hive, Map Reduce, Sqoop, ETL, Kafka, Mongo, MySQL, Visualization technologies etc.
• Flair for data, schema, data model, how to bring efficiency in big data related life cycle
• Understanding of automated QA needs related to big data.
• Proficiency with agile or lean development practices.
• Experience with ETL & data cleansing/preparation in a Hadoop environment
• Experience with Hadoop tools such as Spark, Pig, Hive, Impala etc.
• Familiar with Hadoop distributions such as Cloudera ,Hortonworks ...
• Strong object-oriented design and analysis skills
• Excellent technical and organizational skills
• Excellent written and verbal communication skills
• Hadoop Cluster setup & administration experience
• Gathering requirements, analysis of entire system and providing estimation on development, testing efforts.
• Coordinating with team to assign tasks and monitor team deliverables to meet project time lines.
• Writing UNIX shell scripts to load the data from different interfaces to Hadoop.
• Writing scripts to import, export and update the data to RDBMS.