Senior Hadoop Developer

ITC Infotech - Dearborn, MI (30+ days ago)3.7


Job Summary

Essential Job Functions:

1. Design and development of data ingestion pipelines.

2. Perform data migration and conversion activities.

3. Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.

4. Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).

5. Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.

Required : 1. Java 2. J2EE, Web Applications, Tomcat (or any equivalent App server) , Restful Services, JSON 3. Spring, Spring Boot, Struts, Design Patterns 4. Hadoop (preferably Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase

5. SCALA 6. SQL 7. Linux

Good to Have : 8. Google Analytics, Adobe Analytics 9. Python, Perl 10. Flume, Solr 11. Strong Database Design Skills 12. ETL Tools 13. NoSQL databases (Mongo, Couchbase, Cassandra) 14. JavaScript UI frameworks (Angular, NodeJS, Bootstrap)

Responsibilities and Duties

Required : 1. Java 2. J2EE, Web Applications, Tomcat (or any equivalent App server) , Restful Services, JSON 3. Spring, Spring Boot, Struts, Design Patterns 4. Hadoop (preferably Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase

5. SCALA 6. SQL 7. Linux

Good to Have : 8. Google Analytics, Adobe Analytics 9. Python, Perl 10. Flume, Solr 11. Strong Database Design Skills 12. ETL Tools 13. NoSQL databases (Mongo, Couchbase, Cassandra) 14. JavaScript UI frameworks (Angular, NodeJS, Bootstrap)

Required Experience, Skills and Qualifications

Minimum Qualifications and Job Requirements:

Must have a Bachelor’s degree in Computer Science or related IT discipline:

  • Must have at least 9+ years of IT development experience.
  • Must have strong, hands-on J2EE development
  • Must have indepth knowledge of SCALA – Spark programming
  • Must have 3+ years relevant professional experience working with Hadoop (HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript, .Net, SQL, PERL, Python or equivalent scripting language
  • Must have experience with ETL tools
  • Must have experience integrating web services
  • Knowledge of standard software development methodologies such as Agile and Waterfall
  • Strong communication skills.
  • Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary

Specific Knowledge, Skills and Abilities:

  • Ability to multitask with numerous projects and responsibilities
  • Experience working with JIRA and WIKI
  • Must have experience working in a fast-paced dynamic environment.
  • Must have strong analytical and problem solving skills.
  • Must have excellent verbal and written communication skills
  • Must be able and willing to participate as individual contributor as needed.
  • Must have ability to work the time projects and/or meet deadlines.

Thanks and Regards,

Dilipkumar TM

ITC Infotech (USA) Inc.,

( : 646-569-9010 (Extn: 1994)

Job Type: Contract

Education:

  • Bachelor's