Design technical architecture and develop various Big Data workflows using Hadoop MapReduce, Pig, Hive, Sqoop and Flume. Work with business users, Oracle database team and system administrators in designing data conversion to HBase. Extract data from MySQL database into HDFS, HIVE and HBase using Sqoop. Orchestrate hundreds of HIVE queries using Oozie workflows and Hadoop Data serialization mechanisms like AVRO. Create and test several Java classes in JUnit4 to test MapReduce programs, Hive scripts and Pig Latin scripts. Create Pig Latin scripts to clean up both structured and unstructured data for further analysis in HIVE. Will work in Manchester, CT and/or various client sites throughout the U.S. Must be willing to travel and/or relocate.