This position is for an experienced architect and developer to become the chief Hadoop architect in our Big Data Global Benchmark Center and part of our R&D Advanced Projects (AP) team.
The candidate will be responsible for implementing Big Data Proof of Concept and Benchmark projects and as a consultant to field projects. The focus will be on the use of Hadoop with integration and application within and around the Teradata Unified Data Architecture.
The candidate will be exposed to a broad variety of emerging technologies in both a consultant as well as a solution development role.You are expected to be a key contributor as we ask you to research new areas of big data usage and develop new case developments with Teradata’s Emerging technologies. Candidate must have experience in a solution architecture & development role in some kind of large scale database technologies, including but not limited to; Teradata, Oracle, Netezza as well as at least one major Hadoop Distribution such as Cloudera or HortonWorks.
• Consult on Installation, tuning and configuring Hadoop products in the Big Data GBC Lab
• Contribute to detailed project plans and lead technical project scoping and planning;
* Design, and develop automated test cases that verify solution feasibility and interoperability, including performance assessments.
* Research new technologies and startups in the Big Data Space.
* Consulting on all GBC Hosted Hadoop POC’s
* POC Solution Design and Development
* Big Data Benchmarking and Performance Tuning
• Develop Benchmark, Verification criteria and statistics analysis
• Teach & Lead more junior Hadoop Engineers in the GBC
• Apply judgment based on the analysis of quantitative data and information. Execute and Document Verifiable Objective Evidence (VOE) to back all results and analysis
* Design and implementation of new use test case with emerging technologies.
• All areas, from Data Conditions to advanced application development
• Contribute to the development of new algorithms or techniques.
* Apply knowledge of emerging technologies to define new solutions
* Consult and advise Teradata Big Data Professional Services associates and COE
U.S. Master’s Degree or Bachelor’s Degree in Computer Science or related discipline and the following experience:
* A minimum of 5 years of extensive Java and Java Script development
* A minimum of 3 years of extensive C & C++ Development
* A minimum of 3 years’ experience with Perl and/or Python
* A minimum of 5 years’ experience with SQL and two major RDBMS’s
* A minimum of 3 years’ experience with a major Hadoop distribution and Hive
* A minimum of 2 years’ experience with Pig, HBase, Flume or Sqoop
* A minimum of 2 years’ experience with Map/Reduce solution design and development
* A minimum of 5 years’ experience doing Data related performance analysis and tuning
* A minimum of 5 years’ experience development with Linux
• Cloudera 4.5 and 5.0 experience or Hortonworks 1.3.2 & 2.0 experience
• Experience as an Enterprise Architect
• Knowledge of System-wide bottleneck analysis including network analysis and performance tuning
• Experience with 2 different ETL solutions on Hadoop
• Experience with Ambari and/or Cloudera Enterprise Manager
• Experience with Microstrategy and at least one other Business Intelligence Tool
• Experience with Tableau
• Experience as Technical Project Leadership