Big Data Hadoop Engg CR194Location: Pleasanton, CA
Posted On: 01/12/2021
Requirement Code: 42990
Title : Big Data Hadoop Engineer
Location- Pleasanton, CA
Duration: 1 year contract extendable / Full time on Saicon Payroll
TECHNICAL KNOWLEDGE AND SKILLS:
- 4+ years of hands-on Development, Deployment and production Support experience in Big Data environment.
- 4-5 years of programming experience in Java, Scala, Python.
- Proficient in SQL and relational database design and methods for data retrieval.
- Knowledge of NoSQL systems like HBase or Cassandra
- Hands-on experience in Cloudera Distribution 6.x
- Hands-on experience in creating, indexing Solr collections in Solr Cloud environment.
- Hands-on experience building data pipelines using Hadoop components Sqoop, Hive, Solr, MR, Impala, Spark, Spark SQL.
- Must have experience with developing Hive QL, UDF???s for analyzing semi structured/structured datasets.
- Must have experience with Spring framework, Web Services, REST API's and MicroServices.
- Hands-on experience ingesting and processing various file formats like Avro/Parquet/Sequence Files/Text Files etc.
- Must have working experience in the data warehousing and Business Intelligence systems.
- Expertise in Unix/Linux environment in writing scripts and schedule/execute jobs.
- Successful track record of building automation scripts/code using Java, Bash, Python etc. and experience in production support issue resolution process.
- Experience in building ML models using MLLib or any ML tools.
- Hands-on experience working in Real-Time analytics like Spark/Kafka/Storm
- Experience with Graph Databases like Neo4J, Tiger Graph, Orient DB
- Agile development methodologies.