BigData / Data Engineer

Duration: Contract or Full-time

Job Location: Bay Area, CA

Job Description:

  • Design, implement, and deploy high-performance distributed applications at scale on Hadoop
  • Hands on experience with Big Data systems, building ETL pipelines, data processing, and analytics tools
  • Define common business and development processes, platform and tools usage for data acquisition, storage, transformation, and analysis
  • Understanding of data structures & common methods in data transformation.
  • Distributed computing experience using tools such as Hadoop and Spark is a must
  • Strong proficiency in using query languages such as SQL, Hive and SparkSQL
  • Mentor junior data engineers
  • Perform profiling, troubleshooting of existing solutions;
  • Create technical documentation

Skills Required:

  • 8+ years of overall experience and 3-4 years with the Hadoop ecosystem and Big Data technologies
  • Ability to dynamically adapt to conventional big-data frameworks and tools
  • Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hbase, Hive, Impala, Spark, Kafka, Kudu, Solr)
  • Experience with building stream-processing systems using solutions such as spark-streaming
  • Knowledge of design strategies for developing scalable, resilient, always-on data lake
  • Experience in building Data management platform or framework is a plus
  • Some knowledge of agile(scrum) development methodology is a plus
  • Strong development/automation skills.
  • Must be very comfortable with reading and writing Scala, Python or Java code.
  • Excellent inter-personal and teamwork skills
  • Can-do attitude on problem solving, quality and ability to execute
DevOps Engineer

Duration: Contract or Full-time

Job Location: Bay Area, CA

Skills Required:

  • Experience with AWS/Azure
  • Experience in automation and scripting using Chef
  • Experience in Terraform
  • Experience working in Distributed Systems
  • Linux/ Unix Operating systems
  • Experience in one or more scripting languages (e.g. bash, perl, python)
  • Jenkins (including plugins) or other build automation tools
  • Experience with one or more of the following: Artifactory, Nexus, or other artifact storage tool
  • Experience with Maven and/or other build tools
  • Applied experience with modern application technologies and design patterns including:
  • Cloud infrastructure, distributed computing, horizontal scaling, and database technologies
  • Proven success working with and optimizing applications for large scale enterprise performance
  • Experience tuning the performance of Java applications in virtual environments
  • Embraces technology trends that drive intelligent automation
  • Terraform is a must