MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.

I'm not curious

Software Engineer - Hadoop Platform Team

Location San Francisco, United States
Posted 20-March-2021
About the Role

The Hadoop Platform team aims to build the next generation of intelligent data services that will profoundly impact how we march towards the vision of Ubers data-driven, real-time marketplace. Our team builds libraries and distributed services around the Hadoop stack (Hive, Presto, HDFS, Kafka, ....) to ease user interaction with the Big Data world. We share them with the open-source community (e.g., Hudi or Marmaray).

The Data Ingestion team owns the platform that moves hundreds of TBs of data a day from thousands of data sources to Ubers data lake. The hundreds of PBs data ingested by the Data Ingestion platform is the source of truth for business analytics and insights at Uber, powering various data-driven decisions. The platform consists of Marmaray, built from the ground-up at Uber and various services and tooling to operate the Data ingestion platform at scale and to deliver new data to customers at lower latencies.

About The Role

We are currently looking for a strong engineer to join the Data Ingestion team to
Design and implement distributed solutions to make new data available faster for business analytical needs.Drive reliability and operational scalability improvements through design and automation - scaling with the ever-increasing data volume flowing through the systems and the number of data sources producing data, observability, and monitoring.Drive efficiency efforts for one of the largest consumers of Ubers Data Infrastructure.

What the Candidate Will Need / Bonus Points

Basic Qualifications
Bachelors degree with 2+ years of experience, or Masters degree with 1+ years of experience.Strong problem solving and coding skills in at least one of Java / C++ / Go. Experience with developing, debugging, and shipping software products on large codebases.

Preferred Qualifications

Experience designing and executing large-scale distributed applications.Dedication to moving fast in the short-term, while simultaneously building for the long term.Understanding of big data infrastructure tools and software such as Spark, HDFS, Yarn.

Related Posts:

Awards & Accolades for MyTechLogy
Winner of
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url