Senior Software Engineer- Data Engineering

  • Location
    • Menlo Park
  • Date Posted
  • Aug. 10, 2021
  • Function
  • Software Engineering
  • Sector
  • Fintech

Join a leading fintech company that’s democratizing finance for all.

Robinhood was founded on a simple idea: that our financial markets should be accessible to all. With customers at the heart of our decisions, Robinhood is lowering barriers, removing fees, and providing greater access to financial information. Together, we are building products and services that help create a financial system everyone can participate in.

Just as we focus on our customers, we also strive to create an inclusive environment where our employees can thrive and do impactful work. We are proud of the world class products and company culture we continue to build and have been recognized as:

  • A Great Place to Work
  • A CNBC Disruptor 50 in 2019 and 2020
  • A LinkedIn Top Startup in 2017, 2018, 2019 and 2020
  • Robinhood is backed by leading investors that include DST Global, Index Ventures, NEA, Ribbit Capital, Thrive Capital, and Sequoia.
  • Check out life at Robinhood on The Muse!

About the role

Robinhood is a metrics driven company and data is foundational to all key decisions from growth strategy to product optimization to our day-to-day operations. We are looking for Senior Data Engineers to build and maintain the foundational datasets that will allow us to reliably and efficiently power decision making at Robinhood. These datasets include application events, database snapshots, and the derived datasets that describe and track Robinhood’s key metrics across all products. You’ll partner closely with engineers, data analysts, and data scientists across the company to power analytics, experimentation, and machine learning use cases. Robinhood is a fast growing company and this is a unique opportunity to help lay the foundation for reliable, impactful, data-driven decisions across the company for years to come.

Your day-to-day will involve:

  • Help define, build, and own key datasets and the quality and evolution of these datasets as use cases grow
  • Build scalable data pipelines (using Spark and Airflow) to move data from different applications into our data warehouse
  • Partner with upstream engineering teams to enhance data logging patterns
  • Partner with data consumers across Robinhood to understand consumption patterns and design intuitive data models
  • Ideate and contribute to shared data engineering tooling and standards
  • Evangelize data engineering best practices across the organization

Some things we consider critical for this role:

  • CS background or any other relevant fields of study
  • Strong product mindset and 5+ years of experience building high-quality data solutions
  • Strong analytical and problem solving skills
  • Expertise building data pipelines using open source frameworks (Hadoop, Spark, etc)
  • Expertise in one or more programming languages (ideally Python)
  • Strong SQL (Presto, Spark SQL, etc) skills
  • Familiarity with data visualization tools (Looker, Tableau, etc)
  • Strong communication skills

We’re looking for more growth-minded and collaborative people to be a part our journey in democratizing finance for all. If you’re ready to give 100% in helping us achieve our mission—we’d love to have you apply even if you feel unsure about whether you meet every single requirement in this posting. At Robinhood, we’re looking for people invigorated by our mission, values, and drive to change the world, not just those who simply check off all the boxes.

Robinhood promotes diversity and provides equal opportunity for all applicants and employees. We are dedicated to building a company that represents a variety of backgrounds, perspectives, and skills. We believe that the more inclusive we are, the better our work (and work environment) will be for everyone. Additionally, Robinhood provides reasonable accommodations for candidates on request and respects applicants' privacy rights. To review Robinhood's Privacy Policy please click here.