• Location
  • New York
  • Last Published
  • Nov. 8, 2024
  • Sector
  • AI/ML
  • Function
  • Data Science

Thread AI

Thread AI is focused on building an AI-native workflow orchestration engine looking for dedicated individuals to join its growing team. Our goal is to make infrastructure simple for enterprises and public sector agencies seeking to get the most from AI.

Headquartered in New York, our growing team is a group of AI, product, and engineering experts, who have a track record of creating and executing against complex workflows and infrastructure.

Engineering Culture

We're a small and committed technical team that oversees engineering, research, design, product, and operations. We believe that a small dedicated team with a flat structure and collaborative culture can move faster and build better products than large hierarchical organizations.

About the Role

We are looking for a skilled Applied Data Scientist with a strong foundation in data engineering principles to play a key role in developing and implementing data-driven solutions. A successful candidate will be responsible for designing, building, and maintaining robust data pipelines, optimizing data workflows, and ensuring the reliability and scalability of our data infrastructure. This role will require a combination of technical expertise in data engineering, proficiency in programming languages and tools, and the ability to collaborate effectively with cross-functional teams.

What We're Looking For

  • 5+ years of experience in building, evaluating, and deploying machine learning and artificial intelligence models into production environments
  • Proven experience in data engineering, ETL development, and building data pipelines in production environments
  • Proficiency in programming languages such as Python and experience with data processing frameworks like Apache Spark or Flink
  • Strong understanding of database systems, data warehousing concepts, and cloud platforms such as AWS, GCP, or Azure
  • Skilled in SQL, with extensive experience extracting large datasets and designing ETL workflows
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes) and workflow orchestration tools (e.g., Airflow, Luigi)
  • Comfortable working with a wide variety of data including unstructured data and high-volume real-time sensor data
  • Excellent problem-solving skills, attention to detail, and ability to work effectively in a fast-paced, collaborative environment
  • Strong communication and interpersonal skills, with the ability to explain complex technical concepts to non-technical stakeholders

We’re committed to creating a space where our employees can bring their full selves to work and have equal opportunity to succeed. So regardless of race, gender identity or expression, sexual orientation, religion, origin, ability, or age, we encourage you to apply!

Don’t meet every single requirement? If you’re excited about this role and the company but your past experience doesn’t align with every qualification, we still encourage you to apply anyway - we'd love to get to know you and see if there's a place for you at Thread!

Compensation

$160,000 - $200,000 + Equity