On company logo

On is hiring a

Senior Data Streaming Platform Engineer

Back to Jobs
Remote
Posted a day ago
16 views

Job Description

In short

We are seeking a highly skilled and motivated Streaming Platform Engineer to join the Data Streaming Platform team. This is a unique hybrid role that combines the disciplines of platform, software, and data engineering to build, scale, and maintain our high-performance, real-time data streaming platform. The ideal candidate should have a passion for architecting robust, scalable systems to enable data-driven products and services at massive scale.

Your Mission

  • Design, build, and maintain the core infrastructure for our real-time data streaming platform, ensuring high availability, reliability, and low latency.

  • Implement and optimize data pipelines and stream processing applications using technologies like Apache Kafka, Apache Flink, and Spark Streaming.

  • Collaborate with software and data engineering teams to define event schemas, ensure data quality, and support the integration of new services into the streaming ecosystem.

  • Develop and maintain automation and tooling for platform provisioning, configuration management and CI/CD pipelines.

  • Champion the development of self-service tools and workflows that empower engineers to manage their own streaming data needs, reducing friction and accelerating development.

  • Monitor platform performance, troubleshoot issues, and implement observability solutions (metrics, logging, tracing) to ensure the platform's health and stability.

  • Stay up-to-date with the latest advancements in streaming and distributed systems technologies and propose innovative solutions to technical challenges.


Your story

This is a hybrid role, and we understand that candidates may not have experience with every single technology listed. We encourage you to apply if you have a strong foundation in a majority of these areas.

  • Streaming Platforms & Architecture: Strong production experience with Apache Kafka and its ecosystem (e.g., Confluent Cloud, Kafka Streams, Kafka Connect). Solid understanding of distributed systems and event-driven architectures and how they drive modern microservices and data pipelines.

  • Real-Time Data Pipelines: Experience building and optimizing real-time data pipelines for ML, analytics and reporting, leveraging technologies such as Apache Flink, Spark Structured Streaming, and integration with low-latency OLAP systems like Apache Pinot.

  • Platform Infrastructure & Observability: Hands-on experience with major Cloud Platforms (AWS, GCP, or Azure), Kubernetes and Docker, coupled with proficiency in Infrastructure as Code (Terraform). Experience integrating and managing CI/CD pipelines (GitHub Actions) and implementing comprehensive Observability solutions (New Relic, Prometheus, Grafana) for production environments.

  • Programming Languages: Proficiency in at least one of the following: Python, Typescript, Java, Scala or Go.

  • Data Technologies: Familiarity with data platform concepts, including data lakes and data warehouses.
Sponsored
⭐ Featured Partner

Join Swish Analytics

Work on cutting-edge sports data and analytics. Join a team that's revolutionizing how we understand sports performance with AI and machine learning.

Remote FriendlyCompetitive SalarySports Tech

Create a Job Alert

Interested in building your career at On? Get future opportunities sent straight to your email.

Create Alert

Related Opportunities

Discover similar positions that might interest you