We are seeking a highly skilled and motivated Streaming Platform Engineer to join the Data Streaming Platform team. This is a unique hybrid role that combines the disciplines of platform, software, and data engineering to build, scale, and maintain our high-performance, real-time data streaming platform. The ideal candidate should have a passion for architecting robust, scalable systems to enable data-driven products and services at massive scale.
Your Mission
- Design, build, and maintain the core infrastructure for our real-time data streaming platform, ensuring high availability, reliability, and low latency.
- Implement and optimize data pipelines and stream processing applications using technologies like Apache Kafka, Apache Flink, and Spark Streaming.
- Collaborate with software and data engineering teams to define event schemas, ensure data quality, and support the integration of new services into the streaming ecosystem.
- Develop and maintain automation and tooling for platform provisioning, configuration management and CI/CD pipelines.
- Champion the development of self-service tools and workflows that empower engineers to manage their own streaming data needs, reducing friction and accelerating development.
- Monitor platform performance, troubleshoot issues, and implement observability solutions (metrics, logging, tracing) to ensure the platform's health and stability.
- Stay up-to-date with the latest advancements in streaming and distributed systems technologies and propose innovative solutions to technical challenges.
Your Story
- Streaming Platforms & Architecture: Strong production experience with Apache Kafka and its ecosystem (e.g., Confluent Cloud, Kafka Streams, Kafka Connect). Solid understanding of distributed systems and event-driven architectures and how they drive modern microservices and data pipelines.
- Real-Time Data Pipelines: Experience building and optimizing real-time data pipelines for ML, analytics and reporting, leveraging technologies such as Apache Flink, Spark Structured Streaming, and integration with low-latency OLAP systems like Apache Pinot.
- Platform Infrastructure & Observability: Hands-on experience with major Cloud Platforms (AWS, GCP, or Azure), Kubernetes and Docker, coupled with proficiency in Infrastructure as Code (Terraform). Experience integrating and managing CI/CD pipelines (GitHub Actions) and implementing comprehensive Observability solutions (New Relic, Prometheus, Grafana) for production environments.
- Programming Languages: Proficiency in at least one of the following: Python, Typescript, Java, Scala or Go.
- Data Technologies: Familiarity with data platform concepts, including data lakes and data warehouses.
Meet The Team
You will be part of a talented and diverse team of data engineers, data scientists and product managers focused on revolutionizing the use of stream-processing across the organization. We are building innovative data solutions to optimize internal processes, enhance customer experiences, and drive business growth.
What We Offer
On is a place that is centered around growth and progress. We offer an environment designed to give people the tools to develop holistically - to stay active, to learn, explore and innovate. Our distinctive approach combines a supportive, team-oriented atmosphere, with access to personal self-care for both physical and mental well-being, so each person is led by purpose. On is an Equal Opportunity Employer. We are committed to creating a work environment that is fair and inclusive, where all decisions related to recruitment, advancement, and retention are free of discrimination.