This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Technical Product Owner in Europe.
This role offers the opportunity to lead the vision, strategy, and execution of a high-impact Kafka-based data engineering platform. As a Technical Product Owner, you will act as the bridge between business stakeholders and engineering teams, translating complex requirements into scalable, compliant, and reliable event-driven solutions. You will define the roadmap, prioritize initiatives, and ensure delivery of measurable business value while collaborating with cross-functional teams in a fast-paced, fully remote environment.
Accountabilities
In this position, your main responsibilities will include:
- Defining and owning the product vision and roadmap for Kafka/data engineering capabilities, translating business needs into prioritized backlogs with clear acceptance criteria and KPIs.
- Collaborating with Data, Analytics, Marketing, and Product teams to define event models, SLAs, integration requirements, and ensure stakeholder alignment on priorities and trade‑offs.
- Shaping the architecture and evolution of Kafka-based pipelines (topics, partitions, retention, compaction, Connect/Debezium, Streams/ksqlDB), partnering with engineers to ensure scalable, secure, and cost‑efficient solutions.
- Driving schema governance (Avro/Protobuf), data quality enforcement, and compliance with GDPR/CCPA, PII handling, and other regulatory requirements.
- Managing backlog grooming, sprint planning, and delivery tracking to meet throughput, latency, and consumer lag targets.
- Continuously evaluating improvements for reliability, latency reduction, and cost‑efficiency, while exploring new tools and best practices in data streaming and event‑driven systems.
Requirements
To be successful in this role, you should bring:
- 4+ years of experience as a Product Owner or Technical Product Owner in data engineering, streaming, or related domains.
- Proven ability to define and communicate a product vision and roadmap, aligning with business goals and stakeholders.
- Hands‑on experience in backlog management, user story creation, prioritization techniques (MoSCoW, WSJF, RICE), and defining acceptance criteria.
- Strong experience with Agile/Scrum/Kanban frameworks, backlog grooming, and sprint ceremonies.
- Excellent stakeholder management skills, translating business outcomes into technical user stories, defining KPIs/SLAs, and balancing trade‑offs.
- Technical understanding of event‑driven architectures and Kafka ecosystem (Confluent, Kafka Connect, Schema Registry, Streams/ksqlDB).
- Knowledge of data governance, compliance, and security practices (GDPR/CCPA, PII, encryption, RBAC/ACLs).
- Familiarity with cloud platforms (AWS/Azure/GCP), containerization (Docker, Kubernetes), and Infrastructure as Code (Terraform/Helm).
- Strong communication skills (English C1 or higher) and ability to present technical concepts to business audiences.
Preferred Qualifications
- Experience with internal developer platforms or streaming data service ownership.
- Familiarity with observability and reliability tools (Prometheus, Grafana, Datadog) and incident response processes.
- Exposure to Customer Data Platforms (e.g., mParticle) or tag management systems (e.g., Tealium).
- Background in software development or data engineering (Python, Kotlin, Spark/Flink) is a strong plus.
Benefits
- 100% remote work across Europe with flexible hours.
- Fast‑paced and innovative environment supporting continuous learning and growth.
- Career progression paths with opportunities for promotion and advancement.
- Competitive compensation package aligned with experience.
- International exposure, working with teams across Europe and globally.
- Collaborative and supportive work culture, with organized team events.
- Access to modern tools, equipment, and resources to support productivity.
Thank you for your interest!