¡Activa las notificaciones laborales por email!

Real-time Data Streaming Engineer (Kafka, Flink, Spark Streaming)

ProTalent Recruiters

Alcoy

A distancia

EUR 60.000 - 80.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A data recruitment agency is seeking a Real-Time Data Streaming Engineer with expertise in Kafka, Flink, and Spark Streaming. The role requires 5+ years of data engineering experience with a focus on real-time streaming. Responsibilities include designing streaming pipelines and ensuring data quality. This is a remote position within the EU offering a full-time contract.

Formación

  • 5+ years of experience in data engineering, 3+ years focused on real-time streaming.
  • Strong expertise in Apache Kafka.
  • Hands-on experience with Apache Flink or Spark Streaming.

Responsabilidades

  • Design and maintain real-time data streaming pipelines.
  • Architect event-driven solutions for real-time analytics.
  • Optimize performance and scalability of streaming jobs.

Conocimientos

Real-time data processing
Apache Kafka
Apache Flink
Spark Streaming
Java
Scala
Python
SQL
Distributed computing
Cloud solutions

Herramientas

Docker
Kubernetes
Prometheus
Grafana
ELK
Descripción del empleo
We are seeking a highly skilled Real-Time Data Streaming Engineer with strong expertise in designing and building low-latency, event-driven data pipelines. You will be responsible for architecting, implementing, and optimizing streaming platforms using Apache Kafka, Apache Flink, and Spark Streaming. The ideal candidate has hands-on experience with large-scale, high-throughput systems, ensuring data reliability, scalability, and performance for enterprise-grade solutions.

Details:
Location: Remote in EU
Employment Type: Full-Time, B2B Contract
Start Date: ASAP
Language Requirements: Fluent English
Key Responsibilities
  • Design, develop, and maintain real-time data streaming pipelines using Kafka, Flink, and Spark Streaming.
  • Architect event-driven and microservices-based solutions for real-time analytics and processing.
  • Implement data ingestion, transformation, and enrichment workflows across distributed systems.
  • Optimize performance and scalability of streaming jobs for high-throughput, low-latency environments.
  • Ensure data quality, governance, and fault tolerance within the streaming infrastructure.
  • Integrate streaming solutions with data warehouses, data lakes, and cloud platforms (AWS, Azure, GCP).
  • Collaborate with data engineers, data scientists, and application teams to deliver business-critical real-time insights.
  • Monitor, troubleshoot, and improve the reliability and resilience of streaming systems.
  • Participate in system design, code reviews, and best practices development.
Requirements
  • 5+ years of experience in data engineering with at least 3+ years focused on real-time streaming.
  • Strong expertise in Apache Kafka (producers, consumers, Connect, Streams, schema registry).
  • Hands-on experience with Apache Flink or Spark Streaming for real-time data processing.
  • Solid understanding of event-driven architectures, pub/sub systems, and distributed computing.
  • Strong programming skills in Java, Scala, or Python.
  • Proficiency in SQL and experience with databases (relational and NoSQL: PostgreSQL, Cassandra, MongoDB, etc.).
  • Familiarity with cloud-native streaming solutions (AWS Kinesis, Azure Event Hubs, GCP Pub/Sub).
  • Knowledge of CI/CD, containerization (Docker, Kubernetes), and monitoring tools (Prometheus, Grafana, ELK).
  • Strong problem-solving and debugging skills, with experience in large-scale production environments.
Nice to Have
  • Knowledge of data lakehouse architectures (Delta Lake, Iceberg, Hudi).
  • Experience with machine learning pipelines on streaming data.
  • Familiarity with message brokers (RabbitMQ, Pulsar, ActiveMQ).
  • Background in industries like fintech, telecom, IoT, or e-commerce where real-time data is critical.
  • Contributions to open-source streaming projects.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.