¡Activa las notificaciones laborales por email!

Specialist Software Engineer (Kafka)

buscojobs España

Cádiz

Híbrido

EUR 35.000 - 55.000

Jornada completa

Hace 3 días
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

A forward-thinking technology consultancy is looking for a Software Engineer to build modern data pipelines. The successful candidate will work with Kafka and Snowflake to create real-time data flows, driving powerful analytics. The role offers competitive compensation, permanent employment, health benefits, and a flexible working environment.

Servicios

Competitive compensation
Health insurance
Mental health support
Ongoing training funding
Relaxed dress code
Diverse career paths
Flexible hybrid work environment

Formación

  • Experience in building data pipelines using Kafka technologies.
  • Familiarity with structured data formats like JSON and Avro.
  • Excellent communication skills in English.

Responsabilidades

  • Build and maintain data pipelines using Kafka Connect and Streams.
  • Implement real-time transformations and integrate Kafka with databases.
  • Collaborate to turn data requirements into effective solutions.

Conocimientos

C#
Python
Kafka
Data Pipelines
Real-time Data Transformation
JSON
Avro
English Communication

Descripción del empleo

is an independently minded, entrepreneurial technology consultancy helping clients exploit tomorrow’s technology to find unexpected solutions to today’s business problems.

We’re looking for a

Software Engineer

with a passion for

and a strong interest in working with

modern data pipelines.

This role is hands-on, focused on building

stream transformations

data integrations, not on maintaining infrastructure.

You’ll help deliver business-critical data flows using Kafka Connect, Kafka Streams, and Snowflake — enabling real-time analytics and platform integration across enterprise systems.

What You’ll Do :

  • Build and maintain data pipelines using Kafka Connect, Kafka Streams, and KSQL.
  • Implement real-time transformations, enrichments, and routing logic for streaming data.
  • Integrate Kafka with databases and cloud data platforms like Snowflake.
  • Collaborate with leads and stakeholders to turn data requirements into running code.
  • Ensure reliable data delivery across producers, Kafka topics, and downstream consumers.

Must-Have Requirements :

  • Experience as a software engineer, working with C# or Python.
  • Producing to and consuming from Kafka topics.
  • Comfortable using Kafka Connect.
  • Exposure to Kafka Streams and KSQL for real-time data transformation.
  • Familiarity with working on data pipelines, databases, and structured data formats (e.g., JSON, Avro).
  • Communicates well in English and works effectively in team-based environments.

Nice-to-Haves :

  • Experience integrating with Snowflake or other cloud data warehouses.
  • Familiarity with data schemas and Schema Registry.
  • Exposure to CI / CD practices or working with DevOps teams.
  • General knowledge of cloud platforms (Azure, AWS, or GCP).

Benefits include competitive compensation, a permanent position, health insurance, mental health support, ongoing training funding, a relaxed dress code, diverse career paths, and a flexible hybrid work environment.

Interview process: 3 interviews.

Application: Marionete provides equal employment opportunities and complies with GDPR. By applying, you consent to data privacy policies.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.