¡Activa las notificaciones laborales por email!

Data Engineer

Explore Group

León

A distancia

EUR 45.000 - 70.000

Jornada completa

Hace 3 días
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

A leading technology consultancy is seeking a skilled Data Engineer with strong expertise in Apache Kafka and Snowflake. This permanent role is remote-first, allowing you to work on exciting projects with autonomy while fostering long-term career growth in a collaborative environment.

Formación

  • 5+ years of experience as a Data Engineer.
  • Expertise in Kafka Streams, Kafka Connect, or Confluent Kafka.
  • Hands-on experience with Snowflake (schema design, optimisation).

Responsabilidades

  • Design and manage real-time data pipelines using Apache Kafka.
  • Develop scalable ETL / ELT processes for ingesting data into Snowflake.
  • Collaborate with cross-functional teams, including analysts and data scientists.

Conocimientos

Apache Kafka
Snowflake
SQL
Python
Java
Scala
.NET
Git

Herramientas

GitLab CI / CD
Jenkins
Azure DevOps

Descripción del empleo

A leading technology consultancy is hiring a Data Engineer with strong expertise in Apache Kafka and Snowflake . You'll join a fast-growing, remote-first team delivering cutting-edge data solutions to global clients in finance, retail, and other industries.

This is a permanent role open to candidates based in Portugal or Spain , offering exciting projects, autonomy, and long-term career growth.

What You’ll Be Doing

  • Design and manage real-time data pipelines using Apache Kafka
  • Develop scalable ETL / ELT processes for ingesting data into Snowflake
  • Collaborate with cross-functional teams, including analysts, data scientists, and DevOps
  • Apply modern engineering best practices, including CI / CD and version control
  • Provide guidance and mentorship to junior engineers

What We’re Looking For

  • 5+ years of experience as a Data Engineer
  • Expertise in Kafka Streams , Kafka Connect , or Confluent Kafka
  • Hands-on experience with Snowflake (schema design, optimisation)
  • Strong SQL skills and / or programming knowledge in Python, Java, Scala, or .NET
  • Experience with Git-based CI / CD tools (e.g. GitLab CI / CD, Jenkins, Azure DevOps)
  • Clear and confident communicator in English

Nice to Have

  • Familiarity with Azure services like Data Factory, Data Lake, or Synapse
  • Exposure to Apache Spark , Hadoop , or other distributed data frameworks
  • Understanding of data governance , security , and compliance (e.g. GDPR)
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.