¡Activa las notificaciones laborales por email!

Mid-Senior Data Engineer

Jordan martorell s.l.

Barcelona

Híbrido

EUR 30.000 - 50.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A global leader in digital content distribution is seeking a Data Engineer in Barcelona to design, scale, and optimize their data platform. The ideal candidate will build ETL pipelines, manage AWS services, and collaborate with teams to develop robust data solutions. This role offers a hybrid working model and competitive salary with benefits such as health insurance and a training budget.

Servicios

Health insurance
Meal plan
Training budget
Team events
Modern office with views
Free access to digital content

Formación

  • Experience with Databricks or similar platforms.
  • Solid knowledge of AWS services.
  • Proficiency in SQL, Docker, and Airflow.

Responsabilidades

  • Build and maintain ETL pipelines in Databricks/PySpark.
  • Manage and integrate data on AWS (S3, EC2, Lambda, Athena, IAM).
  • Containerize and deploy workflows with Docker/Fargate.
  • Orchestrate pipelines using Apache Airflow.
  • Collaborate with teams to deliver robust, business-driven data solutions.

Conocimientos

ETL pipelines
AWS
SQL
Docker
Airflow

Herramientas

Databricks
Git
Descripción del empleo
Overview

Our partner is a global leader in digital content distribution, delivering magazines, newspapers, and other media to millions of readers worldwide. Their platform powers publishers, retailers, and service providers by managing large-scale data flows and unlocking new digital revenue streams. They are seeking a Data Engineer in Barcelona to help design, scale, and optimize their data platform, ensuring data is reliable, secure, and actionable for analytics and business decisions.

Responsibilities
  • Build and maintain ETL pipelines in Databricks/PySpark.
  • Manage and integrate data on AWS (S3, EC2, Lambda, Athena, IAM).
  • Containerize and deploy workflows with Docker/Fargate.
  • Orchestrate pipelines using Apache Airflow.
  • Collaborate with teams to deliver robust, business-driven data solutions.
Requirements
  • Experience with Databricks or similar platforms.
  • Solid knowledge of AWS services.
  • Proficiency in SQL, Docker, and Airflow.
  • Familiarity with Git.
Nice-to-have

Looker or BI tools, MLOps (MLflow, SageMaker).

What’s on offer
  • Competitive salary & benefits (health insurance, meal plan, training budget).
  • Hybrid model – 1 day/week in the Barcelona office.
  • International, collaborative team.
  • Extra perks: team events, modern office with views, free access to digital content.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.