¡Activa las notificaciones laborales por email!

Data Engineer

Hays

Salamanca

Presencial

EUR 40.000 - 80.000

Jornada completa

Hace 12 días

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

Join a forward-thinking international consulting firm as a Data Engineer, where you will develop and manage data pipelines using cutting-edge technologies. This role offers an engaging environment with a talented team passionate about Business Intelligence and Advanced Analytics. Enjoy the flexibility of a hybrid work model while contributing to innovative projects in cloud data platforms. With a commitment to best practices and continuous learning, this position is perfect for those eager to make an impact in the world of data. Don't miss this opportunity to advance your career in a dynamic and supportive setting!

Servicios

25 days of holidays
Attractive salary
Excellent work environment
Hybrid work model

Formación

  • 4+ years of experience as a Data Engineer.
  • Proven experience with Python or Scala and orchestration tools.

Responsabilidades

  • Develop and manage data pipelines using CI/CD practices.
  • Support orchestration of data pipelines with tools like Airflow.

Conocimientos

Python
Scala
Data Warehousing
Data Lake
Data Lakehouse
SQL
CI/CD
DevOps

Herramientas

Airflow
Oozie
Azure Data Factory
GitHub Action
Azure DevOps
AWS CI/CD Pipelines
GCP

Descripción del empleo

From Hays we are hiring a Data Engineer for a specialist international consulting firm delivering services and solutions in Business Intelligence, Advanced Analytics, Big Data & Cloud, and Web & Mobile Applications.

Responsibilities

  • Assist in developing, deploying, and managing data pipelines using CI / CD practices.
  • Support the orchestration of data pipelines with tools like Airflow, Oozie, or Azure Data Factory.
  • Support developing Spark jobs for data processing using Python or Scala.
  • Gather and analyse pipeline requirements with guidance.
  • Learn and apply best practices for building efficient and scalable data pipelines.

Requirements

  • Minimum 4 years of experience as a Data Engineer.
  • Proven experience with Python or Scala
  • Knowledge in Data Warehousing, Data Lake and Lakehouse paradigms
  • Experience with orchestration tools like Airflow or Oozie
  • Ability to design and implement DevOps strategies for CI / CD with tools such as GitHub Action, Azure DevOps, AWS CI / CD Pipelines, otc.
  • Knowledge on SQL
  • Ability to design and implement data ingestion and transformation pipelines.
  • Technology expertise of solutioning in cloud data platforms in AWS, Azure or GCP

Our offer to you!

  • Be part of an international and talented team, enjoying an excellent work environment where people are passionate about what they do.
  • Attractive salary according to experience and knowledge provided.
  • Hybrid modality in Barcelona and Reus with 2 days in-house and remote work in Madrid, Valencia, Alicante and Sevilla.
  • 25 days of holidays

If you are interested and want to know more about the opportunity apply for the job and upload your updated CV in english !

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.