¡Activa las notificaciones laborales por email!

Cloud Data Engineer (Based In Spain)

buscojobs España

Madrid

A distancia

EUR 45.000 - 65.000

Jornada completa

Hace 6 días
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

A leading recruitment company is seeking Cloud Data Engineers in Madrid. The role involves maintaining customer data platforms using GCP/Azure and developing scalable data pipelines utilizing Python and other technologies. Candidates should have extensive experience in data engineering and strong programming skills. This position offers remote work options, but candidates must be based in Spain.

Servicios

Remote mode
Integration in an international project

Formación

  • 4+ years of overall industry experience specifically in data engineering.
  • Experience with GCP/Azure and programming in Python and Java.
  • Strong verbal and written skills in English.

Responsabilidades

  • Maintain and extend customer data platform using GCP or Azure.
  • Develop and maintain scalable data pipelines leveraging PySpark and Databricks.
  • Build and deploy CI/CD pipelines in GCP/Azure DevOps.

Conocimientos

Data Engineering
Python
Java
GCP
Azure
SQL
Terraform
Pyspark
Docker
Kafka

Descripción del empleo

At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange, we are currently looking for Cloud Data Engineers to collaborate with some international clients with headquarters in Málaga and Barcelona, Spain.

What are the requirements for this position?

  • 4+ years of overall industry experience specifically in data engineering
  • Experience with Google cloud platforms (GCP ) / or Azure.
  • Python and Java programming experience and in design, programming, unit-testing ( Pytest, JUnit ).
  • Experience using version control (GIT), Kafka or PubSub and Docker
  • Experience using Terraform.
  • Experience using Apache Beam with Dataflow
  • Experience with Pyspark and Databricks working in an Azure environment.
  • Knowledge of Clean code principles
  • Strong verbal and written skills in English.

Nice to have :

  • Experience working with Google BigQuery / Snowflake.
  • Preferably GCP / Azure / Databricks certified Engineer .
  • Knowledge of Cloud Build.

Which are the main tasks?

  • As part of the team, you are responsible to maintain and extend their customer data platform based on different GCP services (e.g. Dataflow, Cloud Functions) or Azure services.
  • Develop and maintain scalable data pipelines using GCP or Microsoft Azure services , leveraging PySpark, Python, and Databricks .
  • The platform development is based on Python and Terraform.
  • Furthermore you will work with SQL related technologies like Google BigQuery or Snowflake and dimensional data models , to support advanced analytics and reporting capabilities .
  • Design, programming, unit-testing ( Pytest ), quality assurance and documentation
  • Design and implement data models using industry-standard tools such as ER / Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling .
  • Build and deploy CI / CD pipelines in GCP / Azure DevOps , automating data workflows and ensuring seamless integration and delivery of data solutions.
  • Work closely together in an agile team with members located in other countries (team language will be English).

What can we offer you?

  • Remote mode. However, you must be based in Spain.
  • Integration in an international project of a solid company with prospects for the future and with continuity.

We are waiting for profiles like yours, passionate about technology and who want to face a new challenge. If this is your case, sign up for the offer with your CV so we can tell you more!

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.