¡Activa las notificaciones laborales por email!

Cloud Data Engineer (based in Spain)

Hays

Logroño

A distancia

EUR 40.000 - 60.000

Jornada completa

Hace 4 días
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

Hays is seeking a Cloud Data Engineer to join their international teams in Spain. You will develop and maintain scalable data pipelines using GCP or Azure services, supporting advanced analytics and reporting capabilities. The position offers remote work, a strong company framework, and a chance to face new technological challenges.

Servicios

Remote work option
Integration into international projects

Formación

  • 4+ years of experience in data engineering.
  • Experience with Python, Java and Google Cloud or Azure.
  • Strong verbal and written skills in English.

Responsabilidades

  • Maintain and extend customer data platform using GCP or Azure.
  • Develop scalable data pipelines leveraging Python and Databricks.
  • Design and implement data models using ER/Studio, ERwin, or similar tools.

Conocimientos

Data Engineering
Python
Java
Google Cloud Platform
Azure
Terraform
Apache Beam
Pyspark
Databricks
SQL

Descripción del empleo

At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange, we are currently looking for Cloud Data Engineers to collaborate with some international clients with headquarters in Málaga and Barcelona, Spain.

What are the requirements for this position?

4+ years of overall industry experience specifically in data engineering

Experience with Google cloud platforms (GCP ) / or Azure.

Python and Java programming experience and in design, programming, unit-testing ( Pytest, JUnit ).

Experience using version control (GIT), Kafka or PubSub and Docker

Experience using Terraform.

Experience using Apache Beam with Dataflow

Experience with Pyspark and Databricks working in an Azure environment.

Knowledge of Clean code principles

Strong verbal and written skills in English.

Nice to have :

Experience working with Google BigQuery / Snowflake.

Preferably GCP / Azure / Databricks certified Engineer .

Knowledge of Cloud Build.

Which are the main tasks?

As part of the team, you are responsible to maintain and extend their customer data platform based on different GCP services (e.g. Dataflow, Cloud Functions) or Azure services.

Develop and maintain scalable data pipelines using GCP or Microsoft Azure services , leveraging PySpark, Python, and Databricks .

The platform development is based on Python and Terraform.

Furthermore you will work with SQL related technologies like Google BigQuery or Snowflake and dimensional data models , to support advanced analytics and reporting capabilities .

Design, programming, unit-testing ( Pytest ), quality assurance and documentation

Design and implement data models using industry-standard tools such as ER / Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling .

Build and deploy CI / CD pipelines in GCP / Azure DevOps , automating data workflows and ensuring seamless integration and delivery of data solutions.

Work closely together in an agile team with members located in other countries (team language will be English).

What can we offer you?

Remote mode. However, you must be based in Spain.

Integration in an international project of a solid company with prospects for the future and with continuity.

We are waiting for profiles like yours, passionate about technology and who want to face a new challenge. If this is your case, sign up for the offer with your CV so we can tell you more!

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.