¡Activa las notificaciones laborales por email!

Cloud Data Engineer (based in Spain)

Hays

Donostia/San Sebastián

A distancia

EUR 55.000 - 75.000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading company is seeking a Cloud Data Engineer to work with international clients, primarily leveraging GCP and Azure technologies. Responsibilities include developing scalable data pipelines and maintaining customer data platforms. The role offers remote work potential, provided candidates are based in Spain, and promises an opportunity to integrate into an international project.

Servicios

Remote work option
Integration in an international project

Formación

  • 4+ years of overall industry experience in data engineering.
  • Strong verbal and written skills in English.
  • Knowledge of Clean code principles.

Responsabilidades

  • Maintain and extend customer data platform based on GCP or Azure services.
  • Develop and maintain scalable data pipelines using GCP or Azure services.
  • Design and implement data models using industry-standard tools.

Conocimientos

Python
Java
GCP
Azure
Data Engineering
Docker
Terraform
Pyspark
Version Control (GIT)
Kafka

Descripción del empleo

At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange, we are currently looking for Cloud Data Engineers to collaborate with some international clients with headquarters in Málaga and Barcelona, Spain.

What are the requirements for this position?

  • 4+ years of overall industry experience specifically in data engineering
  • Experience with Google cloud platforms (GCP ) / or Azure.
  • Python and Java programming experience and in design, programming, unit-testing ( Pytest, JUnit ).
  • Experience using version control (GIT), Kafka or PubSub and Docker
  • Experience using Terraform.
  • Experience using Apache Beam with Dataflow
  • Experience with Pyspark and Databricks working in an Azure environment.
  • Knowledge of Clean code principles
  • Strong verbal and written skills in English.

Nice to have :

  • Experience working with Google BigQuery / Snowflake.
  • Preferably GCP / Azure / Databricks certified Engineer .
  • Knowledge of Cloud Build.

Which are the main tasks?

  • As part of the team, you are responsible to maintain and extend their customer data platform based on different GCP services (e.g. Dataflow, Cloud Functions) or Azure services.
  • Develop and maintain scalable data pipelines using GCP or Microsoft Azure services , leveraging PySpark, Python, and Databricks .
  • The platform development is based on Python and Terraform.
  • Furthermore you will work with SQL related technologies like Google BigQuery or Snowflake and dimensional data models , to support advanced analytics and reporting capabilities .
  • Design, programming, unit-testing ( Pytest ), quality assurance and documentation
  • Design and implement data models using industry-standard tools such as ER / Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling .
  • Build and deploy CI / CD pipelines in GCP / Azure DevOps , automating data workflows and ensuring seamless integration and delivery of data solutions.
  • Work closely together in an agile team with members located in other countries (team language will be English).

What can we offer you?

  • Remote mode. However, you must be based in Spain.
  • Integration in an international project of a solid company with prospects for the future and with continuity.

We are waiting for profiles like yours, passionate about technology and who want to face a new challenge. If this is your case, sign up for the offer with your CV so we can tell you more!

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.