¡Activa las notificaciones laborales por email!

Cloud Data Engineer (Based In Spain)

buscojobs España

Tarragona

A distancia

EUR 45.000 - 70.000

Jornada completa

Hace 4 días
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Empieza desde cero o carga un currículum

Descripción de la vacante

A leading multinational company is seeking Cloud Data Engineers to work with international clients in Spain. Applicants should have substantial experience in data engineering with proficiency in GCP or Azure, Python, and Java. Remote work options and opportunities for career growth are offered as part of this role.

Servicios

Remote work option
Integration into an international project
Opportunities for career growth

Formación

  • 4+ years of overall industry experience specifically in data engineering.
  • Experience with Google Cloud Platform (GCP) and/or Azure.
  • Strong verbal and written skills in English.

Responsabilidades

  • Maintain and extend customer data platforms based on GCP or Azure services.
  • Develop and maintain scalable data pipelines using GCP or Azure.
  • Build and deploy CI/CD pipelines in GCP or Azure DevOps.

Conocimientos

Data engineering
Python
Java
SQL
GCP
Azure
Docker
Terraform
Apache Beam
Pyspark

Herramientas

GIT
Kafka
PubSub
Databricks
Google BigQuery
Snowflake

Descripción del empleo

At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange, we are currently looking for Cloud Data Engineers to collaborate with some international clients with headquarters in Málaga and Barcelona, Spain.

What are the requirements for this position?
  • 4+ years of overall industry experience specifically in data engineering
  • Experience with Google Cloud Platform (GCP) and/or Azure.
  • Proficiency in Python and Java programming, including design, programming, and unit testing (Pytest, JUnit).
  • Experience using version control (GIT), Kafka or PubSub, and Docker.
  • Experience with Terraform.
  • Experience using Apache Beam with Dataflow.
  • Experience with Pyspark and Databricks in an Azure environment.
  • Knowledge of Clean Code principles.
  • Strong verbal and written skills in English.
Nice to have:
  • Experience with Google BigQuery or Snowflake.
  • GCP, Azure, or Databricks certification as an Engineer.
  • Knowledge of Cloud Build.
What are the main tasks?
  • Maintain and extend customer data platforms based on GCP (e.g., Dataflow, Cloud Functions) or Azure services.
  • Develop and maintain scalable data pipelines using GCP or Azure, leveraging PySpark, Python, and Databricks.
  • Work with Python and Terraform for platform development.
  • Utilize SQL technologies like Google BigQuery or Snowflake and dimensional data models for analytics and reporting.
  • Design, program, perform unit testing (Pytest), ensure quality assurance, and document solutions.
  • Design and implement data models using tools such as ER/Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling.
  • Build and deploy CI/CD pipelines in GCP or Azure DevOps, automating data workflows.
  • Collaborate in an agile team with members from different countries; the working language is English.
What can we offer you?
  • Remote work option, based in Spain.
  • Integration into an international project with a stable company offering future prospects and continuity.
  • Opportunity for passionate technology profiles eager for new challenges. Apply with your CV to learn more!
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.