Activez les alertes d’offres d’emploi par e-mail !

Data Cloud Architect (Azure)

buscojobs España

Lyon

À distance

EUR 50 000 - 80 000

Plein temps

Il y a 2 jours
Soyez parmi les premiers à postuler

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Repartez de zéro ou importez un CV existant

Résumé du poste

A leading international company is seeking Cloud Data Engineers to join their team for exciting projects involving GCP and Azure. Candidates should have substantial experience in data engineering and expertise in Python, Terraform, and various cloud technologies, with the opportunity to work in a dynamic, agile environment.

Prestations

Remote work flexibility
Integration into an international team
Career development opportunities

Qualifications

  • 4+ years of data engineering experience required.
  • Strong programming skills in Python and Java.
  • Experience with GCP/Azure services essential.

Responsabilités

  • Maintain and extend customer data platforms using GCP/Azure services.
  • Develop scalable data pipelines leveraging PySpark and Databricks.
  • Build CI/CD pipelines in GCP/Azure DevOps for automated data workflows.

Connaissances

Data Engineering
Python
Java
GCP
Azure
Terraform
Apache Beam
Pyspark
Databricks
SQL

Formation

Degree in Computer Science or related field
GCP/Azure/Databricks Certification

Outils

Docker
Git

Description du poste

At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange, we are currently looking for Cloud Data Engineers to collaborate with some international clients with headquarters in Málaga and Barcelona, Spain. What are the requirements for this position?

4+ years of overall industry experience specifically in data engineering

Experience with Google cloud platforms (GCP ) / or Azure.

Python and Java programming experience and in design, programming, unit-testing ( Pytest, JUnit ).

Experience using version control (GIT), Kafka or PubSub and Docker

Experience using Terraform.

Experience using Apache Beam with Dataflow

Experience with Pyspark and Databricks working in an Azure environment.

Knowledge of Clean code principles

Strong verbal and written skills in English.

Nice to have :

Experience working with Google BigQuery / Snowflake.

Preferably GCP / Azure / Databricks certified Engineer .

Knowledge of Cloud Build.

Which are the main tasks?

As part of the team, you are responsible to maintain and extend their customer data platform based on different GCP services (e.g. Dataflow, Cloud Functions) or Azure services.

Develop and maintain scalable data pipelines using GCP or Microsoft Azure services , leveraging PySpark, Python, and Databricks .

The platform development is based on Python and Terraform.

Furthermore you will work with SQL related technologies like Google BigQuery or Snowflake and dimensional data models , to support advanced analytics and reporting capabilities .

Design, programming, unit-testing ( Pytest ), quality assurance and documentation

Design and implement data models using industry-standard tools such as ER / Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling .

Build and deploy CI / CD pipelines in GCP / Azure DevOps , automating data workflows and ensuring seamless integration and delivery of data solutions.

Work closely together in an agile team with members located in other countries (team language will be English).

What can we offer you?

Remote mode. However, you must be based in Spain.

Integration in an international project of a solid company with prospects for the future and with continuity.

We are waiting for profiles like yours, passionate about technology and who want to face a new challenge. If this is your case, sign up for the offer with your CV so we can tell you more!

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.