Social network you want to login/join with:
Cloud Data Engineer (based in Spain), bilbao
col-narrow-left
Client:
Hays
Location:
bilbao, Spain
Job Category:
Other
-
EU work permit required:
Yes
col-narrow-right
Job Reference:
3411760099675340832466
Job Views:
2
Posted:
23.07.2025
Expiry Date:
06.09.2025
col-wide
Job Description:
At HAYS, a British multinational company that offers recruitment and human resources services in 33 countries worldwide and is listed on the London Stock Exchange. We are currently looking for 2 Cloud Data Engineers to collaborate remotely with some international clients with headquarters in Málaga and Barcelona, Spain.
What are the requirements for these 2 positions?
- 4+ years in data engineering
- Experience with Azure.
- Experience with Pyspark and Databricks working in an Azure environment.
- Python programming experience
- Strong verbal and written skills in English.
Nice to have :
- Experience working with Snowflake.
- Preferably Azure/Databricks certified Engineer.
- 4+ years of experience with Google cloud platforms (GCP).
- Python and Java programming experience and in design, programming, unit-testing (Pytest, JUnit).
- Experience using Terraform.
- Experience using version control (GIT), Kafka or PubSub, and Docker.
- Experience using Apache Beam with Dataflow.
- Knowledge of Clean code principles
- Strong verbal and written skills in English.
Nice to have :
- Experience working with Snowflake.
- Preferably GCP certified Engineer.
Which are the main tasks?
- As part of the team, you are responsible to maintain and extend their customer data platform based on different GCP services (e.g. Dataflow, Cloud Functions).
- Develop and maintain scalable data pipelines using GCP.
- The platform development is based on Python and Terraform.
- Furthermore you will work with SQL related technologies like Google BigQuery or Snowflake and dimensional data models, to support advanced analytics and reporting capabilities.
- Design, programming, unit-testing (Pytest), quality assurance and documentation
- Work closely together in an agile team with members located in other countries (team language will be English).
- Develop and maintain scalable data pipelines using Microsoft Azure services, leveraging PySpark, Python, and Databricks.
- Build and deploy CI/CD pipelines in Azure DevOps, automating data workflows and ensuring seamless integration and delivery of data solutions.
- Design and implement data models using industry-standard tools such as ER/Studio, ERwin, Oracle Data Modeling, or Toad Data Modeling.
- Work closely together in an agile team with members located in other countries (team language will be English).
What can we offer you?
- Remote mode. However, you must be based in Spain.
- Integration in an international project of a solid company with prospects for the future and with continuity.
We are waiting for profiles like yours, passionate about technology and who want to face a new challenge. If this is your case, sign up for the offer with your CV so we can tell you more!