¡Activa las notificaciones laborales por email!

Data Engineer GCP (fm) Deutsche Telekom

T-Systems Iberia

Madrid

Híbrido

EUR 50.000 - 70.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading global IT services company is looking for skilled Data Engineers in Madrid, Spain. The role involves managing Google Cloud infrastructure, data processing, and implementing AI solutions. Candidates should have strong expertise in GCP, Python programming, and relevant certifications. The company offers a hybrid work model, flexible schedules, and continuous training opportunities.

Servicios

International positive dynamic work environment
Flexible schedule
Continuous training
Flexible Compensation Plan
Life and accident insurance
More than 26 working days of vacation per year

Formación

  • Certified GCP Cloud Architect or Data Engineer required.
  • Strong proficiency in data processing tools and orchestration.
  • Experience with security policies and IAM management.

Responsabilidades

  • Deploy and manage infrastructure on Google Cloud Platform (GCP).
  • Utilize Hadoop and PySpark for data processing and transformation.
  • Develop Python applications for GCP services.
  • Integrate data sources from various APIs and servers.
  • Implement AI solutions using Google's Vertex AI.

Conocimientos

Strong proficiency in Google Cloud Platform (GCP)
Expertise in Terraform for infrastructure management
Skilled in Python for application implementation
Experience with GitLab CI / CD for automation
Deep knowledge of network architectures and security implementations
Proficiency in employing data processing tools like Hive and PySpark
Familiarity with managing and integrating diverse data sources
Certified GCP Cloud Architect and Data Engineer
Descripción del empleo
Project Description :

We are looking for highly skilled Data Engineers to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on ODE development of the relevant data products on ODE Operations of the data products on ODE

Activity description and concrete tasks :
  • Infrastructure Deployment & Management : Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC Hub-and-Spoke) security implementations (IAM Secret Manager firewalls Identity-Aware Proxy) DNS configuration VPN and Load Balancing.
  • Data Processing & Transformation : Utilize Hadoop cluster with Hive for querying data and PySpark for data transformations. Implement job orchestration using Airflow.
  • Core GCP Services Management : Work extensively with services like Google Kubernetes Engine (GKE) Cloud Run BigQuery Compute Engine and Composer all managed through Terraform.
  • Application Implementation : Develop and implement Python applications for various GCP services.
  • CI / CD Pipelines : Integrate and manage GitLab Magenta CI / CD pipelines for automating cloud deployment testing and configuration of diverse data pipelines.
  • Security & Compliance : Implement comprehensive security measures manage IAM policies secrets using Secret Manager and enforce identity-aware policies.
  • Data Integration : Handle integration of data sources from CDI Datendrehscheibe (FTP servers) TARDIS APIs and Google Cloud Storage (GCS).
  • Multi-environment Deployment : Create and deploy workloads across Development (DEV) Testing (TEST) and Production (PROD) environments.
  • AI Solutions : Implement AI solutions using Googles Vertex AI for building and deploying machine learning models.
  • Certification Desired : Must be a certified GCP Cloud Architect or Data Engineer.
Qualifications : Skills Required :
  • Strong proficiency in Google Cloud Platform (GCP)
  • Expertise in Terraform for infrastructure management
  • Skilled in Python for application implementation
  • Experience with GitLab CI / CD for automation
  • Deep knowledge of network architectures security implementations and management of core GCP services
  • Proficiency in employing data processing tools like Hive PySpark and data orchestration tools like Airflow
  • Familiarity with managing and integrating diverse data sources
  • Certified GCP Cloud Architect and Data Engineer
Additional Information :
What do we offer you
  • International positive dynamic and motivated work environment.
  • Hybrid work model (telework / face-to-face).
  • Flexible schedule.
  • Continuous training.
  • Flexible Compensation Plan.
  • Life and accident insurance.
  • More than 26 working days of vacation per year.

And many more advantages of being part of T-Systems!

If you are looking for a new challenge do not hesitate to send us your CV! Please send CV in English. Join our team!

T-Systems Iberia will only process the CVs of candidates who meet the requirements specified for each offer.

Remote Work: No

Employment Type: Full-time

Key Skills

Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala

Experience: years

Vacancy: 1

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.