¡Activa las notificaciones laborales por email!

Data Engineer GCP

T-Systems Iberia

Madrid

Híbrido

EUR 40.000 - 60.000

Jornada completa

Hace 30+ días

Descripción de la vacante

A leading company in cloud solutions seeks highly skilled Data Engineers to join their dynamic team in Madrid. The role involves deploying infrastructure on Google Cloud Platform, utilizing advanced data processing tools, and implementing CI/CD pipelines. Ideal candidates will possess GCP certifications and a strong proficiency in Python, offering a rewarding environment with continuous training and a hybrid work model.

Servicios

Hybrid work model
Continuous training and certification preparation
Flexible compensation plan
Life and accident insurance
More than 26 working days of vacation
Free specialist services
100% salary in case of medical leave

Formación

  • Certified GCP Cloud Architect or Data Engineer required.
  • Experience with Google Cloud Platform and Python applications.
  • Familiarity with data integration and processing tools.

Responsabilidades

  • Deploy and manage infrastructure on Google Cloud Platform.
  • Utilize Hadoop and PySpark for data processing.
  • Implement CI/CD pipelines and manage security measures.

Conocimientos

Strong proficiency in Google Cloud Platform (GCP)
Expertise in Terraform
Skilled in Python
Experience with GitLab CI/CD
Deep knowledge of network architectures
Proficiency with data processing tools
Familiarity with data source integration
GCP Cloud Architect certification
Data Engineer certification
Descripción del empleo

We are looking for highly skilled Data Engineers to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on ODE development of the relevant data products on ODE Operations of the data products on ODE.

Activity description and concrete tasks :
  1. Infrastructure Deployment & Management : Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP), including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing.
  2. Data Processing & Transformation : Utilize Hadoop cluster with Hive for querying data and PySpark for data transformations. Implement job orchestration using Airflow.
  3. Core GCP Services Management : Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform.
  4. Application Implementation : Develop and implement Python applications for various GCP services.
  5. CI / CD Pipelines : Integrate and manage GitLab CI/CD pipelines for automating cloud deployment, testing, and configuration of data pipelines.
  6. Security & Compliance : Implement security measures, manage IAM policies, secrets using Secret Manager, and enforce identity-aware policies.
  7. Data Integration : Handle integration of data sources from CDI Datendrehscheibe (FTP servers), TARDIS APIs, and Google Cloud Storage (GCS).
  8. Multi-environment Deployment : Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments.
  9. AI Solutions : Implement AI solutions using Google’s Vertex AI for building and deploying machine learning models.
  10. Certification Desired : Must be a certified GCP Cloud Architect or Data Engineer.
Qualifications : Skills Required :
  • Strong proficiency in Google Cloud Platform (GCP)
  • Expertise in Terraform for infrastructure management
  • Skilled in Python for application implementation
  • Experience with GitLab CI/CD for automation
  • Deep knowledge of network architectures, security implementations, and management of core GCP services
  • Proficiency with data processing tools like Hive, PySpark, and orchestration tools like Airflow
  • Familiarity with managing and integrating diverse data sources
  • Certified GCP Cloud Architect and Data Engineer
Additional Information :

What do we offer you

  • International, positive, dynamic, and motivated work environment.
  • Hybrid work model (telecommuting / on-site).
  • Continuous training: Certification preparation, access to Coursera, weekly English and German classes, etc.
  • Flexible compensation plan: medical insurance, restaurant tickets, daycare, transportation allowances, etc.
  • Life and accident insurance.
  • More than 26 working days of vacation per year.
  • Social fund.
  • Free services from specialists (doctors, physiotherapists, nutritionists, psychologists, lawyers, etc.).
  • 100% salary in case of medical leave.

And many more advantages of being part of T-Systems!

If you are looking for a new challenge, do not hesitate to send us your CV! Please send your CV in English. Join our team!

T-Systems Iberia will only process CVs of candidates who meet the requirements specified for each offer.

Remote Work : Employment Type :

Full-time

Key Skills

Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Experience : years

Vacancy : 1

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.