We are looking for highly skilled Data Engineers to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on ODE development of the relevant data products on ODE Operations of the data products on ODE.
Activity description and concrete tasks :
- Infrastructure Deployment & Management : Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP), including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing.
- Data Processing & Transformation : Utilize Hadoop cluster with Hive for querying data and PySpark for data transformations. Implement job orchestration using Airflow.
- Core GCP Services Management : Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform.
- Application Implementation : Develop and implement Python applications for various GCP services.
- CI / CD Pipelines : Integrate and manage GitLab CI/CD pipelines for automating cloud deployment, testing, and configuration of data pipelines.
- Security & Compliance : Implement security measures, manage IAM policies, secrets using Secret Manager, and enforce identity-aware policies.
- Data Integration : Handle integration of data sources from CDI Datendrehscheibe (FTP servers), TARDIS APIs, and Google Cloud Storage (GCS).
- Multi-environment Deployment : Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments.
- AI Solutions : Implement AI solutions using Google’s Vertex AI for building and deploying machine learning models.
- Certification Desired : Must be a certified GCP Cloud Architect or Data Engineer.
Qualifications : Skills Required :
- Strong proficiency in Google Cloud Platform (GCP)
- Expertise in Terraform for infrastructure management
- Skilled in Python for application implementation
- Experience with GitLab CI/CD for automation
- Deep knowledge of network architectures, security implementations, and management of core GCP services
- Proficiency with data processing tools like Hive, PySpark, and orchestration tools like Airflow
- Familiarity with managing and integrating diverse data sources
- Certified GCP Cloud Architect and Data Engineer
Additional Information :
What do we offer you
- International, positive, dynamic, and motivated work environment.
- Hybrid work model (telecommuting / on-site).
- Continuous training: Certification preparation, access to Coursera, weekly English and German classes, etc.
- Flexible compensation plan: medical insurance, restaurant tickets, daycare, transportation allowances, etc.
- Life and accident insurance.
- More than 26 working days of vacation per year.
- Social fund.
- Free services from specialists (doctors, physiotherapists, nutritionists, psychologists, lawyers, etc.).
- 100% salary in case of medical leave.
And many more advantages of being part of T-Systems!
If you are looking for a new challenge, do not hesitate to send us your CV! Please send your CV in English. Join our team!
T-Systems Iberia will only process CVs of candidates who meet the requirements specified for each offer.
Remote Work : Employment Type :
Full-time
Key Skills
Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala
Experience : years
Vacancy : 1