¡Activa las notificaciones laborales por email!

Data Engineer

Go Global Travel

Valladolid

Presencial

EUR 35.000 - 50.000

Jornada completa

Hace 17 días

Descripción de la vacante

Go Global Travel is seeking an experienced Data Engineer to enhance their R&D team. This role focuses on designing data pipelines and ensuring data integrity across a multicultural, dynamic environment, with opportunities for flexible work arrangements and professional growth.

Servicios

Competitive salary & bonuses
Flexible working hours
Remote work options
Learning opportunities and certifications

Formación

  • 3+ years in Data Engineering or similar roles.
  • Strong skills in Python and SQL.
  • Experience with cloud platforms (AWS, GCP) required.

Responsabilidades

  • Design and build scalable ETL/ELT pipelines.
  • Develop and maintain data models, warehouses, and data lakes.
  • Optimize pipeline performance using Airflow, Spark, Kubernetes.

Conocimientos

Python
SQL
Data Governance
Data Quality
Data Security

Herramientas

Airflow
Spark
Docker
AWS
GCP

Descripción del empleo

Who We Are : Go Global Travel Group is a fast-growing B2B travel and technology company, operating across six continents with 30 global offices. We simplify the booking process for our partners with a cutting-edge platform and a strong commitment to service, innovation, and global collaboration. Our multicultural team values diverse backgrounds, unique experiences, and mutual support.

About the Role : We are looking for an experienced and detail-oriented Data Engineer to join our R&D team. You’ll play a key role in designing and optimizing data pipelines, ensuring data integrity, and empowering data-driven decisions across the organization.

Location : Spain, Italy, Bulgaria, Romania (onsite or hybrid) and Portugal (remote only)

Department : Data Engineering

What You'll Do :

  1. Design and build scalable ETL / ELT pipelines
  2. Structure and transform raw data using in-house tools
  3. Develop and maintain data models, warehouses, and data lakes
  4. Collaborate with engineers, analysts, and business stakeholders
  5. Implement data quality and governance frameworks
  6. Optimize pipeline performance (Airflow, Spark, Kubernetes)
  7. Ensure data security, documentation, and compliance

What You Bring :

  • 3+ years in Data Engineering or similar roles
  • Strong Python and SQL skills
  • Experience with cloud platforms (AWS, GCP)
  • Proficiency with modern data stacks (Airflow, dbt, Snowflake, Redshift, BigQuery)
  • Knowledge of big data tools (Spark, Kafka) and DevOps practices (Docker, CI / CD, Terraform)

Nice to Have :

  • Experience with real-time data and ML infrastructure
  • Familiarity with GDPR / CCPA and access control standards
  • Exposure to BI tools like Looker or Power BI

What We Offer : Competitive salary & bonuses, flexible working hours and remote options, a dynamic and inclusive workplace, learning opportunities and certifications.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.