Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Lead GCP Data Engineer, Architect Data B2B: 26 750 – 40 000 PLN net + VAT Salary: 20 000 – 31 5[...]

Pgs Soft

España

Híbrido

EUR 40.000 - 60.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A leading tech company in Spain is seeking an experienced Data Engineer to design and optimize data ingestion pipelines and support product teams with complex data workflows. The ideal candidate has over 6 years of experience in data engineering, strong expertise in ETL/ELT, and a solid foundation in GCP cloud infrastructure. This role offers opportunities for professional growth and access to development budgets.

Servicios

Development budgets
Access to training platforms
MultiSport card

Formación

  • 6+ years of hands-on experience in data engineering and large-scale distributed systems.
  • Proven expertise in building and maintaining complex ETL/ELT pipelines.
  • Strong GCP cloud infrastructure experience.

Responsabilidades

  • Design and optimize data ingestion pipelines.
  • Lead initiatives for platform scalability and reliability.
  • Provide support for product teams with complex pipelines.

Conocimientos

Data ingestion pipeline design
Data engineering in Python
ETL/ELT
Data platform support
Collaboration with data scientists

Herramientas

GCP
Airflow
GKE
CI/CD tools
BigQuery
Descripción del empleo

GCP

Airflow

GKE

Python/Scala

ETL/ELT

Who We Are

While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.

What We Do

We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.

We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!

Beyond Projects

What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.

What sets us apart?

Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.

You will be:
  • designing, building and optimizing the data ingestion pipeline to reliably deliver billions of events daily in defined SLA,
  • leading initiatives to improve scalability, performance and reliability,
  • providing support for all product teams in building and optimizing their complex pipelines,
  • identifying and addressing pain points in the existing data platform; proposing and implementing high-leverage improvements,
  • developing new tools and frameworks to streamline the data platform workflows,
  • driving adoption of best practices in data and software engineering (testing, CI/CD, version control, monitoring),
  • working in close collaboration with data scientists and data analysts to help support their work in production,
  • supporting production ML workflows and real-time streaming use cases,
  • mentoring other engineers and contributing to a culture of technical excellence and knowledge sharing.
Your profile:
  • 6+ years of hands-on experience in data engineering and large-scale distributed systems,
  • proven expertise in building and maintaining complex ETL/ELT pipelines,
  • deep knowledge of orchestration frameworks (Airflow) and workflow optimization,
  • strong GCP cloud infrastructure experience,
  • GKE experience,
  • expert-level programming in Python or Scala,
  • solid understanding of Spark internals,
  • experience with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure as code,
  • familiarity with managing self-hosted tools like Spark or Airflow on Kubernetes,
  • experience managing data warehouse in BigQuery,
  • strong communication skills and a proactive, problem-solving mindset,
  • very good command of English (min. C1).

Work from the European Union region and a work permit arerequired.

  • working experience with messaging systems like Kafka, Redpanda,
  • experience with real-time data streaming platforms (e.g., Flink, Spark Structured Streaming),
  • familiarity with ML platforms or MLOps workflows,
  • familiarity with Kubeflow, Valido, Looker, Looker Studio.
Recruitment Process

CVreview –HR call – Technical Interview (with Live-coding)– ClientInterview (with Live-coding) – Hiring Manager Interview –Decision

Development:
  • development budgets of up to 6,800 PLN,
  • we fund certifications e.g.: AWS, Azure, ISTQB, PSM,
  • access to Udemy and O'Reilly (formerly Safari Books Online),
  • events and technology conferences,
  • internal training,
  • Xebia Upskill.
We take care of your health:
  • multiSport card - we subsidise a MultiSport card,
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.