Enable job alerts via email!

Senior GCP Data Engineer

Xebia

Wrocław

On-site

PLN 120,000 - 180,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global tech company is seeking an experienced Data Engineer to design and maintain scalable data solutions. Candidates should have over 5 years of data engineering experience, strong proficiency in GCP services, and expertise in Python and data processing frameworks. This role offers the opportunity to work on transformative data projects in a vibrant team environment.

Benefits

Personal development budgets
Supportive company culture
Opportunities for community events

Qualifications

  • 5+ years of experience in a data engineering role.
  • Hands-on experience in building data processing pipelines.
  • Proficiency with GCP services for large-scale data processing.
  • Extensive experience with Apache Airflow.
  • Strong Python proficiency and expertise in modern data libraries.
  • Experience with ETL tools and processes.
  • Deep understanding of relational and NoSQL databases.

Responsibilities

  • Developing and maintaining data pipelines for seamless data flow.
  • Collaborating with teams to integrate data engineering best practices.
  • Ensuring data integrity and availability across all data systems.
  • Integrating data from various sources into the data lake.
  • Implementing ETL processes for data analytics.

Skills

Data engineering experience
Building data processing pipelines
GCP services proficiency
Apache Airflow
Python proficiency
ETL tools experience
Relational databases knowledge
Data visualization tools
Machine learning understanding

Education

Bachelor's or Master’s degree in Computer Science

Tools

Databricks
Snowflake
Spark
SQL
Tableau
Looker
Job description

While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.

What We Do

We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.

We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!

Beyond Projects

What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.

What sets us apart?

Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.

About the role:

As a Data Engineer at Xebia, you will work closely with engineering, product, and data teams to deliver our clients scalable and robust data solutions. Your key responsibilities will include designing, building, and maintaining data platforms and pipelines and mentoring new engineers.

You will be:
  • developing and maintaining data pipelines to ensure seamless data flow from the Loyalty system to the data lake and data warehouse,
  • collaborating with data engineers to ensure data engineering best practices are integrated into the development process,
  • ensuring data integrity, consistency, and availability across all data systems,
  • integrating data from various sources, including transactional databases, third-party APIs, and external data sources, into the data lake,
  • implementing ETL processes to transform and load data into the data warehouse for analytics and reporting,
  • working closely with cross-functional teams including Engineering, Business Analytics, Data Science and Product Management to understand data requirements and deliver solutions,
  • optimizing data storage and retrieval to improve performance and scalability,
  • monitoring and troubleshooting data pipelines to ensure high reliability and efficiency,
  • implementing and enforcing data governance policies to ensure data security, privacy, and compliance,
  • developing documentation and standards for data processes and procedures.
Job requirements
Your profile:
  • 5+ years in a data engineering role, with hands‑on experience in building data processing pipelines,
  • experience in leading the design and implementing of data pipelines and data products,
  • proficiency with GCP services, for large-scale data processing and optimization,
  • extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization,
  • knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing,
  • strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL),
  • hands‑on experience with ETL tools and processes,
  • practical experience with dbt for data transformation,
  • deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts,
  • excellent command of oral and written English,
  • Bachelor's or Master’s degree in Computer Science, Information Systems, or a related field.
  • Work from the European Union region and a work permit are required.
  • experience with ecommerce systems and their data integration,
  • knowledge of data visualization tools (e.g., Tableau, Looker),
  • understanding of machine learning and data analytics,
  • certification in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.).
Recruitment Process:

CV review –HR call –Interview (with Live-coding) –Client Interview (with Live-coding) – Hiring Manager Interview –Decision

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.