Enable job alerts via email!

Senior Data Engineer with Flink

Xebia

Wrocław, Rzeszów

Hybrid

PLN 120,000 - 180,000

Full time

Today
Be an early applicant

Job summary

A global tech company is seeking a Data Streaming Engineer located in Wrocław, Poland. In this role, you will develop and enhance the Data platform, create real-time data streaming pipelines using Apache Flink, and engage closely with clients to optimize data resources. The ideal candidate will have a strong programming background and familiarity with cloud environments. This position also requires a good command of English and permits work from the EU.

Qualifications

  • Proficiency in a programming language like Python, Java, or Scala.
  • Solid experience in building real-time data streaming pipelines with Apache Flink.
  • Familiarity with the Kafka messaging system.
  • Experience in working with cloud environments (AWS / Azure / GCP).
  • Fundamental knowledge of data engineering.

Responsibilities

  • Developing and committing new functionalities and open-source tools.
  • Taking part in R&D, maintenance, and monitoring of the platform’s components.
  • Implementing policies aligned to the strategic plans of the company.
  • Creating and propagating standards of work in projects.
  • Contributing to the organization’s knowledge database.

Skills

Proficiency in Python, Java, or Scala
Building real-time data streaming pipelines
Familiarity with Kafka
Experience with cloud environments (AWS/Azure/GCP)
Knowledge of data engineering
Very good command of English (min. B2)

Tools

Apache Flink
Cloud environments
Kafka
Job description

While Xebia is a global tech company, in Poland, our roots came from two teams – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.

What We Do

We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.

We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!

Beyond Projects

What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.

What sets us apart?

Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.

About the role:

As a Data Streaming Engineer, you will hold a vital position in the provision, development, implementation, enhancement, and upkeep of the Data platform. Your role extends to guiding customers and disseminating best practices throughout the company, ensuring efficient utilization and optimization of data resources. Additionally, you will actively contribute to the continuous improvement and innovation of data streaming processes, fostering a culture of excellence and adaptability within the organization.

You will be:
  • developing and committing of new functionalities and open-source tools,
  • taking part in R&D, maintenance, and monitoring of the platform’s components,
  • implementing and executing policies aligned to the strategic plans of the company concerning used technologies, work organization, etc.,
  • creating and propagating standards of work in projects,
  • undertaking work that requires the application of fundamental principles in a wide and often unpredictable range of contexts,
  • contributing to the organization’s knowledge database.
Job requirements
Your profile:
  • proficiency in a programming language like Python, Java, or Scala,
  • solid experience in building real-time data streaming pipelines with Apache Flink,
  • familiarity with the Kafka messaging system,
  • experience in working with cloud environments (AWS / Azure / GCP),
  • fundamental knowledge of data engineering, building data platforms, and solutions,
  • understanding complexity and relationships between businesses, suppliers, partners, competitors, and clients,
  • ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement,
  • very good command of English (min. B2).
  • Work from EU and a work permit to work from EU are required.

CV review – HR call – Technical Interview (with Live-coding) – Client Interview (with Live-coding) – Hiring Manager Interview - Decision

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.