
Aktiviere Job-Benachrichtigungen per E-Mail!
Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf
Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren
A global technology company is seeking a Senior Azure Data Platform Engineer to construct a centralized data platform on Azure. The role requires strong skills in cloud technologies, SQL, and Python, with the opportunity to work in a hybrid model. Applicants should have a Bachelor's degree in a relevant field and be proficient in Microsoft Azure, Databricks, and Unity Catalog. Excellent communication skills in English are crucial.
Who We Are
While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world‑class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top‑notch work across cloud, data, and software. And we’re just getting started.
What We Do
We work on projects that matter – and that make a difference. From fintech and e‑commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data‑driven solutions, and next‑gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.
Beyond Projects
What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively supports your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.
What sets us apart?
Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.
The Data & AI Team is currently operating in an on‑premise environment, where data is dispersed across multiple source systems. Our objective is to establish a centralized data platform on Azure that will consolidate this data and enable a broad range of use cases, including operational reporting. The target architecture has already been defined and will primarily leverage Databricks and Unity Catalog within a medallion architecture, along with Azure Data Factory and Kafka for data ingestion.
Work from the European Union region and a work permit are required.
CVreview –HRcall –Interview(with Live‑coding)–ClientInterview (with Live‑coding)– Hiring Manager Interview -Decision