
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A global jewelry manufacturer in Wrocław is seeking a Data Engineer to design and implement scalable data products on an Azure PaaS data platform. The ideal candidate has 5+ years of experience, expertise in Azure Synapse Analytics, SQL Server, and strong analytical abilities, focusing on delivering high-quality data solutions that enhance decision-making across the organization.
Our client is a global jewelry manufacturer undergoing a major transformation, moving from IaaS-based solutions to a modern Azure PaaS data platform. As part of this journey, you will design and implement scalable, reusable, and high-quality data products using technologies such as Data Factory, Data Lake, Synapse, and Databricks. These solutions will enable advanced analytics, reporting, and data-driven decision-making across the organization. By collaborating with product owners, architects, and business stakeholders, you will play a key role in maximizing the value of data and driving measurable commercial impact worldwide.
Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform.
Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models.
Enable advanced analytics, reporting, and other data‑driven use cases that support commercial initiatives and operational efficiencies.
Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance.
Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production‑ready solutions.
Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits.
Promote a “build‑once, consume‑many” approach to maximize reuse and value creation across business verticals.
Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering.