Company description
Where creativity meets innovation
Founded in 1926, Publicis Groupe stands as the world’s most valuable holding communications agency, renowned for its creative capabilities that drive innovation across the industry. With a legacy of creative thinking, cutting-edge technology, and digital expertise, we empower clients worldwide to navigate their digital transformation journeys seamlessly. With three Solution Hubs—Publicis Communications, Publicis Media, and Epsilon, we’re dedicated to crafting compelling narratives and impactful strategies that resonate on a global scale, ensuring our clients stay ahead in a rapidly evolving digital landscape.
Publicis Groupe Middle East (PGME) is headquartered in the UAE, with a presence in KSA, Jordan, Turkey, Egypt, Kuwait, Iraq, and Lebanon. Our vibrant culture is built on teamwork, flexibility, and continuous learning. We’re a dynamic bunch who love to create and grow together. The evolution never stops.
Thanks to our collaborative approach and incredible talent, we won Network Agency of the Year at Dubai Lynx in 2023 and 2024, among many other exciting awards, making us the most awarded agency network in the region.
Fast Company also recognises us as the ‘Most Innovative Company’ for our Power of One model.
We’re all about pushing boundaries, embracing new challenges, and having a lot of fun along the way.
Join us and experience our culture first‑hand.
Overview
The Data Engineer Manager plays a critical role in leading end‑to‑end data engineering initiatives across cloud and enterprise data platforms for a banking client. This role ensures that business and analytical needs are translated into scalable data pipelines, robust data models, and high‑quality data products that support reporting, marketing performance, and advanced analytics use cases.
This role oversees data ingestion, transformation, and governance processes, ensuring data reliability, security, and accessibility for cross‑functional teams. The Data Engineer Manager collaborates closely with IT, business analytics, and marketing stakeholders to automate data flows, enable single‑source‑of‑truth environments, and drive continuous improvement in data quality and operational efficiency.
Responsibilities
Technical & Product *
- Design, develop, and maintain scalable data pipelines, data models, and ETL/ELT processes using Azure Data Factory, Azure Databricks, and Azure Data Lake.
- Ensure timely ingestion, transformation, and availability of high‑quality data across Azure environments to support reporting and analytics needs.
- Collaborate with IT/engineering teams to onboard new data sources, integrate APIs, and enhance Azure‑based data architecture.
- Perform data validation and apply data governance practices to ensure accuracy, reliability, and consistency across enterprise datasets.
- Drive automation using Azure tools to reduce manual processes and improve data delivery speed and efficiency.
- Maintain secure and compliant Azure data environments aligned with enterprise security and regulatory standards.
- Optimize data workflows, Databricks notebooks, and system performance for efficiency, scalability, and cost‑effectiveness.
Business Acumen & Operational Efficiency *
- Understands business objectives and translates them into efficient data engineering and BI solutions.
- Evaluates operational processes and identifies opportunities to automate, streamline, and reduce manual effort through Azure‑based workflows.
- Applies structured thinking to assess business requirements, prioritize tasks, and recommend scalable data solutions.
- Partners with stakeholders to understand pain points and proposes data‑driven improvements that enhance operational performance.
- Ensures data solutions are cost‑efficient, reliable, and aligned with business value and ROI goals.
- Continuously monitors system performance and suggests optimizations to improve delivery speed and overall efficiency.
Innovative & Analytical Mindset *
- Uses analytical thinking to evaluate complex data challenges and propose creative, modern solutions.
- Explores emerging Azure, Databricks, and BI technologies to enhance engineering practices and improve data capabilities.
- Brings an innovation‑first mindset to redesign workflows, improve architecture, and adopt best‑in‑class tools.
- Breaks down technical problems into structured, actionable steps that enable clear decision‑making.
- Applies critical thinking to validate assumptions, analyze root causes, and design forward‑looking solutions.
- Challenges traditional approaches constructively and introduces innovative alternatives that strengthen the data platform.
Qualifications
Education and Experience Required *
- Bachelor’s degree in computer science, Information Systems, Data Engineering, or related field.
- 5+ years of experience in Data Engineering and BI development with strong hands‑on technical exposure.
- Advanced experience with Microsoft Azure Modern Data Platform, including Azure Data Factory, Azure Databricks, Azure Synapse, and Azure Data Lake.
- Strong expertise in Databricks (PySpark, Delta Lake) for data processing and transformation.
- Advanced SQL skills and strong understanding of data warehousing and ETL/ELT concepts.
- Solid experience in Power BI, including data modeling, DAX, and building optimized dashboards.
- Working knowledge of Azure DevOps, CI/CD pipelines, and governance best practices for BI and data engineering projects.
- Ability to design, build, and optimize scalable data pipelines and analytics workflows on Microsoft Azure.
- Strong understanding of BI and analytics principles with the ability to translate business requirements into technical solutions.
- Excellent communication and stakeholder management skills.
Additional information
R-3493 P-3952