
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A global consulting firm based in Birmingham seeks a Manager in Data Engineering to lead teams and design advanced data solutions. The ideal candidate has 4+ years of experience in data engineering, strong Python skills, and familiarity with cloud platforms. Responsibilities include leading data engineering projects, providing technical direction, and ensuring quality in data solutions. The role offers competitive benefits, including flexible working and private medical cover.
Internal Firm Services
Technology
IFS - Information Technology (IT)
Manager
You’ll be joining our Data & AI capability as a Manager in the Data Engineering team leading one or more teams to design and deliver advanced data solutions that address complex challenges for PwC and its clients. Operating at the forefront of data engineering you’ll support projects across various industries such as healthcare and financial services shaping and scaling data platforms that underpin analytics, AI and machine learning.
You’ll combine hands‑on technical delivery with team leadership setting direction for robust modern data engineering in a cross‑functional environment collaborating across business and technology to deliver tangible value from data.
We’re looking for a motivated self‑starter comfortable with ambiguity and experienced in managing cross‑functional delivery with 4 years of data engineering experience to join us in either our Manchester or Birmingham offices.
Leading and developing teams of data engineers creating a collaborative high‑performing environment focused on building reliable and scalable data solutions.
Providing technical direction for the design, build and support of data pipelines, data platforms and analytics infrastructure ensuring alignment with organisational goals and industry best practices.
Contributing hands‑on to solution design, development and troubleshooting including code reviews and resolution of complex technical issues.
Building data engineering capability by driving adoption of modern techniques, tools and patterns supporting the professional growth of your teams and the wider Data & AI capability.
Engaging stakeholders across business, technology partners and clients to understand requirements, set priorities and deliver impactful data solutions.
Ensuring quality by overseeing the development, deployment and validation of data solutions maintaining high standards of accuracy, reliability and performance.
Proven experience leading or managing data engineering teams or workstreams in complex environments.
Strong object‑oriented Python skills for developing, testing and packaging code including experience with tools such as Git, Cliff and familiarity with frameworks such as PyTorch and TensorFlow where relevant to data and AI workloads.
Experience with Apache Spark for large‑scale data processing.
Effective use of coding tools such as Cursor, GitHub Copilot and similar to accelerate high‑quality delivery.
Experience developing APIs using FastAPI or similar technologies to expose data and analytics services.
Strong understanding of business intelligence needs and optimising data transformations for AI and BI applications.
Solid understanding of best practices in data engineering architecture including data modelling, orchestration, testing and observability.
Familiarity with SDLC methodologies such as SAFe Agile and JadX and experience applying them to data engineering projects.
Experience using repositories and DevOps tooling including GitHub and Azure DevOps.
Hands‑on experience with major data engineering tools and platforms such as Databricks, Microsoft Fabric, Azure Data Factory and Palantir.
Experience with at least one major cloud platform (Azure, AWS or GCP) ideally more than one for data engineering workloads.
No matter where you may be in your career or personal life our benefits are designed to add value and support recognising and rewarding you fairly for your contributions. We offer a range of benefits including empowered flexibility and a working week split between office, home and client site; private medical cover and 24 / 7 access to a qualified virtual GP; six volunteering days a year and much more.
Accepting Feedback Accepting Feedback Active Listening Agile Scalability Amazon Web Services (AWS) Analytical Thinking Apache Airflow Apache Hadoop Azure Data Factory Coaching and Feedback Communication Creativity Data Anonymization Data Architecture Database Administration Database Management System (DBMS) Database Optimization Database Security Best Practices Databricks Unified Data Analytics Platform Data Engineering Data Engineering Platforms Data Infrastructure Data Integration Data Lake Data Modeling 32 more
Not Specified
No
No
Manager
Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala
Full-Time
years
1