Enable job alerts via email!

Data Engineering Consultant - Pyspark

Talan

City Of London

Hybrid

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Job summary

A premier consultancy firm in London seeks an experienced Data Engineer to optimize data management solutions using PySpark, Python, and SQL while collaborating closely with clients. The role involves supporting cloud platforms and ensuring high-quality project delivery. This company offers competitive compensation, extensive training opportunities, and a flexible working environment.

Benefits

25 days holiday plus bank holidays
Private medical insurance
Life cover
Cycle to work scheme
Employee assistance programme
Online learning via Udemy

Qualifications

  • Proven experience with PySpark and Apache Spark.
  • Experience with cloud platforms like AWS, Azure, and GCP.
  • Strong programming skills in Python and SQL.
  • Knowledge of CI/CD practices and GitHub.
  • Experience in data processing frameworks.

Responsibilities

  • Support development of Azure Databricks Lakehouse platform.
  • Build and maintain data processing frameworks using Python.
  • Collaborate with Cloud and Data Architecture teams.
  • Take ownership of client requirements and deliver solutions.

Skills

Team player
Client facing skills
Communication skills
Negotiating skills
Creative thinking
Leadership

Education

Degree in Computer Science or related field

Tools

PySpark
Apache Spark
Python
SQL
GitHub
Azure Databricks
Tableau
Power BI
Job description
Overview

Talan Data x AI is a leading Data Management and Analytics consultancy, working closely with leading software vendors and top industry experts across a range of sectors, unlocking value and insight from their data. At Talan Data x AI, innovation is at the heart of our client offerings, and we help companies to further improve their efficiency with modern processes and technologies, such as Machine Learning (ML) and Artificial Intelligence (AI).

Our consultants are at the heart of everything we do, and we have been recertified as a 2025 Great Place to Work. This achievement not only highlights Talan Data x AI’s positive organisational culture but also strengthens its reputation as an employer of choice within the industry. We invest heavily in the training and development of our teams and hold regular socials in each region to encourage engagement and network building.

Skills and attributes
  • An excellent team player and able to work independently.
  • Excellent client facing skills with experience on client projects.
  • A self-starter who is proactive in nature.
  • Excellent verbal, written communication, and presentational skills.
  • Ability to build internal and external relationships.
  • Effective negotiating and influencing skills.
  • Ability to think creatively and propose innovative solutions.
  • Leadership skills.
To qualify for this role
  • Proven experience and knowledge with PySpark and Apache Spark including the fundamentals of how it works.
  • Experience with cloud platforms (e.g., AWS, Azure, GCP) and data lake architectures.
  • Strong experience and programming skills in languages such as Python and SQL and the ability to write complex SQL queries.
  • Use of GitHub and CI/CD practices.
  • Support development of the Azure Databricks Lakehouse platform, shaping frameworks and solutions that other engineering teams will adopt in future data projects.
  • Build, optimise, and maintain data processing frameworks using Python, ensuring performance, scalability, and maintainability.
  • Support DBT integration and best practices for transformation pipelines within Databricks.
  • Apply software engineering principles including:
    • Full development lifecycle management
    • Source control, automated testing, CI/CD
    • Design patterns and reusable solutions
    • Coding standards and patterns
  • Collaborate with technical solution authorities, ensuring alignment with governance, design decisions, and platform standards.
  • Collaborate closely with the Cloud Architecture and Data Architecture teams to deliver approved solutions.
  • Take ownership of requirements, communicate effectively across teams, and deliver high quality solutions.
  • Experience of DevOps and infrastructure deployments (Azure and Databricks).
  • A proactive awareness of industry standards, regulations, and developments.
  • Multi-skilled experience in one or more of the following disciplines: Data Management, Data Engineering, Data Warehousing, Data Modelling, Data Quality, Data Integration, Data Analytics, Data Visualisation, Data Science and Business Intelligence.
  • Project experience using one or more of the following technologies: Tableau, Power BI, Cloud, Azure, AWS, GCP, Snowflake) and their integration with Databricks is advantageous
You must be
  • Willing to work on client sites, potentially for extended periods.
  • Willing to travel for work purposes and be happy to stay away from home for extended periods.
  • Eligible to work in the UK without restriction.
What we offer
  • BDP Plus – A reward programme whereby you accrue points to trade against a 3-month paid sabbatical or cash equivalent.
  • 25 days holiday + bank holidays.
  • 5 days holiday buy/sell option.
  • Private medical insurance.
  • Life cover.
  • Cycle to work scheme.
  • Eligibility for company pension scheme (5% employer contribution, salary sacrifice option).
  • Employee assistance programme.
  • Bespoke online learning via Udemy for Business.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.