Enable job alerts via email!

Data Engineering Consultant - Pyspark

Techwolf

City Of London

On-site

GBP 50,000 - 70,000

Full time

2 days ago
Be an early applicant

Job summary

A leading data consultancy in the UK is looking for a skilled data engineer to build and maintain data frameworks. The role requires proficiency in PySpark, cloud technologies, and strong programming skills in Python and SQL. Additional skills include excellent communication and problem-solving abilities. This position offers a range of employee benefits and opportunities for professional development.

Benefits

25 days holiday + bank holidays
Private medical insurance
Life cover
Cycle to work scheme
Employee assistance programme
Online learning via Udemy

Qualifications

  • Proven experience with PySpark and Apache Spark.
  • Experience with cloud platforms and data lake architectures.
  • Strong programming skills in Python and SQL.
  • Experience with DevOps and infrastructure deployments.
  • Ability to write complex SQL queries.

Responsibilities

  • Build and maintain data processing frameworks using Python.
  • Support development of the Azure Databricks platform.
  • Collaborate with technical teams to ensure solution alignment.
  • Take ownership of requirements and communicate effectively.
  • Apply software engineering principles in all tasks.

Skills

Team player
Client facing skills
Proactive
Communication skills
Negotiation skills
Creative thinking
Leadership skills

Tools

PySpark
Apache Spark
AWS
Azure
GCP
GitHub
CI/CD
DBT
Tableau
Power BI
Snowflake
Job description
Company Description

Talan Data x AI is a leading Data Management and Analytics consultancy, working closely with leading software vendors and top industry experts across a range of sectors, unlocking value and insight from their data. At Talan Data x AI, innovation is at the heart of our client offerings, and we help companies to further improve their efficiency with modern processes and technologies, such as Machine Learning (ML) and Artificial Intelligence (AI).

Our consultants are at the heart of everything we do, and we have been recertified as a 2025 Great Place to Work. This achievement not only highlights Talan Data x AI’s positive organisational culture but also strengthens its reputation as an employer of choice within the industry. We invest heavily in the training and development of our teams and hold regular socials in each region to encourage engagement and network building.

Job Description

Your skills and attributes for success:

  • An excellent team player and able to work independently.
  • Excellent client facing skills with experience on client projects.
  • A self-starter who is proactive in nature.
  • Excellent verbal, written communication, and presentational skills.
  • Ability to build internal and external relationships.
  • Effective negotiating and influencing skills.
  • Ability to think creatively and propose innovative solutions.
  • Leadership skills.

To qualify for this role, you must have:

  • Proven experience and knowledge with PySpark and Apache Spark including the fundamentals of how it works.
  • Experience with cloud platforms (e.g., AWS, Azure, GCP) and data lake architectures.
  • Strong experience and programming skills in languages such as Python and SQL and the ability to write complex SQL queries.
  • Use of GitHub and CI/CD practices.
  • Support development of the Azure Databricks Lakehouse platform, shaping frameworks and solutions that other engineering teams will adopt in future data projects.
  • Build, optimise, and maintain data processing frameworks using Python, ensuring performance, scalability, and maintainability.
  • Support DBT integration and best practices for transformation pipelines within Databricks.
  • Apply software engineering principles including:
    • Full development lifecycle management
    • Source control, automated testing, CI/CD
    • Design patterns and reusable solutions
    • Coding standards and patterns
  • Collaborate with technical solution authorities, ensuring alignment with governance, design decisions, and platform standards.
  • Collaborate closely with the Cloud Architecture and Data Architecture teams to deliver approved solutions.
  • Take ownership of requirements, communicate effectively across teams, and deliver high quality solutions.
  • Experience of DevOps and infrastructure deployments (Azure and Databricks).
  • A proactive awareness of industry standards, regulations, and developments.
  • Multi-skilled experience in one or more of the following disciplines: Data Management, Data Engineering, Data Warehousing, Data Modelling, Data Quality, Data Integration, Data Analytics, Data Visualisation, Data Science and Business Intelligence.
  • Project experience using one or more of the following technologies: Tableau, Power BI, Cloud, Azure, AWS, GCP, Snowflake) and their integration with Databricks is advantageous

You must be:

  • Willing to work on client sites, potentially for extended periods.
  • Willing to travel for work purposes and be happy to stay away from home for extended periods.
  • Eligible to work in the UK without restriction.

#TalanUK #LI-HB1

Additional Information

What we offer:

  • BDP Plus – A reward programme whereby you accrue points to trade against a 3-month paid sabbatical or cash equivalent.
  • 25 days holiday + bank holidays.
  • 5 days holiday buy/sell option.
  • Private medical insurance.
  • Life cover.
  • Cycle to work scheme.
  • Eligibility for company pension scheme (5% employer contribution, salary sacrifice option).
  • Employee assistance programme.
  • Bespoke online learning via Udemy for Business.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.