Enable job alerts via email!

Data Engineering Consultant

Talan

City Of London

On-site

GBP 50,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A leading Data Management consultancy in London is seeking a skilled Data Engineer with expertise in PySpark and AWS. This role involves optimizing data frameworks, collaborating with architecture teams, and engaging with clients directly. The ideal candidate has strong programming skills and a proactive approach to industry standards. The position offers 25 days of holiday, private medical insurance, and opportunities for professional development.

Benefits

25 days holiday + bank holidays
Private medical insurance
Life cover
Cycle to work scheme
Company pension scheme

Qualifications

  • Proven experience with PySpark and Apache Spark.
  • Core experience with AWS and Azure.
  • Strong programming skills in Python and SQL.
  • Experience with cloud platforms and data lake architectures.
  • Experience in building and optimizing data frameworks.

Responsibilities

  • Support development of Azure Databricks Lakehouse platform.
  • Build, optimize, and maintain data processing frameworks.
  • Collaborate with different architecture teams for approved solutions.
  • Manage stakeholder requirements and deliver quality solutions.
  • Proactive awareness of industry standards and regulations.

Skills

Team player
Client-facing skills
Proactive self-starter
Communication skills
Relationship building
Negotiation skills
Creative thinking
Leadership skills

Tools

PySpark
Apache Spark
AWS
Azure
Python
SQL
GitHub
Terraform
Databricks
Job description
Overview

Talan Data x AI is a leading Data Management and Analytics consultancy, working closely with leading software vendors and top industry experts across a range of sectors, unlocking value and insight from their data. At Talan Data x AI, innovation is at the heart of our client offerings, and we help companies to further improve their efficiency with modern processes and technologies, such as Machine Learning (ML) and Artificial Intelligence (AI).

Our consultants are at the heart of everything we do, and we have been recertified as a 2025 Great Place to Work. This achievement not only highlights Talan Data x AI’s positive organisational culture but also strengthens its reputation as an employer of choice within the industry. We invest heavily in the training and development of our teams and hold regular socials in each region to encourage engagement and network building.

Skills and attributes for success
  • An excellent team player and able to work independently.
  • Excellent client facing skills with experience on client projects.
  • A self-starter who is proactive in nature.
  • Excellent verbal, written communication, and presentational skills.
  • Ability to build internal and external relationships.
  • Effective negotiating and influencing skills.
  • Ability to think creatively and propose innovative solutions.
  • Leadership skills.
To qualify for this role, you must have
  • Proven experience and knowledge with PySpark and Apache Spark including the fundamentals of how it works.
  • Core experience with AWS, with substantial and mature Azure platform offering.
  • Experience with other cloud platforms, e.g. Azure, GCP and data lake architectures.
  • Strong experience and programming skills in languages such as Python and SQL and the ability to write complex SQL queries.
  • Use of GitHub and CI/CD practices.
  • Support development of the Azure Databricks Lakehouse platform, shaping frameworks and solutions that other engineering teams will adopt in future data projects.
  • Build, optimise, and maintain data processing frameworks using Python, ensuring performance, scalability, and maintainability.
  • Support DBT integration and best practices for transformation pipelines within Databricks.
  • Apply software engineering principles including:
    • Source control, automated testing, CI/CD
    • Design patterns and reusable solutions
    • Coding standards and patterns
  • Collaborate with technical solution authorities, ensuring alignment with governance, design decisions, and platform standards.
  • Collaborate closely with the Cloud Architecture and Data Architecture teams to deliver approved solutions.
  • Stakeholder management, take ownership of requirements, communicate effectively across teams, and deliver high quality solutions.
  • Experience of DevOps and infrastructure deployments (Azure and Databricks).
  • A proactive awareness of industry standards, regulations, and developments.
  • Multi-skilled experience in one or more of the following disciplines: Data Management, Data Engineering, Data Warehousing, Data Modelling, Data Quality, Data Integration, Data Analytics, Data Visualisation, Data Science and Business Intelligence.
  • Proficiency in Infrastructure as Code tools, especially Terraform.
  • Experience with Terraform for cloud resource provisioning (AWS, Azure, GCP).
  • Project experience using one or more of the following technologies: Tableau, Power BI, Cloud, Azure, AWS, GCP, Snowflake and their integration with Databricks is advantageous.
Qualifications

You must be:

  • Willing to work on client sites, potentially for extended periods.
  • Willing to travel for work purposes and be happy to stay away from home for extended periods.
  • Eligible to work in the UK without restriction.
Additional Information

What we offer:

  • 25 days holiday + bank holidays.
  • 5 days holiday buy/sell option.
  • Private medical insurance.
  • Life cover.
  • Cycle to work scheme.
  • Eligibility for company pension scheme (5% employer contribution, salary sacrifice option).
  • Employee assistance programme.
  • Bespoke online learning via Udemy for Business.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.