Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Venn Group

Bristol

Hybrid

GBP 60,000 - 80,000

Full time

12 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A recruitment agency is seeking a highly skilled Data Engineer to enhance and optimize data pipelines in a hybrid setting. Key responsibilities include developing Databricks notebooks and ETL pipelines, while utilizing advanced Python and SQL skills. The ideal candidate will collaborate closely with internal teams and be focused on ensuring reliability and performance of data operations. This role is intended for a full-time commitment with an initial three-month duration, offering a competitive daily rate.

Qualifications

  • Strong hands-on experience with Databricks, including notebooks, Delta Lake, and Genie bots.
  • Proven background in designing and maintaining ETL pipelines, including debugging and optimisation.
  • Advanced Python skills for transformation, automation, and pipeline development.
  • Solid SQL proficiency for querying, validation, and performance tuning.
  • Experience using GitHub for version control, collaboration, and code management.

Responsibilities

  • Develop, optimise, and debug Databricks notebooks, Delta Lake workflows, and Genie-bot automations.
  • Design, maintain, and enhance ETL pipelines, ensuring reliability and performance.
  • Validate, transform, and orchestrate data using Python and SQL within the Lakehouse environment.
  • Manage data ingestion and scheduling processes via Azure Data Factory.
  • Reverse-engineer, document, and communicate existing data flows and pipeline structures.

Skills

Databricks
ETL pipelines
Python
SQL
GitHub

Tools

Azure Data Factory
Job description

The West of England Combined Authority are seeking a highly skilled Data Engineer to support the optimisation, documentation, and enhancement of our existing data pipelines within the Databricks Lakehouse environment. This role will involve hands‑on development, reverse‑engineering of current solutions, and close collaboration with internal teams to ensure efficient BAU operations and future scalability.

Location:West of England Combined Authority

Set-up: Hybrid – 2-3 days on-site

Rate:£249/day via umbrella inside IR35

Duration:Initial 3-month, subject to extension

Hours:Full-time position

Responsibilities
  • Develop, optimise, and debug Databricks notebooks, Delta Lake workflows, and Genie‑bot automations
  • Design, maintain, and enhance ETL pipelines, ensuring reliability and performance
  • Validate, transform, and orchestrate data using Python and SQL within the Lakehouse environment
  • Manage data ingestion and scheduling processes via Azure Data Factory
  • Reverse‑engineer, document, and communicate existing data flows and pipeline structures for handover to internal teams
Requirements
  • Strong hands‑on experience with Databricks, including notebooks, Delta Lake, and Genie bots
  • Proven background in designing and maintaining ETL pipelines, including debugging and optimisation
  • Advanced Python skills for transformation, automation, and pipeline development
  • Solid SQL proficiency for querying, validation, and performance tuning
  • Experience using GitHub for version control, collaboration, and code management
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.