Enable job alerts via email!

Data Engineer

Hansonwade

City Of London

Hybrid

GBP 60,000 - 65,000

Full time

Yesterday
Be an early applicant

Job summary

A leading data engineering firm in London seeks an experienced Data Engineer to contribute to the Data Engineering team. The role involves developing and maintaining the Data Platform, focusing on continuous delivery and collaboration across business units. Ideal candidates will have strong skills in Databricks, Python, and SQL. The position offers a salary of £60,000 - £65,000, hybrid working options, and a comprehensive benefits package.

Benefits

Private health and life insurance
Hybrid working arrangement
Annual leave up to 30 days
Access to Wader Hub benefits platform
Volunteer day
Professional education sponsorship
Individual career coaching

Qualifications

  • Excellent knowledge of the Databricks Platform with commercial experience.
  • Expertise in writing efficient Python Code.
  • Good experience in writing batch and streaming code in Apache Spark.

Responsibilities

  • Build and maintain Databricks infrastructure.
  • Create and maintain efficient Data Pipelines.
  • Collaborate with stakeholders to improve deliverability.

Skills

Databricks Platform knowledge
Python coding
SQL Server expertise
Apache Spark (PySpark)
Git version control
Data Quality testing
Good expertise in Data Quality testing
Knowledge of security best practices

Tools

GitHub
Terraform
Linux (Ubuntu)
Job description

We are looking for an experienced Data Engineer to join us and act as an individual contributor within the Data Engineering team at Hanson Wade. This is a Technical Hands-on role that will require the development and maintenance of Hanson Wade Group’s Data Platform system.

The Data Engineering team forms part of the Technology department. The team supports Hanson Wade Group businesses - Our Conferences business supports the BioPharma industry via Life Science events, and our Beacon business also supports the BioPharma industry by providing drug development information. We have been standardizing all our data flows in the Data Platform and taking on more new projects so this role will be a mix of Brownfield and Greenfield work

Responsibilities
  • Work as part of the Data Engineering team to build and maintain our existing Databricks infrastructure.
  • Create and maintain efficient Data Pipelines integrating various sources and destinations.
  • Use infrastructure as code (IaC) as a basis for a strong focus on continuous delivery.
  • Establish procedures and tooling for monitoring alerting and responding to incidents promptly.
  • Collaborate with stakeholders from across the business to improve our deliverability.
Required Skills
  • Excellent knowledge of the Databricks Platform, backed up by demonstrable commercial experience.
  • Expertise in writing efficient Python Code
  • Expertise working with SQL Server, Postgres Server and writing advanced SQL Code.
  • Expertise with Python Data libraries like Pandas / Polars, Requests, SQL Alchemy, Beautiful Soup etc.
  • A good experience in writing batch and streaming code in Apache Spark (PySpark)
  • Experience in Code version control using git (We use GitHub)
  • Good expertise in continuous integration (CI) and continuous deployment (CD) pipelines using bash scripts, python scripts and GitHub Actions
  • Good expertise in Data Quality testing both manual and automated
  • A good knowledge of working with Linux VMs (Ubuntu) and WSL
  • Knowledge of security best practices, tools, and documentation standards.
Desirable Skills
  • Experience with Open Source ETL Framework Airbyte and writing Custom Connectors (This is part of our core Tech Stack, and we can train on this).
  • Deploying infrastructure components using Terraform Modules (For Databricks, Azure services and Air byte)
  • Experience with agile methodologies (e.g., Kanban, Scrum).
  • Knowledge of Microsoft Azure Cloud Services like Azure Function, Storage Accounts, Key Vaults etc.
  • Some knowledge of Terraform for IaaC and experience working with Azure Kubernetes Services
Why Choose Us?
  • Private health and life insurance
  • Hybrid working arrangement - 1/2 days a week in the London Office
  • 1 extra day of annual leave each year, up to 30 days of annual leave (not including public holidays)
  • Access to our Wader Hub benefits platform which include, retail, gym, hospitality, and wellness discounts
  • Volunteer day (we offer all our employees the chance to take a day on us to get out there and do some good!)
  • Opportunities for professional education sponsorship
  • Access to individual career coaching to develop your career from day one

Salary: £60,000 -£65,000 + Bonus

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.