Enable job alerts via email!

Senior Data Engineer - 12 Month FTC

BMS Group

London

On-site

GBP 70,000 - 90,000

Full time

8 days ago

Job summary

A prominent data-driven organization in London is seeking a Senior Data Engineer to own and manage data engineering pipelines on a Lakehouse platform in Azure. In this pivotal role, you will lead a small team, ensure the delivery of efficient data solutions, and continuously engage with stakeholders to identify business needs. The ideal candidate will have extensive experience in SQL, Python, and managing engineering teams, with a background in the London Insurance Market preferred.

Qualifications

  • Experience as a principal/lead Data Engineer.
  • Experience managing a team of engineers.
  • Experience with large data sets and data engineering pipelines.

Responsibilities

  • Own and manage all data engineering pipelines.
  • Engage with stakeholders to understand business needs.
  • Continuously monitor and optimize pipelines.

Skills

SQL
Python
PySpark
Team management
Data modelling
Agile methodology

Tools

Azure Databricks
Azure DevOps Pipelines

Job description

Position Title: Senior Data Engineer – 12 Month FTC

Reports to: Head of Data Platforms

Location: London

About the Role:

We are seeking a highly motivated and skilled Senior Data Engineer to join our growing Data Platforms team at BMS Group. In this role, you will play a pivotal role in supporting BMS’s ambition to fully realise the benefits of a Lakehouse platform deployed within Azure. Working under the Head of Data Platforms you will help manage, and implement the data engineering pipelines within Azure to ingest, enrich, and curate our; structured, semi-structured and unstructured data from our global business. In this role you will help lead a small team of engineers who you will be responsible for developing to support the business objectives of delivering a scalable and efficient platform for the business.

Key Responsibilities:

  • Responsible for owning and managing all data engineering pipelines and layers of the Lakehouse platform.
  • Responsible for managing the prioritisation of data engineering backlog.
  • Responsible for managing the peer review and testing process of any pipeline releases.
  • Continuously engage with Head of Data Platforms, Group Head of Data Strategy and Governance and Architecture to understand the evolving needs of the business so that they can be supported by the Lakehouse platform.
  • Own, with the support of Head of Data Platforms, the definition of standards and best practices for BMS’s data engineering pipelines in relation to; code as well as documentation.
  • Continuously monitor and analyse pipelines to identify opportunities for optimisation and efficiencies.
  • Work within plan-driven (Waterfall) or iterative (Agile) delivery methodologies based on project requirements.
  • Continuously learn and develop your skills to stay ahead of the curve in the evolving data landscape.
  • Develop and implement generalised data engineering pipelines, based on patterns, to create efficient, scalable, and manageable pipelines.
  • Collaborate with other central IT functions to ensure that the necessary access to systems and technology is granted based on the evolving needs of the team.
  • Build a comprehensive understanding of both technical and business domains.
  • Collaborate with cross-functional teams to understand and address data engineering and data needs.

Knowledge and Skills:

  • Experience working as a principal/lead Data Engineer.
  • Experience working with large data sets and proficiency in SQL, Python and PySpark.
  • Experience manging a team of engineers with varying levels of experience within data engineering.
  • Experience deploying pipelines within Azure Databricks in line with the medallion architecture framework.
  • Experience using SQL, Python and PySpark to build data engineering pipelines.
  • Understanding of how to define best practices in relation to documentation standards as well as code standards.
  • Understanding of data modelling approaches and standards
  • Understanding of semantic modelling techniques and how data is consumed from a Lakehouse to support them.
  • Experience building Azure DevOps Pipelines.
  • Excellent communication and problem-solving skills.
  • Experience working within an agile environment.
  • Assist with the upskilling and continued improvement of junior members of the team

Desired skills and experience:

  • Experience in the London Insurance Market, or the wider Financial Services sector
  • Experience building and deploying machine learning pipelines into data engineering pipelines.
  • Experience of using metadata driven data engineering approaches for ingestion and transformations within data pipelines.

Success Metrics:

  • Manage and maintain high-quality and efficient data engineering pipelines to meet the technical requirements of the business.
  • Share knowledge and expertise to develop and upskill your team to become better and more effective data engineers.
  • Contribute to our team culture by creating a positively challenging environment.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs