Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

NatWest Group

City Of London

On-site

GBP 70,000 - 90,000

Full time

4 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading UK banking institution seeks a Data Engineer to drive innovative data-driven solutions. You will simplify operations through advanced data pipelines and ETL design, enhancing customer-centric experiences. Key requirements include expertise in Snowflake, proficiency in Airflow and AWS services, and excellent stakeholder communication. The role supports strategic initiatives while ensuring data safety. Ideal candidates will have a strong background in Python and lead technical projects effectively.

Qualifications

  • Strong experience of Snowflake for data warehousing with efficient SQL.
  • Proficiency in Airflow for orchestration and workflow management.
  • Hands-on experience with AWS services, particularly S3 and Lambda.
  • Excellent communication skills and stakeholder management.
  • Expert knowledge of ETL/ELT process and data warehousing.
  • Experience with Kafka integration in streaming pipelines.
  • Proficiency in Python for data engineering tasks.
  • Ability to lead initiatives and mentor junior colleagues.

Responsibilities

  • Build advanced automation of data engineering pipelines.
  • Embed new data techniques through training and oversight.
  • Deliver understanding of data platform costs for targets.
  • Source new data using appropriate tooling.
  • Develop solutions for data ingestion and transformations.

Skills

Snowflake for data warehousing
Airflow orchestration
AWS services (S3 and Lambda)
SQL proficiency
Python for data engineering
Kafka concepts (producers, consumers, topics)
ETL/ELT process
Data modelling capabilities
Stakeholder management
Version control systems (Git)
Job description
  • You'll be the voice of our customers, using data to tell their stories and put them at the heart of all decision‑making
  • We'll look to you to drive the build of effortless, digital‑first customer experiences
  • If you're ready for a new challenge and want to make a far‑reaching impact through your work, this could be the opportunity you're looking for
What you’ll do

As a Data Engineer, you’ll be looking to simplify our organisation by developing innovative data‑driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank’s data, safe and secure. You’ll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You’ll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture.

Responsibilities
  • Building advanced automation of data engineering pipelines through removal of manual stages
  • Embedding new data techniques into our business through role modelling, training, and experiment design oversight
  • Delivering a clear understanding of data platform costs to meet your department’s cost saving and income targets
  • Sourcing new data using the most appropriate tooling for the situation
  • Developing solutions for streaming data ingestion and transformations in line with our streaming strategy
Requirements
  • Strong experience of Snowflake for data warehousing along with writing efficient SQL and managing schemas
  • Proficiency in Airflow for orchestration and workflow management as well as hands‑on experience with AWS services particularly S3 and Lambda
  • Excellent communication skills with the ability to proactively engage and manage a wide range of stakeholders
  • Expert level knowledge of ETL/ELT process along with in-depth knowledge of data warehousing and data modelling capabilities
  • Experience with Kafka concepts like producers, consumers and topics with the ability to integrate with streaming pipelines
  • Proficiency in Python for data engineering and version control systems such as Git
  • Ability to lead technical initiatives along with experience of mentoring junior colleagues
  • Knowledge of Snowflake performance tuning would be hugely beneficial
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.