Enable job alerts via email!

Data Engineer

0000050079 London branch of Royal Bank of Canada

London

On-site

GBP 50,000 - 75,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company in the financial sector is looking for a Data Engineer for their London office. The role involves maintaining and evolving the Data Lakehouse platform, working closely with business and tech teams across Wealth Management Europe. Applicants should have substantial experience in data management and related technologies and be ready to contribute to innovative data initiatives.

Benefits

Leadership support and development opportunities
Work with top professionals in the field
Make a meaningful impact
Join a dynamic, collaborative team

Qualifications

  • At least two years of experience in data management disciplines.
  • Experience with cross-functional teams and data initiatives.
  • Strong knowledge of data management architectures and SQL.

Responsibilities

  • Develop and maintain Data Lakehouse infrastructure using Microsoft Azure.
  • Manage data pipelines from data sources to consumption.
  • Utilize SQL and PySpark for reporting and analytics support.

Skills

Data Management
SQL
Big Data Technologies
Data Integration

Tools

Azure Data Factory
Databricks
GitHub
Terraform

Job description

Social network you want to login/join with:

0000050079 London branch of Royal Bank of Canada

Location:

London, United Kingdom

Job Category:

Other

EU work permit required:

Yes

Job Reference:

f35b5d9038bc

Job Views:

3

Posted:

04.07.2025

Expiry Date:

18.08.2025

Job Description:

Job Description

We have an exciting opportunity for a Data Engineer to join our London/Newcastle offices. The successful candidate will work with business and technology teams across Wealth Management Europe (WME) to maintain and evolve the Data Lakehouse platform, focusing on data ingestion, modeling, and platform performance improvements.

Responsibilities:
  • Develop and maintain Data Lakehouse infrastructure using Microsoft Azure, including Databricks and Data Factory.
  • Manage data pipelines from data sources to consumption, optimizing for development and production environments.
  • Utilize SQL and PySpark for reporting and analytics support.
  • Create and maintain Dev, UAT, and Production environments.
  • Automate data preparation and integration tasks using modern tools and techniques.
  • Use version control tools like GitHub and perform schema comparisons.
  • Ensure development follows DevOps best practices and documentation standards.
  • Identify and implement process improvements for data automation and scalability.
  • Engage with teams and stakeholders to foster relationships and thought leadership.
  • Follow Agile methodologies and collaborate in sprints and meetings.
  • Stay informed on new data initiatives and propose innovative solutions.
Requirements:
  • At least two years of experience in data management disciplines, including data integration, modeling, and quality.
  • Experience working with cross-functional teams and supporting data initiatives.
  • Strong knowledge of Data Management architectures, SQL, and big data technologies.
  • Experience with Azure Data Factory, Databricks, and related tools.
  • Familiarity with DevOps/DataOps principles.
  • Basic understanding of data governance, security, and prototyping.
Nice-to-have:
  • Knowledge of Terraform.
  • Experience with advanced analytics tools and programming languages like Python, Java, Scala, R.
Benefits:
  • Leadership support and development opportunities.
  • Work with top professionals in the field.
  • Make a meaningful impact.
  • Join a dynamic, collaborative team.

Note: RBC Group does not accept agency resumés. Please do not forward resumés to employees or other locations. Contact Recruitment for details.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.