Job Search and Career Advice Platform

Enable job alerts via email!

Security Services Data Modelling and Engineering Senior Associate

JPMorgan Chase & Co.

City of Westminster

On-site

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global financial services leader is seeking a candidate to design and optimize data pipelines on Databricks using Python and Spark. The ideal applicant will possess proven experience in building data solutions and be proficient in ETL concepts. Responsibilities include collaborating with data architects, implementing data quality checks, and maintaining project tracking in Jira. Strong communication and documentation skills are also essential. This role offers the opportunity to contribute to innovative projects in a leading financial institution.

Qualifications

  • Proven experience developing data pipelines and solutions on Databricks.
  • Strong proficiency in Python, including libraries for data transformation.
  • Solid understanding of ETL concepts and data modelling.

Responsibilities

  • Design, build, and optimize data pipelines and transformation workflows on Databricks.
  • Collaborate to develop robust data models and document data flows.
  • Implement data quality checks and maintain work transparency in Jira.

Skills

Python
Databricks
Data modeling
ETL concepts
Spark
Jira
Job description
Responsibilities
  • Design, build, and optimize data pipelines and transformation workflows on Databricks, leveraging Python and Spark.
  • Collaborate with Data Architects and Business Analysts to develop robust data models and clearly document data flows and ETL logic.
  • Implement and execute data quality checks and validation modules using Python.
  • Maintain transparency and accountability by tracking work and progress in Jira.
  • Ensure datasets and pipelines are accurately registered in relevant catalogues and consoles, meeting governance and privacy standards.
Qualifications
  • Proven experience developing data pipelines and solutions on Databricks.
  • Strong proficiency in Python, including libraries for data transformation (e.g., pandas).
  • Solid understanding of ETL concepts, data modelling, and pipeline design.
  • Experience with Spark and cloud data platforms.
  • Ability to document data flows and transformation logic to a high standard.
  • Familiarity with project management tools such as Jira.
  • Collaborative mindset and strong communication skills.
Preferred qualifications, capabilities, and skills
  • Experience in financial services or large enterprise data environments.
  • Knowledge of data governance, privacy, and compliance requirements.
  • Exposure to business analysis and requirements gathering.

J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives. J.P. Morgan's Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.