Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

nCino

Greater London

On-site

GBP 60,000 - 80,000

Full time

15 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading cloud banking firm in Greater London seeks an experienced Data Engineer to design and implement data pipelines, support critical cloud transitions, and collaborate with product teams. The role requires expertise in AWS, programming skills in Python or Scala, and experience in data infrastructure projects. Candidates should have a strong understanding of data modeling and be able to work effectively in mission-critical environments. This is a full-time position with a focus on innovative data solutions.

Qualifications

  • 6 years as a data engineer or equivalent with proven track record on complex ETL and data infrastructure projects.
  • Expert knowledge in programming languages such as Python, Scala, or Java.
  • Deep understanding of data modelling, distributed computing, and data pipeline architecture.

Responsibilities

  • Design and implement scalable, reliable data pipelines.
  • Support migration of critical data infrastructure from GCP to AWS.
  • Partner with product teams to translate requirements into data solutions.

Skills

Python
Scala
AWS
Data Modelling
ETL
Apache Spark
PostgreSQL

Tools

Databricks
AWS Lambda
Docker
Kubernetes
Job description
Overview

nCino offers exciting career opportunities for individuals who want to join the worldwide leader in cloud banking.

Leads planning designing development and testing of simple software systems or applications for software enhancements and new products including cloud-based or internet-related tools. Guides team to support clients project objectives. Troubleshoots client issues as they arise.

Data Infrastructure & Optimisation (Core Focus)
  • Design and implement scalable reliable data pipelines that support product features and platform initiatives
  • Own performance optimisation of data storage retrieval and processing systems
  • Design data models and pipeline architectures that support both current and future requirements
  • Build and maintain robust monitoring and alerting strategies to ensure system reliability
  • Collaborate with the Lead Engineer on data architecture decisions and technical standards
  • Establish best practices and standards for data engineering within the organisation
AWS Platform Transition (Immediate Priority)
  • Support migration of critical data infrastructure from GCP to AWS prioritising data integrity and continuity
  • Optimise data storage strategies through the transition (PostgreSQL GCP AWS Elasticsearch OpenSearch)
  • Build data ingestion layers using Lambda and EC2 where appropriate
  • Document new operational standards and patterns for the AWS environment
Cross-Functional Collaboration
  • Partner with product teams to understand requirements and translate them into robust data solutions
  • Work alongside the Senior Software Engineer and Lead Engineer on API data contracts and integration points
  • Design and implement data pipelines using AWS-native services and Databricks as the primary compute layer
  • Support incoming product initiatives with data infrastructure recommendations
  • Bring discipline to technical decision-making and trade-off analysis
Execution Focus
  • Maintain clarity on priorities and escalate blockers early
  • Deliver on commitments with minimal scope creep
  • Balance migration work with product development pipeline needs
  • Communicate risks dependencies and timeline concerns proactively
Required Experience
  • 6 years as a data engineer or equivalent with proven track record on complex ETL and data infrastructure projects
  • Expert knowledge in one or more programming languages (Python Scala Java or similar)
  • Deep understanding of data modelling distributed computing and data pipeline architecture
  • Strong experience with data processing frameworks and cloud data platforms
  • Experience optimising relational and NoSQL databases for performance and scalability
  • Solid understanding of data structures algorithms and system design principles
  • AWS cloud platform expertise
  • Familiarity with Unix systems command-line tools and version control (git)
  • Ability to work in a prioritised focused manner on mission-critical work
  • Experience in regulated or high-reliability environments (FinTech healthcare etc. a strong plus)
Desired Qualifications
  • Experience with Databricks or Apache Spark for distributed data processing
  • AWS cloud platform expertise in building scalable application and leveraging IaaC
  • Knowledge of API design and data contract principles
  • Experience with modern data orchestration tools (Airflow Dagster etc.)
  • Understanding of data governance and compliance requirements in regulated industries
  • Exposure to agile development methodologies and collaborative team environments
Technical Stack

Current & Migration Focus

  • AWS (core platform expanding from GCP)
  • Databricks (primary data processing layer)
  • PostgreSQL (relational database)
  • OpenSearch (search and analytics migrated from Elasticsearch)
  • Lambda & EC2 (data ingestion and orchestration)
  • Kubernetes Docker (containerisation)

Languages & Frameworks

  • Data Pipelines
  • Databricks Python & Scala Pyspark Typescript with AWS SDK

  • API development and data integration patterns

Kotlin PHP & TypeScript

nCino provides equal employment opportunities to all employees and applicants for employment without regard to race color religion sex sexual orientation gender identity national origin age protected veteran status disability genetics or other protected addition to federal law requirements nCino complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment including recruiting hiring placement promotion termination layoff recall transfer leaves of absence compensation and training.

nCino is committed to the full inclusion of all qualified individuals. As part of this commitment nCino will ensure that persons with disabilities are provided reasonable accommodations. If reasonable accommodation is needed to participate in the job application or interview process to perform essential job functions and / or to receive other benefits and privileges of employment please contact us at .

Our commitment to inclusion and equality includes a strong belief that the diversity of our team is instrumental to our success. We strive to create workplaces where employees are empowered to bring their authentic selves to work.

Required Experience :

Senior IC

Key Skills

Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala

Employment Type : Full-Time

Experience : years

Vacancy : 1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.