Enable job alerts via email!

Databricks Data Engineer

Square One Resources

Glasgow

Hybrid

GBP 100,000 - 125,000

Full time

Today
Be an early applicant

Job summary

A leading financial services company in Glasgow is looking for a skilled Databricks Data Engineer for a contract position until the end of 2026. This role involves designing data pipelines and developing serverless applications using AWS. The ideal candidate should have a strong background in Python and AWS services. The position requires office attendance two days a week.

Qualifications

  • Strong coding background in Python and PySpark.
  • Hands-on experience with various AWS services including S3, Lambda, Glue, and others.
  • Proficiency in CloudFormation for infrastructure automation.

Responsibilities

  • Design, develop, and maintain robust data pipelines and ETL workflows using AWS.
  • Implement scalable data processing solutions using PySpark and AWS Glue.
  • Build and manage infrastructure as code using CloudFormation.

Skills

Python
PySpark
AWS S3
AWS Lambda
AWS Glue
AWS Step Functions
AWS Athena
AWS SageMaker
AWS IAM
GitLab

Tools

CloudFormation
ECS
Job description

Job Title: Databricks Data Engineer
Location: Glasgow - 2 days per week in the office
Salary/Rate: Up to £400 per day inside IR35
Start Date: 24/11/2025
Job Type: Contract

Company Introduction
We have an exciting opportunity now available with one of our sector-leading financial services clients! They are currently looking for a skilled Databricks Data Engineer to join their team for a contract until the end of 2026.

Job Responsibilities/Objectives

We are seeking a skilled and hands‑on AWS Data Engineer with strong coding expertise and deep experience in building scalable data solutions using AWS services. The ideal candidate will have a solid background in data engineering, Python development, and cloud-native architecture.

  • Design, develop, and maintain robust data pipelines and ETL workflows using AWS services.
  • Implement scalable data processing solutions using PySpark and AWS Glue.
  • Build and manage infrastructure as code using CloudFormation.
  • Develop and deploy serverless applications using AWS Lambda, Step Functions, and S3.
  • Perform data querying and analysis using Athena.
  • Collaborate with Data Scientists to operationalize models using SageMaker.
  • Ensure secure and compliant data handling using IAM, KMS, and VPC configurations.
  • Containerize applications using ECS for scalable deployment.
  • Write clean, testable code in Python, with a strong emphasis on unit testing.
  • Use GitLab for version control, CI/CD, and collaboration.
Required Skills/Experience

The ideal candidate will have the following:

  • Strong coding background in Python and PySpark.
  • Hands‑on experience with the following AWS services: S3, Lambda, Glue, Step Functions, Athena, SageMaker, VPC, ECS, IAM, KMS.
  • Proficiency in CloudFormation for infrastructure automation.
  • Experience with unit testing frameworks and best practices.
  • Familiarity with GitLab for source control and CI/CD.

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer

Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.