Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

AXIO Group

Greater London

Hybrid

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data platform provider in the UK seeks an experienced Data Engineer to optimize ETL and streaming data pipelines using Azure Databricks. In this role, you will design and implement data solutions and collaborate with cross-functional teams to ensure data quality and performance. Candidates should have proficiency in Python, SQL, and data orchestration tools such as Apache Airflow. This position offers flexible working arrangements and a comprehensive benefits package including health insurance and annual leave incentives.

Benefits

Free daily lunch
Flexible working arrangements
Remote work options for 1 month annually
Health insurance
Company bonus scheme
Generous annual leave policy
Team-building activities

Qualifications

  • Proven track record in data engineering, focusing on ETL development and streaming architectures.
  • Experience with Azure Databricks, Apache Spark (Structured Streaming), and Delta Lake.
  • Proficiency in Python and SQL for transforming large datasets.
  • Experience in data orchestration and workflow automation with Apache Airflow.
  • Experience in a cloud data environment, preferably Azure.
  • Familiarity with streaming or messaging technologies.
  • Strong understanding of data quality and validation practices.
  • Ability to deliver production-grade solutions.
  • Experience with CI/CD practices using Azure DevOps or similar.
  • Excellent analytical and communication skills.
  • Strong understanding of modern data engineering patterns.

Responsibilities

  • Design and optimize scalable ETL and Streaming pipelines in Azure Databricks.
  • Implement data ingestion and processing pipelines to consolidate heterogeneous data sources.
  • Monitor data quality using automated practices.
  • Develop orchestration workflows in Apache Airflow.
  • Build reusable frameworks for error handling and auditing.
  • Collaborate across teams to define SLAs and data contracts.
  • Optimize Spark and Delta Lake performance.
  • Implement CI/CD and automation for data workflows.
  • Mentor engineers and contribute to design discussions.

Skills

Data engineering
ETL development
Streaming data architectures
Python (PySpark)
SQL
Cloud data environment
Apache Airflow
Data quality practices
CI/CD
Collaboration skills

Tools

Azure Databricks
Apache Spark
Delta Lake
Kafka
Terraform
Job description
About the Company

OAG is a leading data platform for the global travel industry offering an industry‑first single source for supply, demand, and pricing data. We empower the global travel industry with high‑quality, relevant datasets covering the whole journey from planning to customer experience. Headquartered in the UK, with operations in the USA, Denmark, France, Germany, Singapore, Japan, China, and Lithuania.

About the Project / Role

You will join a diverse team of 6 multi‑skilled software and data engineers, tasked with the modernisation of Flight Status data ingestion, processing, and aggregation mechanisms. Our initiative is targeted towards the implementation of a platform based on industry standards and best practices, ensuring crucial enablement of OAG’s core business. Our focus is on data pipelines. The role is open to candidates located anywhere in Lithuania.

You Will
  • Design, build, and optimize scalable ETL and Structured Streaming pipelines in Azure Databricks for real‑time and batch ingestion of Flight Status data.
  • Design and implement data ingestion and processing pipelines that consolidate heterogeneous data sources, including APIs, event streams, and file‑based feeds, into the OAG lakehouse (Azure Databricks + Delta Lake), ensuring data consistency, reliability, and scalability.
  • Implement and monitor data quality using automated validation, alerting, and observability practices.
  • Develop and maintain orchestration workflows in Apache Airflow, coordinating ingestion and transformation processes across multiple data flows.
  • Build reusable frameworks for schema evolution, error handling, deduplication, and auditing.
  • Collaborate with data platform, analytics, and product teams to define SLAs, data contracts, and performance targets.
  • Optimize Spark and Delta Lake performance for scalability, latency, and cost efficiency.
  • Implement CI/CD pipelines and automation for data workflows using Azure DevOps or equivalent tools.
  • Mentor engineers, review code, contribute to platform design discussions and planning, and help grow data engineering competencies in the team and across OAG.
Essential
  • Proven track record in data engineering, with a strong focus on ETL development and streaming data architectures.
  • Experience with Azure Databricks, Apache Spark (Structured Streaming), and Delta Lake.
  • Proficiency in Python (PySpark) and SQL, with experience transforming large‑scale, complex datasets.
  • Hands‑on experience in data orchestration and workflow automation (e.g., Apache Airflow or similar).
  • Experience working in a cloud data environment (preferably Azure) across storage, compute, and pipeline services.
  • Familiarity with streaming or messaging technologies (e.g., Kafka, Event Hubs).
  • Strong understanding of data quality, validation, and observability practices.
  • Ability to deliver production‑grade solutions with a results‑oriented and ownership‑driven mindset.
  • Experience implementing CI/CD and version‑control practices using Azure DevOps, GitHub Actions, or similar tools.
  • Excellent analytical, communication, and collaboration skills.
  • Strong understanding of modern data engineering patterns and ability to design scalable, modular, and reliable data systems.
Desirable
  • Knowledge of data governance and lineage tools (e.g., Unity Catalog).
  • Experience with data‑quality frameworks (e.g., Great Expectations, Soda).
  • Understanding of real‑time ingestion design patterns and schema evolution techniques.
  • Exposure to containerized environments, infrastructure‑as‑code practices (Terraform).
We Will Offer You
  • A dynamic work environment at OAG, fostering innovation in a progressive, non‑hierarchical culture, where passionate tech enthusiasts continuously develop new products and solutions with a focus on improvement for the aviation industry.
  • Company‑provided free lunch every day.
  • A modern office in a convenient location, with flexible working arrangements that allow remote work or office attendance.
  • Work remotely from anywhere in the world for 1 month each year.
  • An attractive compensation and benefits package, including private health insurance, a company bonus scheme, and voluntary participation in a company‑supported retirement scheme.
  • A generous annual leave policy, growing with each year of service, and a day off during your birthday month.
  • Participation in team‑building activities, team workshops, and group learning sessions.
Salary and Benefits

We will ensure that the exact salary offered for you will be based on your qualifications, competencies, professional experience and requirements for the corresponding job function (salary starts from EUR gross per month).

Equal Opportunity Statement

OAG is an Equal Opportunity Employer. We ensure all applicants are considered for employment without discrimination.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.