Job Search and Career Advice Platform

Enable job alerts via email!

Senior Scala Engineer

Beyond

Greater London

On-site

GBP 70,000 - 90,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A design studio in Greater London is seeking a Senior Scala Engineer to drive the modernization of their core data processing platforms. The ideal candidate will enhance high-traffic data pipelines using GCP Dataflow and collaborate with engineers on innovative solutions. Applicants should have strong proficiency in Scala, solid understanding of data engineering principles, and at least six years of relevant experience. Join a dynamic team committed to designing exceptional customer experiences.

Benefits

Promotes diversity and inclusion
Collaborative work environment

Qualifications

  • 6+ years of software engineering experience in distributed systems.
  • Proven ability to deliver complex features from design to production.
  • Familiarity with CI/CD best practices and infrastructure-as-code tools.

Responsibilities

  • Implement and enhance high-throughput data pipelines using Scala and GCP.
  • Contribute to architectural design for modern data processing.
  • Collaborate with Staff Engineers on data export systems.

Skills

Scala expertise
GCP Dataflow
Data engineering principles
Communication skills

Education

Degree in Computer Science or related

Tools

GCP services (BigQuery, Pub/Sub, Cloud Functions)
CI/CD tools (Terraform, Docker, Kubernetes)
Job description

Beyond is Qodea’s Customer Experience Design Studio.

We design the ‘surfaces’ where customers and technology meet.

Our teams shape the intelligence behind those experiences, turning data, design, and emerging technologies into products that are intuitive, adaptive, and human.

We are multi-disciplinary designers, product strategists, writers, architects, engineers, data scientists, and ML researchers, united by a single goal : to design a better future for our clients and their customers.

We believe we are on the cusp of a new golden era of design, one where design will be more important than ever. An era of exploration and discovery.

We’re building a studio where designers immerse themselves in AI design paradigms, experimenting with adaptive patterns, conversational interfaces, and agentic workflows, the foundation for tomorrow’s customer experience.

We look for people who embody :

Innovation to solve the hardest problems.

Accountability for every result.

Integrity always.

About The Role
  • We're looking for an experienced and hands-on Senior Scala Engineer to help drive the modernization and evolution of our core data processing platforms. This role is key to enhancing and scaling high-traffic, mission-critical data pipelines on the Google Cloud Platform (GCP).
  • You will be a vital contributor to our Scala-based Dataflow processing, focusing on implementation, performance, and reliability at scale. This is a hands‑on technical role for someone who thrives on solving complex distributed systems problems and delivering high‑quality, efficient code within a collaborative team.
What You’ll Do
  • Implement and enhance sophisticated, high‑throughput batch and streaming data pipelines using Scala and GCP Dataflow (Apache Beam).
  • Contribute to the architectural design and technical roadmap for modernizing our data ingestion and processing pipelines.
  • Develop and implement performance optimizations, such as migrating services to use Cloud Functions for exporting data from BigQuery to Pub/Sub more efficiently.
  • Collaborate with Staff Engineers to expand the functionality of our data export systems for external partners. This includes pulling data from BigQuery, creating formatted feed files, and exporting them to various external stores (e.g., FTP, GCS, S3).
  • Support and contribute to the expansion of existing data integrations, including adding new fields to schemas and ensuring that data is propagated correctly through the entire pipeline.
  • Actively contribute to the refactoring and modernization of legacy Scala codebases, applying best practices in functional programming, testing, and observability.
  • Collaborate closely with teams using Node.js APIs (for data ingestion) and Python (for data transformations), maintaining clear data contracts and robust integration patterns.
  • Act as a strong technical communicator, contributing to technical approaches, breaking down complex problems, and engaging in rapid, iterative development cycles by asking questions quickly rather than working in isolation.
Requirements
  • Degree in Computer Science or a related technical discipline.
  • 6+ years of software engineering experience, with a strong background in building and operating large‑scale, high‑traffic distributed systems in production.
  • Deep expertise in Scala and its functional programming paradigms.
  • Demonstrable, hands‑on experience with GCP Dataflow (Apache Beam). Experience with other major streaming/batch frameworks (e.g., Apache Spark, Akka Streams) is also highly valuable.
  • Strong proficiency in the GCP ecosystem, including critical services like BigQuery, Pub/Sub, and Cloud Functions.
  • Solid understanding of data engineering principles, including data modeling, schema design, and data lifecycle management.
  • Experience building and maintaining data export feeds to external systems (e.g., SFTP, GCS, S3).
  • Proven ability to take ownership of complex technical features and deliver them from design to production.
  • Excellent communication skills, with experience working in polyglot environments (interfacing with Node.js and Python, etc.) and a proactive, inquisitive approach to problem‑solving.
  • Ability to be available for key synchronous meetings and stand‑ups between 6:00 PM and 7:00 PM GMT.
Preferred Qualifications
  • Familiarity with CI/CD best practices and infrastructure‑as‑code tools (e.g., Terraform, Docker, Kubernetes).
  • Working knowledge of Node.js or Python for data‑related tasks.
  • Experience with other data stores (e.g., NoSQL, time‑series databases) or data orchestration tools (e.g., Airflow).
Diversity and Inclusion

At Beyond, we champion diversity and inclusion. We believe that a career in IT should be open to everyone, regardless of race, ethnicity, gender, age, sexual orientation, disability, or neurotype. We value the unique talents and perspectives that each individual brings to our team, and we strive to create a fair and accessible hiring process for all.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.