Job Search and Career Advice Platform

Enable job alerts via email!

Senior Backend Engineer (Remote)

Translucent

Remote

GBP 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A financial technology startup is seeking an experienced Backend Engineer to build and optimize backend services, working with data pipelines and AI workflows. The role requires strong skills in SQL, programming in Python or Go, and an understanding of cloud environments and ETL processes. As part of a fast-moving team, you will influence product direction and engineering culture at the company. This role is fully remote with a competitive salary and equity stake.

Benefits

Competitive salary
Equity stake
Work remotely

Qualifications

  • 5+ years of backend engineering experience in data-intensive systems.
  • Strong skills in SQL and database design with optimization experience.
  • Programming skills in Python, Go, or similar backend language.

Responsibilities

  • Design and optimize backend services for APIs and integrations.
  • Own database performance and scaling strategies.
  • Build high-throughput ingestion pipelines with Kafka.

Skills

Backend engineering experience
SQL and database design
Python
Cloud platforms usage
ETL processes
Data pipelines and ETL
Exposure to AI data frameworks
Pragmatic communication

Tools

PostgreSQL
Kafka
Docker
Kubernetes
Job description

Location : Remote.

Compensation : Competitive salary + meaningful equity stake.

Visa sponsorship : Not available.

About Translucent

Translucent is building the future of accounting with AI. We are reimagining how companies handle their finances by making AI a trusted teammate for finance teams : safe, transparent, and always with humans in control.

We are not starting from zero. Our platform already processes Xero and QuickBooks data at the transaction level for thousands of companies across multiple entities. Customers rely daily on our consolidation, intercompany, and search tools - and their feedback drives our roadmap.

We are a small, fast-moving team with high standards and big ambitions. If you enjoy working close to customers, shipping quickly, and building something truly different, you’ll fit right in.

Role overview

We are looking for an experienced Backend Engineer who loves working with data, building robust services, and optimising distributed systems. You will work across ingestion pipelines, API layers, and query engines, shaping how our platform stores, processes, and serves financial data.

This is a hands‑on role where you will make architectural decisions, solve scaling challenges, and design backend systems that form the foundation for AI‑powered financial workflows.

What you will do
  • Design, implement, and optimise backend services that power our APIs, integrations, and developer tools.
  • Own database performance and reliability : schema design, indexing, query optimisation, and scaling strategies for PostgreSQL and ClickHouse.
  • Build and maintain high‑throughput ingestion pipelines with Kafka, transforming raw accounting data into well‑structured, queryable formats .
  • Develop data access APIs that make financial data securely and efficiently available to both internal services and external developers.
  • Improve observability across the stack : logging, metrics, tracing, and alerting for ingestion pipelines and backend services.
  • Collaborate with engineers, product, and domain experts to turn customer workflows into reliable backend systems.
  • Contribute to architectural decisions across event‑driven workflows, sync / transform services, indexing strategies, and search infrastructure.
  • Support AI‑powered workflows by ensuring data pipelines are structured, reliable, and optimised for retrieval‑augmented generation (RAG), schema validation, and downstream agent evaluation.
  • Help design guardrails and evaluation hooks for systems where agents or customers write data back, ensuring correctness, performance, and safety.
  • Explore new data architectures such as graph databases to model complex financial relationships and support emerging AI workflows.
  • Work in lockstep with AI engineers to evolve Translucent into an AI‑first company.
What you will bring
  • 5+ years of backend engineering experience, ideally in data‑intensive or high‑scale systems.
  • Strong skills in SQL and database design (PostgreSQL preferred), with proven experience in optimisation and scaling.
  • Solid programming skills in Python, Go, Kotlin or a similar backend language.
  • Experience designing and maintaining production APIs (REST / gRPC / tRPC).
  • Comfort with cloud platforms (AWS, GCP, or similar) and containerised environments (Docker, Kubernetes).
  • Understanding of data pipelines and ETL processes, working with both structured and semi‑structured data.
  • Exposure to AI‑powered data frameworks and infra such as vector databases (Weaviate, Qdrant, Pinecone), graph databases (Neo4j, Memgraph), or model serving / routing systems (vLLM, LiteLLM).
  • Familiarity with agent orchestration frameworks (LangGraph, Mastra) or AI observability tools (W&B Traces, Arize, OpenLLMetry).
  • Experience building data pipelines optimised for AI workflows, e.g. embedding search, feature stores, or schema‑validated outputs.
  • Clear, pragmatic communication and a collaborative approach.
  • An interest in joining a small, fast‑moving startup where you can influence product direction and engineering culture.
Nice to have
  • Experience with event‑driven architectures and streaming systems (Kafka, Pub / Sub, etc.).
  • Familiarity with ClickHouse, ElasticSearch, or vector databases for high‑performance queries and indexing.
  • Background in accounting, finance, or enterprise SaaS.
  • Exposure to AI / ML‑powered applications, LLM frameworks, or retrieval‑augmented generation (RAG).
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.