Overview
Are you excited by the idea of helping scale a fast-growing tech business across London, Mumbai and Austin? Do you want to build a career in data engineering while working with cutting-edge GenAI tools? Are you passionate about building scalable data infrastructure that powers analytics, machine learning, and customer insights?
The Role
As a Core Data Engineer at Fospha, you’ll be at the heart of our data ecosystem. You’ll design, build, and optimise pipelines that move, transform, and scale data across multiple systems. Working at the intersection of analytics, engineering, and machine learning, you’ll ensure our data infrastructure grows as fast as our ambitions. You’ll define data quality standards, shape the data roadmap, and support high-quality data access for our Data Science and Analytics teams. This is a high-impact role where your work will directly empower teams to move faster and deliver smarter insights.
Key Responsibilities
- Design, build, and optimise data pipelines using dbt, Python, and SQL
- Implement and maintain scalable ELT/ETL frameworks that power analytics and ML systems
- Collaborate with Data Science and Platform teams to ensure robust and reliable model deployment pipelines
- Own the reliability, scalability, and observability of data workflows in production
- Contribute to data architecture decisions and documentation, ensuring data integrity and consistency across sources
- Design and maintain data models used by ML Engineers, Data Analysts, and Data Scientists
- Drive automation, versioning, and quality validation in data delivery
- Conduct exploratory data analysis to uncover trends and inform strategic decision-making
- Identify opportunities for process improvement and promote a culture of continuous data excellence
- Excited by the opportunity GenAI brings and interested in integrating our GenAI hub into new ways of working
What are we looking for?
We hire for potential – you should apply if you:
- Have proven experience building data pipelines in dbt, Python, and SQL
- Demonstrate a strong grasp of data modelling, warehousing, and orchestration tools
- Understand data architecture and ELT flows
- Are familiar with ML Ops principles and how they integrate with engineering systems
- Have experience with cloud-native data stacks (preferably AWS)
- Take a pragmatic approach to balancing perfection with delivery
- Understand agile methodologies and best practices
- Know how to apply data quality frameworks and version control in data delivery
Our Values and Principles
- Seek inclusion & diversity: We create an environment where everyone feels welcome, and people are encouraged to speak and be heard
- Work Hard, Work Well, Work Together: We take responsibility for making things happen, independently and together; we help colleagues in need and close loops, making sure our work is complete and has lasting impact
- Grow: We are proactive, curious and unafraid of failure
- Customer at the heart: We care about the customer, feel their pain and love building product that solves their biggest problems
- Candour with caring: We deliver candid feedback with kindness and receive it with gratitude
What we can offer you
- Competitive salary
- Be part of a leading global venture builder, Blenheim Chalcot and learn from the incredible talent in BC
- Be exposed to the right mix of challenges and learning and development opportunities
- Flexible Benefits including Private Medical and Dental, Gym Subsidiaries, Life Assurance, Pension scheme etc
- 25 days of paid holiday + your birthday off!
- Free snacks in the office!
- Quarterly team socials