Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A growing investment firm in London is seeking a Data Engineer to enhance their analytics capabilities. This role involves designing and implementing data pipelines primarily with Azure and Databricks, ensuring scalability and readiness for AI integration. Candidates should have strong Python skills and substantial experience in ETL processes, cloud migrations, and Terraforms. The position requires collaboration with teams and offers an exciting opportunity to work within a rapidly evolving environment aimed at leveraging data automation and analytics.
A rapidly growing investment firm headquartered in London, which has doubled its team over the past 2.5 years and continues to attract strong investor backing. With 230+ employees and ongoing expansion, the company is now focused on becoming more AI- and data-enabled, modernising its infrastructure to support more advanced analytics and automation.
You'll join a skilled and collaborative analytics team of eight, working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation.
As a Data Engineer, you'll play a pivotal role in designing and implementing robust data pipelines, supporting the migration from legacy Azure systems to Databricks, and working closely with stakeholders to deliver tailored data solutions. This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business.
Develop and maintain ETL pipelines, including manual and semi-manual data loads
Connect and integrate diverse data sources across cloud platforms
Collaborate with analytics and design teams to create bespoke, scalable data solutions
Support data migration efforts from Azure to Databricks
Use Terraform to manage and deploy cloud infrastructure
Build robust data workflows in Python (e.g., pandas, PySpark)
Ensure the platform is scalable, efficient, and ready for future AI use cases
Strong experience with Azure and Databricks environments
Advanced Python skills for data engineering (pandas, PySpark)
Proficiency in designing and maintaining ETL pipelines
Experience with Terraform for infrastructure automation
Track record of working on cloud migration projects, especially Azure to Databricks
Comfortable working onsite in London 2 days/week and engaging cross-functionally
Strong communication and problem-solving abilities
Experience with Qlik or other data visualisation tools
Exposure to AI product integration or readiness projects
If you can’t see what you’re looking for right now, send us your CV anyway – we’re always getting fresh new roles through the door.