Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Join a forward-thinking renewable energy firm as an Azure Databricks Engineer. This role offers the chance to develop and optimise cutting-edge data lakehouse solutions on Azure, supporting critical analytics and data delivery systems. You'll collaborate within a dynamic engineering team, tackling complex cloud-native challenges while ensuring high-quality data accessibility. With a commitment to sustainable innovation and digital transformation, this established industry player provides an exciting opportunity to make a meaningful impact in the renewable energy sector. Embrace the chance to advance your career in a supportive and innovative environment.
Social network you want to login/join with:
This role offers a great opportunity for an Azure Databricks Engineer to join a renewable energy firm based in London. You'll play a hands-on role in developing and optimising modern data lakehouse solutions on Azure, while supporting critical analytics and data delivery systems. The environment encourages technical ownership, collaboration, and the chance to tackle complex cloud-native engineering challenges.
THE COMPANY
This is a leading organisation within the renewable energy sector, dedicated to sustainable innovation and data-driven operations. The business is undergoing rapid digital transformation, investing in cloud-based technologies to optimise performance, forecasting, and environmental impact. With operations across multiple regions, their data initiatives play a key role in supporting clean energy production, distribution, and strategy.
THE ROLE
You'll join a collaborative engineering team focused on building scalable, secure, and efficient data platforms on Microsoft Azure. Your work will directly support migration initiatives, analytics enablement, and platform reliability. You'll be responsible for data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems.
Your responsibilities will include:
Designing and implementing scalable data lakehouse architectures using Databricks on Azure.
Building efficient ETL/ELT pipelines for structured and unstructured data.
Working with stakeholders to ensure high-quality, accessible data delivery.
Optimising SQL workloads and data flows for analytics performance.
Automating infrastructure deployment using Terraform and maintaining CI/CD practices.
Supporting secure and performant data access via cloud-based networking.
KEY SKILLS AND REQUIREMENTS
Strong experience with Azure Databricks in production environments.
Background with Azure Data Factory, Azure Functions, and Synapse Analytics.
Proficient in Python and advanced SQL, including query tuning and optimisation.
Hands-on experience with big data tools such as Spark, Hadoop, and Kafka.
Familiarity with CI/CD pipelines, version control, and deployment automation.
Experience using Infrastructure as Code tools like Terraform.
Solid understanding of Azure-based networking and cloud security principles.
HOW TO APPLY
Please register your interest by sending your CV via the apply link on this page.