Enable job alerts via email!
A leading entertainment company in London is seeking a Data Engineer to enhance its analytics platform. In this role, you'll develop data pipelines, work with Azure technologies, and collaborate with business stakeholders to deliver effective data solutions. The ideal candidate has strong experience with ETL/ELT processes and cloud-based data platforms. This opportunity offers competitive pay and flexible working arrangements.
Social network you want to login/join with:
As a Data Engineer, you will be working in the Red Engine Business Insights team, helping to build out the existing data and analytics platform. This role is placed within a small team, allowing the successful candidate to work closely with the senior engineers in implementing bespoke features and enhancements to our data platform using the latest technology. In this role, you will assist senior and lead engineers and help drive the development of the data platform. This includes meeting with key business stakeholders to gather technical requirements and translating these into technical data solutions under the guidance of the lead engineer within the Data & Analytics platform.
Key responsibilities will include:
Work with the Lead Engineer to design and implement data engineering solutions using T-SQL, Python, PySpark, and DBT in the Azure cloud environment.
Collaborate with Data Analysts to incorporate business logic into the analytics platform to support reports and dashboards.
Maintain and leverage CI/CD deployment pipelines for application code promotion across environments.
Update and maintain technical documentation in Azure DevOps.
To be successful in this role, you’ll:
Have experience developing ETL/ELT pipelines for data movement, transformation, and visualization from structured and unstructured sources.
Possess experience with the Azure Platform, including:
Data Ingestion: Azure Data Factory (ADF), Databricks, Logic Apps, and Function Apps.
Data Storage: ADLS, SQL Server, and Unity Catalog (Medallion Architecture).
Strong understanding of the Databricks Platform, including managing, developing, and deploying workflows, jobs, and notebooks.
Experience modeling data in a Data Warehouse using Inmon or Kimball approaches.
Database development experience in SQL Server, including stored procedures and T-SQL or similar tools.
Experience working within an Agile development framework.
Experience with Data Build Tool (DBT), including building data models, contracts, tests, validation, and transformations.
Knowledge of modern distributed file formats such as Parquet, Delta, Iceberg, Hudi.
Experience building data ingestion pipelines from REST API sources.
Ability to produce clear technical documentation for both technical and non-technical audiences.
Experience with Infrastructure as Code (IaC) solutions like Terraform or Pulumi.
Familiarity with modern CI/CD DevOps frameworks.
What you'll get