Job Search and Career Advice Platform

Enable job alerts via email!

Devops Engineer AWS Cloud & Data Engineer

Michael Bailey Associates

Greater London

On-site

GBP 60,000 - 90,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A financial services firm in Greater London is looking for a versatile DevOps Engineer to develop and maintain data solutions. The role involves building and managing infrastructure on AWS and Azure, automating workflows, and ensuring compliance with data governance. The ideal candidate has strong experience with cloud technologies, CI/CD tools, and scripting. Join a collaborative team focused on delivering high-quality data for reporting and analysis.

Qualifications

  • Strong experience with AWS and Azure.
  • Proficiency in Infrastructure as Code tools.
  • Strong scripting skills in Python and PowerShell.

Responsibilities

  • Design and implement end-to-end data solutions.
  • Develop and maintain CI / CD pipelines for deployments.
  • Collaborate closely with stakeholders for data solutions.

Skills

AWS (e.g., S3, Glue, IAM, Lambda, VPC)
Azure (e.g., Data Factory, Synapse, Key Vault, VNets, NSGs)
Infrastructure as Code (Terraform, Bicep, ARM templates, CloudFormation)
CI / CD tools (Azure DevOps)
Scripting skills (e.g., Python, PowerShell, Bash)
Data transformation tools (e.g., dbt, PySpark, SQL)
MSSQL and DataBricks
Data modelling and data warehouse architecture
Job description
About the Role

We are looking for a versatile DevOps Engineer to join our Area Risk Reporting team within Tribe External Reporting.

You will be responsible for building and maintaining our information factories — end-to-end data solutions that source, transform, and deliver high-quality, reportready data to the business. This role blends DevOps, data engineering, and cloud infrastructure, with a strong focus on both Azure and AWS.

You’ll work across the full lifecycle of data delivery — from infrastructure provisioning and data ingestion to transformation and reporting readiness.

About the Department

Tribe External Reporting is responsible for ensuring our compliance with financial regulations and standards. The department manages the accurate and timely submission of regulatory reports to financial authorities. By applying advanced data analytics and reporting tools, it safeguards efficiency, transparency, and full adherence to regulatory requirements.

Key Responsibilities

Build Information Factories

  • Design and implement end-to-end data solutions that source, transform, and deliver data for reporting
  • Collaborate with stakeholders to understand data requirements and translate them into scalable technical solutions
  • Ensure data pipelines are reliable, secure, and optimized for performance

Infrastructure & Cloud Engineering

  • Build and manage infrastructure in AWS and Azure using Infrastructure as Code (IaC) tools like Terraform, CloudFormation, ARM and Bicep
  • Configure networking, security, and access controls across cloud environments
  • Ensure infrastructure supports data workloads and reporting performance

DevOps & Automation

  • Develop and maintain CI / CD pipelines for infrastructure and data deployments
  • Automate testing, monitoring, and deployment workflows
  • Implement logging, alerting, and observability for data and infrastructure components
Collaboration & Governance
  • Work closely with developers, architects, and business stakeholders to align infrastructure and data solutions with reporting needs
  • Ensure compliance with data governance, security, and regulatory requirements
  • Document architecture, processes, and best practices for internal knowledge sharing
What We’re Looking For
Skills & Knowledge
  • Strong experience with AWS (e.g., S3, Glue, IAM, Lambda, VPC)
  • Strong experience with Azure (e.g., Data Factory, Synapse, Key Vault, VNets, NSGs)
  • Proficiency in Infrastructure as Code (Terraform, Bicep, ARM templates, CloudFormation)
  • Experience with CI / CD tools (Azure DevOps)
  • Strong scripting skills (e.g., Python, PowerShell, Bash)
  • Familiarity with data transformation tools (e.g., dbt, PySpark, SQL)
  • Experience with MSSQL and DataBricks
  • Understanding of data modelling and data warehouse architecture
Bonus Points
  • Experience in risk or financial reporting environments
  • Certifications in AWS and / or Azure (e.g., AWS Data Engineer, Azure DevOps Engineer)
  • Exposure to multi-cloud or cloud migration projects

Contact: Mees Vogelenzang 0031-207975015

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.