Job Search and Career Advice Platform

Enable job alerts via email!

Devops Engineer AWS Cloud & Data Engineer

Michael Bailey Associates

City Of London

On-site

GBP 70,000 - 90,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A financial services company in London is seeking a DevOps Engineer to join their Area Risk Reporting team. The role involves building end-to-end data solutions using AWS and Azure, ensuring compliance with data governance, and collaborating closely with various stakeholders. Ideal candidates should have strong experience in cloud infrastructure and data management, with skills in Infrastructure as Code and CI/CD tools.

Qualifications

  • Strong experience with AWS and Azure cloud platforms.
  • Proficiency in Infrastructure as Code with tools like Terraform.
  • Strong scripting skills with languages such as Python or PowerShell.

Responsibilities

  • Design and implement end-to-end data solutions for reporting.
  • Ensure data pipelines are reliable, secure, and optimized.
  • Develop and maintain CI/CD pipelines for deployments.

Skills

AWS (e.g., S3, Glue, IAM, Lambda, VPC)
Azure (e.g., Data Factory, Synapse, Key Vault)
Infrastructure as Code (Terraform, Bicep, ARM templates)
CI / CD tools (Azure DevOps)
Scripting (Python, PowerShell, Bash)
Data transformation tools (dbt, PySpark, SQL)
MSSQL and DataBricks
Data modelling and data warehouse architecture
Job description
About the Role

We are looking for a versatile DevOps Engineer to join our Area Risk Reporting team within Tribe External Reporting.

You will be responsible for building and maintaining our information factories — end-to-end data solutions that source, transform, and deliver high-quality, report‑ready data to the business. This role blends DevOps, data engineering, and cloud infrastructure, with a strong focus on both Azure and AWS.

You’ll work across the full lifecycle of data delivery — from infrastructure provisioning and data ingestion to transformation and reporting readiness.

About the Department

Tribe External Reporting is responsible for ensuring our compliance with financial regulations and standards. The department manages the accurate and timely submission of regulatory reports to financial authorities. By applying advanced data analytics and reporting tools, it safeguards efficiency, transparency, and full adherence to regulatory requirements.

Key Responsibilities
Build Information Factories
  • Design and implement end-to-end data solutions that source, transform, and deliver data for reporting.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable technical solutions.
Infrastructure & Cloud Engineering
  • Ensure data pipelines are reliable, secure, and optimized for performance.
  • Build and manage infrastructure in AWS and Azure using Infrastructure as Code (IaC) tools like Terraform, CloudFormation, ARM, and Bicep.
  • Configure networking, security, and access controls across cloud environments.
  • Ensure infrastructure supports data workloads and reporting performance.
DevOps & Automation
  • Develop and maintain CI / CD pipelines for infrastructure and data deployments.
  • Automate testing, monitoring, and deployment workflows.
  • Implement logging, alerting, and observability for data and infrastructure components.
Collaboration & Governance
  • Work closely with developers, architects, and business stakeholders to align infrastructure and data solutions with reporting needs.
  • Ensure compliance with data governance, security, and regulatory requirements.
  • Document architecture, processes, and best practices for internal knowledge sharing.
What We’re Looking For
Skills & Knowledge
  • Strong experience with AWS (e.g., S3, Glue, IAM, Lambda, VPC).
  • Strong experience with Azure (e.g., Data Factory, Synapse, Key Vault, VNets, NSGs).
  • Proficiency in Infrastructure as Code (Terraform, Bicep, ARM templates, CloudFormation).
  • Experience with CI / CD tools (Azure DevOps).
  • Strong scripting skills (e.g., Python, PowerShell, Bash).
  • Familiarity with data transformation tools (e.g., dbt, PySpark, SQL).
  • Experience with MSSQL and DataBricks.
  • Understanding of data modelling and data warehouse architecture.
Bonus Points
  • Experience in risk or financial reporting environments.
  • Certifications in AWS and/or Azure (e.g., AWS Data Engineer, Azure DevOps Engineer).
  • Exposure to multi‑cloud or cloud migration projects.

Contact: Mees Vogelenzang 0031-207975015

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.