Enable job alerts via email!

AWS Data Architect - Market Data

ZipRecruiter

London

On-site

GBP 80,000 - 120,000

Full time

25 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading tech company seeks an AWS Data Architect to design a next-gen data platform. You'll shape architectural decisions and build scalable data pipelines while collaborating with cross-functional teams. This role offers a unique opportunity to influence critical market data workflows and implement best practices in data governance.

Qualifications

  • Hands-on expertise with AWS services like Glue, Lake Formation, and Athena.
  • Strong coding skills in Python and SQL for building, testing, and optimizing data pipelines.
  • Proven experience designing secure, scalable, and reliable data architectures.

Responsibilities

  • Design and implement end-to-end data architecture on AWS.
  • Develop scalable and secure ETL/ELT pipelines using Python, PySpark, and SQL.
  • Drive decisions on data modeling and integration strategies.

Skills

AWS data services
Python
SQL
ETL/ELT pipelines
Data governance

Job description

Job Description

AWS Data Architect - Market Data


We're seeking a hands-on AWS Data Architect to play a lead role in a high-impact initiative building a next- data platform from the ground up. This is a rare greenfield opportunity to architect and engineer cutting-edge solutions that will power critical market data workflows across the business.


As a senior technical leader, you'll not only set the architectural direction but also roll up your sleeves to build and optimize scalable data pipelines using the latest cloud- tools and frameworks. You'll be instrumental in shaping the technical foundation of our platform, from core design principles to implementation best practices.


What You'll Do:




  • Design and implement end-to-end data architecture on AWS using tools such as Glue, Lake Formation, and Athena




  • Develop scalable and secure ETL/ELT pipelines using Python, PySpark, and SQL




  • Drive decisions on data modeling, lakehouse architecture, and integration strategies with Databricks and Snowflake




  • Collaborate cross-functionally to embed data governance, quality, and lineage into platform design




  • Lead technical evaluations of new tools and approaches to evolve the platform's capabilities




  • Serve as a trusted advisor to engineering and business stakeholders on data strategy and architecture




What You Bring:




  • Deep, hands-on expertise with AWS data services (Glue, Lake Formation, PySpark, Athena, etc.)




  • Strong coding skills in Python and SQL for building, testing, and optimizing data pipelines




  • Proven experience designing secure, scalable, and reliable data architectures in cloud environments




  • Solid grasp of data governance, quality frameworks, and security best practices




  • A builder's mindset, comfortable leading architectural decisions and also delivering code in production




Bonus Points For:




  • Experience with modern platforms like Databricks, Snowflake, or other lakehouse solutions




  • Familiarity with analytics, ML workflows, or financial market data




Why Join Us?


This is your chance to shape a foundational data platform from day one. If you're looking to make a real architectural impact while staying deeply technical, we want to hear from you.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.