Enable job alerts via email!

Sr Data Analyst

beatmysalary

Milton Keynes

Hybrid

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Job summary

A UK-based data solutions company is looking for a professional to lead end-to-end data workflows, involving SQL view development and ETL processes using AWS Glue. The ideal candidate has expertise in Google Cloud Platform and PostgreSQL, focusing on performance optimization and data product governance. This role offers a hybrid work model based in Milton Keynes.

Qualifications

  • Experience in developing SQL views and managing data workflows.
  • Proficiency in utilizing AWS Glue for ETL processes.
  • Strong understanding of Google Cloud and PostgreSQL.

Responsibilities

  • Develop and schedule SQL views aligning with business needs.
  • Execute cross-platform ETL with AWS Glue for data extraction.
  • Monitor, troubleshoot, and resolve incidents in ETL workflows.
  • Design and govern reusable data products ensuring compliance with FAIR principles.
  • Translate business requirements into detailed technical designs.
  • Optimize performance across the data stack and ETL pipelines.
  • Lead migration strategies from legacy systems to BigQuery.

Skills

SQL
AWS Glue
Airflow
PostgreSQL
Google Cloud Platform
Data Governance
ETL
Job description

Location : Milton Keynes, hybrid

Type of employmnbet : Contract or Permanent

Lead end-to-end data workflows—from requirement-to-delivery—including data product creation and secure data transfer from Google Cloud Platform to PostgreSQL.

Responsibilities:
  1. Develop & Schedule SQL Views via DAGs Design and implement SQL views aligned with business needs, prioritizing clarity, reusability, and efficiency. Build and manage workflow orchestrations (e.g., Airflow DAGs) to automate those views, ensuring reliable execution on daily, weekly, or customized schedules.
  2. Execute Cross‑Platform ETL with AWS Glue Develop, deploy, and maintain AWS Glue jobs to extract data from GCP (such as BigQuery or GCS) and load it into PostgreSQL. Set up secure connectivity, schedule jobs via cron or trigger mechanisms, and ensure data pipelines are reliable and idempotent.
  3. Monitor, Troubleshoot & Resolve Incidents Continuously oversee ETL workflows in Airflow and AWS Glue, proactively responding to alerts and errors. Conduct root cause analysis for pipeline failures—whether due to schema mismatches or performance bottlenecks—and apply robust fixes. Document resolutions to strengthen system resilience.
  4. Design, Build, & Govern Data Products Architect, construct, and maintain reusable data products, embedding clean datasets, metadata, governance policies, and clearly defined data contracts. Ensure compliance with FAIR principles—data being Findable, Accessible, Interoperable, and Reusable—and enforce robust access controls in collaboration with governance stakeholders.
  5. Translate Requirements into Technical Designs Accumulate and analyze requirements via stakeholder engagement, user stories, or use cases. Convert these into detailed design artifacts, including architecture diagrams, data models, and specifications for development.
  6. Optimize Performance Across the Stack Continuously refine ETL pipelines, SQL logic, and data workflows to boost efficiency and scalability. Techniques may include indexing, partitioning, caching, or employing materialized views to improve query speed.
  7. Lead Migration from hh360 to BigQuery Architect and drive a seamless migration strategy to move data and pipelines from the legacy hh360 system into Google BigQuery. Employ iterative migration patterns for safe data transfers, rigorous validation, and phased deprecation of legacy infrastructure.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.