Enable job alerts via email!

Sr Architect

Pig Improvement Company

United Kingdom

Remote

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking an experienced Data & Integration Architect to revolutionize their data strategy. In this pivotal role, you will bridge legacy systems with modern lakehouse architectures, driving innovation and ensuring alignment with business objectives. You'll work with cutting-edge tools like Apache Spark, Databricks, and Kafka, while overseeing SQL Server and ETL processes. This is an exciting opportunity for someone passionate about solving complex data challenges and making a tangible impact in a dynamic environment. If you're ready to lead R&D initiatives and build scalable data solutions, we want to hear from you!

Qualifications

  • Degree in Computer Science or related field required.
  • Experience with data governance and security best practices essential.
  • Certifications in Azure Cloud or Databricks are a plus.

Responsibilities

  • Develop and maintain the data architecture roadmap.
  • Oversee data flow, pipeline design, and transformations.
  • Lead the transition to modern data architectures.

Skills

Strategic Thinking
Execution Focus
Change Leadership
Collaboration
Decision-Making
Data Governance
Data Engineering
Project Management

Education

Degree in Computer Science
TOGAF or equivalent certification
Relevant Azure Cloud or Databricks certifications

Tools

Apache Spark
Databricks
Kafka
Airflow
Azure
SQL Server
Dynamics 365
Power Platform
Microsoft Dataverse
Python

Job description

Role Overview

We’re looking for an experienced Data & Integration Architect to shape the backbone of our data strategy—bridging legacy systems with modern lakehouse architectures.

In this high-impact role, you'll define the path forward for data, integration, and governance, ensuring our technology aligns with business objectives.

Work with cutting-edge tools like Apache Spark, Databricks, Kafka, Airflow, and Azure, while overseeing SQL Server, ETL, data pipelines, and streaming platforms. You'll also drive automation and governance across Dynamics 365, Power Platform, and Microsoft Dataverse.

This is your opportunity to lead innovation, spearhead R&D initiatives, and build scalable, secure, and high-performing data solutions. If you're passionate about solving complex data challenges and making real business impact, we’d love to hear from you!

You Will

Data Strategy & Architecture

  • Develop and maintain the data architecture roadmap, balancing legacy and modern data solutions.
  • Evaluate emerging technologies (e.g., Apache Spark, Kafka) to future-proof our data landscape.
  • Define and enforce data integration standards, ensuring consistency across systems.

Solution Design & Implementation

  • Oversee data flow, pipeline design, transformations, and warehouse architectures.
  • Lead the technical implementation of SQL Server, Azure SQL, Databricks, and Airflow pipelines.
  • Champion data mesh principles and federated analytics using Starburst (Trino).
  • Enable real-time data streaming and analytics through Kafka.

Integration & Automation

  • Design and implement data flows across Dynamics 365, Power Platform, and Dataverse.
  • Utilize Python for complex transformations and integrations.
  • Ensure seamless, secure, and scalable data exchange across platforms.

Data Governance & Security

  • Establish frameworks for data lineage, quality, and security.
  • Define KPIs for data reliability, availability, and performance.
  • Conduct code reviews, data modelling sessions, and performance tuning.

Collaboration & Leadership

  • Work with product managers, stakeholders, and technical teams to align data initiatives with business goals.
  • Communicate strategies and trade-offs to both technical and non-technical audiences.
  • Mentor and guide data engineers, ETL developers, and solution architects.

Modernization & Continuous Improvement

  • Lead the transition from legacy ETL and data warehouses to modern architectures.
  • Drive proof-of-concepts (PoCs) for new technologies in data engineering, streaming, and analytics.
  • Identify opportunities for automation and cost optimization in data operations.

Core Competencies

  • Strategic Thinking: Balances innovation with practical risk management.
  • Execution Focus: Delivers high-quality outcomes with persistence and efficiency.
  • Change Leadership: Effectively drives adoption of new data technologies.
  • Collaboration: Builds trust and alignment across teams.
  • Decision-Making: Uses data-driven insights to inform technical and business decisions.
Requirements

Education & Certifications

  • Degree in Computer Science, Data Science, Engineering, or a related field.
  • Preferred: TOGAF or equivalent enterprise architecture certification.
  • Relevant Azure Cloud or Databricks certifications are a plus.

Technical Expertise

  • ETL & Data Warehousing: Experience with legacy ETL tools and modern data transformation strategies.
  • Microsoft Stack: Proficiency in SQL Server, Azure SQL, and Azure-based data processing.
  • Apache Spark & Databricks: Strong background in large-scale data processing and analytics.
  • Kafka & Streaming: Experience with real-time data ingestion and event-driven architectures.
  • Python & Data Engineering: Hands-on experience building data pipelines and integrations.
  • Data Governance & Security: Understanding of data privacy regulations (e.g., GDPR) and best practices.
  • DevOps & Agile: Familiarity with CI/CD, Infrastructure as Code, and Agile methodologies.

Other Requirements

  • Occasional travel as required particularly to our office in Stapeley.
  • Demonstrated experience in project management, including resource planning and risk management, within Agile frameworks.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.