Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Parexel International

Remote

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading global biopharmaceutical services company is seeking a Senior Data Engineer in the United Kingdom to develop and optimize data pipelines and platforms using Azure, Databricks, and Snowflake. This pivotal role focuses on transforming data into actionable insights, building a scalable data ecosystem that supports business-critical functions. The ideal candidate will possess extensive experience in data engineering, specifically with Azure tools, and be skilled in guiding junior engineers and ensuring data quality and security compliance.

Qualifications

  • 6+ years of data engineering experience, with at least 4 years hands-on in Azure, Databricks, and Snowflake.
  • Experience with Reltio and Power BI integration is highly desirable.
  • Bachelor's or master's degree in computer science, Information Systems, Engineering, or a related field.

Responsibilities

  • Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake.
  • Design, build, modify, and support data pipelines leveraging DataBricks and PowerBI.
  • Collaborate with Team Leads to define business requirements and finalize work plans.
  • Run unit and integration tests on created code.
  • Establish data governance and compliance controls.
  • Mentor junior engineers and drive CI/CD practices.

Skills

Azure Data Factory
Databricks
Snowflake
SQL
Python
Power BI
Cloud Architecture
Data Governance

Education

Bachelor’s or master’s degree in computer science

Tools

Power BI
Reltio
Job description

Parexel is seeking a highly experienced Senior Data Engineer to architect, develop, and optimize enterprise-grade data pipelines and platforms using Azure, Databricks, Snowflake and Power BI. This role is pivotal in transforming raw data into actionable insights and building a resilient, scalable data ecosystem that supports business‑critical functions across clinical and operational domains.

Key Responsibilities
  • Architect and implement end‑to‑end data pipelines using Azure Data Factory, Databricks, and Snowflake for large‑scale data ingestion, transformation, and storage.

  • Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting.

  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders.

  • Excellent grasp of and expertise with test‑driven development and continuous integration processes.

  • Analysis and Design – Convert high‑level design to low‑level design and implement it.

  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans.

  • Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle.

  • Benchmark application code proactively to prevent performance and scalability concerns.

  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management.

  • Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments.

  • Familiarity with PowerBI and Reltio is advantageous but not required.

  • Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.

  • Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.

  • Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.

  • Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.

  • Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.

Required Qualifications
  • Experience: 6+ years of data engineering experience, with at least 4 years hands‑on in Azure, Databricks, and Snowflake; experience with Reltio and Power BI integration is highly desirable.

  • Education: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or a related field.

Skills
  • Expert‑level knowledge of Azure Data Factory, Databricks, and Snowflake.

  • Understanding of quality processes and estimate methods.

  • Understanding of design concepts and architectural basics.

  • Fundamental grasp of the project domain.

  • The ability to transform functional and nonfunctional needs into system requirements.

  • The ability to develop and code complicated applications is required.

  • The ability to create test cases and scenarios based on specifications.

  • Solid knowledge of SDLC and agile techniques.

  • Knowledge of current technology and trends.

  • Logical thinking and problem‑solving abilities, as well as the capacity to collaborate.

  • Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO.

  • Advantageous: SQL, Python, PowerBI.

  • General Knowledge: PowerApps, Java/Spark, Reltio.

  • 3-5 years of experience in software development with minimum 2 years of cloud computing.

  • Proven experience in building BI‑ready datasets and performance tuning in Power BI.

  • Proficient in SQL, Python, and cloud‑native architecture.

  • Strong grasp of data security, privacy compliance, and best practices in a regulated environment.

#LI-REMOTE

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.