Enable job alerts via email!

Data Engineer - Business Intelligence Team

Morgan McKinley

London

On-site

GBP 40,000 - 80,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An exciting healthcare company is looking for a talented Data Engineer to join their Business Intelligence Team. In this pivotal role, you will design and optimize data infrastructure and pipelines, ensuring data reliability and availability for analytics. Your contributions will help drive insights across the organization, supporting their ambitious growth plans. This innovative firm is dedicated to excellence and is expanding its operations, offering a dynamic work environment where your skills will be valued and impactful. If you're passionate about data engineering and want to make a difference in healthcare, this is the perfect opportunity for you.

Qualifications

  • 3+ years in data engineering with strong SQL and Python skills.
  • Experience in designing and maintaining data pipelines and ETL processes.

Responsibilities

  • Design and optimize scalable data pipelines for analytics.
  • Collaborate with stakeholders to translate data requirements into solutions.

Skills

SQL
Data Engineering
Python
Data Warehousing
ETL/ELT Processes
Data Quality Management

Education

Bachelor's degree in Computer Science
Bachelor's degree in Engineering
Bachelor's degree in Information Systems

Tools

dbt
Airflow
PowerBI
Docker

Job description

About the job

About the Client:

My client is an exciting, agile, and forward-thinking healthcare company with ambitious growth plans. Committed to excellence, their aim is to be 'Beyond Better' in everything they do, as they build an outstanding healthcare organization. Their innovative model is based on partnering with doctors to build and operate highly specialized healthcare centers, focusing on single medical specialties such as Endoscopy, Orthopaedics, or Cardiology. With their HQ in London and a growing number of centers in Central London, they are also expanding to new locations in Oxford and Cambridge, with many more exciting projects planned across the UK and worldwide.

About the Role:

My client is seeking a talented Data Engineer to join their Business Intelligence Team. In this crucial role, you will be responsible for designing, building, managing, and optimizing the data infrastructure and pipelines that drive analytics and insights across their business and digital platforms. Your work will be pivotal in ensuring that their data is reliable, accurate, and readily available for analysis.

Your Impact:

  1. Design, build, and optimize robust and scalable data pipelines to ingest, process, and store data across the business.
  2. Develop, manage, and maintain ETL/ELT processes, ensuring data quality, integrity, and timely delivery to their data warehouse.
  3. Implement data modeling solutions within the data warehouse to support efficient querying and analytics for performance monitoring and executive reporting needs.
  4. Collaborate with BI Analysts and commercial stakeholders to understand data requirements and translate them into effective technical solutions.
  5. Monitor, troubleshoot, and optimize data pipeline performance, reliability, and cost-effectiveness.
  6. Implement data quality checks and validation processes throughout the data lifecycle.
  7. Contribute to the architecture and strategy of their data platform and tooling.

Required Skills:

  1. Strong SQL proficiency with experience writing complex, optimized queries for data transformation and modeling.
  2. Proven experience (3+ years) in data engineering, specifically designing, building, and maintaining data pipelines and ETL/ELT processes.
  3. Proficiency in Python for data engineering tasks (e.g., scripting, automation, data manipulation libraries like Pandas, interacting with APIs).
  4. Solid understanding of data warehousing concepts (e.g., dimensional modeling, star/snowflake schemas) and database design principles.
  5. Hands-on experience with data build tool (dbt) or similar data transformation frameworks.

Nice to Have:

  1. Bachelor's degree in Computer Science, Engineering, Information Systems, or a related technical field.
  2. Experience with workflow orchestration tools (e.g., Airflow, Prefect, Dagster).
  3. Experience ingesting data from diverse sources, including marketing analytics platforms (Google Analytics, HubSpot), APIs, databases, and call tracking systems.
  4. Familiarity with data visualization tools (like PowerBI) from a data provisioning perspective.
  5. Healthcare or medical industry experience.
  6. Knowledge of containerization (Docker) and CI/CD practices for data pipelines.
  7. Experience with data governance and data quality management frameworks.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.