Enable job alerts via email!

Senior Data Engineer

International Rescue Committee

London

Hybrid

GBP 40,000 - 70,000

Full time

20 days ago

Job summary

A humanitarian organization is seeking a skilled Data Engineer to enhance data capabilities. The role involves building, maintaining, and optimizing data systems, collaborating with data scientists, and applying engineering practices to ensure data quality. Ideal candidates will possess a strong background in data analytics, experience with tools such as Airflow and dbt, and the ability to communicate complex concepts effectively.

Qualifications

  • 4+ years in data and analytics.
  • Proficient in manipulating large-scale data with Python and SQL.
  • Experience with cloud data warehouses.

Responsibilities

  • Support the workflow of the ER data model.
  • Collaborate with analysts and stakeholders.
  • Automate ML pipelines from data prep to deployment.

Skills

Data pipeline development
Machine Learning operations
Python
SQL
Attention to detail
Excellent communication

Tools

Airflow
dbt
MLflow
Snowflake
Databricks
PowerBI

Job description

The International Rescue Committee (IRC) responds to the world's worst humanitarian crises, helping to restore health, safety, education, economic wellbeing, and power to people devastated by conflict and disaster. Founded in 1933 at the call of Albert Einstein, the IRC is one of the world's largest international humanitarian non-governmental organizations (INGO), operating in more than 40 countries and 29 U.S. cities, helping people to survive, reclaim control of their future, and strengthen their communities. A force for humanity, IRC employees deliver lasting impact by restoring safety, dignity, and hope to millions. If you're a solutions-driven, passionate change-maker, come join us in positively impacting the lives of millions of people worldwide for a better future.

BACKGROUND

Over the past 90 years, the IRC has developed unparalleled expertise in responding to emergencies and helping uprooted communities rebuild. Founded in 1933 at the request of Albert Einstein, the IRC offers lifesaving care and life-changing assistance to refugees fleeing war or disaster. The IRC is active in more than 40 countries, providing emergency relief, relocating refugees, and rebuilding lives after disasters.

The IRC is committed to a culture of bold leadership, innovation, creative partnerships, and accountability to those we serve. We are tireless advocates for the most vulnerable.

IRC UK

Part of the IRC global network with headquarters in New York, our UK team raises awareness, delivers policy and practice change, and increases funding to help restore health, safety, education, economic wellbeing, and power to people affected by conflict and disaster. Since 2021, IRC UK has also provided integration services directly to refugees in England. Our offices are also located in Berlin, Bonn, Brussels, Geneva, and Stockholm.

The Purpose of the Role

The External Relations (ER) department, created in February 2020, comprises three functions: Private fundraising, Communications, and Policy & Advocacy. We are currently executing a 5-year global strategy to enhance IRC’s capabilities in private income, advocacy, and brand awareness. The department’s main goal is to ensure the organization has the resources to serve 18 million people worldwide, influence humanitarian policies, and strengthen IRC’s reputation.

We seek a skilled Data Engineer to join our analytics team, which includes data scientists and analysts. You will leverage your expertise in analytics engineering, ML Ops, infrastructure design, and deployment to build, maintain, and optimize data systems and tools supporting our data pipelines, ML workflows, and business intelligence reporting.

You will play an active role in scaling IRC’s internal data capabilities as data volumes and complexity grow.

Major Responsibilities:

  • Support the workflow of the ER data model: data pipeline development, ELT performance, data loading, and model maintenance through monitoring, testing, and automation.
  • Collaborate with analysts, data scientists, and stakeholders to develop integrated, production-quality, re-usable data models in SQL using dbt, ensuring data quality.
  • Work with data scientists to automate ML pipelines from data prep to deployment, including MLflow workflows for tracking, versioning, and deployment.
  • Apply software engineering practices to ensure data quality and standardization, including testing and documentation, and contribute to code reviews.
  • Identify and implement process improvements, redesign infrastructure for scalability, and automate manual processes.
  • Mentor analysts and improve analytics engineering practices across ER analytics.
  • Develop and maintain conceptual data models, documentation, diagrams, prototypes, and change notices.
  • Collaborate with engineering, analysts, and business users to implement new data pipelines and infrastructure improvements.
  • Partner with leadership to evaluate data stack improvements.
  • Support other analytics tasks as needed.

Person Specification

Skills and Competencies:

  • Curiosity to explore complex problems and deliver structured solutions.
  • 4+ years in data and analytics.
  • At least 2+ years manipulating large-scale data with Python and SQL.
  • Experience with data pipeline tools (e.g., Airflow, dbt), dependency management, schema design, and dimensional modeling.
  • Experience with ML model management tools like MLflow.
  • 2+ years working with cloud data warehouses (Snowflake, Databricks, BigQuery, Redshift, Azure).
  • Knowledge of the modern data stack and data ops best practices.
  • Strong attention to detail and ability to work independently.
  • Excellent communication skills to translate technical concepts for non-technical stakeholders.
  • Desire to work in a multicultural environment.

Nice-to-Haves:

  • Familiarity with Salesforce or similar CRM.
  • Experience with dbt in high-growth orgs, including deploying utils, packages, tests, snapshots, and incremental tables.
  • Experience with Snowflake and Databricks.
  • Experience with PowerBI, Power Query, DAX/MDX.
  • Knowledge of infrastructure-as-code (Terraform, CloudFormation) and CI/CD pipelines for ML/AI.
  • Experience with Apache Spark or Kafka is a plus.

Working Environment:

Standard office environment, with potential remote work. UK candidates must have the right to work in the UK.

Application Deadline: 2nd September 2025

IRC UK is an equal opportunities employer committed to diversity and inclusion. We welcome applications from all backgrounds, including refugees and underrepresented groups. Reasonable adjustments are available for applicants with disabilities.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs