Enable job alerts via email!

Python Developer

Develop

London

Hybrid

GBP 60,000 - 80,000

Full time

16 days ago

Job summary

A leading company is seeking a skilled Python Developer specializing in data engineering and orchestration workflows within the financial services industry. The role involves building robust data pipelines, ensuring efficient orchestration across systems, and integrating with modern data platforms. This position allows for remote work in the UK with one day a week in London, catering to a technically curious candidate who thrives in a collaborative environment.

Qualifications

  • Strong experience in Python for backend processing.
  • Hands-on with orchestration tools like Airflow and ADF.
  • Familiarity with Snowflake and Databricks.

Responsibilities

  • Develop and maintain data processing solutions in Python.
  • Implement orchestration workflows using tools like Apache Airflow.
  • Collaborate with teams to optimize data integration.

Skills

Python
Data Processing
Orchestration Tools
Kafka
Event Hub
Analytical Skills
Problem Solving

Tools

Apache Airflow
Azure Data Factory (ADF)
Control-M

Job description

Social network you want to login/join with:

Python Developer - Data Engineering & Orchestration


Remote in the UK (1 day a week in London)

Outside IR35

Rate negotiable

Must come from Financial Services

We are seeking a skilled Python Developer to join our data engineering team, focusing on data processing, automation, and orchestration workflows within a reference data platform.

This is a technically demanding role that involves building and maintaining robust data pipelines, ensuring seamless orchestration across systems, and integrating with modern data platforms.

Key Responsibilities:

  • Develop and maintain Python-based data processing solutions.
  • Design and implement orchestration workflows using tools such as Apache Airflow, Azure Data Factory (ADF), and Control-M.
  • Collaborate with cross-functional teams to optimize data integration and transformation processes.
  • Work with data platforms including Databricks, Snowflake, and Exadata to manage and manipulate large-scale datasets.
  • Integrate and manage event-driven architectures using Kafka and Event Hub.
  • Contribute to automation and efficiency improvements across the reference data platform.
  • Participate in code reviews, testing, and documentation.


Key Skills & Experience:

  • Very strong in Python, particularly for backend and data processing tasks.
  • Hands-on experience with orchestration tools such as Airflow, ADF, or Control-M.
  • Familiarity with data platforms such as Databricks, Snowflake, and Exadata.
  • Experience with event streaming technologies like Kafka and Event Hub.
  • Bonus: Exposure to Streamlit, Power Apps for lightweight UI development.
  • Bonus: Knowledge of banking and securities reference data.
  • Ideal Candidate Profile:
  • Technically curious with a passion for data-driven solutions.
  • Strong problem-solving and analytical skills.
  • Able to work both independently and as part of a collaborative team.
  • Experience in high-performance, regulated industries (e.g., financial services) is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.