Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer Contractor (Python Pandas)

Data Intellect

Greater London

On-site

GBP 40,000 - 70,000

Full time

21 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A consultancy firm specializing in data solutions is seeking a talented individual to develop and maintain core data pipelines. This role involves collaborating with teams to deliver impactful data solutions, with a strong emphasis on Python and data manipulation skills. The ideal candidate will have experience in the Capital Markets sector and be comfortable working with diverse teams to manage multiple priorities. This is a contract position located in Greater London.

Qualifications

  • Solid experience writing clean, efficient, and scalable Python code.
  • Strong understanding of data frames and data cleaning using Pandas.
  • Experience working across all phases of the SDLC.
  • Familiarity with version control systems and CI/CD pipelines.
  • Experience in Capital Markets / Financial Services.

Responsibilities

  • Develop and maintain core data pipelines supporting FX BAU.
  • Collaborate with cross-functional teams for high-impact data solutions.
  • Engage with Front Office stakeholders in FICC asset classes.

Skills

Python
Data Manipulation with Pandas
Software Development Life Cycle (SDLC)
Version control systems (e.g. Git)
Unit testing and debugging
Capital Markets / Financial Services

Tools

Apache Hive
S3
Hadoop
Redshift
Spark
AWS
Apache Pig
NoSQL
Big Data
Data Warehouse
Kafka
Scala
Job description
Key Responsibilities
  • Develop and maintain core data pipelines supporting FX BAU and small enhancement projects
  • Collaborate with cross-functional teams to deliver high-impact data solutions
  • Contribute to both strategic initiatives and day‑to‑day operations
  • Work independently and as part of a team to manage multiple priorities and context‑switch effectively
  • Engage with Front Office stakeholders and datasets particularly within FICC asset classes
Qualifications : Essential skills :
  • Proficiency in Python: solid experience writing clean, efficient and scalable Python code.
  • Data Manipulation with Pandas: strong understanding of data frames, data cleaning, transformation and analysis using the Pandas library.
  • Software Development Life Cycle (SDLC): demonstrated experience working across all phases of the SDLC including requirements gathering, design, development, testing, deployment and maintenance.
  • Familiarity with version control systems (e.g. Git) and CI/CD pipelines.
  • Experience with unit testing and debugging Python applications.
  • Experience in Capital Markets / Financial Services.
Core Attributes
  • Self‑starter with excellent organisational skills.
  • Comfortable working independently and collaboratively.
  • Proven ability to manage multiple initiatives simultaneously.
  • Experience in fast‑paced environments with shifting priorities.
  • Strong communication skills and stakeholder engagement.
Preferred Experience
  • Front Office experience particularly in FICC asset classes.
  • Exposure to FX datasets and trading environments.
  • Familiarity with agile methodologies and project delivery.

A little background on DI

Simply put we turn big data problems into smart data solutions

At our core Data Intellect is a data and technology consultancy firm. Our key area of expertise is financial and capital markets technology solutions. However the utility of these solutions allow us to apply fintech data expertise to other industries such as smart energy and healthcare.

This proprietary offering is complemented by a wealth of experience in data engineering electronic trading systems data capture applications regulatory and compliance systems and middle and back‑office enterprise web solutions.

Fair employment and equal opportunities

Data Intellect is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Accommodations are available on request throughout the assessment and selection process.

Welcome to Data Intellect.

#ChallengeAccepted

Remote Work: No

Employment Type: Contract

Key Skills

Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Experience: years

Vacancy: 1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.