Enable job alerts via email!

Core Data Engineer - London

Flow Traders

London

On-site

GBP 50,000 - 90,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a Core Data Engineer to enhance their core data products. This role offers a unique opportunity to shape the data landscape and drive strategic growth initiatives. You will be responsible for ensuring the timely and high-quality delivery of data products, developing their lifecycle, and refining data models. The ideal candidate will possess excellent programming skills, particularly in Python, and have experience with big data solutions. Join a dynamic team where your contributions will significantly impact the organization and help drive its future success.

Qualifications

  • Excellent programming skills, preferably in Python.
  • Experience with Linux, SQL, and relational databases.

Responsibilities

  • Deliver high-quality core data products across the organization.
  • Develop and manage the end-to-end lifecycle of data products.

Skills

Programming skills
Python
Linux
SQL
Version control systems
Working with large datasets
Interpersonal skills
Project management

Tools

Airflow
Kafka
GCP
Pandas
Spark
Apache Beam
Apache NiFi

Job description

Flow Traders is looking for a Core Data Engineer to own the end-to-end lifecycle of our most fundamental data products. As the architect and guardian of our most valuable data, you will have a unique opportunity to make your mark on Flow Traders. High-quality data products offer substantial benefits that impact every division in our firm, playing a vital role in delivering on our strategic growth initiatives. You will be accountable for delivering these data products at an exceptional level of quality, ensuring that they are not only timely and pristine, but powerful enough to drive Flow Traders' future growth.

What you will do

  1. Be accountable for timely and high-quality delivery of core data products to the rest of the organization
  2. Develop and own the end-to-end lifecycle of these data products
  3. Define and manage project plans to assume ownership of these data products
  4. Create and refine a common data model (esp. identifiers, normalized schemas, use of master data) across these data products
  5. Investigate currently available sources of data and ensure that Flow extracts maximum information and value from them
  6. Explore, evaluate, and onboard additional sources of available data (both independently and at the request of stakeholders)
What you need to succeed
  1. Excellent programming skills (any language, Python preferred)
  2. Working proficiency with Linux, SQL and relational databases, and version control systems
  3. Prior experience working in a trading environment
  4. Demonstrated ability to work with large datasets and familiarity with big data solutions
  5. Experience with Airflow, Kafka, and GCP (esp. GKE, Dataproc, and Big Query) highly desired
  6. Experience with Pandas, Spark, Apache Beam, Apache NiFi a plus
  7. Innate drive to generalize, systematize, and automate
  8. Strong interpersonal skills, works very well with both technical and non-technical stakeholders
  9. Ability to balance individual stakeholder demands in order to preserve quality and wide applicability of data
  10. Organized and able to define, manage, and execute projects (often several in parallel) while effectively triaging operational concerns
Please note that we do not offer sponsorship for our London office.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.