Enable job alerts via email!

Data Scientist

83zero Limited

London

Remote

GBP 50,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading FinTech consultancy seeks a skilled Data Scientist to join their team on a fully remote basis. This position involves working on high-impact projects within cloud environments, utilizing advanced analytics and engineering techniques to derive insights from complex data sets. Candidates should possess extensive experience with PySpark and cloud technologies, contributing to innovative data solutions in the fast-evolving FinTech sector.

Qualifications

  • 5+ years' experience in PySpark
  • Strong coding skills in Python and SQL
  • 2+ years working in a cloud environment

Responsibilities

  • Support critical systems and client needs with data solutions.
  • Apply advanced analytics and data engineering techniques.
  • Transform datasets into actionable insights.

Skills

PySpark
Data Solutions
Cloud Technologies
Python
SQL

Tools

Apache Spark
Airflow
Google DataProc
AWS Glue
Apache Kafka
Snowflake

Job description

Social network you want to login/join with:

Data Scientist required for fast growing FinTech organisation.

Location: Fully Remote (UK-based)

Sector: FinTech

Duration: 6 Months+

The Opportunity:

Are you a skilled Data Scientist with a passion for cloud technologies and big data? Join a high-performing consultancy delivering cutting-edge data solutions to clients in the FinTech sector.

Your Role:

As a Data Scientist, you'll work on high-impact projects supporting critical systems and helping clients meet evolving regulatory demands. You'll apply advanced analytics and data engineering techniques to transform complex datasets into actionable insights, using the latest tools across cloud and big data ecosystems.

What You'll Bring:

5+ years' experience in PySpark

4+ years of building scalable data solutions

2+ years working in a cloud environment

Technical Expertise:

  • ETL/ELT tools: experience with at least two of the following - Apache Spark, Airflow, Python Pandas, Google DataProc, Google Composer, AWS EMR, AWS Glue
  • Streaming technologies: one or more - Apache Kafka, AWS Kinesis, GCP Pub/Sub
  • Data warehousing/lakehouse: one or more - Snowflake, Starburst, Databricks, AWS Redshift/Athena/Glue, GCP BigQuery
  • Serverless frameworks: AWS Lambda/Step Functions, GCP Cloud Functions, or Azure Functions
  • Strong coding skills in Python and SQL
  • Bonus points for experience with Apache Beam

If this role is of interest to you please send your CV to Richard Burton.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.