Enable job alerts via email!

Data Engineer - Python, SQL, Databricks

Access Computer Consulting

Glasgow

Hybrid

GBP 100,000 - 125,000

Full time

Today
Be an early applicant

Job summary

A consulting firm is seeking a Data Engineer in Glasgow to work 3 days on-site and 2 days remote. The role involves developing tailored data solutions, requiring extensive experience in SQL, Databricks, and Python among other tools. Candidates should have a strong background in data engineering projects, and the ability to collaborate across varying technical stacks. Apply ASAP to find out more.

Qualifications

  • Several years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects.
  • Experience designing and implementing tailored data solutions.
  • Hands-on experience with data pipelines for structured, semi-structured, and unstructured data.

Responsibilities

  • Develop data solutions in a hybrid environment (on-Prem and Cloud).
  • Collaborate across diverse technical stacks.
  • Write ad-hoc and complex SQL queries for data analysis.

Skills

SQL / PLSQL
Databricks
Python
Pandas
NumPy
PySpark

Tools

Cloudera
Snowflake
Azure
AWS
Job description
Overview

i am recruiting for a Data Engineer to work in Glasgow 3 days a week, 2 days remote.

The role falls inside IR35 so you will have to work through an umbrella company.

Responsibilities
  • You will have several years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects.
  • You will have experience of designing and implementing tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack.
  • Experience in data development and solutions in highly complex data environments with large data volumes.
  • SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis.
  • Databricks experience is essential.
  • Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
  • You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud).
  • You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc.
  • Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc).
Qualifications
  • SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis.
  • Databricks experience is essential.
  • Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
  • You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud).
  • You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc.
  • Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc).

Please apply ASAP to find out more

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.