Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Harvey Nash Group

Glasgow

Hybrid

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology firm is looking for a contract Data Engineer in Glasgow. This hybrid role involves designing and optimizing data pipelines for Data Scientists and Traders, focusing on Python development on Databricks and Streamlit. The ideal candidate will have extensive data engineering experience and programming skills in Python and SQL, along with strong stakeholder management and communication skills. This position offers a day rate of up to £675 for a duration of 6 months.

Qualifications

  • Extensive experience programming in Python/PySpark and SQL.
  • Experience in data engineering for rapid processing of system data.
  • Understanding of data streaming concepts and ELT design patterns.

Responsibilities

  • Design, build, manage, and optimize data pipelines.
  • Deliver commercial value through collaboration with business stakeholders.
  • Promote understanding of data and analytics across stakeholders.

Skills

Python/PySpark
SQL
Data Engineering
Stakeholder Management
Communication Skills

Tools

Databricks
Azure
ADF
Streamlit
Job description
Data Engineer – Inside IR35 – Glasgow – Private Sector – Hybrid
Day Rate – up to £675
Duration – 6 months

Harvey Nash's Client are looking to bring in a contract Data Engineer. You will be responsible for designing, building, managing, and optimizing data pipelines and the data model used by Data Scientists and Traders. Working closely with Data Management teams on governance and security as well as business stakeholders around projects and IT teams to deliver these data pipelines and models effectively into production. This role will focus on python development on the Databricks and Streamlit platforms to deliver operational tools and process efficiencies.

Other Responsibilities
  • Architect, create, improve, and operationalise integrated and reusable data pipelines, tools and processes. Measuring performance, efficiency and robustness of data provision and best practice process approaches.
  • Deliver commercial value through working with key business stakeholders, IT experts and subject‑matter experts and high‑quality analytics and data science solutions.
  • Understand that solutions and developments need to fit with the long‑term Data Strategy for the analytics platform.
  • Explore the opportunities presented by using the latest advances in technology and tools alongside contributing to ensure an appropriate operational model exists to support solutions in production.
  • Promote a better understanding of data and analytics across business stakeholders.
  • Stakeholder Management & Communication
  • Possess outstanding communication skills and encourages collaboration across EM. Building strong relationships throughout the business is crucial to preserving the trust and respect necessary for the team to continue offering support and guidance.
  • Strong communication and stakeholder management skills, with the ability to lead cross‑functional projects and drive adoption of data quality practices.
  • Promote and champion the exceptional work being accomplished.
Skill/Experience Required
  • Extensive experience in programming/query language skills including Python/PySpark and SQL.
  • Data engineering experience - via data preparation, transformation & conversion to allow for rapid processing of system data across a number of different formats & structures
  • Experience in Databricks, Azure, ADF, Streamlit or alternatives.
  • Excellent technical computing, analysis, design and development skills to a proven professional level.
  • Good understanding of data streaming concepts with experience designing, analysing (not essential) A good understanding of data modelling, ELT design patterns, data governance, and security best practices.
  • A problem‑solving mindset, curiosity, and adaptability - able to operate as a generalist across multiple data domains.
  • A strong background in delivering using cloud‑based Microsoft Azure Data and Analytics capabilities, Microsoft DevOps and Agile delivery.
  • Experience with designing, building, and operating analytics solutions using Azure cloud technologies.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.