Enable job alerts via email!

Data Engineer

Access Computer Consulting

Glasgow

Hybrid

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Job summary

A leading technology consulting firm is seeking a Data Engineer based in Glasgow for a hybrid work model. The ideal candidate will have extensive experience in developing data pipelines and data warehousing solutions using Python, alongside strong expertise in cloud services, particularly Databricks. This role involves collaboration with cross-functional teams to design scalable ETL processes. Apply ASAP if interested.

Qualifications

  • Several years of experience developing data pipelines and data warehousing solutions using Python.
  • Hands-on experience with cloud services, especially Databricks.
  • Expertise in ETL processes.

Responsibilities

  • Collaborate with cross-functional teams to understand data requirements.
  • Design efficient, scalable and reliable ETL processes using Python and Databricks.
  • Develop and deploy ETL jobs that extract data from various sources.

Skills

Data pipeline development
Data warehousing solutions
Python
ETL processes
Cloud services (Databricks)
Snowflake or similar
Job description

I am recruiting for a Data Engineer to be based in Glasgow 3 days a week, 2 days remote.

The role falls inside IR35 so you will need to work through an umbrella company for the duration of the contract.

You must have several years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.

You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines.

ETL process expertise is essential.

Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential.

Experience in data development and solutions in highly complex data environments with large data volumes is also required.

You will be responsible for collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks.

You will also develop and deploy ETL jobs that extract data from various sources, transforming them to meet business needs.

Please apply ASAP if this is of interest.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.