Enable job alerts via email!

Data Engineer

Exceptional Dental

Glasgow

Hybrid

GBP 35,000 - 55,000

Full time

4 days ago
Be an early applicant

Job summary

A leading dental practice in Glasgow is seeking a skilled Data Engineer to join their Data & Analytics team. The role involves developing scalable data pipelines and ensuring data quality for analytics and machine learning. The ideal candidate has strong experience in AWS and SQL, with a collaborative approach to problem-solving. This position offers a hybrid working model with generous benefits, including medical cover and 28 days of holiday.

Benefits

Generous Holiday Allowance
Vitality Medical Cover
Flexible Working

Qualifications

  • Proven experience as a Data Engineer or similar role.
  • Ability to work collaboratively in a cross-functional team.

Responsibilities

  • Design, build, and maintain reliable data pipelines and ETL processes.
  • Collaborate with Data Scientists and Analysts to understand data requirements.
  • Implement and manage data warehousing solutions.
  • Develop APIs and data services.
  • Monitor and troubleshoot data systems.

Skills

Problem-solving skills
Attention to detail
Collaboration in cross-functional teams

Tools

AWS
Databricks
PostgreSQL
Apache Superset
Python
SQL
Job description
About Us

At iOpt, we are dedicated to improving housing conditions, reducing energy costs, and supporting tenants struggling with fuel poverty. Using cutting-edge sensor technology, we empower asset managers with data visibility to make informed decisions and protect property portfolios. With a commitment to innovation and sustainability, iOpt partners with housing providers, landlords, and property managers to create smarter, more sustainable living and working spaces.

Overview

We are looking for a skilled Data Engineer to join our Data & Analytics team. You will work alongside our Data Scientist and Data Analyst, supporting the development of scalable data pipelines, ensuring data quality and availability, and enabling efficient access to data for analytics and machine learning solutions.

Responsibilities
  • Design, build, and maintain reliable data pipelines and ETL processes.
  • Collaborate with Data Scientists and Analysts to understand data requirements and ensure accessibility and quality.
  • Implement and manage data warehousing solutions (e.g., Databricks, Redshift, etc).
  • Develop APIs and data services to expose and consume data securely.
  • Ensure data governance, security, and compliance best practices are followed.
  • Optimize workflows for performance and cost (e.g., tuning queries, scaling clusters, caching).
  • Monitor and troubleshoot data systems to maintain high uptime and data accuracy.
  • Automate data quality validation, monitoring, and alerting using modern observability tools.
  • Support the development and maintenance of dashboards by ensuring underlying data models and tables are accurate, performant, and well-documented.
  • Build and support machine learning pipelines and model deployment.
  • Troubleshoot and debug data issues, ensuring system uptime and reliability.
About You
  • Proven experience as a Data Engineer or similar role.
  • Strong problem-solving skills and attention to detail.
  • Ability to work collaboratively in a cross-functional team.
Technical Skills
  • Cloud Platform:
    • AWS – EC2, IAM, CloudWatch, VPCs, etc.
  • Object Storage:
    • AWS S3 – lifecycle policies, data partitioning, fine-grained access controls
  • Big-Data Processing:
    • Databricks (Apache Spark) – building scalable ETL pipelines, Spark SQL, Delta Lake
  • Relational Database:
    • PostgreSQL – schema design, performance tuning, replication/backups
  • BI & Visualization:
    • Apache Superset – designing, deploying and maintaining interactive dashboards and data exploration tools
  • Programming & Query Languages:
    • Python (pandas, PySpark)
    • SQL (advanced window functions, CTEs, optimisation)
Desirable / Nice-to-Have
  • Knowledge of and experience implementing DBT
  • Exposure to geospatial data or IoT datasets.
  • Background in DevOps, CI/CD, or infrastructure as code.
  • Experience with dashboarding tools (e.g., Power BI, Looker, Tableau).
Perks and Benefits
  • Generous Holiday Allowance: 28 days of holiday plus bank holidays, with the flexibility to use them whenever suits you best.
  • Vitality Medical Cover: Comprehensive health coverage to support your well-being and provide peace of mind.
  • Flexible Working: Hybrid model with the freedom to balance work between home and office, accommodating your lifestyle.
What We Value
  • Purpose-Driven Impact: A passion for using data and technology to create meaningful change, supporting people in social housing to live in healthier, more comfortable homes.
  • Innovation and Curiosity: A drive to solve complex challenges through innovative thinking and continuous learning.
  • Collaboration and Inclusivity: A commitment to fostering a supportive and inclusive work culture where everyone thrives.
  • Tech for Good Ethos: A shared belief in the power of technology to improve lives and contribute to a better society.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.