Enable job alerts via email!

Data Engineer (Databricks Contractor)

iO Associates

City of Edinburgh

Hybrid

GBP 100,000 - 125,000

Full time

7 days ago
Be an early applicant

Job summary

A tech consulting firm is looking for an experienced Databricks Engineer to design and deliver strategic data solutions across enterprise platforms. This role involves evaluating Databricks, migrating data flows, and implementing data governance practices. Candidates should have strong skills in Databricks, Apache Spark, and Agile methodologies. The position is primarily onsite in Edinburgh, Newcastle, or London, 3 days a week.

Qualifications

  • Strong hands-on experience with Databricks and Apache Spark.
  • Proficiency in Python, SQL, and PySpark.
  • Solid understanding of data governance and enterprise architecture.

Responsibilities

  • Lead Databricks platform evaluation and performance benchmarking.
  • Collaborate with Databricks Professional Services for architectural assurance.
  • Design scalable data models and integration patterns.
  • Support enterprise integration with Salesforce data.
  • Migrate Azure Data Factory flows to Databricks.
  • Implement Unity Catalog for automated data lineage.
  • Deliver backlog items through Agile sprint planning.

Skills

Databricks
Fabric
Apache Spark
Delta Lake
Python
SQL
PySpark
Azure Data Factory
Event Hub
Unity Catalog
Data governance
Enterprise architecture
Effective communication

Job description

We're seeking an experienced Databricks Engineer to help design and deliver strategic data solutions across enterprise platforms. This role will support both proof-of-concept work and the development of a Strategic Operational Data Store (ODS), with a focus on performance, scalability, and governance. This role is 3 days a week onsite in either Newcastle, London or Edinburgh and falls outside IR35.

Key Responsibilities
  • Lead Databricks platform evaluation, including performance benchmarking and stress testing
  • Collaborate with Databricks Professional Services for architectural assurance
  • Design and document scalable data models and integration patterns
  • Support enterprise integration, including Salesforce data and unified client ID strategies
  • Migrate Azure Data Factory flows to Databricks for improved traceability
  • Implement Unity Catalog for automated data lineage
  • Deliver backlog items through Agile sprint planning
Skills & Experience
  • Strong hands-on experience with Databricks, Fabric, Apache Spark, Delta Lake
  • Proficient in Python, SQL, and PySpark
  • Familiar with Azure Data Factory, Event Hub, Unity Catalog
  • Solid understanding of data governance and enterprise architecture
  • Effective communicator with experience engaging stakeholders
Desirable
  • Experience with Salesforce data integration
  • Prior involvement in platform PoCs or benchmarking
  • Knowledge of Strategic ODS and enterprise integration strategies
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs