Job Search and Career Advice Platform

Enable job alerts via email!

Data Developer

SPX Flow

England

On-site

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

An innovative technology company in the UK is looking for a candidate to create and maintain semantic data models and ontologies aligned with W3C standards. The role involves developing SHACL-based validation frameworks and building DCAT3-compliant metadata services. Ideal candidates will have advanced knowledge of RDF and strong experience with SPARQL and linked data design. The position demands a robust understanding of semantic web technologies and offers a dynamic work environment.

Qualifications

  • Advanced knowledge of RDF, RDFS, OWL, SHACL, and broader semantic web technologies.
  • Strong experience with SKOS, DCAT3, SOSA/SSN, and Hydra.
  • Proficiency in SPARQL, SQL, data validation frameworks, and linked data design.

Responsibilities

  • Design and maintain semantic data models and ontologies aligned with W3C standards.
  • Develop SHACL-based validation frameworks and automated testing approaches.
  • Create DCAT3-compliant metadata services and build/extend SKOS vocabularies.

Skills

Advanced knowledge of RDF
Strong experience with SKOS
Proficiency in SPARQL
Ability to convert domain requirements

Tools

Python
Docker
Git
Job description
Responsibilities
  • Design and maintain semantic data models and ontologies aligned with W3C standards.
  • Develop SHACL-based validation frameworks and automated testing approaches.
  • Create DCAT3-compliant metadata services and build/extend SKOS vocabularies.
  • Implement and optimise Triple Pattern Fragments for high-performance data access.
  • Translate domain requirements into robust RDF models and modelling strategies.
  • Build tools and pipelines for data harmonisation, transformation, and validation.
  • Design versioning approaches for evolving vocabularies and datasets.
  • Develop hypermedia APIs following Hydra principles.
  • Work with geospatial datasets using WKT and relevant geographic vocabularies.
  • Write, optimise, and maintain SPARQL queries and SQL-to-RDF transformation logic.
  • Produce clear technical documentation, diagrams, and architectural guides.
Deliverables
  • SHACL validation schemas with accompanying automated tests
  • Semantic data models, ontologies, and vocabulary management structures
  • DCAT3 metadata service components and SKOS vocabularies
  • Triple Pattern Fragment endpoints and Hydra-driven APIs
  • Tools for geospatial processing and data harmonisation
  • Optimised SPARQL queries and transformation scripts
  • Architecture documentation and engineering guidance
Qualifications
  • Advanced knowledge of RDF, RDFS, OWL, SHACL, and broader semantic web technologies
  • Strong experience with SKOS, DCAT3, SOSA/SSN, and Hydra
  • Proficiency in SPARQL, SQL, data validation frameworks, and linked data design
  • Ability to convert domain requirements into scalable semantic models
  • Experience working across distributed or federated linked data architectures
Desirable Skills
  • Triple Pattern Fragments, JSON-LD APIs, performance tuning for linked data endpoints
  • Familiarity with OGC standards and geospatial/environmental data formats
  • Experience with Python (e.g., FastAPI), Docker, Git, CI/CD workflows
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.