Enable job alerts via email!

Data Architect

Vallum Associates

City Of London

On-site

GBP 45,000 - 60,000

Full time

Today
Be an early applicant

Job summary

A data solutions firm in City Of London is seeking a Data Engineer to design, develop, and maintain scalable data pipelines using SPARQL and Python. You will work on graph-based applications and collaborate with data scientists to meet data requirements while ensuring data quality. The ideal candidate has a Bachelor's degree and experience with graph databases and data pipeline tools.

Qualifications

  • Proven experience with SPARQL and Python programming.
  • Strong understanding of graph databases such as RDF, Neo4j, and GraphDB.
  • Knowledge of data pipeline tools and frameworks.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using SPARQL and Python.
  • Create and optimize complex SPARQL queries for graph databases.
  • Collaborate with data scientists to meet data requirements.

Skills

SPARQL
Python
Data modeling
Analytical skills

Education

Bachelor's degree in Computer Science, Data Science, or related field

Tools

Apache Airflow
Luigi
Neo4j
GraphDB
Job description
Responsibilities
  • Design, develop, and maintain scalable data pipelines using SPARQL and Python to extract, transform, and load data into graph databases.
  • Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases.
  • Develop graph-based applications and models to solve real-world problems and extract valuable insights from data.
  • Collaborate with data scientists and analysts to understand their data requirements and translate them into effective data pipelines and models.
  • Ensure data quality and integrity throughout the data pipeline process.
  • Stay up-to-date with the latest advancements in graph databases, data modeling, and programming.
Qualifications
  • Bachelor's degree in Computer Science, Data Science, or a related field.
  • Proven experience with SPARQL and Python programming.
  • Strong understanding of graph databases (e.g.,RDF, Neo4j, GraphDB).
  • Experience with data modeling and schema design.
  • Knowledge of data pipeline tools and frameworks (e.g., Apache Airflow, Luigi).
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.