Job Search and Career Advice Platform

Enable job alerts via email!

Data Integration Developer

JD GROUP

Bury

On-site

GBP 45,000 - 60,000

Full time

18 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions company in the United Kingdom is seeking a Data Integration Developer to work with a variety of tools in a cloud-based environment. Candidates should have strong skills in SQL and Python, as well as experience in cloud data solutions. This role involves analyzing large data sets, automating pipelines, and potentially specializing in areas such as infrastructure or machine learning. Great opportunity for career growth in a dynamic environment.

Qualifications

  • 1-2 years of experience in cloud data solutions.
  • Excellent level of experience in SQL and Python.
  • Experience with Pyspark or equivalent is a plus.

Responsibilities

  • Automate and maintain pipelines within a cloud environment.
  • Analyze large data sets using tools like Python and SQL.
  • Create and maintain data processes for analysis.

Skills

SQL
Python
Cloud data solution experience
Pyspark coding
Problem-solving skills
Attention to detail
Stakeholder management
Communication skills

Tools

GCP
Job description
Data Integration Developer

Role overview: Working with a wide array of tools, you will work through the full, streaming, enrichment & curation of data into a cloud-based data environment, verifying data, and making sure that links to other key data sets are obtained to allow for simple, effective data analysis for our insight team and data scientists.

The role will allow you to start looking at whether you would like to become a specialist in a specific field such as infrastructure, curation or ML/AI in order to further progress your career.

Responsibilities

Assisting in the automation & maintenance of pipelines within a cloud-based environment, the candidate would also be involved in the sourcing of data using a range of different methods, whilst carrying out verification that the data is acceptable for ingestion.

  • Analysis of large data sets using tools such as Python & SQL
  • Setting up new pipelines for the full stream/enrichment/curation process
  • Upkeep of source code locations
Role Objectives & KPI’s
  • Analysis of large data sets using tools such as Python & SQL
  • Creation stream/enrichment/curation process utilising a wide variety of data sources
  • Upkeep of source code locations/Git Hub Repositories
  • Set up of tables/views/procedures
  • Data aggregation & manipulation
  • Building of large scale analytical data sets
  • Investigation of new/alternative technology
Competencies & Behaviours
  • Deciding and Initiating Action
  • Working with People
  • Relating and Networking
  • Applying Expertise and Technology
  • Learning and Researching
  • Planning and Organising
  • Achieving Personal Work Goals and Objectives
Skills & Experience
  • An excellent level of experience in tools such as SQL / Python
  • Ability to decide on the overall concept and vision
  • Creating road maps
  • 1-2 years of cloud data solution experience in GCP
  • Experience of Pyspark coding or equivalent
  • Excellent problem solving skills
  • Strong attention to detail
  • Strong stakeholder management
  • Strong communication skills
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.