London
On-site
GBP 50,000 - 75,000
Full time
30+ days ago
Job summary
A leading company in data solutions is looking for a developer to enhance their data pipelines and contribute to the ongoing development of a Data Lake technology. This role requires hands-on experience with various data integration technologies and programming languages, along with a collaborative approach within an Agile framework. Ideal candidates will possess strong analytical skills and experience in cloud environments.
Qualifications
- Experience with data solution BAU processes (ETL, table refresh).
- Experience with data integration and Big Data technologies.
- Strong programming skills in Python or Scala.
Responsibilities
- Deliver hands-on development on Nexus Platform.
- Monitor and refresh daily data pipelines.
- Build new data pipelines and enhance existing BAU processes.
Skills
Data solution BAU processes
Data integration from multiple sources
Big Data technologies
Python
Scala
Analytical skills
CI/CD
Linux scripting
Tools
You will deliver- Hands on development on Nexus Platform
- Monitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every day
- Enhance the daily BAU process. Making it easier to monitor and less likely to fail and hands on development on Data Lake build, change and defect fix
- Building new data pipelines using existing frameworks and patterns
- Working with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processes
- Contributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousing
Skills and ExperienceEssential- Experience with data solution BAU processes (ETL, table refresh etc.)
- Experience with integration of data from multiple data sources
- Experience in Big Data data integration technologies such as Spark, Scala, Kafka
- Experience in programming language such as Python or Scala.
Experience using AWS, DBT and Snowflake. - Analytical and problem-solving skills, applied to data solution
- Experience of CI/CD
- Good aptitude in multi-threading and concurrency concepts
- Familiarity with the fundamentals of Linux scripting language
Desirable- Experience of ETL technologies
- AWS exposure (Athena, Glue, EMR, Step functions)
- Experience of Snowflake and DBT
- Experience with data solution BAU processes (ETL, table refresh etc
- Previous proficiency with ETL technologies (e.g. Talend, Informatica, Abinitio)
- Previous exposure to Python
- Previous exposure to own data solution BAU monitoring and enhancement
- Exposure to building applications for a cloud environment
We work with Textio to make our job design and hiring inclusive.Permanent