Enable job alerts via email!

ETL Data Engineer - Talend - 100% remote - Outside IR35

Berkeley Square IT

Coventry

Remote

GBP 40,000 - 80,000

Full time

28 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative company is looking for a skilled ETL Data Engineer to join their remote team. This role focuses on leveraging Talend for ETL processes and ensuring data quality across various platforms. You'll work with cutting-edge technologies such as Snowflake and various databases, contributing to the design and implementation of data models that support business intelligence and analytics. If you are passionate about data engineering and want to work in a flexible environment that values your expertise, this opportunity is perfect for you. Join a forward-thinking team and make an impact in the world of data!

Qualifications

  • Experience in ETL tools like Talend and data quality management.
  • Strong skills in SQL and programming languages such as Python and Java.

Responsibilities

  • Develop and manage ETL processes using Talend for data integration.
  • Ensure data quality and perform data modelling for data warehouses.

Skills

Talend
ETL processes
Data Quality
Snowflake
Database management
Data modelling
Job Scheduling
PL/SQL
SQL
Python
Java

Tools

AWS
Azure
GCP
Jira
Confluence
Gitlab
Jenkins
Ansible
Pentaho
Power BI

Job description

ETL Data Engineer - Talend - 100% remote - Outside IR35
Job Description:

My client is seeking an experienced data engineer with expertise in using Talend for ETL processes and Data Quality.

This role is 100% remote and sits Outside of IR35.

Must have technologies:

  • Experience in an ETL toolset (Talend, Pentaho, SAS DI, Informatica, etc.)
  • Snowflake
  • Experience in a Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc.)
  • Experience in data modelling (Data Warehouse, Marts)
  • Job Scheduling toolset (Job Scheduler, TWS, etc.)
  • Programming and scripting languages (PL/SQL, SQL, Unix, Java, Python, Hive, HiveQL, HDFS, Impala, etc.)

Good to have:

  • Data virtualisation tools (Denodo)
  • Reporting (Pentaho BA, Power BI, Business Objects)
  • Data Analytics toolset (SAS Viya)
  • Cloud (AWS, Azure, GCP)
  • ALM Tooling (Jira, Confluence, Bitbucket)
  • CI/CD toolsets (Gitlab, Jenkins, Ansible)

CVs to Nick ASAP for immediate review.

Required Skills:

Informatica, Hive, Data Quality Objects, CVS, Hadoop, SAS, Ansible, Bitbucket, Gitlab, PL/SQL, Data Analytics, Confluence, Unix, Power BI, Jenkins, Analytics, Programming, Oracle, JIRA, MySQL, Scheduling, Java, Python, SQL, Business.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.