Job Search and Career Advice Platform

Enable job alerts via email!

Remote Data Engineer – SC Cleared (AWS, ETL)

Fynity

Remote

GBP 59,000 - 70,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A dynamic digital transformation consultancy is looking for a Data Engineer to deliver innovative solutions for government clients. You will be involved in designing robust ETL pipelines using technologies like Apache Kafka, Spark, and AWS services. The role requires proven data engineering experience, including expertise in Python and CI/CD workflows. This position offers remote work flexibility with occasional visits to London. Join us to make a significant impact in a fast-paced environment.

Qualifications

  • Proven hands-on experience in data engineering projects.
  • Good hands-on experience of designing, implementing, debugging ETL pipeline.
  • Expertise in Python, PySpark and SQL languages.
  • Expertise with Spark and Airflow.
  • Experience of designing data pipelines using cloud native services on AWS.
  • Extensive knowledge of AWS services like API Gateway, Lambda, Redshift, Glue, CloudWatch.
  • IaC experience of deploying AWS resources using Terraform.
  • Hands-on experience of setting up CI/CD workflows using GitHub Actions.

Responsibilities

  • Design, implement, and debug ETL pipelines to process and manage complex datasets.
  • Leverage big data tools including Apache Kafka, Spark, and Airflow.
  • Collaborate with stakeholders to ensure data quality and alignment with business goals.
  • Utilize programming expertise in Python, Scala, and SQL for efficient data processing.
  • Build data pipelines using cloud-native services on AWS.
  • Monitor and optimise data solutions using AWS CloudWatch.

Skills

Data engineering projects
ETL pipeline design and implementation
Python
PySpark
SQL
Apache Spark
Apache Airflow
AWS services
Infrastructure as Code (IaC)
CI/CD workflows

Tools

AWS Lambda
AWS Glue
AWS Redshift
AWS API Gateway
AWS CloudWatch
Terraform
GitHub Actions
Job description
A dynamic digital transformation consultancy is looking for a Data Engineer to deliver innovative solutions for government clients. You will be involved in designing robust ETL pipelines using technologies like Apache Kafka, Spark, and AWS services. The role requires proven data engineering experience, including expertise in Python and CI/CD workflows. This position offers remote work flexibility with occasional visits to London. Join us to make a significant impact in a fast-paced environment.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.