Enable job alerts via email!

AWS Data Engineer

JR United Kingdom

London

Hybrid

GBP 60,000 - 80,000

Full time

6 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player in consumer behavior analytics is seeking a driven AWS Data Engineer to enhance their data infrastructure. This exciting role involves developing advanced data pipelines and integrating cutting-edge technologies like DataOps and Generative AI. Collaborate with a talented team to optimize ETL processes, ensure data security, and explore innovative data warehousing solutions. If you have a passion for working with diverse data and the latest technologies, this opportunity offers a collaborative environment where you can make a significant impact.

Qualifications

  • Hands-on experience with AWS services and strong SQL skills.
  • Experience with data pipeline and workflow management tools.

Responsibilities

  • Develop and optimize ETL/ELT processes for data transformation.
  • Implement and maintain CI/CD pipelines to automate data workflows.

Skills

AWS Services
SQL
Python
Data Pipeline Management
Strong Communication Skills

Tools

Dagster
Snowflake
Pandas
CI/CD Pipelines

Job description

Social network you want to login/join with:

Salary: Negotiable to £80,000 Dependent on Experience

London: Hybrid working 3 days per week in the office 2 days home-based

Job Ref: J12931

A leader in consumer behaviour analytics seeks a driven AWS Data Engineer to guide data infrastructure architecture, working alongside a small talented team of engineers, analysts, and data scientists. In this role, you’ll enhance the data platform, develop advanced data pipelines, and integrate cutting-edge technologies like DataOps and Generative AI, including Large Language Models (LLMs).

This is an exciting opportunity for someone looking to challenge themselves in a collaborative environment, working with a breadth of diverse data and cutting-edge technologies. You’ll have proven experience developing AWS Cloud platforms end to end, orchestrating data using Dagster or similar as well as coding in Python and SQL.

Key Responsibilities
  • Develop and optimize ETL/ELT processes to support data transformation and integrity for analytics.
  • Explore and evaluate new data warehousing solutions, including Snowflake, to improve data accessibility and scalability.
  • Partner with product and engineering teams to define data architecture and best practices for reporting.
  • Ensure data security, compliance, and governance across data systems.
  • Implement and maintain CI/CD pipelines to automate data workflows and enhance system reliability.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability and performance.
Essential Skills and Experience:
  • Hands-on experience with AWS services, including Lambda, Glue, Athena, RDS, and S3.
  • Strong SQL skills for data transformation, cleaning, and loading.
  • Strong coding experience with Python and Pandas.
  • Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Strong communication skills to collaborate with remote teams (US, Canada).
Nice to Have
  • Familiarity with LLMs including fine-tuning and RAG.
  • Knowledge of Statistics.
  • Knowledge of DataOps best practices, including CI/CD for data workflows.
Additional Requirements

Candidates must have an existing and future right to live and work in the UK. Sponsorship at any point is not available.

If this sounds like the role for you then please apply today!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.