Aktiviere Job-Benachrichtigungen per E-Mail!

AWS Data Engineer (German Speaking)

Gazelle Global

Essen

Vor Ort

EUR 60.000 - 95.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

An innovative company is on the lookout for a Senior Data Engineer to join their dynamic team. This role focuses on building scalable data pipelines and delivering end-to-end ETL solutions using cutting-edge AWS technologies. The ideal candidate will leverage their expertise in Python and AWS services like Glue and Lambda to create efficient data processing workflows. Collaborating with cross-functional teams, you will troubleshoot and resolve data issues, ensuring optimal performance. If you're passionate about data engineering and want to make a significant impact in a forward-thinking environment, this opportunity is perfect for you.

Qualifikationen

  • 5-8 years of experience in Data Engineering focusing on AWS.
  • Strong background in building scalable data pipelines.

Aufgaben

  • Design and optimize data pipelines using AWS Glue and Lambda.
  • Lead project teams to deliver end-to-end ETL solutions.

Kenntnisse

Python
AWS Glue
AWS Lambda
ETL Solutions
Data Processing
FastAPI
Flask
Django
Data Migration
DevOps Practices

Ausbildung

Bachelor’s degree in Computer Science
Bachelor’s degree in Information Technology

Tools

AWS S3
AWS Kinesis
Python Notebooks

Jobbeschreibung

We are seeking a highly skilled Senior Data Engineer with 5–8 years of experience to join our dynamic team. The ideal candidate will have a strong background in building scalable and efficient data pipelines using AWS Glue, Lambda , and Python , with a focus on delivering end-to-end ETL solutions.

Key Responsibilities :

  • Design, develop, and optimize robust and scalable data pipelines using AWS services like Glue and Lambda .
  • Build reusable Python modules, APIs (FastAPI, Flask, or Django), and automation scripts for data processing workflows.
  • Lead project teams in designing and delivering end-to-end data and ETL solutions.
  • Troubleshoot and resolve data processing issues, ensuring optimal performance and reliability.
  • Collaborate with cross-functional teams including data scientists, DevOps, and business stakeholders.

Technical Experience :

  • Proven experience with Python for data processing, transformation, and modular development.
  • Expertise in working with Python Notebooks and classical Python-based application development.
  • Hands-on experience with AWS services including S3, Glue, Lambda , and Kinesis (KDS) .
  • Familiarity with data migration strategies and tools is a strong plus.
  • Experience with DevOps practices and infrastructure automation / oversight is beneficial.

Qualifications :

  • Bachelor’s degree in Computer Science , Information Technology , or a related field.
  • 5–8 years of experience in Data Engineering with a strong focus on cloud-based solutions (preferably AWS).
  • In-depth understanding of IAM roles and security best practices in cloud environments.
  • Excellent analytical and problem-solving skills .
  • Strong communication and leadership abilities to drive collaboration with stakeholders.
  • Ability to handle multiple priorities and deliver high-quality results within deadlines.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.