Aktiviere Job-Benachrichtigungen per E-Mail!

AWS Data Engineer (German Speaking)

Gazelle Global

Leipzig

Vor Ort

EUR 60.000 - 95.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

An innovative company is on the lookout for a Senior Data Engineer to enhance their data processing capabilities. In this role, you will design and build scalable data pipelines using AWS services like Glue and Lambda, ensuring efficient ETL solutions. Collaborating with cross-functional teams, you will lead projects that leverage your expertise in Python and cloud technologies. If you are passionate about data engineering and thrive in a dynamic environment, this opportunity is perfect for you to make a significant impact.

Qualifikationen

  • 5-8 years of experience in Data Engineering with a focus on AWS.
  • Strong background in building scalable data pipelines and ETL solutions.

Aufgaben

  • Design and optimize scalable data pipelines using AWS services.
  • Lead project teams to deliver end-to-end data solutions.

Kenntnisse

Python
AWS Glue
AWS Lambda
ETL Solutions
Data Processing
FastAPI
Flask
Django
Data Migration
DevOps Practices

Ausbildung

Bachelor's degree in Computer Science
Bachelor's degree in Information Technology

Tools

AWS S3
AWS Kinesis
Python Notebooks

Jobbeschreibung

We are seeking a highly skilled Senior Data Engineer with 5–8 years of experience to join our dynamic team. The ideal candidate will have a strong background in building scalable and efficient data pipelines using AWS Glue, Lambda , and Python , with a focus on delivering end-to-end ETL solutions.

Key Responsibilities :

  • Design, develop, and optimize robust and scalable data pipelines using AWS services like Glue and Lambda .
  • Build reusable Python modules, APIs (FastAPI, Flask, or Django), and automation scripts for data processing workflows.
  • Lead project teams in designing and delivering end-to-end data and ETL solutions.
  • Troubleshoot and resolve data processing issues, ensuring optimal performance and reliability.
  • Collaborate with cross-functional teams including data scientists, DevOps, and business stakeholders.

Technical Experience :

  • Proven experience with Python for data processing, transformation, and modular development.
  • Expertise in working with Python Notebooks and classical Python-based application development.
  • Hands-on experience with AWS services including S3, Glue, Lambda , and Kinesis (KDS) .
  • Familiarity with data migration strategies and tools is a strong plus.
  • Experience with DevOps practices and infrastructure automation / oversight is beneficial.

Qualifications :

  • Bachelor’s degree in Computer Science , Information Technology , or a related field.
  • 5–8 years of experience in Data Engineering with a strong focus on cloud-based solutions (preferably AWS).
  • In-depth understanding of IAM roles and security best practices in cloud environments.
  • Excellent analytical and problem-solving skills .
  • Strong communication and leadership abilities to drive collaboration with stakeholders.
  • Ability to handle multiple priorities and deliver high-quality results within deadlines.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.