Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer (Freelance)

DevologyX

München

Hybrid

EUR 60.000 - 80.000

Vollzeit

Vor 4 Tagen
Sei unter den ersten Bewerbenden

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

A leading company in the FinTech space is seeking a Data Engineer to design, develop, and maintain data pipelines that empower analytics and decision-making. You will collaborate with AI engineers and data scientists, ensuring robust data infrastructure while optimizing data solutions using modern technologies.

Leistungen

Work on high-impact financial data projects

Qualifikationen

  • Strong SQL and Python skills for data processing.
  • Experience with modern data lake and warehouse solutions.
  • Proficiency in cloud-based data engineering.

Aufgaben

  • Develop and maintain ETL / ELT pipelines using Apache Airflow.
  • Optimize data storage and processing with Snowflake and Databricks.
  • Work with Kafka for real-time data streaming.

Kenntnisse

SQL
Python
Data modeling
Data quality
Data governance

Tools

Apache Airflow
Snowflake
Databricks
Kafka
AWS
GCP
Azure

Jobbeschreibung

12 Month Contract, Possible Extension, Occassional Onsite in Frankfurt needed once per quarter (TBD)

  • As a Data Engineer in the FinTech space, you will design and maintain data pipelines that power analytics, machine learning, and real-time financial decision-making.
  • You will work with modern data engineering technologies to process, transform, and optimize financial datasets at scale.
  • Collaborating with AI engineers and data scientists, you will play a key role in building robust data infrastructure.

RESPONSIBILITIES

  • Develop and maintain ETL / ELT pipelines using Apache Airflow.
  • Optimize data storage and processing with Snowflake, Databricks.
  • Work with Kafka or Pulsar for real-time data streaming.
  • Implement data quality and governance best practices.
  • Deploy scalable data solutions on AWS, GCP, or Azure.
  • Collaborate with analytics teams to support business intelligence initiatives

REQUIREMENTS

  • Strong SQL and Python skills for data processing.
  • Experience with modern data lake and warehouse solutions (Snowflake, BigQuery, Redshift).
  • Knowledge of real-time data processing (Kafka, Pulsar, Spark Streaming).
  • Proficiency in cloud-based data engineering.
  • Understanding of data modeling and schema design.

NICE TO HAVE

  • Familiarity with FinTech regulations and compliance.
  • Experience with DBT for data transformation workflows.
  • Exposure to MLOps and AI-driven analytics.

BENEFITS

  • Work on high-impact financial data projects.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.