12 Month Contract, Possible Extension, Occassional Onsite in Frankfurt needed once per quarter (TBD)
- As a Data Engineer in the FinTech space, you will design and maintain data pipelines that power analytics, machine learning, and real-time financial decision-making.
- You will work with modern data engineering technologies to process, transform, and optimize financial datasets at scale.
- Collaborating with AI engineers and data scientists, you will play a key role in building robust data infrastructure.
RESPONSIBILITIES
- Develop and maintain ETL / ELT pipelines using Apache Airflow.
- Optimize data storage and processing with Snowflake, Databricks.
- Work with Kafka or Pulsar for real-time data streaming.
- Implement data quality and governance best practices.
- Deploy scalable data solutions on AWS, GCP, or Azure.
- Collaborate with analytics teams to support business intelligence initiatives
REQUIREMENTS
- Strong SQL and Python skills for data processing.
- Experience with modern data lake and warehouse solutions (Snowflake, BigQuery, Redshift).
- Knowledge of real-time data processing (Kafka, Pulsar, Spark Streaming).
- Proficiency in cloud-based data engineering.
- Understanding of data modeling and schema design.
NICE TO HAVE
- Familiarity with FinTech regulations and compliance.
- Experience with DBT for data transformation workflows.
- Exposure to MLOps and AI-driven analytics.
BENEFITS
- Work on high-impact financial data projects.