Join to apply for the (Senior) Data Engineer—Data Platform (m/f/d) role at Unzer
1 day ago Be among the first 25 applicants
Join to apply for the (Senior) Data Engineer—Data Platform (m/f/d) role at Unzer
About Us
Unzer is a leading European fintech company with a mission to simplify international payments for e-commerce and retail businesses. Our brand was formed from 13 companies that now contribute to building a unique product covering the entire payment flow.
We are driven by the belief that customers should enjoy a seamless shopping experience, no matter where they shop. Our team of over 750 experts from 70 nationalities is dedicated to creating a state-of-the-art unified commerce platform. Our goal is to enable businesses to delight their customers with seamless payment experiences.
Our offices are located across Austria, Denmark, Germany, and Luxembourg, with a headquarters in Berlin.
About The Project
We’re looking for an experienced freelance Data Engineer to support Unzer’s Data Team in scaling and maintaining our self-serve data platform. This role involves building and optimizing data pipelines, enhancing infrastructure, and enabling data-driven product features. Residency within the EU is required.
Responsibilities include:
- Operate, monitor, and extend data platform infrastructure, pipelines, and services.
- Implement and maintain Airflow workflows for orchestration.
- Use DBT for data transformation, materialization, and data quality checks.
- Develop and optimize schemas on PostgreSQL/MySQL.
- Collaborate with analytics stakeholders to improve the semantic layer from an engineering perspective.
- Work with Redshift to design scalable data models and ingestion pipelines, optimizing performance for large analytics workloads.
- Utilize Kafka and other streaming technologies for real-time data processing and integration.
Tech Stack:
- AWS (Redshift, Athena, S3, EMR, DMS)
- Airflow, DBT, Terraform
- PostgreSQL, MySQL
- AWS DMS, AWS Glue
Requirements for success:
- Strong SQL skills and experience with AWS data stack.
- Proven experience with ETL/ELT pipelines using Airflow and DBT.
- Hands-on experience with Terraform and infrastructure-as-code in cloud environments.
- Ability to manage the entire delivery process from requirements to production monitoring.
- Self-driven, communicative, and able to work independently with asynchronous teams.
Next Steps:
- If interested, apply with your CV in English. Don’t worry if you don’t meet all requirements; we look forward to hearing from you.
- We will respond within 14 days of receiving your application.
Seniority Level
Employment Type
Job Function
- Information Technology and Engineering