Enable job alerts via email!

Data Engineer

Twisto

Warszawa

Hybrid

PLN 120,000 - 160,000

Full time

6 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A modern financial services provider in Warsaw is seeking a Data Engineer to design and implement data ingestion and modeling workflows. The successful candidate will work on core system migration, develop reliable data pipelines, and collaborate with stakeholders to improve data collection and modeling. This role offers a hybrid or remote work model, with a chance to impact data usage significantly within the organization.

Benefits

Home office reimbursement
Up to 34 days of vacation
Health care
Flexible working hours
Team events

Qualifications

  • 3+ years in Data Engineering or a similar role in a modern data environment.
  • Experience with SQL (Snowflake SQL, PostgreSQL), complex queries, performance tuning.
  • Experience with dbt, Airbyte, and orchestration tools.

Responsibilities

  • Design and maintain scalable data pipelines for batch ingestion to Snowflake.
  • Develop and optimize data models in Snowflake using dbt.
  • Write documentation and implement tests to ensure data quality.

Skills

SQL
Python
dbt
NoSQL databases
Snowflake
Airbyte
Orchestration tools
Cloud platforms

Tools

Snowflake SQL
PostgreSQL
MongoDB
DynamoDB
Cassandra
Jenkins
Mage
Airflow
Job description
Company Description

Twisto, a Param Company, is a modern financial services provider in Central and Eastern Europe. With our app, people have full control over their payments - they can defer or split purchases, share expenses, pay invoices, or shop with virtual and physical cards. We deliver clarity, convenience, and fair exchange rates both at home and abroad. We are a company that keeps growing, innovating, and expanding our services to make everyday life easier for our customers.

About the Role

We are looking for a Data Engineer who will help design and implement our data ingestion and modelling workflows as our IT architecture evolves.

We are preparing a core system migration from a monolithic application to a microservices architecture. As part of this transition, our operational data will be spread across multiple databases (both SQL and NoSQL). Your focus will be on reliably extracting that data and shaping it into clean analytical models.

As a Data Engineer, you'll be responsible for designing, maintaining, and evolving our modern data stack - currently including Snowflake, Airbyte, dbt, and orchestration tools such as Jenkins and Mage. You’ll work closely with data analysts, IT developers, and business stakeholders to improve how we collect, model, and serve data across the organization.

This role is perfect for someone who likes to understand data at the source and build straightforward, reliable pipelines and models that others can work with easily.

Key Responsibilities
  • Designing and maintaining scalable, reliable data pipelines for batch ingestion to Snowflake using Airbyte and other ETL tools.
  • Developing, maintaining, and optimizing both new and legacy data models in Snowflake using dbt.
  • Writing clear documentation, implementing tests to ensure data quality.
  • Building and managing data orchestration workflows to ensure reliable automation and timely data availability.
  • Monitoring and managing data quality, integrity, and performance.
  • Assisting in administering and optimizing Snowflake, including IAM management, cost optimization, and ensuring data integrity and security.
  • Developing and actively contributing to the architecture of the future-state data platform.
  • Working with stakeholders to define data requirements and ensure models and pipelines meet business and compliance needs.
What Twisto Can Offer
  • A chance to help shape the future of data in a company where your work will have an immediate impact.
  • Real-world technical challenges - not everything is perfect, but you’ll have the opportunity to fix and improve things.
  • Your pipelines and models will directly support critical workflows such as regulatory reporting, financial reconciliation, and machine learning operations.
  • Be part of a small but impactful international team of supportive professionals.
  • Hybrid or remote work model (we have nice offices in Prague - Karlín and in the center of Warsaw).
  • Home office reimbursement.
  • Informal and pleasant atmosphere - we all know and respect each other, and we also have a pack of dogs.
  • Promoting a healthy lifestyle - we do offer flexible working hours, health care, drinks and fruits daily in the office, team events, up to 34 days of vacation, and an additional 5 days paid leave for parents after child birth, etc.
Must-Have Experience

3+ years in Data Engineering or a similar role in a modern data environment.

Experience in:

  • SQL (Snowflake SQL, PostgreSQL, complex queries, performance tuning).
  • Python for data transformation, automation, and integration workflows.
  • dbt (including testing, macros, packages).
  • NoSQL databases (MongoDB, DynamoDB, Cassandra, or similar).
  • Snowflake (or similar cloud data warehouse).
  • Airbyte or similar ingestion tools (Fivetran, Stitch).
  • Orchestration tools like Mage, Jenkins, Airflow, or similar.
  • Experience working with cloud platforms (eg, GCP, AWS, or similar).
  • Familiarity with data governance, version control (git), CI/CD, and documentation.
Nice-to-Have
  • Experience in a regulated environment (e.g., fintech, banking).
  • Exposure to financial data (PnL, credit, provisioning, etc.).
  • Understanding of data security, IAM, and compliance requirements in cloud environments.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.