Enable job alerts via email!

Data Engineer

Amelco

Warszawa

Hybrid

PLN 120,000 - 150,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company based in Poland is seeking a Data Engineer to design and maintain scalable data pipelines. The ideal candidate will have a minimum of 3 years' experience in data engineering, strong SQL skills, and familiarity with AWS and Kafka. This role offers a hybrid work model and opportunities to work with advanced data technologies.

Qualifications

  • Minimum 3+ years of experience as a Data Engineer or in a similar role.
  • Strong proficiency in SQL and relational databases.
  • Solid understanding of data warehousing principles.

Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines for large-scale data.
  • Implement data models in Snowflake and ensure data governance.
  • Optimize SQL and Python transformations for performance.

Skills

SQL
Data Engineering
Data Warehousing
AWS
Kafka
Python
Power BI
CI/CD
Data Governance
Dimensional Modelling

Tools

Snowflake
PostgreSQL
Redshift
Airflow
Terraform
Job description

Based in: Warsaw – Focus, av. Armii Ludowej 26 / hybrid

Role Overview

As aData Engineer, you will design, build, and maintain scalable data pipelines and architectures that power our analytical and operational systems. You’ll work closely with Data Analysts, BI Developers, and Software Engineers to ensuredata quality, performance, and scalabilityacross all environments — from real-time streaming to batch processing. You’ll be part of a modern data ecosystem leveragingAWS, Snowflake, Spark, Airflow, and Kafka, supporting mission-critical reporting and predictive analytics across the business.

Key Responsibilities
  • Design, develop, and maintainETL/ELT pipelinesfor ingesting and transforming large-scale data from multiple sources (betting, casino, player, finance, CRM, and external feeds).
  • Implement data models and schemas (Bronze/Silver/Gold layers) inSnowflake / Redshift / Postgres, ensuring data consistency and governance.
  • Work onreal-time streaming and event-driven architecturesusing Kafka, Solace, or similar tools.
  • Optimize SQL and Python-based data transformations for performance and scalability.
  • Develop data validation, monitoring, and alerting frameworks.
  • Collaborate with BI developers to deliverPower BIand analytical datasets.
  • Partner with stakeholders across Trading, Finance, Compliance, and Risk to deliver high-impact data solutions.
  • Contribute todata quality standards, documentation, and CI/CD automation of pipelines (GitHub).
Skills & Experience
  • Minimum 3+ years of experienceas a Data Engineer or in a similar data-focused role.
  • Strong proficiency inSQLand experience with relational databases (PostgreSQL, Snowflake, Redshift, BigQuery, etc.).
  • Solid understanding ofdata warehousing principlesand dimensional modelling.
  • Familiarity withcloud platforms(AWS, Azure, or GCP).
  • Strong problem-solving skills and ability to work in fast-paced, data-intensive environments.
  • Experience withreal-time data streaming(Kafka, Kinesis, or Pub/Sub).
  • Knowledge ofsports betting or iGaming data models(bets, markets, transactions, GGR, etc.).
  • Exposure todata governance,data observability, orcatalogue tools
  • Experience withCI/CD for data(GitHub Actions, Terraform).
  • Familiarity withPower BIor other BI tools.
If this sounds interesting, please submit your CV in English using the Apply button.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.