Enable job alerts via email!

Middle/Senior Data Engineer (Snowflake) @ Godel Technologies Europe

Godel Technologies Europe

Białystok

Hybrid

PLN 120,000 - 180,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology company in Poland seeks a talented Data Engineer to build scalable data pipelines and collaborate with cross-functional teams. Candidates should have 3+ years of experience, strong skills in Snowflake, SQL, and Python, and the ability to work with large datasets in a hybrid work environment.

Benefits

Private healthcare
Sport subscription
Free coffee
Bike parking
Playroom
Free snacks
Internal trainings
No dress code
Integration events

Qualifications

  • 3+ years in a Data Engineering role required.
  • Strong understanding of data modeling, data warehousing, and ETL/ELT processes.
  • Solid programming skills in Python and knowledge of SQL.

Responsibilities

  • Design and optimize scalable data pipelines and ETL/ELT processes.
  • Collaborate with Data Scientists and Software Engineers to meet data requirements.
  • Ensure data security, integrity, and availability.

Skills

Data modeling
Data warehousing
ETL/ELT processes
Python
Snowflake
SQL
Data pipeline construction
AWS tools
Agile methodologies
Analytical thinking
Communication skills

Tools

Snowflake
Fivetran
Matillion
dbt
AWS S3
AWS Lambda
AWS Kinesis
AWS Athena
Job description
Job Overview

At Godel Technologies, we are passionate about building innovative software solutions that empower businesses around the world. We are growing and looking for a talented Data Engineer with strong Snowflake expertise to join our team. If you are interested in working with modern data technologies, solving complex problems, and making an impact — we want to hear from you! As a Data Engineer, you will be part of a collaborative and agile environment where your ideas matter. You will design, build, and maintain scalable and efficient data pipelines and solutions that support data-driven decision‑making. You’ll work closely with cross‑functional teams including Data Scientists, Architects, and Software Engineers to create reliable, secure, and high‑performing systems.

This is a hybrid position, so we expect candidates to visit an office in one of our five locations (Warsaw, Wrocław, Łódź, Gdańsk, Białystok) at least once a week.

Salary ranges
B2B 80‑145 PLN/hour
EC 10 000 – 22 000 PLN gross/month

Responsibilities
  • Design, develop, and optimize scalable data pipelines and ETL/ELT processes.
  • Work with structured and unstructured data from various sources.
  • Collaborate with Data Scientists, Software Engineers, and Business Analysts to meet data requirements.
  • Implement data quality, data validation, and monitoring practices.
  • Participate in architecture and design discussions to build cloud‑native data platforms.
  • Maintain and optimize data storage solutions (data lakes, data warehouses).
  • Ensure the security, integrity, and availability of data.
  • Support the deployment of machine learning models into production.
  • Continuously improve performance, reliability, and scalability of our data systems.
Qualifications
  • 3+ years in Data Engineering role.
  • Strong understanding of data modeling, data warehousing, and ETL/ELT processes.
  • Solid programming skills in Python.
  • Knowledge of data warehousing tools: Snowflake, Redshift.
  • Strong knowledge of SQL and database optimization.
  • Experience in building and maintaining data pipelines using Fivetran, Matillion, dbt.
  • Experience with AWS tools (S3, Lambda, Kinesis, Batch, DynamoDB, Athena, Glue, etc.).
  • Ability to work with large volumes of data efficiently.
  • Understanding best practices for data security and governance.
  • Familiarity with Agile methodologies and working in cross‑functional teams.
  • Strong analytical thinking, problem‑solving skills, and attention to detail.
  • Excellent communication and teamwork skills.
  • Strong verbal and written English communication skills.
Nice to have
  • Experience with data processing tools: Spark (Databricks), Kafka.
  • Knowledge of MS SQL (SSIS, SSAS), PostgreSQL, MySQL.
  • Intention to learn Power BI for data visualization.
Benefits & Perks
  • Flat structure, Small teams, Integration events, Internal trainings/meetups.
  • Private healthcare, Sport subscription, Free coffee, Bike parking, Playroom, Free snacks.
  • In‑house trainings, In‑house hack days, Modern office, No dress code.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.