Job Overview
At Godel Technologies, we are passionate about building innovative software solutions that empower businesses around the world. We are growing and looking for a talented Data Engineer with strong Snowflake expertise to join our team. If you are interested in working with modern data technologies, solving complex problems, and making an impact — we want to hear from you! As a Data Engineer, you will be part of a collaborative and agile environment where your ideas matter. You will design, build, and maintain scalable and efficient data pipelines and solutions that support data-driven decision‑making. You’ll work closely with cross‑functional teams including Data Scientists, Architects, and Software Engineers to create reliable, secure, and high‑performing systems.
This is a hybrid position, so we expect candidates to visit an office in one of our five locations (Warsaw, Wrocław, Łódź, Gdańsk, Białystok) at least once a week.
Salary ranges
B2B 80‑145 PLN/hour
EC 10 000 – 22 000 PLN gross/month
Responsibilities
- Design, develop, and optimize scalable data pipelines and ETL/ELT processes.
- Work with structured and unstructured data from various sources.
- Collaborate with Data Scientists, Software Engineers, and Business Analysts to meet data requirements.
- Implement data quality, data validation, and monitoring practices.
- Participate in architecture and design discussions to build cloud‑native data platforms.
- Maintain and optimize data storage solutions (data lakes, data warehouses).
- Ensure the security, integrity, and availability of data.
- Support the deployment of machine learning models into production.
- Continuously improve performance, reliability, and scalability of our data systems.
Qualifications
- 3+ years in Data Engineering role.
- Strong understanding of data modeling, data warehousing, and ETL/ELT processes.
- Solid programming skills in Python.
- Knowledge of data warehousing tools: Snowflake, Redshift.
- Strong knowledge of SQL and database optimization.
- Experience in building and maintaining data pipelines using Fivetran, Matillion, dbt.
- Experience with AWS tools (S3, Lambda, Kinesis, Batch, DynamoDB, Athena, Glue, etc.).
- Ability to work with large volumes of data efficiently.
- Understanding best practices for data security and governance.
- Familiarity with Agile methodologies and working in cross‑functional teams.
- Strong analytical thinking, problem‑solving skills, and attention to detail.
- Excellent communication and teamwork skills.
- Strong verbal and written English communication skills.
Nice to have
- Experience with data processing tools: Spark (Databricks), Kafka.
- Knowledge of MS SQL (SSIS, SSAS), PostgreSQL, MySQL.
- Intention to learn Power BI for data visualization.
Benefits & Perks
- Flat structure, Small teams, Integration events, Internal trainings/meetups.
- Private healthcare, Sport subscription, Free coffee, Bike parking, Playroom, Free snacks.
- In‑house trainings, In‑house hack days, Modern office, No dress code.