Position: Data Engineer
Location: Krakow, Wroclaw, Warsaw
Type of work: Hybrid (2-3 days per week in office) - obligatory
Contract: only contract of employment
Salaries: base + financial bonus
Please note: for this role, we cannot consider candidates who require company support to legally live and work in Poland
The Role
You’ll be a senior contributor in our Data Engineering team, working across various projects, building out key elements of our data platform using a range of modern tools including Snowflake, dbt, and Airflow. You’ll help us minimize our cloud costs, drive best practices across all our Data Disciplines, and scale and automate our data governance.
What you’ll be working on:
- Support the building of robust data models downstream of backend services (mostly in Snowflake) that support internal reporting, financial, and regulatory use cases.
- Focus on optimization of our Data Warehouse, spotting opportunities to reduce complexity and cost.
- Help define and manage best practices for our Data Warehouse. This may include payload design of source data, logical data modeling, implementation, metadata, and testing standards.
- Set standards and ways of working with data across VXBS, working collaboratively with others to make it happen.
- Take established best practices and standards defined by the team and apply them within other areas of the business.
- Investigate and effectively work with colleagues from other disciplines to monitor and improve data quality within the warehouse.
- Contribute to prioritization of data governance issues.
You should Apply if
- You have experience and a passion for Data Modeling, ETL projects, and Big Data as a developer or engineer
- You have good experience in Python, Java, or similar languages
- You have proven experience with AWS
- SQL and data modeling is second nature to you
- You are comfortable with general Data Warehousing concepts
- You strive for improvement in your work and that of others, proactively identifying issues and opportunities
- You have experience building robust and reliable data sets requiring a high level of control
- You have experience working with IaC tools such as Terraform, AWS CloudFormation, or Ansible
Nice to have:
- Any experience working within a finance function or knowledge of accounting.
- Experience working in a highly regulated environment (e.g., finance, gaming, food, healthcare).
- Knowledge of regulatory reporting and treasury operations in retail banking
- Have previously used dbt, Databricks, or similar tooling
- Experience working with orchestration frameworks such as Airflow/Prefect
- Design and implementation knowledge of stream processing frameworks like Flink, Spark Streaming, etc.
- Used to AGILE ways of working (Kanban, Scrum)