Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer with Java

ING

Katowice

On-site

PLN 120,000 - 180,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading financial services provider in Poland is seeking a skilled Data Engineer to design and build data processing pipelines using Java. The ideal candidate will have hands-on experience in ETL development and be comfortable working in dynamic environments. Responsibilities include migrating existing pipelines and ensuring data quality through automated testing. This position is part of a collaborative team focused on integrating sustainable practices into business solutions.

Qualifications

  • Hands-on experience in building robust ETL pipelines.
  • Experience migrating ETL processes to Java‑based backends.
  • Excellent communication skills to articulate complex problems simply.

Responsibilities

  • Design and build new scalable data processing pipelines using Java.
  • Migrate existing data pipelines to a Java-based backend.
  • Ensure high data quality through automated testing strategies.

Skills

Proficiency in Java or JVM based languages
ETL/data pipeline development
Automated testing and data quality frameworks
Software design patterns understanding
Big data processing frameworks (e.g., Spark)
CI/CD pipelines (preferably Azure DevOps)
Workflow orchestration tools (e.g., Airflow)
Collaboration skills

Tools

Azure DevOps
Airflow
Containerization
Job description
ING Hubs Poland is hiring!

The expected salary for this position: 13 000 - 20 000 PLN

The financial ranges specified in the announcement are adjusted and may differ from the range specified in the remuneration regulations.

We are seeking a skilled and forward-thinking Data Engineer to join our team. This role is ideal for someone who thrives in dynamic environments, brings hands‑on experience in building robust ETL pipelines, and is eager to contribute fresh perspectives to improve our engineering practices.

Your primary focus will be on designing and building new data processing pipelines using Java, while also migrating existing data pipelines to a Java-based backend. You will leverage modern technologies and best practices to ensure scalability, performance, and maintainability, helping us deliver high‑quality data solutions that power critical business decisions.

We’re not just looking for someone to adapt to our current ways of working. We want someone who brings real‑world experience, innovative thinking, and the confidence to challenge and improve our current solutions and practices.

We are looking for you, if you have:
  • Proficiency in Java or any other JVM based language.
  • Proven experience in ETL/data pipeline development beyond SQL, stored procedures or backend-only work.
  • Experience with automated testing and data quality frameworks.
  • Solid understanding of software design patterns and object‑oriented programming.
  • Familiarity with big data processing frameworks (e.g., Spark).
  • Familiarity with CI/CD pipelines, preferably in Azure DevOps.
  • Comfortable working with workflow orchestration tools like Airflow.
  • Ability to articulate complex technical problems in a clear and simple manner.
  • Strong collaboration skills across functional and interdisciplinary teams.
  • Confidence to challenge existing practices and propose improvements.
  • Relevant experience and fresh perspectives from recent roles.
You’ll get extra points for:
  • Experience migrating ETL processes to a Java‑based backend.
  • Knowledge of containerization and how to leverage DTAP environments.
  • Familiarity with code style standards and best engineering practices.
  • Prior experience in cross‑functional teams delivering high‑impact solutions.
Your responsibilities:
  • Design and build new scalable data processing pipelines using Java.
  • Migrate existing data pipelines to a Java‑based backend.
  • Ensure high data quality through automated testing strategies (unit, integration, system).
  • Develop and maintain CI/CD pipelines (using Azure DevOps).
  • Apply best engineering practices including containerization, DTAP environments, and code style standards.
  • Use workflow orchestration tools like Stonebranch UAC to manage data pipelines.
  • Contribute to software design using object‑oriented principles and design patterns.
  • Collaborate cross‑functionally with interdisciplinary teams to deliver high‑impact solutions.
Information about the squad:

You will join the ESG Product Area within ING’s Wholesale Banking Sustainability Tribe. Our mission is to integrate environmental, social, and governance (ESG) principles into all aspects of our business and help clients transition toward sustainable practices. The team is dynamic, collaborative, and focused on delivering cutting‑edge solutions for measuring and managing emissions, client transition plans, ESG‑related data, and business intelligence. Together, we aim to maintain ING’s leadership in financing sustainable transitions.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.