Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer with Java

ING Group

Katowice

On-site

PLN 120,000 - 180,000

Full time

9 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading financial services provider is seeking a skilled Data Engineer to join their team in Katowice. This role focuses on designing and building scalable data processing pipelines using Java, while migrating existing data pipelines to a Java-based backend. The ideal candidate will have extensive experience in ETL pipeline development, collaborative skills, and familiarity with modern data processing frameworks. You will be part of a dynamic team that integrates sustainable practices into business solutions.

Qualifications

  • Proficiency in Java or any JVM-based language.
  • Experience in ETL/data pipeline development beyond SQL.
  • Familiarity with automated testing and data quality frameworks.

Responsibilities

  • Design and build new scalable data processing pipelines using Java.
  • Migrate existing data pipelines to a Java-based backend.
  • Ensure high data quality through automated testing strategies.

Skills

Java
ETL/data pipeline development
Automated testing
Software design patterns
Big data processing frameworks
CI/CD pipelines
Airflow
Collaboration skills

Tools

Azure DevOps
Stonebranch UAC
Spark
Containerization
Job description
ING Hubs Poland is hiring!

The expected salary for this position: 13 000 - 20 000 PLN


The financial ranges specified in the announcement are adjusted and may differ from the range specified in the remuneration regulations.


We are seeking a skilled and forward-thinking Data Engineer to join our team. This role is ideal for someone who thrives in dynamic environments, brings hands‑on experience in building robust ETL pipelines, and is eager to contribute fresh perspectives to improve our engineering practices.


Your primary focus will be on designing and building new data processing pipelines using Java, while also migrating existing data pipelines to a Java‑based backend. You will leverage modern technologies and best practices to ensure scalability, performance, and maintainability, helping us deliver high‑quality data solutions that power critical business decisions.


We’re not just looking for someone to adapt to our current ways of working. We want someone who brings real‑world experience, innovative thinking, and the confidence to challenge and improve our current solutions and practices.


We are looking for you, if you have:



  • Proficiency in Java or any other JVM based language.

  • Proven experience in ETL/data pipeline development beyond SQL, stored procedures or backend‑only work.

  • Experience with automated testing and data quality frameworks.

  • Solid understanding of software design patterns and object‑oriented programming.

  • Familiarity with big data processing frameworks (e.g., Spark).

  • Familiarity with CI/CD pipelines, preferably in Azure DevOps.

  • Comfortable working with workflow orchestration tools like Airflow.

  • Ability to articulate complex technical problems in a clear and simple manner.

  • Strong collaboration skills across functional and interdisciplinary teams.

  • Confidence to challenge existing practices and propose improvements.

  • Relevant experience and fresh perspectives from recent roles.


You’ll get extra points for:



  • Experience migrating ETL processes to a Java‑based backend.

  • Knowledge of containerization and how to leverage DTAP environments.

  • Familiarity with code style standards and best engineering practices.

  • Prior experience in cross‑functional teams delivering high‑impact solutions.


Your responsibilities:



  • Design and build new scalable data processing pipelines using Java.

  • Migrate existing data pipelines to a Java‑based backend.

  • Ensure high data quality through automated testing strategies (unit, integration, system).

  • Develop and maintain CI/CD pipelines (using Azure DevOps).

  • Apply best engineering practices including containerization, DTAP environments, and code style standards.

  • Use workflow orchestration tools like Stonebranch UAC to manage data pipelines.

  • Contribute to software design using object‑oriented principles and design patterns.

  • Collaborate cross‑functionally with interdisciplinary teams to deliver high‑impact solutions.


Information about the squad:


You will join the ESG Product Area within ING’s Wholesale Banking Sustainability Tribe. Our mission is to integrate environmental, social, and governance (ESG) principles into all aspects of our business and help clients transition toward sustainable practices. The team is dynamic, collaborative, and focused on delivering cutting‑edge solutions for measuring and managing emissions, client transition plans, ESG‑related data, and business intelligence. Together, we aim to maintain ING’s leadership in financing sustainable transitions.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.