Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer @ Devire

Devire

Polska

Remote

PLN 120,000 - 180,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

An IT outsourcing company is seeking an experienced Senior Data Engineer for a remote role focused on automating analytics for a UK-based insurance client. Responsibilities include designing and optimizing data pipelines on AWS and ensuring data quality and governance. The ideal candidate has strong skills in AWS, SQL, and Python. The role offers a competitive rate and flexible working hours.

Benefits

Medical healthcare
Sports membership
Flexible working hours

Qualifications

  • Proficiency in AWS (Lambda, Redshift, Glue, S3).
  • Advanced SQL and Python skills for building pipelines and automation.
  • Experience with Snowflake and API integration.
  • Knowledge of TypeScript for technical documentation.
  • Familiarity with automation tools (GitHub Actions).
  • Experience with data quality management (iCEDQ, Collibra).
  • Strong problem-solving skills with a focus on quality and details.

Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL/ELT processes.
  • Develop, optimise and automate data processing and quality checks.
  • Implement and monitor data quality and governance controls.
  • Use CI/CD practices and automation tools for reliability.
  • Collaborate with cross-functional teams for technical documentation.

Skills

AWS (Lambda, Redshift, Glue, S3)
Advanced SQL
Python
Snowflake
API integration
TypeScript
GitHub Actions
Data quality management (iCEDQ, Collibra)
CI/CD best practices
Strong communication skills
Job description
Senior Data Engineer @ Devire – Polska

Devire IT Outsourcing is a B2B cooperation contracted for IT professionals.

Our client is a UK-based insurance company seeking an experienced Senior Data Engineer for a long‑term assignment (minimum 12 months) within the investment research team. The main objective is to automate investment analytics, empowering analysts to focus on high‑value activities and improve investment decision quality.

Key Details

  • Salary: 155‑175 PLN/hour (net + VAT, B2B)
  • Location Remote
  • Language: English daily
  • Additional benefits: Medical healthcare, sports membership, etc.
  • Long‑term cooperation, flexible working hours

Responsibilities

  • Design, build and maintain scalable data pipelines and ETL/ELT processes on AWS (Lambda, Glue, S3, Redshift) and Snowflake.
  • Develop, optimise and automate data processing and quality checks using advanced SQL and Python, including integration with internal and external APIs.
  • Implement and monitor data quality and governance controls (iCEDQ, Collibra), ensuring secure handling and proper documentation of data assets.
  • Use CI/CD practices and automation tools (GitHub Actions) to reliably deploy and maintain data solutions in line with security and coding standards.
  • Collaborate with cross‑functional teams and stakeholders, providing clear communication, technical documentation (including TypeScript where needed), realistic estimates and regular status updates.

Qualifications

  • Proficiency in AWS (Lambda, Redshift, Glue, S3)
  • Advanced SQL and Python skills for building pipelines, data transformation, quality checks, automation, integration
  • Experience with Snowflake and API integration
  • Knowledge of TypeScript and creating/maintaining technical documentation
  • Familiarity with automation tools (GitHub Actions)
  • Experience with data quality management (iCEDQ, Collibra)
  • Experience with CI/CD best practices and secure data management
  • Strong communication skills – clear, concise verbal and written communication, able to explain complex concepts to non‑technical stakeholders.
  • Systematic problem solving, quality and detail orientation, security awareness.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.