Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer / Data Architect

Experis ManpowerGroup Sp. z o.o.

Wrocław

Remote

PLN 254,000 - 383,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions provider is seeking a Senior Data Engineer / Data Architect to join their remote team. This role involves designing scalable data pipelines using Databricks, collaborating with cross-functional teams, and implementing CI/CD principles. With a focus on Big Data technologies, the ideal candidate will have over 8 years of experience and strong skills in Python and SQL. This is a unique opportunity to work on impactful projects within a dynamic environment.

Benefits

Medicover healthcare package
Multisport card
Access to an e-learning platform
Group life insurance

Qualifications

  • Minimum 8 years of experience in Data Engineering.
  • At least 4 years of hands-on experience with Databricks.
  • Strong analytical and problem-solving skills.

Responsibilities

  • Design and implement scalable data pipelines using Databricks and Python.
  • Collaborate with cross-functional teams using Agile methodologies.
  • Implement CI/CD and DevOps principles in data engineering workflows.

Skills

Databricks
Big Data technologies
Python
SQL
Agile methodologies
Data Lakes
CI/CD
Data Warehousing
Job description

Work Model: 100% Remote
Start Date: ASAP / within 1 month / flexible
Contract Type: B2B

We are looking for a highly skilled Senior Data Engineer / Data Architect to join our remote team. This is a great opportunity for someone with deep expertise in Databricks and Big Data technologies to work on impactful projects in a dynamic and collaborative environment.

Responsibilities:
  • Design and implement scalable data pipelines using Databricks and Python
  • Work with structured, semi-structured, and unstructured data
  • Develop and optimize data warehousing and ETL processes
  • Apply distributed data processing techniques and software engineering best practices
  • Collaborate with cross-functional teams using Agile methodologies
  • Implement CI/CD and DevOps principles in data engineering workflows
Requirements:
  • Minimum 8 years of experience in Data Engineering
  • At least 4 years of hands-on experience with Databricks, including Unity Catalog and data pipelines
  • Minimum 2 years of experience in Big Data environments
  • Strong proficiency in Python and SQL
  • Solid background in data warehousing, ETL, distributed systems, and data modeling
  • Experience with relational and non-relational databases
  • Familiarity with concepts such as Data Lakes, Data Warehouses, Data Marts, and Data Mesh
  • Experience with at least one public cloud platform (Azure, AWS, or GCP)
  • Strong analytical and problem-solving skills in Big Data contexts
  • Fluent English (verbal and written)
  • Experience working with Agile methodologies (Scrum, Kanban) and DevOps practices
What We Offer:
  • Medicover healthcare package
  • Multisport card
  • Access to an e-learning platform
  • Group life insurance
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.