Job Search and Career Advice Platform

Enable job alerts via email!

Senior Azure Data Engineer

Cyclad Sp. z o.o.

Remote

PLN 240,000 - 320,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading IT services company is seeking an experienced Azure Data Engineer with Databricks expertise to join their remote team. You will design and maintain data platforms using the Azure analytics stack and collaborate with various stakeholders for secure data solutions. Strong experience with Azure Data Factory and background in data engineering is essential. The role offers a full-time, B2B contract, and benefits including private medical care and a multisport card.

Benefits

Private medical care with dental coverage
Multisport card
Life insurance
International work environment

Qualifications

  • Minimum 3+ years' experience with Azure Data Factory and Databricks.
  • At least 5+ years' background in data engineering or backend development.
  • Strong SQL skills in both relational and non-relational formats.
  • Experience with data transformation tools and real-time analytics on Azure.
  • Strong communication skills in English.

Responsibilities

  • Design, build, and deploy scalable data pipelines using Azure Databricks.
  • Curate structured and unstructured data through efficient pipelines.
  • Create and maintain data pipeline architecture ensuring reliability.
  • Identify and implement process improvements for data delivery.
  • Collaborate with Product Owners and data leadership for timely delivery.

Skills

Azure Data Factory
Databricks
Python
SQL
Git
CI/CD
Spark
Airflow

Tools

Azure Event Hubs
CosmosDB
Spark Streaming
Job description

In Cyclad We work with top international IT companies to boost, cutting-edge technologies that shape the world of the future. We are seeking an experienced Azure Data Engineer with Databricks expertise to join a remote development team. The role focuses on designing, building, and maintaining data platforms using the Microsoft Azure analytics stack and modern Databricks features. The candidate will collaborate closely with architects, data scientists, and business stakeholders to deliver secure, production-ready data solutions.

Project information:
  • Type of project: IT services
  • Budget: 150-180 PLN net /h - b2b
  • Only candidates with citizenship in the European Union and residence in Poland
  • Start date: ASAP (depending on candidate's availability)
Project scope:
  • Design, build, and deploy scalable data pipelines using Azure Databricks and the Azure Analytics stack
  • Curate structured, semi-structured, and unstructured data by creating efficient, cost-effective, and scalable pipelines
  • Create and maintain robust data pipeline architecture ensuring data quality, reliability, and scalability
  • Assemble and manage large, complex datasets to meet functional and non-functional business requirements
  • Identify, design, and implement process improvements, including automation and optimization of data delivery
  • Work with real-time and streaming analytics solutions where applicable
  • Collaborate closely with Product Owners, Scrum Masters, architects, and data leadership to ensure timely, high-quality delivery
  • Align with Data Engineering chapter standards, processes, and best practices
  • Apply solid software engineering practices, including unit testing, CI/CD, and version control
  • Troubleshoot complex data-related issues and perform root cause analysis
  • Ensure data security, governance, and compliance with data management frameworks
  • Data Processing & Analytics: Databricks, Spark, SQL, real-time analytics tools
  • Programming & Scripting: Python, SQL
  • Workflow & Orchestration: Airflow
Requirements:
  • Minimum 3+ years' hands-on experience with Azure Data Factory and Databricks (modern features, including Unity Catalog and 2025 capabilities)
  • At least 5+ years' experience in data engineering or backend/full-stack software development
  • Solid software engineering background: writing unit tests in Python, proficient with Git, and CI/CD pipelines
  • Strong SQL skills and experience structuring and modelling data in both relational and non-relational formats
  • Experience with data transformation tools, Spark, and real-time analytics solutions on Azure
  • Familiarity with modern cloud infrastructure and analytics tools, including Azure Event Hubs, CosmosDB, Spark Streaming, or Airflow is a plus
  • Exposure to data catalogue tools (Collibra, Alation) and data management frameworks (e.g., DAMA) is a plus
  • Strong verbal and written communication skills in English
We offer:
  • Full-time job agreement based on b2b
  • Private medical care with dental care (covering 70% of costs) + rehabilitation package.; family package option possible
  • Multisport card (also for an accompanying person)
  • Life insurance
  • Flexibility and international environment
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.