Enable job alerts via email!

[Future opening] Senior Data Engineer (Azure / Databricks)

GetInData sp. z o.o. sp. k.

Warszawa

Remote

Full time

8 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading data company is seeking a Data Engineer who will design and maintain data architecture to support substantial data volumes. With opportunities for professional growth, this role involves collaborating on international projects and working alongside top experts in Data and AI. Candidates should have proficiency in programming languages and experience with data platforms and tools.

Benefits

Possibility to work from the office in Warsaw
Opportunity to learn with Big Data experts
International projects
Workshops and training opportunities

Qualifications

  • Proficiency in programming languages such as Python, Scala, or Java.
  • Experience with Lakehouse platforms like Databricks.
  • Knowledge of orchestration tools and CI/CD.

Responsibilities

  • Developing and committing new functionalities and tools.
  • Ensuring compliance with security and data privacy standards.
  • Conducting training and knowledge-sharing sessions.

Skills

Python
Scala
Java
Databricks
dbt
GIT
Airflow
Azure Data Factory
Docker
Terraform

Tools

DevOps practices

Job description

  • Remote
  • PLN160 - PLN200
About us

GetInData | Part of Xebia is a leading data company working for international Clients, delivering innovative projects related to Data, AI, Cloud, Analytics, ML/LLM, and GenAI. The company was founded in 2014 by data engineers and today brings together 120 Data & AI experts. Our Clients are both fast-growing scaleups and large corporations that are industry leaders. In 2022, we joined forces with Xebia Group to broaden our horizons and bring new international opportunities.

What about the projects we work with?

We run a variety of projects in which our sweepmasters can excel. Advanced Analytics, Data Platforms, Streaming Analytics Platforms, Machine Learning Models, Generative AI and more. We like working with top technologies and open-source solutions for Data & AI and ML/AI. In our portfolio, you can find Clients from many industries, e.g., media, e-commerce, retail, fintech, banking, and telcos, such as Truecaller, Spotify, ING, Acast, Volt, Play, and Allegro. You can read some customer stories here.

What else do we do besides working on projects?

We conduct many initiatives like Guilds and Labs and other knowledge-sharing initiatives. We build a community around Data & AI, thanks to our conference Big Data Technology Warsaw Summit, meetup Warsaw Data Tech Talks, Radio Data podcast, and DATA Pill newsletter.

Data & AI projects that we run and the company's philosophy of sharing knowledge and ideas in this field make GetInData | Part of Xebia not only a great place to work but also a place that provides you with a real opportunity to boost your career.

If you want to be up to date with the latest news from us, please follow up on our LinkedIn profile.

About role

We are excited to announce that we are looking for a Data Engineer! This position is vital to our company, and we are seeking candidates with outstanding skills and experience. Although there isn't an immediate project available, we invite you to reach out and explore future opportunities with us.

A Data Engineer's role involves the design, construction, and upkeep of data architecture, tools, and procedures facilitating an organization's collection, storage, manipulation, and analysis of substantial data volumes. This position involves erecting data platforms atop commonly provided infrastructure and establishing a streamlined path for Analytics Engineers who rely on the system.

Responsibilities

  1. Development and committing of new functionalities and open-source tools
  2. Implementing and enacting policies in line with the company's strategic plans regarding utilized technologies, work organization, etc.
  3. Ensuring compliance with industry standards and regulations in terms of security, data privacy applied in the data processing layer
  4. Conducting training and knowledge-sharing
Job requirements
  • Proficiency in a programming language like Python / Scala or Java
  • Knowledge of Lakehouse platforms - Databricks
  • Experience working with dbt
  • Familiarity with Version Control Systems, particularly GIT
  • Experience as a programmer and knowledge of software engineering, good principles, practices, and solutions
  • Knowledge of at least one orchestration and scheduling tool, for example, Airflow, Azure Data Factory, Prefect, Dagster
  • Familiarity with DevOps practices and tools, including Docker, Terraform, CI/CD, Azure DevOps
  • Ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement
We offer

Salary: 160 - 200 PLN net + VAT/h B2B (depending on knowledge and experience)

Possibility to work from the office located in the heart of Warsaw

Opportunity to learn and develop with the best Big Data experts

International projects

Possibility of conducting workshops and training

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.