Attiva gli avvisi di lavoro via e-mail!

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

JR Italy

Belluno

Remoto

EUR 50.000 - 80.000

Tempo pieno

10 giorni fa

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading InsurTech company is seeking a Mid & Senior Data Engineer to join their growing Data team. This remote role focuses on bridging data science and engineering to tackle complex data challenges. The ideal candidate will have expertise in data processing, proficiency in Python, and experience with cloud platforms. Join a dynamic team dedicated to making insurance smarter and more accessible.

Competenze

  • Deep expertise in batch and distributed data processing.
  • Experience building Data Lake and Big Data analytics platforms.
  • Proficiency in Python and software engineering best practices.

Mansioni

  • Bridge data science and engineering to solve complex data challenges.
  • Collaborate with data scientists and ML engineers.
  • Drive innovation and shape future products and technology.

Conoscenze

Data Processing
Python
Data Modeling
DevOps

Strumenti

Kafka
Spark
AWS
Databricks

Descrizione del lavoro

Social network you want to login/join with:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based), Belluno
Client:

Futureheads Recruitment | B Corp

Location:

Italy (Remote, based in Belluno)

Job Category:

Other

EU work permit required:

Yes

Job Reference:

6837761301586903040337151

Job Views:

2

Posted:

12.05.2025

Expiry Date:

26.06.2025

Job Description:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

We have partnered with an exciting business that is growing its Data team, currently over 350 members!

They are a cutting-edge insurance company transforming the industry with a customer-first approach. Their focus is on simplicity, transparency, and innovation—covering seamless digital claims, personalized coverage, and data-driven pricing. They are expanding rapidly and seek passionate individuals to join their mission of making insurance smarter and more accessible. If you seek a dynamic, forward-thinking team, this is the place to be!

The tech stack includes:

  • Python
  • Kafka/Spark
  • Databricks
  • AWS

Your role will involve bridging data science and engineering, focusing on complex data challenges. You’ll collaborate with data scientists and ML engineers to develop practical solutions that meet real business needs, driving innovation and shaping future products and technology.

Key requirements:
  • Deep expertise in batch and distributed data processing, including near real-time streaming pipelines with Kafka, Flink, and Spark.
  • Experience building Data Lake and Big Data analytics platforms on cloud infrastructure.
  • Proficiency in Python, following software engineering best practices.
  • Experience with relational databases and data modeling, including RDBMS (e.g., Redshift, PostgreSQL) and NoSQL systems.
  • Understanding of DevOps, CI/CD pipelines, and Infrastructure as Code (IaC).
Nice to haves:
  • Experience with cloud platforms (AWS preferred, GCP and Azure are also considered).
  • Experience with Databricks is a strong plus.
  • Familiarity with streaming technologies, especially Kafka.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.