Attiva gli avvisi di lavoro via e-mail!

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

JR Italy

Modena

Remoto

EUR 60.000 - 80.000

Tempo pieno

11 giorni fa

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading InsurTech company is seeking a Mid & Senior Data Engineer to join their innovative team. This remote position focuses on bridging data science and engineering, tackling complex data challenges, and driving impactful innovation in the insurance sector. The ideal candidate will have expertise in Python, data processing technologies, and cloud infrastructure, contributing to a mission of making insurance smarter and more accessible.

Competenze

  • Deep expertise in batch and distributed data processing.
  • Proven experience building Data Lake and Big Data analytics platforms.

Mansioni

  • Collaborate with data scientists and machine learning engineers.
  • Build practical solutions addressing real business needs.

Conoscenze

Python
Data Modeling
DevOps

Strumenti

Kafka
Spark
Databricks
AWS

Descrizione del lavoro

Social network you want to login/join with:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based), modena

col-narrow-left

Client:

Futureheads Recruitment | B Corp

Location:
Job Category:

Other

-

EU work permit required:

Yes

col-narrow-right

Job Reference:

6837761301586903040337112

Job Views:

2

Posted:

12.05.2025

Expiry Date:

26.06.2025

col-wide

Job Description:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

We have partnered with a exciting business who are in the process of growing out their Data team, with an engineering department of over 350 currently!

They’re a cutting-edge insurance company that’s transforming the industry with a customer-first approach. Everything they do is built around simplicity, transparency, and innovation—whether it’s seamless digital claims, personalised coverage, or fair, data-driven pricing. They’re growing fast and looking for passionate people to join their mission of making insurance smarter and more accessible. If you’re looking for a dynamic, forward-thinking team, this is the place to be!

The tech stack:

  • Python
  • Kafka/Spark
  • Databricks
  • AWS

You'll be helping bridge the gap between data science and engineering. Focusing on complex data challenges, you’ll collaborate closely with data scientists and machine learning engineers to build practical, technical solutions that address real business needs. Your work will drive impactful innovation and help shape the future of our products and technology.

Key requirements:

  • Deep expertise in batch and distributed data processing, as well as near real-time streaming pipelines using technologies like Kafka, Flink, and Spark.
  • Proven experience building Data Lake and Big Data analytics platforms on cloud infrastructure
  • Proficient in Python with strong adherence to software engineering best practices.
  • Skilled in relational databases and data modeling, with hands-on experience in RDBMS (e.g., Redshift, PostgreSQL) and NoSQL systems.
  • Solid understanding of DevOps, CI/CD pipeline management, and Infrastructure as Code (IaC) using industry-standard practices.

Nice to haves include:

  • Experience with cloud platforms (they are using AWS) but open to GCP and Azure
  • Experience with Databricks is a strong plus.
  • Streaming technologies (Kafka is they're go to)
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.