Attiva gli avvisi di lavoro via e-mail!

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

JR Italy

Firenze

Remoto

EUR 50.000 - 80.000

Tempo pieno

4 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading InsurTech company based in Italy is seeking a Mid & Senior Data Engineer to join their expanding Data team. This role focuses on bridging data science and engineering, tackling complex data challenges, and driving innovation in the insurance industry. The ideal candidate will possess expertise in data processing, cloud platforms, and software engineering best practices. Join a dynamic team dedicated to transforming the insurance landscape with cutting-edge technology and a customer-first approach.

Competenze

  • Expertise in batch and distributed data processing.
  • Experience building Data Lake and Big Data analytics platforms.

Mansioni

  • Collaborate with data scientists and machine learning engineers.
  • Develop practical solutions for complex data challenges.

Conoscenze

Python
Data Processing
Data Modeling
DevOps
CI/CD

Strumenti

Kafka
Spark
Databricks
AWS
RDBMS
NoSQL

Descrizione del lavoro

Social network you want to login/join with:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based), Firenze
Client:

Futureheads Recruitment | B Corp

Location:

Firenze, Italy

Job Category:

Other

EU work permit required:

Yes

Job Reference:

683776130158690304033714

Job Views:

2

Posted:

12.05.2025

Expiry Date:

26.06.2025

Job Description:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

We have partnered with an exciting business that is growing its Data team, currently over 350 members!

They are a cutting-edge insurance company transforming the industry with a customer-first approach. Their focus is on simplicity, transparency, and innovation—covering seamless digital claims, personalized coverage, and fair, data-driven pricing. They are expanding rapidly and seek passionate individuals to join their mission of making insurance smarter and more accessible. If you desire a dynamic, forward-thinking team, this is the place for you!

The tech stack includes:

  • Python
  • Kafka/Spark
  • Databricks
  • AWS

Your role will involve bridging data science and engineering, focusing on complex data challenges. You will collaborate with data scientists and machine learning engineers to develop practical solutions that meet real business needs, driving impactful innovation and shaping future products and technology.

Key requirements:

  • Expertise in batch and distributed data processing, with experience in real-time streaming pipelines using Kafka, Flink, and Spark.
  • Experience building Data Lake and Big Data analytics platforms on cloud infrastructure.
  • Proficiency in Python, following best software engineering practices.
  • Knowledge of relational databases and data modeling, with hands-on experience with RDBMS (e.g., Redshift, PostgreSQL) and NoSQL systems.
  • Understanding of DevOps, CI/CD pipelines, and Infrastructure as Code (IaC).

Nice to have:

  • Experience with cloud platforms, especially AWS, but GCP and Azure are also acceptable.
  • Experience with Databricks is a strong plus.
  • Familiarity with streaming technologies, especially Kafka.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.