Attiva gli avvisi di lavoro via e-mail!

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

JR Italy

Pescara

Remoto

EUR 50.000 - 80.000

Tempo pieno

4 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A growing InsurTech company is seeking a Mid & Senior Data Engineer to join their expanding Data team. This remote role focuses on transforming the insurance industry with innovative solutions. Candidates will work with cutting-edge technologies to solve complex data challenges and enhance customer experience.

Competenze

  • Deep expertise in batch and distributed data processing.
  • Experience building Data Lakes and Big Data platforms.
  • Proficiency in Python and best software engineering practices.

Mansioni

  • Bridge data science and engineering, working on complex data challenges.
  • Collaborate with data scientists and machine learning engineers.
  • Drive innovation and shape product and technology futures.

Conoscenze

Python
Kafka
Spark
Data Processing
DevOps

Strumenti

AWS
Databricks
Redshift
PostgreSQL

Descrizione del lavoro

Social network you want to login/join with:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based), Pescara
Client:

Futureheads Recruitment | B Corp

Location:

Remote (Italy based)

Job Category:

Other

EU work permit required:

Yes

Job Reference:

6837761301586903040337143

Job Views:

2

Posted:

12.05.2025

Expiry Date:

26.06.2025

Job Description:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

We have partnered with an exciting business that is growing its Data team, currently over 350 members!

They are a cutting-edge insurance company transforming the industry with a customer-first approach, focusing on simplicity, transparency, and innovation—digital claims, personalized coverage, and data-driven pricing. They seek passionate individuals to join their mission of making insurance smarter and more accessible.

The tech stack:

  • Python
  • Kafka/Spark
  • Databricks
  • AWS

You will bridge data science and engineering, working on complex data challenges, collaborating with data scientists and machine learning engineers to develop solutions that meet business needs, driving innovation and shaping product and technology futures.

Key requirements:

  • Deep expertise in batch and distributed data processing, including real-time streaming with Kafka, Flink, and Spark.
  • Experience building Data Lakes and Big Data platforms on cloud infrastructure.
  • Proficiency in Python, following best software engineering practices.
  • Experience with relational databases (Redshift, PostgreSQL) and NoSQL systems.
  • Understanding of DevOps, CI/CD, and Infrastructure as Code (IaC).

Nice to haves:

  • Experience with AWS (others like GCP and Azure are a plus).
  • Experience with Databricks.
  • Knowledge of streaming technologies, especially Kafka.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.