Attiva gli avvisi di lavoro via e-mail!

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

JR Italy

Terni

Remoto

EUR 60.000 - 80.000

Tempo pieno

4 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A dynamic InsurTech company is seeking a Mid & Senior Data Engineer to join their expanding Data team. This remote role focuses on transforming the insurance industry through innovative data solutions. You will work closely with data scientists and engineers to tackle complex challenges, utilizing a cutting-edge tech stack including Python, Kafka, and AWS. If you are passionate about making insurance smarter and more accessible, this is the opportunity for you.

Competenze

  • Expertise in batch and distributed data processing.
  • Experience building Data Lake and Big Data analytics platforms.
  • Proficiency in Python with strong software engineering practices.

Mansioni

  • Bridge the gap between data science and engineering.
  • Collaborate with data scientists and machine learning engineers.
  • Drive impactful innovation in products and technology.

Conoscenze

Python
Data Processing
Data Modeling
DevOps
CI/CD

Strumenti

Kafka
Spark
AWS
Databricks
PostgreSQL

Descrizione del lavoro

Social network you want to login/join with:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based), Terni
Client:

Futureheads Recruitment | B Corp

Location:

Italy (Remote, based in Terni)

Job Category:

Other

EU work permit required:

Yes

Job Reference:

6837761301586903040337167

Job Views:

2

Posted:

12.05.2025

Expiry Date:

26.06.2025

Job Description:

Mid & Senior Data Engineer - InsurTech - Remote (Italy based)

We have partnered with an exciting business that is growing its Data team, which currently has over 350 members!

They are a cutting-edge insurance company transforming the industry with a customer-first approach. Their focus is on simplicity, transparency, and innovation—offering seamless digital claims, personalized coverage, and fair, data-driven pricing. They are expanding rapidly and seek passionate individuals to join their mission of making insurance smarter and more accessible. If you’re looking for a dynamic, forward-thinking team, this is the place to be!

The tech stack:

  • Python
  • Kafka/Spark
  • Databricks
  • AWS

In this role, you will bridge the gap between data science and engineering. You’ll work on complex data challenges, collaborating closely with data scientists and machine learning engineers to develop practical solutions that meet real business needs. Your work will drive impactful innovation and influence the future of our products and technology.

Key requirements:

  • Deep expertise in batch and distributed data processing, as well as near real-time streaming pipelines using technologies like Kafka, Flink, and Spark.
  • Proven experience building Data Lake and Big Data analytics platforms on cloud infrastructure.
  • Proficiency in Python with strong software engineering practices.
  • Experience with relational databases and data modeling, including RDBMS (e.g., Redshift, PostgreSQL) and NoSQL systems.
  • Understanding of DevOps, CI/CD pipelines, and Infrastructure as Code (IaC) practices.

Nice to haves:

  • Experience with cloud platforms (they use AWS but GCP and Azure are also considered).
  • Experience with Databricks is a strong plus.
  • Experience with streaming technologies, especially Kafka.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.