Attiva gli avvisi di lavoro via e-mail!

Data Engineer

RED Global

Treviso

Remoto

EUR 40.000 - 70.000

Tempo pieno

Ieri
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Inizia da zero o importa un CV esistente

Descrizione del lavoro

A leading company is seeking a Data Engineer for remote work on banking and insurance projects. This role requires expertise in Python, SQL, and Azure Data Factory, aimed at designing and implementing advanced data solutions. Ideal candidates will bring experience from complex data environments and have a strong focus on quality and safety.

Competenze

  • Experience in a complex data environment supporting operational systems.
  • Fluency in Python and SQL required, with knowledge of Data Analytics.
  • Experience with Azure DevOps and ETL tools is advantageous.

Mansioni

  • Design and implement data solutions within the team on banking/insurance projects.
  • Support the full data lifecycle, ensuring quality and robustness.
  • Work independently in an agile setup, contributing to big data efforts.

Conoscenze

Python
SQL
Data Analytics
Azure Data Factory

Descrizione del lavoro

On behalf of a Key Client Partner in Italy, I am currently searching for Data Engineers to be joining new banking / insurance projects.

  • Please find the details below;

Role : Data Engineer - Azure Duration : 6 months + Possible extension Location : 100% Remote Capacity : Full time English B2 (Italian nice to have but not a must)

Scope : You are an integrated part of the team. By working on our platform, you will gain a deep insight into data solutions and business requirements . You will support the design and implementation of our future setup and your expertise will help us to make the right decisions, implement new solutions and smooth migrations on our journey.

  • Skills;

Senior Data Engineer packed with experience to enrich the team Experienced and motivated to work in a complex data environment by supporting the operational system and participating in the development of the new system Excellent understanding of the context in data management, the entire data lifecycle as well as processes and tools involved High awareness and demand for safety, quality and robustness Experience with ETL and respective tools No aversion to SQL, Informatica, ControlM and the like Performing in an agile setup and used to working independently and in a team eager to apply your experience with big data and learn new skills Experience with a range of data engineering toolset and scripting languages Experience with Azure DevOps, Terraform, Data Factory, Synapse, Spark, PowerShell, Python is an advantage

Must have : Python Sql Data Analytics Azure Data Factory

If you are interested and available, please apply or send an email to for immediate consideration.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.