Attiva gli avvisi di lavoro via e-mail!

Data Engineer

TN Italy

Lodi

Remoto

EUR 40.000 - 60.000

Tempo pieno

6 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading company is seeking a Data Engineer to design and implement data pipelines, ensuring data quality and performance. The role involves collaboration with teams to deliver high-quality data assets and optimize existing pipelines using modern technologies. Join a team committed to innovation and excellence.

Competenze

  • Strong software/data engineering skills and proficiency in at least one programming language.
  • Experience with CI/CD processes, data testing, and monitoring.

Mansioni

  • Design and implement data pipelines in collaboration with data users and engineering teams.
  • Ensure reliability, data quality, and optimal performance of data assets.

Conoscenze

Python
Data Engineering
Communication

Strumenti

PySpark
Databricks
Azure Data Factory

Descrizione del lavoro

Social Network Login/Join

Herzumis is seeking a Data Engineer to join our team and support one of our clients.

Key Responsibilities:
  1. Design and implement data pipelines in collaboration with data users and engineering teams.
  2. Ensure reliability, data quality, and optimal performance of data assets.
  3. Translate complex business and analytics requirements into high-quality data assets.
  4. Deliver high-quality, maintainable, and performant code.
  5. Redesign and optimize existing data pipelines using the latest technologies and best practices where applicable.
  6. Collaborate with solution engineers, data scientists, and product owners to deliver end-to-end products.
  7. Support partners with proof of concept initiatives and technical data-related questions.
Required Skills:
  • Strong software/data engineering skills and proficiency in at least one programming language, preferably Python.
  • Good knowledge of distributed computing frameworks, with PySpark being a must-have; additional frameworks are a plus.
  • Familiarity with system design, data structures, algorithms, storage systems, and cloud infrastructure.
  • Understanding of data modeling and data architecture concepts.
  • Experience with CI/CD processes, data testing, and monitoring.
  • Knowledge of Delta Lake protocol and Lakehouse architectures.
  • Experience with Databricks and Azure data services such as Azure Data Factory, Synapse Analytics, or Fabric.
Additional Skills:
  • Ability to work effectively in teams with both technical and non-technical members.
  • Strong communication skills to explain complex technical concepts clearly to non-technical audiences.
  • Proficiency in English.
  • Experience with YAML is a plus.
Work Mode:

Full remote

Join Us!

Become part of a team driven by innovation, talent, and a commitment to excellence. Your next career step starts here.

This announcement is addressed to all genders, ages, and nationalities, in accordance with applicable laws.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.