Attiva gli avvisi di lavoro via e-mail!

Data Engineer

TN Italy

Como

Remoto

EUR 40.000 - 60.000

Tempo pieno

4 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading company is seeking a Data Engineer to join their team. This role involves designing data pipelines, ensuring data quality, and collaborating with various teams to deliver high-quality data solutions. The position is fully remote, allowing for flexibility and innovation in a supportive environment.

Competenze

  • Proficiency in at least one programming language, preferably Python.
  • Experience with CI/CD, data testing, and monitoring.

Mansioni

  • Design and implement data pipelines in collaboration with teams.
  • Ensure reliability, data quality, and optimal performance of data assets.

Conoscenze

Software Engineering
Data Engineering
Communication
Problem Solving

Strumenti

Python
PySpark
Databricks
Azure Data Factory
Synapse Analytics

Descrizione del lavoro

Social Network Platform - Data Engineer Role

Herzumis is seeking a Data Engineer to join our team and support one of our clients.

Key Responsibilities:
  1. Design and implement data pipelines in collaboration with data users and engineering teams.
  2. Ensure reliability, data quality, and optimal performance of data assets.
  3. Translate complex business and analytics requirements into high-quality data assets.
  4. Deliver high-quality, maintainable code with a focus on simplicity and performance.
  5. Redesign and implement existing data pipelines using the latest technologies and best practices.
  6. Collaborate with solution engineers, data scientists, and product owners to deliver end-to-end products.
  7. Support partners with proof of concept initiatives and address data-related technical questions.
Required Skills:
  • Strong software/data engineering skills, proficiency in at least one programming language (Python preferred).
  • Good knowledge of distributed computing frameworks, with PySpark being essential.
  • Familiarity with system design, data structures, algorithms, storage systems, and cloud infrastructure.
  • Understanding of data modeling and architecture concepts.
  • Experience with CI/CD, data testing, and monitoring.
  • Knowledge of Delta Lake protocol and Lakehouse architectures.
  • Experience with Databricks and Azure data services such as Azure Data Factory, Synapse Analytics, or Fabric.
Additional Skills:
  • Ability to work effectively with both technical and non-technical team members.
  • Strong communication skills to explain complex technical concepts clearly.
  • Proficiency in English.
  • YAML experience is a plus.
Work Mode:

Full remote.

Join Us!

Become part of a team driven by innovation, talent, and a commitment to excellence. Your next career step starts here.

This announcement is addressed to all genders and nationalities, in accordance with applicable laws.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.