Attiva gli avvisi di lavoro via e-mail!

Data Engineer

TN Italy

Lecco

Remoto

EUR 40.000 - 70.000

Tempo pieno

3 giorni fa
Candidati tra i primi

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

A leading company is seeking a Data Engineer to join their innovative team. This role involves designing and implementing data pipelines, ensuring data quality, and collaborating with various stakeholders to deliver high-quality data assets. The position is fully remote, allowing for flexibility while working on cutting-edge technologies. Ideal candidates will have strong software engineering skills, particularly in Python and PySpark, and the ability to communicate complex concepts clearly. Join us to take your career to the next level!

Competenze

  • Proficiency in Python and PySpark required.
  • Experience with CI/CD processes and data testing.

Mansioni

  • Design and implement data pipelines in collaboration with teams.
  • Ensure reliability, data quality, and optimal performance.

Conoscenze

Software Engineering
Data Engineering
Communication
Problem Solving

Strumenti

Python
PySpark
Databricks
Azure Data Factory
Synapse Analytics
Fabric

Descrizione del lavoro

Social network you want to login/join with:

Role: Data Engineer

Herzumis is seeking a Data Engineer to join our team and support one of our clients.

Key Responsibilities:
  1. Design and implement data pipelines in collaboration with data users and other engineering teams.
  2. Ensure reliability, data quality, and optimal performance of data assets.
  3. Translate complex business and analytics requirements into high-quality data assets.
  4. Deliver high-quality, maintainable code focusing on simplicity and performance.
  5. Redesign and implement existing data pipelines using the latest technologies and best practices where applicable.
  6. Collaborate with solution engineers, data scientists, and product owners to deliver end-to-end products.
  7. Support partners with proof of concept initiatives and technical questions related to data.
Required Skills:
  1. Strong software/data engineering skills and proficiency in at least one programming language, preferably Python.
  2. Good knowledge of distributed computing frameworks, with PySpark being a must-have; others are advantageous.
  3. Familiarity with system design, data structures, algorithms, storage systems, and cloud infrastructure.
  4. Understanding of data modeling and data architecture concepts.
  5. Experience with CI/CD processes, data testing, and monitoring.
  6. Knowledge of Delta Lake protocol and Lakehouse architectures.
  7. Experience with Databricks and Azure data services such as Azure Data Factory, Synapse Analytics, or Fabric.
Additional Skills:
  1. Ability to work effectively with both technical and non-technical team members.
  2. Excellent communication skills to explain complex technical concepts clearly to non-technical audiences.
  3. Proficiency in English.
  4. Experience with YAML is a plus.
Work Mode:

Full remote.

Join Us!

Become part of a team driven by innovation, talent, and a commitment to excellence. Your next career step starts here.

This announcement is addressed to all genders, ages, and nationalities, in accordance with applicable laws.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.