¡Activa las notificaciones laborales por email!

Azure Data Engineer - Remote

buscojobs España

País Vasco

A distancia

EUR 35.000 - 55.000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading company in Spain is seeking a Data Engineer - Azure ETL Specialist to design and maintain robust ETL pipelines using Azure technologies. The ideal candidate will have a strong background in data engineering and proficiency in SQL, Databricks, and Azure Data Factory, along with a Bachelor's degree in a related field. Benefits include training budgets, flexible hours, and private medical insurance.

Servicios

Budget of €1,200 for individual training
Flexible working hours
Private medical insurance paid in full
Discounts on major brands for employees

Formación

  • Proven experience in building ETL pipelines using Azure Data Factory, Databricks, SQL, and Azure Data Lake.
  • Strong proficiency in SQL for data manipulation and querying.
  • Deep knowledge of programming languages: Python, Java, or Scala.

Responsabilidades

  • Design, develop, and maintain ETL pipelines using Azure Data Factory.
  • Utilize Databricks for advanced data transformations and analytics.
  • Collaborate with teams to understand data requirements.

Conocimientos

ETL pipelines
Azure Data Factory
Databricks
SQL
Azure Data Lake
Python
Java
Scala
Apache Spark
Data governance

Educación

Bachelor's degree in Computer Science, Engineering, or related field

Descripción del empleo

Capitole keeps growing and we want to do it with you!

We are seeking a Data Engineer - Azure ETL Specialist (Only residents in Spain).

We are looking for a skilled Data Engineer proficient in building robust ETL (Extract, Transform, Load) pipelines using Azure Data Factory, Databricks, SQL, and Azure Data Lake. The ideal candidate will have a strong background in data engineering with a focus on Azure technologies, particularly Azure Data Factory and Databricks.

Responsibilities:
  1. Design, develop, and maintain ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into Azure Data Lake and other target destinations.
  2. Utilize Databricks for advanced data transformations, processing, and analytics, ensuring optimal performance and scalability.
  3. Collaborate with cross-functional teams to understand data requirements and implement solutions that meet business needs.
  4. Develop and maintain data catalogs within Databricks, ensuring accurate documentation and metadata management.
  5. Optimize and troubleshoot existing ETL processes to improve efficiency, reliability, and performance.
  6. Stay up-to-date with the latest Azure technologies and best practices in data engineering.
Minimum Qualifications:
  1. Bachelor's degree in Computer Science, Engineering, or a related field.
  2. Proven experience in building ETL pipelines using Azure Data Factory, Databricks, SQL, and Azure Data Lake.
  3. Strong proficiency in SQL for data manipulation and querying.
  4. Deep knowledge of programming languages: Python, Java, or Scala.
  5. Strong knowledge of Apache Spark.
  6. Certification in Azure Data Engineering or related field.
  7. Knowledge of data governance principles and best practices.
  8. Familiarity with DevOps practices for CI/CD pipelines using GitHub workflows.
Benefits:
  • Budget of €1,200 for individual training (technological events, books, trainings, certifications, etc.).
  • Flexible working hours to help reconcile professional and family life.
  • Private medical insurance paid in full by Capitole.
  • Discounts on major brands for employees (Club Capitole).

The employee will adhere to the information security policies:

  • Will have access to confidential information relating to Capitole and the project on which they are working.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.