¡Activa las notificaciones laborales por email!

Senior Data Engineer

TheWhiteam

Burgos

A distancia

EUR 45.000 - 70.000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading tech company is seeking a Senior Data Engineer to develop robust data infrastructures and optimize data pipelines using DBT, Snowflake, and AWS. This role demands extensive experience (5+ years) in the field, with a focus on collaborating with data teams and maintaining high-quality data processes, all within a fully remote framework in Spain.

Formación

  • 5+ years of experience in data engineering.
  • Fluency in English in a professional context.

Responsabilidades

  • Design and maintain data pipelines using DBT and Airflow.
  • Integrate data into Snowflake ensuring quality and consistency.
  • Manage AWS cloud infrastructure for data processing.

Conocimientos

Data engineering
Data pipeline development
Proficiency in DBT
Snowflake
GitHub
Apache Airflow
Python
AWS
Analytical skills
Problem-solving skills

Educación

Degree in Computer Science, Data Engineering, or related field

Descripción del empleo

Join to apply for the Senior Data Engineer role at TheWhiteam

5 days ago Be among the first 25 applicants

Development and optimization of data infrastructure. The role requires proficiency in DBT, Snowflake, GitHub, and additional experience with Apache Airflow, Python, and AWS. This position demands senior-level expertise, fluency in English, and the ability to work remotely from Spain.

Responsibilities :

  • Design, develop, and maintain robust data pipelines using DBT, Apache Airflow, and Python.
  • Integrate data from various sources into Snowflake, ensuring data quality and consistency.
  • Collaborate closely with Data Scientists and ML Engineers for seamless data processing.
  • Optimize data storage and retrieval for performance and scalability.
  • Use GitHub for version control and collaboration on data projects.
  • Manage and optimize AWS cloud infrastructure for data processing and storage.
  • Identify and resolve issues related to data pipelines and infrastructure.
  • Maintain comprehensive documentation of data processes, pipelines, and infrastructure.

Qualifications :

  • Degree in Computer Science, Data Engineering, or related field.
  • Minimum 5 years of experience in data engineering, focusing on DBT, Snowflake, and GitHub.
  • Proficiency in Python, Apache Airflow, and AWS.
  • Fluency in English, with excellent communication skills.
  • Strong analytical and problem-solving skills with attention to detail.

Must Have :

  • Experience developing and maintaining data transformation workflows using DBT.
  • Proficiency in Snowflake for data storage and integration.
  • Strong skills in version control and collaboration using GitHub.

100% remote work is allowed within Spain.

Languages Required : C1 English level.

J-18808-Ljbffr

J-18808-Ljbffr

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.