¡Activa las notificaciones laborales por email!

AWS DATA ENGINEER – INGLES – REMOTO

IRIUM - Spain

España

A distancia

EUR 50.000 - 75.000

Jornada completa

Hace 3 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading company in Spain is seeking a Senior Data Engineer to enhance its data infrastructure. The role involves developing and optimizing data pipelines, requiring proficiency in modern tools such as DBT and Snowflake. Candidates must possess a degree in a related field and demonstrate senior-level experience along with strong communication skills, with the flexibility to work remotely.

Formación

  • Minimum 5 years of experience in data engineering, focusing on DBT, Snowflake, and GitHub.
  • Strong analytical and problem-solving skills, attention to detail.

Responsabilidades

  • Design, develop, and maintain data pipelines using DBT, Apache Airflow, and Python.
  • Integrate data into Snowflake, ensuring quality and consistency.
  • Collaborate with Data Scientists and ML Engineers for seamless data processing.

Conocimientos

DBT
Snowflake
GitHub
Apache Airflow
Python
AWS
Problem-Solving

Educación

Degree in Computer Science, Data Engineering, or a related field

Descripción del empleo

We are looking for a Senior Data Engineer to contribute to the development and optimization of data infrastructure.

The role requires proficiency in DBT, Snowflake, GitHub, and experience with Apache Airflow, Python, and AWS.

This position demands senior-level expertise, fluency in English, and the ability to work remotely from Spain.

Responsibilities:
  1. Data Pipeline Development: Design, develop, and maintain robust data pipelines using DBT, Apache Airflow, and Python.
  2. Data Integration: Integrate data from various sources into Snowflake, ensuring data quality and consistency.
  3. Collaboration: Work closely with Data Scientists and ML Engineers to ensure seamless data processing and integration.
  4. Optimization: Optimize data storage and retrieval processes to enhance performance and scalability.
  5. Version Control: Utilize GitHub for version control and collaboration on data engineering projects.
  6. Cloud Infrastructure: Manage and optimize AWS cloud infrastructure for data processing and storage.
  7. Troubleshooting: Identify and resolve issues related to data pipelines and infrastructure.
  8. Documentation: Maintain comprehensive documentation of data processes, pipelines, and infrastructure.
Qualifications:
  • Education: Degree in Computer Science, Data Engineering, or a related field.
  • Experience: Minimum 5 years of experience in data engineering, with a focus on DBT, Snowflake, and GitHub.
  • Technical Skills: Proficiency in Python, Apache Airflow, and AWS.
  • Communication: Fluency in English, with excellent communication and collaboration skills.

Problem-Solving: Strong analytical and problem-solving skills, with attention to detail.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.