¡Activa las notificaciones laborales por email!

AWS DATA ENGINEER – INGLES – REMOTO

IRIUM - Spain

Salamanca

A distancia

EUR 50.000 - 75.000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading company in data solutions seeks a Senior Data Engineer to enhance its data infrastructure. This role involves developing data pipelines, integrating diverse data sources, and collaborating with data scientists. Ideal candidates will have at least 5 years of experience in relevant technologies and excellent problem-solving skills.

Formación

  • Minimum 5 years of experience in data engineering, focusing on DBT, Snowflake, GitHub.
  • Fluency in English, excellent communication, and collaboration skills.
  • Strong analytical and problem-solving skills, attention to detail.

Responsabilidades

  • Design, develop, and maintain robust data pipelines using DBT, Apache Airflow, and Python.
  • Integrate data from various sources into Snowflake, ensuring data quality.
  • Optimize data storage and retrieval processes for performance.

Conocimientos

DBT
Snowflake
GitHub
Apache Airflow
Python
AWS
Problem-Solving

Educación

Degree in Computer Science, Data Engineering, or a related field

Descripción del empleo

We are looking for a Senior Data Engineer to contribute to the development and optimization of data infrastructure.

The role requires proficiency in DBT, Snowflake, GitHub , and experience with Apache Airflow, Python, and AWS .

This position demands senior-level expertise, fluency in English, and the ability to work remotely from Spain.

Responsibilities :

  • Data Pipeline Development : Design, develop, and maintain robust data pipelines using DBT, Apache Airflow, and Python.
  • Data Integration : Integrate data from various sources into Snowflake, ensuring data quality and consistency.
  • Collaboration : Work closely with Data Scientists and ML Engineers to ensure seamless data processing and integration.
  • Optimization : Optimize data storage and retrieval processes to enhance performance and scalability.
  • Version Control : Utilize GitHub for version control and collaboration on data engineering projects.
  • Cloud Infrastructure : Manage and optimize AWS cloud infrastructure for data processing and storage.
  • Troubleshooting : Identify and resolve issues related to data pipelines and infrastructure.
  • Documentation : Maintain comprehensive documentation of data processes, pipelines, and infrastructure.

Qualifications :

  • Education : Degree in Computer Science, Data Engineering, or a related field.
  • Experience : Minimum 5 years of experience in data engineering, with a focus on DBT, Snowflake, and GitHub.
  • Technical Skills : Proficiency in Python, Apache Airflow, and AWS.
  • Communication : Fluency in English, with excellent communication and collaboration skills.

Problem-Solving : Strong analytical and problem-solving skills, with attention to detail.

J-18808-Ljbffr

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.