Development and optimization of data infrastructure. The role requires proficiency in DBT, Snowflake, GitHub, and additional experience with Apache Airflow, Python, and AWS. This position demands senior-level expertise, fluency in English, and the ability to work remotely from Spain.
Responsibilities :
- Data Pipeline Development : Design, develop, and maintain robust data pipelines using DBT, Apache Airflow, and Python.
- Data Integration : Integrate data from various sources into Snowflake, ensuring data quality and consistency.
- Collaboration : Work closely with Data Scientists and ML Engineers to ensure seamless data processing and integration.
- Optimization : Optimize data storage and retrieval processes to enhance performance and scalability.
- Version Control : Utilize GitHub for version control and collaboration on data engineering projects.
- Cloud Infrastructure : Manage and optimize AWS cloud infrastructure for data processing and storage.
- Troubleshooting : Identify and resolve issues related to data pipelines and infrastructure.
- Documentation : Maintain comprehensive documentation of data processes, pipelines, and infrastructure.
Qualifications :
- Education : Degree in Computer Science, Data Engineering, or a related field.
- Experience : Minimum 5 years of experience in data engineering, with a strong focus on DBT, Snowflake, and GitHub.
- Technical Skills : Proficiency in Python, Apache Airflow, and AWS.
- Communication : Fluency in English, with excellent communication and collaboration skills.
- Problem-Solving : Strong analytical and problem-solving skills, with attention to detail.
Must Have :
- DBT : Experience in developing and maintaining data transformation workflows using DBT.
- Snowflake : Proficiency in Snowflake for data storage and integration.
- GitHub : Strong skills in version control and collaboration using GitHub.
100% work from home is allowed (in Spain)
J-18808-Ljbffr