Snowflake and Data Engineer - Remote / Telecommute
Cynet Systems Inc
Dallas (TX)
Remote
USD 80,000 - 110,000
Full time
13 days ago
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Job summary
An innovative company is seeking a skilled Data Engineer to develop and maintain data models and pipelines using Snowflake and DBT. This role involves collaborating with cross-functional teams to understand data requirements, implementing data quality checks, and optimizing workflows for automation. The ideal candidate will have a strong background in SQL, data warehousing, and cloud platforms like AWS. Join a forward-thinking team that values problem-solving and effective communication in a dynamic environment where your contributions will drive impactful data solutions.
Qualifications
- Proven experience with Snowflake, DBT, and Airflow.
- Strong knowledge of SQL and data warehousing concepts.
- Experience in designing and optimizing ETL/ELT processes.
Responsibilities
- Develop and maintain data models in dbt with scalable design principles.
- Design and optimize scalable data pipelines using Snowflake.
- Implement data quality checks and automate workflows with Airflow.
Skills
Snowflake
DBT
Airflow
SQL
ETL/ELT Processes
Cloud Platforms (AWS)
Problem-Solving
Communication Skills
Tools
Job Description:- Develop and maintain data models in dbt following modular & scalable design principles.
- Design, develop, and maintain scalable data pipelines using Snowflake and DBT.
- Build and optimize pipelines on Snowflake, ensuring performance, cost efficiency and scalability.
- Collaborate with cross functional teams to understand data requirements.
- Implement data quality checks and testing in dbt.
- Implement and optimize Airflow workflows for automation and scheduling of data processes.
- Maintain documentation for data models, transformations and processes.
- Develop and maintain CI/CD pipelines for data pipeline deployment and version control.
- Troubleshoot and resolve issues related to data pipelines.
Qualifications:
- Proven experience with Snowflake, DBT, Airflow, and CI/CD pipelines.
- Strong knowledge of SQL and data warehousing concepts.
- Experience in designing and optimizing ETL/ELT processes.
- Familiarity with cloud platforms (like AWS).
- Strong problem-solving skills and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.