Enable job alerts via email!

EPIC Cogito Data Engineer Snowflake & DBT (Remote)

TieTalent

Town of Texas (WI)

Remote

USD 60,000 - 80,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative firm is looking for a skilled Data Engineer specializing in EPIC Cogito, Snowflake, and DBT. In this remote role, you will design, develop, and optimize data models and analytics solutions, working closely with cross-functional teams to ensure seamless data integration. Your expertise will enable you to maintain data models, troubleshoot issues, and enhance the efficiency of data workflows. Join a dynamic environment where your contributions will significantly impact data-driven decision-making processes. This is a fantastic opportunity for those passionate about transforming data into actionable insights in a collaborative setting.

Qualifications

  • Strong hands-on experience with EPIC Cogito for data modeling and analytics.
  • Expertise in Snowflake and DBT for data transformation.

Responsibilities

  • Design and model tables in Snowflake using EPIC Cogito expertise.
  • Develop and optimize data processing and analytical scripts using DBT.

Skills

EPIC Cogito
Snowflake
DBT
SQL
Python
ETL tools
Cloud platforms (AWS, Azure, GCP)

Job description

EPIC Cogito Data Engineer Snowflake & DBT (Remote)

Job Title: EPIC Cogito Data Engineer Snowflake & DBT
Location: Dallas, TX 75201 (Remote)
Duration: 6+ Months
Rate: $60/Hour W2 or CTC

Job Description: We are seeking a skilled EPIC Cogito Data Engineer with expertise in Snowflake and DBT to design, develop, and optimize data models and analytics solutions. The role involves maintaining data models, writing data processing scripts, and collaborating with cross-functional teams to ensure seamless data integration and analytics workflows.

Key Responsibilities:

  1. Utilize EPIC Cogito expertise to design and model tables in Snowflake.
  2. Maintain data models and DDL scripts for any changes in the Snowflake environment.
  3. Develop, deploy, and optimize data processing and analytical scripts using DBT and Snowflake.
  4. Perform performance tuning and optimization of data processing and analytics workflows to enhance efficiency and scalability.
  5. Troubleshoot and resolve issues related to data processing, integration, and analytics solutions in collaboration with cross-functional teams.
  6. Communicate regularly with Engagement Managers (Directors), project teams, and technical stakeholders, escalating any concerns that need higher-level attention.

Key Skills & Qualifications:

  1. Strong hands-on experience with EPIC Cogito for data modelling and analytics.
  2. Expertise in Snowflake for data warehousing and DBT (Data Build Tool) for data transformation.
  3. Proficiency in SQL, Python, and ETL tools for data processing.
  4. Experience with performance tuning and optimization of SQL queries and data workflows.
  5. Ability to work with cross-functional teams to resolve data integration and processing issues.
  6. Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus.
  7. Strong communication and stakeholder management skills.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.