Enable job alerts via email!

Data Engineer (ADF, Orchestration, Data Flows, Snowflake) | REMOTE

Vinsys Information Technology Inc

Minneapolis (MN)

Remote

USD 90,000 - 120,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking a Data Engineer specializing in Azure Data Factory and Snowflake. The ideal candidate will have extensive experience in data engineering, particularly in building and managing data pipelines and orchestration. Responsibilities include designing ETL flows, optimizing data storage, and ensuring efficient data integration. This remote role offers an opportunity to work with healthcare data standards and collaborate with various departments.

Qualifications

  • 5+ years of experience in Data Warehousing and Data Engineering.
  • 2+ years creating pipelines with Azure Data Factory.

Responsibilities

  • Develop and manage effective working relationships with other departments.
  • Design, implement, and automate ETL flows.
  • Build ETL pipelines using Azure Data Factory and Snowflake.

Skills

Problem-Solving
Analytical Skills

Education

Bachelor's or higher degree in a related field

Tools

Azure Data Factory
Snowflake
SQL
GitHub

Job description

Join to apply for the Data Engineer (ADF, Orchestration, Data Flows, Snowflake) | REMOTE role at Vinsys Information Technology Inc

This role requires a high level of expertise in data engineering, particularly with Azure Data Factory, Snowflake, and related technologies. The ideal candidate should have extensive experience in building and managing data pipelines, orchestration, and data integration, with a background in healthcare data standards such as HL7 and FHIR.

Responsibilities
  • Develop and manage effective working relationships with other departments, groups, and personnel.
  • Communicate effectively with ETL architects to understand requirements and transform data accordingly.
  • Design, implement, and automate ETL flows, ensuring robustness and efficiency.
  • Investigate data issues within ETL pipelines, notify stakeholders, and propose solutions.
  • Build ETL pipelines using Azure Data Factory and Snowflake tools.
  • Design idempotent ETL processes to handle failures gracefully.
  • Work with Snowflake Virtual Warehouses and automate data pipelines with Snowpipe.
  • Manage data changes and versioning using Snowflake Stream Sets and Tasks.
  • Optimize data movement and storage for performance.
  • Build efficient orchestration solutions for scheduling and workflow execution.
  • Test ETL systems, troubleshoot issues, and document processes and deployments.
Minimum Requirements
  • 5+ years of experience in Data Warehousing and Data Engineering.
  • 2+ years creating pipelines with Azure Data Factory.
  • Experience with relational databases like Snowflake, Oracle, SQL Server.
  • Proficiency in SQL, including stored procedures.
  • Experience with source control systems such as GitHub or SVN.
  • Knowledge of HL7 and FHIR standards.
  • Strong problem-solving and analytical skills.
  • Bachelor's or higher degree in a related field.
Preferred Skills
  • 2+ years scripting with PowerShell or Python.
  • Data modeling experience in a data warehouse environment.
  • Experience with Informatica Cloud, API design, and healthcare applications.
  • Azure certifications related to data.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Data Engineer (ADF, Orchestration, Data Flows, Snowflake) | REMOTE

Skill Mine

Minneapolis

Remote

USD 90,000 - 120,000

30+ days ago