Enable job alerts via email!

Data Engineer

RIT Solutions, Inc.

Minneapolis (MN)

Remote

USD 90,000 - 130,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking a Data Engineer to join their Data Management team in Minneapolis. The ideal candidate will have extensive experience with Azure Data Factory and Snowflake, focusing on developing workflows and maintaining data quality. This remote position requires strong problem-solving skills and a deep understanding of data integration and management in the healthcare domain.

Qualifications

  • 5+ years of Data engineering experience focused on Data Warehousing.
  • 2+ years creating pipelines in Azure Data Factory.
  • Experience with HL7 and FHIR standards.

Responsibilities

  • Develop and manage relationships with departments for data coordination.
  • Analyze project requirements and develop detailed specifications for ETL.
  • Investigate and resolve data quality issues in ETL pipelines.

Skills

Data Warehousing
ETL
Data Integration
Problem-solving
Analytical skills
Data modeling
Python scripting

Education

Bachelors or Advanced Degree in Information Technology

Tools

Azure Data Factory
Snowflake
Informatica
SQL

Job description

Minneapolis, MN - Remote

Job Description

  • 10+Years of IT experience
  • Background candidates should ideally have come from SSIS / SQL transitioning to Azure Data Factory (Data Integration) Snowflake (Source and Target End Point)
  • Azure has a lot of components to it such as Fabric, DataBricks, ADF, Pyspark
  • They need candidates that can use ADF as both orchestration and integration
  • many candidates claim to have ADF but for orchestration really just use Pyspark to trigger the jobs through ADF
  • They need to be advanced at Building the Integration through data flows logic as well

100% Telecommute

Project :

  • As a member of the Data Management team, the Data Engineer supports the Alabama EDS by developing and maintaining workflows, identifying, and resolving data quality issues, and optimizing processes to improve performance.
  • The Data Engineer will also support intrastate agencies by monitoring automated data extracts and working directly with state partners to create new extracts based on business specifications.

Responsibilities :

  • Develop and manage effective working relationships with other departments, groups, and personnel with whom work must be coordinated or interfaced
  • Efficiently communicate with ETL architect while understanding the requirements and business process knowledge in order to transform the data in a way that's geared towards the needs of end users
  • Assist in the overall architecture of the ETL Design, and proactively provide inputs in designing, implementing, and automating the ETL flows
  • Investigate and mine data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions
  • Developing ETL pipelines and data flows in and out of the data warehouse using a combination of Azure Data Factory and Snowflake toolsets
  • Developing idempotent ETL process design so that interrupted, incomplete, or failed processes can be rerun without errors using ADF dataflows and Pipelines
  • Ability to work in Snowflake Virtual Warehouses as needed in Snowflake and automate data pipelines using Snowpipe for tedious ETL problems
  • Capturing changes in data dimensions and maintaining versions of them using Stream sets in snowflake and scheduling them using Tasks
  • Optimize every step of the data movement not only limited to source and during travel but also when it's at rest in the database for accelerated responses
  • Must have the ability to build a highly efficient orchestrator that can schedule jobs, execute workflows, perform Data quality checks, and coordinate dependencies among tasks
  • Responsible for testing of ETL system code, data design, and pipelines and data flows. Root cause analysis on all processes and resolving production issues are also a part of the process and routine tests on databases and data flow and pipeline testing
  • Responsible for documenting the implementations, and test cases as well as responsible for building deployment documents needed for CI / CD

Ideal Background : Data Engineer with Healthcare (Medicaid) and Microsoft Azure based experience with Snowflake and Azure Data Factory

TOP REQUIREMENTS :

  • 5+ years of Data engineering experience with a focus on Data Warehousing
  • 2+ years of experience creating pipelines in Azure Data Factory (ADF)
  • 3+ years of experience creating stored procedures with Oracle PL / SQL, SQL Server T-SQL, or Snowflake SQL

Required :

  • 5+ years of Data engineering experience with a focus on Data Warehousing
  • 2+ years of experience creating pipelines in Azure Data Factory (ADF)
  • 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
  • 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
  • 3+ years of experience creating stored procedures with Oracle PL / SQL, SQL Server T-SQL, or Snowflake SQL
  • 2+ years of experience with GitHub, SVN, or similar source control systems
  • 2+ years of experience processing structured and un-structured data.
  • Experience with HL7 and FHIR standards, and processing files in these formats.
  • 3+ years analyzing project requirements and developing detailed specifications for ETL requirements.
  • Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines.
  • Ability to adapt to evolving technologies and changing business requirements.
  • Bachelors or Advanced Degree in a related field such as Information Technology / Computer Science, Mathematics / Statistics, Analytics, Business

Preferred :

  • 2+ years of batch or PowerShell scripting
  • 2+ years of experience with Python scripting.
  • 3+ years of data modeling experience in a data warehouse environment
  • Experience or familiarity with Informatica Intelligent Cloud Services (specifically Data Integration)
  • Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC)
  • Experience with State Medicaid / Medicare / Healthcare applications
  • Azure certifications related to data engineering or data analytics.
Create a job alert for this search

Data Engineer • Minneapolis, MN, United States

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Senior GCP Data Engineer with Datastage or Snaplogic_Remote_GC and USC Needed_Only on W2

Chelsoft Solutions Co.

Eagan

Remote

USD 97,000 - 120,000

2 days ago
Be an early applicant

Data Engineer

Dexian

Eden Prairie

Remote

USD 74,000 - 118,000

5 days ago
Be an early applicant

Senior Data Engineer

Emergent Software

Minneapolis

Remote

USD 100,000 - 140,000

4 days ago
Be an early applicant

Senior Data Engineer

Emergent Software

Minneapolis

Remote

USD 100,000 - 140,000

15 days ago

Sr Data Engineer (DataStage & GCP)

Davita Inc.

Eagan

Remote

USD 80,000 - 100,000

7 days ago
Be an early applicant

Sr Data Engineer (DataStage & GCP)-CTH

Davita Inc.

Eagan

Remote

USD 80,000 - 100,000

7 days ago
Be an early applicant

Sr Data Engineer (DataStage & GCP)

Genesis10

Eagan

Remote

USD 80,000 - 100,000

7 days ago
Be an early applicant

Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics

Canonical

Minneapolis

Remote

USD 90,000 - 130,000

22 days ago

Data Engineer - Databricks

Lumenalta

Phoenix

Remote

USD 60,000 - 110,000

2 days ago
Be an early applicant