Job Details:
Data Integration Architect
My client has a full time remote opening for a Data Integration Architect. This role will focus on building robust data pipelines, managing enterprise data, and enabling secure, scalable data integration across platforms—particularly in the healthcare domain.
Key Responsibilities:
- Lead and manage Master Data Management (MDM) initiatives to ensure consistency, accuracy, and reliability of core business data.
- Design and develop data pipelines on modern data platforms such as Databricks and Snowflake.
- Create and maintain Entity Relationship Diagrams (ERDs) to support effective data modeling and architecture planning.
- Develop and maintain integration patterns using APIs, RESTful web services, and GraphQL to support various application and data workflows.
- Implement and support data exchange standards such as FHIR and HL7-FHIR, with a deep understanding of healthcare interoperability.
- Ensure compliance with healthcare data protection regulations and data loss prevention policies.
- Work with Robotic Process Automation (RPA) tools to automate data workflows and manual processes.
- Design, build, and optimize ETL (Extract, Transform, Load) processes to support analytical and operational data needs.
- Develop automation scripts using Python and other scripting languages to streamline data operations and platform integrations.
Qualifications:
- Proven experience in master data management and enterprise data governance.
- Hands-on experience with data platforms such as Databricks and Snowflake.
- Strong understanding of data modeling techniques and tools (e.g., ERDs, normalization).
- Proficiency with REST APIs, GraphQL, and web services integration.
- Familiarity with FHIR, HL7-FHIR, and other healthcare-specific data formats.
- Experience with data privacy and security, particularly in a healthcare environment.
- Knowledge of data loss prevention (DLP) strategies and tooling.
- Practical knowledge of RPA platforms such as UiPath, Automation Anywhere, or Blue Prism.
- Strong command of ETL methodologies and tools.
- Proficiency in Python for scripting, automation, and data transformation tasks.
Preferred Qualifications:
- Experience working in a regulated industry (healthcare, pharma, etc.).
- Knowledge of cloud ecosystems (AWS, Azure, GCP) as they relate to data services.
- Familiarity with CI/CD practices for data pipeline deployment.