Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
An established industry player is seeking a skilled Data Architect to lead the design of modern data solutions for a major data modernization initiative. This role involves architecting enterprise-grade data pipelines using Informatica and Azure, ensuring optimal performance and reliability. The ideal candidate will have extensive experience in data architecture, particularly in the Property & Casualty insurance domain, and will be adept at troubleshooting complex data workflows. Join a forward-thinking company where your expertise will drive significant advancements in data management and analytics.
Job Description
Position: Data Architect with Informatica Power BI and Azure
Location: Erie PA (On-Site)
Duration: Long Term
Job Description:
Must have experience on Informatica Power BI and Azure experience.
About the Role:
We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise, hands-on experience, and excellent communication skills to successfully deliver enterprise-grade data solutions in Azure/Informatica. This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.
Key Responsibilities:
l Architect and implement enterprise metadata-driven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.
l Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake, ensuring robust, scalable, and high-performing architecture.
l Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture, enabling advanced analytics and reporting.
l Troubleshoot and resolve issues in data pipelines, workflows, and related processes to ensure reliability and data accuracy.
l Continuously monitor and optimize current workflows for performance, scalability, and cost-efficiency, adhering to best practices.
l Develop and maintain custom processes using Python, T-SQL, and Spark, tailored to business requirements.
l Leverage Azure Functions to design serverless compute solutions for event-driven and scheduled data workflows.
l Optimize data workflows and resource usage to ensure cost-efficiency in Azure Cloud environments.
l Provide leadership and guidance for implementing Hadoop-based big data solutions where applicable.
l Develop a comprehensive understanding of P&C domain data, ensuring alignment with business objectives and compliance requirements.
l Communicate technical solutions effectively with cross-functional teams, stakeholders, and non-technical audiences.
Required Qualifications:
l 13+ years of experience in data architecture, data engineering, and/or ETL development roles, with at least 3+ years in a P&C insurance domain.
l Proven experience with Azure Cloud Services, including Azure Data Lake, Azure Data Factory, and SQL Server.
l Leverage Informatica for robust ETL workflows, data integration, and metadata-driven pipeline automation to streamline data processing
l Build end-to-end metadata-driven frameworks and continuously optimize existing workflows for improved performance, scalability, and efficiency.
l Strong knowledge of Guidewire Claim Data Access (CDA) or similar insurance domain data.
l Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.
l Proficiency in scripting and programming with Python, T-SQL, and Spark for custom data workflows.
l Hands-on expertise in building and managing ODS systems from data lakes.
l Experience with Azure Functions for serverless architecture.
l Familiarity with Hadoop ecosystems (preferred but not mandatory).
l Demonstrated ability to design solutions for Azure Cloud Cost Optimization.
l Excellent communication skills to engage with technical and business stakeholders effectively.
l Experience with metadata management and data cataloging for large-scale data ecosystems.
Preferred Skills:
l Familiarity with Guidewire systems and their integration patterns.
l Experience in implementing Data Governance frameworks.
l Certification in Azure (e.g., Azure Data Engineer Associate or Azure Solutions Architect).
l Experience with other data platforms/tools such as Hadoop, Databricks etc.
Regards,
Manoj
Derex Technologies INC
Contact : 973-834-5005 Ext 206
Qualifications:
Additional Information
All your information will be kept confidential according to EEO guidelines.