Enable job alerts via email!

Data Architect with Informatica Power BI and Azure

Derex Technologies Inc

Erie (Erie County)

On-site

USD 90,000 - 130,000

Full time

10 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a skilled Data Architect to lead the design of modern data solutions for a major data modernization initiative. This role involves architecting enterprise-grade data pipelines using Informatica and Azure, ensuring optimal performance and reliability. The ideal candidate will have extensive experience in data architecture, particularly in the Property & Casualty insurance domain, and will be adept at troubleshooting complex data workflows. Join a forward-thinking company where your expertise will drive significant advancements in data management and analytics.

Qualifications

  • 13+ years of experience in data architecture and ETL development.
  • Strong experience with Azure Cloud Services and Informatica.

Responsibilities

  • Architect and implement enterprise data pipelines using ETL tools.
  • Collaborate with stakeholders to optimize data lake architecture.

Skills

Informatica
Azure
Python
T-SQL
Spark
ETL Development
Data Architecture
Guidewire Claim Data Access

Education

Bachelor's Degree in Computer Science or related field

Tools

Azure Data Factory
Azure Data Lake
SQL Server
Hadoop
Databricks

Job description

Job Description

Position: Data Architect with Informatica Power BI and Azure

Location: Erie PA (On-Site)

Duration: Long Term

Job Description:

Must have experience on Informatica Power BI and Azure experience.

About the Role:

We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise, hands-on experience, and excellent communication skills to successfully deliver enterprise-grade data solutions in Azure/Informatica. This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.

Key Responsibilities:

l Architect and implement enterprise metadata-driven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.

l Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake, ensuring robust, scalable, and high-performing architecture.

l Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture, enabling advanced analytics and reporting.

l Troubleshoot and resolve issues in data pipelines, workflows, and related processes to ensure reliability and data accuracy.

l Continuously monitor and optimize current workflows for performance, scalability, and cost-efficiency, adhering to best practices.

l Develop and maintain custom processes using Python, T-SQL, and Spark, tailored to business requirements.

l Leverage Azure Functions to design serverless compute solutions for event-driven and scheduled data workflows.

l Optimize data workflows and resource usage to ensure cost-efficiency in Azure Cloud environments.

l Provide leadership and guidance for implementing Hadoop-based big data solutions where applicable.

l Develop a comprehensive understanding of P&C domain data, ensuring alignment with business objectives and compliance requirements.

l Communicate technical solutions effectively with cross-functional teams, stakeholders, and non-technical audiences.

Required Qualifications:

l 13+ years of experience in data architecture, data engineering, and/or ETL development roles, with at least 3+ years in a P&C insurance domain.

l Proven experience with Azure Cloud Services, including Azure Data Lake, Azure Data Factory, and SQL Server.

l Leverage Informatica for robust ETL workflows, data integration, and metadata-driven pipeline automation to streamline data processing

l Build end-to-end metadata-driven frameworks and continuously optimize existing workflows for improved performance, scalability, and efficiency.

l Strong knowledge of Guidewire Claim Data Access (CDA) or similar insurance domain data.

l Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.

l Proficiency in scripting and programming with Python, T-SQL, and Spark for custom data workflows.

l Hands-on expertise in building and managing ODS systems from data lakes.

l Experience with Azure Functions for serverless architecture.

l Familiarity with Hadoop ecosystems (preferred but not mandatory).

l Demonstrated ability to design solutions for Azure Cloud Cost Optimization.

l Excellent communication skills to engage with technical and business stakeholders effectively.

l Experience with metadata management and data cataloging for large-scale data ecosystems.

Preferred Skills:

l Familiarity with Guidewire systems and their integration patterns.

l Experience in implementing Data Governance frameworks.

l Certification in Azure (e.g., Azure Data Engineer Associate or Azure Solutions Architect).

l Experience with other data platforms/tools such as Hadoop, Databricks etc.

Regards,

Manoj

Derex Technologies INC

Contact : 973-834-5005 Ext 206

Qualifications:
Additional Information

All your information will be kept confidential according to EEO guidelines.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Data Architect Snowflake - HealthPlans

CitiusTech

Remote

USD 120,000 - 160,000

5 days ago
Be an early applicant

Oracle Cloud HCM Architect (Remote - Only visa independents please)

ZipRecruiter

Dallas

Remote

USD 90,000 - 150,000

29 days ago