Sr Staff Data Engineer - GE07DE
We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.
The Enterprise Data Services department’s IT team supporting Global specialty is seeking a hands-on Senior Staff Data Engineer to enhance and support its Data assets on snowflake and SQL server platform. We are looking for a talented professional with a proven track record of engineering the ELT development and integration using Snowflake. Our ideal candidate will leverage deep technical expertise and problem-solving skills to deliver, maintain, and enhance projects within the Data & Analytics Value stream.
This role will have a Hybrid work schedule, with the expectation of working in an office location (Hartford, CT; Danbury, CT; Charlotte, NC; Chicago, IL; Columbus, OH) 3 days a week (Tuesday through Thursday).
The Senior Staff Data Engineer will be proficient with data platform architecture, design, data curation, multi-dimensional models, and a strong understanding of data architecture, principles of ETL, and Data Warehousing. Responsibilities will include technical delivery review and resolution of architecture issues in the AWS Snowflake platform.
Responsibilities:
- Demonstrate technical leadership and expertise in Snowflake’s cloud-native architecture.
- Create, troubleshoot, and enhance complex code in Snowflake Data Lakes/Data Hubs/Data Warehouses.
- Build data pipelines (ELT) with Snowflake cloud data platform using AWS compute (EC2) and storage layers (S3).
- Build the Snowflake SQL Data Warehouse using Virtual Warehouses based on best practices.
- Work hands-on with ELT tools with Snowflake.
- Implement and leverage features like Materialized Views, Data Sharing, Clone, and Dynamic Data Masking.
- Understand and lead the implementation of delivery methodologies (SDLC).
- Work with Snow SQL, Stored Procedures, UDFs using JavaScript, SnowPipe, and other Snowflake utilities.
- Manage data migration from RDBMS to Snowflake.
- Design and implement data security and access controls.
- Coordinate data loading/unloading activities to/from Snowflake.
- Work with Data Lakes loading structured, semi-structured (XML, JSON, Parquet), and unstructured data.
- Build data pipelines using cloud-native ELT tools and automate data ingestion, including CDC.
- Integrate data pipelines with source control and CI/CD pipelines, and support DevOps practices.
- Tune performance of jobs to optimize CPU and load times.
- Understand Snowflake licensing and data lifecycle management.
- Architect reusable ETL components, including audit and reconciliation processes.
- Research and evaluate alternative solutions for system efficiency and cost-effectiveness.
- Support production issues and clarify requirements promptly.
- Coordinate with architects, analysts, scrum masters, and developers to ensure clarity and successful implementation.
- Oversee quality and completeness of technical specifications, design, and code reviews, adhering to non-functional requirements.
- Deliver solutions in an agile environment (Scrum/Kanban).
- Participate actively in team activities including feature refinement, code review, and user story completion.
- Identify and communicate technical risks, issues, and solutions during projects.
- Collaborate with a high-performing team, including Release Train Engineers, Product Owners, and stakeholders.
- Work on innovative projects with a 'fail-fast' approach to maximize value.
- Continuously learn new skills and adapt to changing project priorities.
Qualifications & Key Skills:
- Candidates must be authorized to work in the US without sponsorship. The company will not support STEM OPT I-983 endorsement for this role.
- 5+ years in Snowflake data warehousing, pipelines, and automation.
- 7+ years hands-on experience in Data Warehouse and Data Integration (ELT/ETL).
- 7+ years proficiency in ETL with Microsoft BI and other tools.
- Strong background in enterprise data warehousing, ETL/ELT development, database replication, metadata management, and data quality.
- Experience with SDLC, T-SQL, Stored Procedures, SSIS.
- Knowledge of data warehousing applications, preferably in finance/insurance.
- Familiarity with version control, CI/CD, and DevOps tools like GitHub, Jenkins, Nexus, uDeploy.
- Knowledge of data profiling, modeling, and database design.
Preferred Skills:
- Experience with Data Visualization (preferably Tableau).
- Experience in the Insurance industry.
- Experience with Artificial Intelligence.
- Experience with Informatica Data Management Cloud.
- Knowledge of Data Governance (Catalog, Quality, Lineage).
Compensation
The annualized base pay range is based on market analysis. Actual pay may vary based on performance, proficiency, and competencies. The range is:
$135,040 - $202,560
Our benefits include bonuses, incentives, and recognition programs. We are an Equal Opportunity Employer.
About Us | Culture & Employee Insights | Diversity, Equity and Inclusion | Benefits