Sr Staff Data Engineer - GE07DE
We’re determined to make a difference and are proud to be an insurance company that goes well beyond coverages and policies. Working here means having every opportunity to achieve your goals – and to help others accomplish theirs, too. Join our team as we help shape the future.
The Enterprise Data Services department’s IT team supporting Global specialty is seeking a hands-on Senior Staff Data Engineer to enhance and support its Data assets on Snowflake and SQL Server platforms. We are looking for a talented professional with a proven track record in engineering ELT development and integration using Snowflake. Our ideal candidate will leverage deep technical expertise and problem-solving skills to deliver, maintain, and enhance projects within the Data & Analytics Value stream.
This role will have a hybrid work schedule, with an expectation of working in an office location (Hartford, CT; Danbury, CT; Charlotte, NC; Chicago, IL; Columbus, OH) 3 days a week (Tuesday through Thursday).
The Senior Staff Data Engineer will be proficient in data platform architecture, design, data curation, multi-dimensional models, and have a strong understanding of data architecture, ETL principles, and Data Warehousing. Responsibilities also include technical delivery review and resolution of architecture issues in the AWS Snowflake platform.
Responsibilities:
- Demonstrate technical leadership and expertise in Snowflake’s cloud-native architecture.
- Create, troubleshoot, and enhance complex code in Snowflake Data Lakes/Data Hubs/Data Warehouses.
- Build data pipelines (ELT) with Snowflake cloud data platform using AWS compute (EC2) and storage layers (S3).
- Build Snowflake SQL Data Warehouse using virtual warehouses based on best practices.
- Work hands-on with ELT tools with Snowflake.
- Implement and leverage features like Materialized Views, Data Sharing, Clone, and perform Dynamic Data Masking.
- Understand delivery methodologies (SDLC) and lead teams in solution implementation according to architecture.
- Experience with SnowSQL, stored procedures, UDFs using JavaScript, SnowPipe, and other Snowflake utilities.
- Manage data migration from RDBMS to Snowflake.
- Design data security and access controls.
- Coordinate data loading/unloading activities to/from Snowflake.
- Work with Data Lakes loading structured, semi-structured (XML, JSON, Parquet), and unstructured data.
- Build data pipelines using cloud-native ELT tools and automate data ingestion, including CDC.
- Integrate data pipelines with source control and CI/CD pipelines, DevOps practices.
- Perform performance tuning of jobs to optimize CPU and load times.
- Understand Snowflake licensing and data lifecycle management.
- Architect reusable ETL components, including audit and reconciliation processes.
- Research and evaluate alternative solutions for system design, recommending cost-effective options.
- Support production issues and clarify requirements quickly.
- Coordinate with architects, analysts, Scrum Masters, developers to clarify technical details and implement solutions.
- Ensure quality and completeness of technical specifications, design, and code reviews, adhering to non-functional requirements.
- Deliver solutions in an agile environment (Scrum/Kanban).
- Participate actively in team activities like feature refinement, code review, and user story completion.
- Identify and communicate technical risks, issues, and solutions during projects.
- Collaborate with high-performing teams, release train engineers, product owners, and stakeholders.
- Work on innovative projects with a 'fail-fast' approach to maximize business value.
- Show passion for learning new skills and adapt to changing project priorities.
Qualifications & Key Skills:
- Candidates must be authorized to work in the US without sponsorship. The company will not support STEM OPT I-983 endorsement for this role.
- 5+ years in Snowflake data warehousing, pipelines, and automation.
- 7+ years hands-on experience in Data Warehouse and Data Integration (ELT/ETL).
- 7+ years proficiency in ETL with Microsoft BI tools and others.
- Strong background in enterprise data warehouse, ETL/ELT development, database replication, metadata management, and data quality.
- Experience with SDLC phases, T-SQL, stored procedures, SSIS.
- Knowledge of data warehouse applications, preferably in financial/insurance domains.
- Familiarity with version control, CI/CD, and DevOps tools like GitHub, Jenkins, Nexus, uDeploy.
- Understanding of data profiling, modeling, and database design.
Preferred Skills:
- Experience with data visualization tools (preferably Tableau).
- Insurance industry experience.
- Knowledge of Artificial Intelligence.
- Experience with Informatica Data Management Cloud.
- Experience with Data Governance (Catalog, Quality, Lineage).
Compensation:
The annual base pay range is based on market analysis. Actual pay may vary based on performance, skills, and competencies. Other rewards include bonuses, incentives, and recognition. The range is:
$135,040 - $202,560
We are an Equal Opportunity Employer supporting diversity and inclusion.