Competition #2025-377-CS | Lead, Data Engineering (2025-377-CS) |
Employment Type: | Permanent Full-Time, Non-Bargaining Unit |
Work Hours: | 35 hours/week (M-F 9am-5pm) |
Work Setting: | Hybrid – Opportunity to work remotely after orientation & training period |
Salary: | $90k - $103k |
Application Deadline: | August 21, 2025 by 11:59 pm |
Program Overview – Digital Transformation Office, Central ServicesThe Lead, Data Engineering is responsible for designing, building, and managing the data infrastructure components and systems within WoodGreen’s enterprise data environment, in coordination with Information Technology (IT) for shared Azure responsibilities. The incumbent will lead the end-to-end deployment and operation of all Azure-based data architecture, ensuring the platform is secure, scalable, governed, and optimized for analytics, reporting, and future innovations including AI.This role serves as both architect and hands-on engineering lead, accountable for the complete technical lifecycle of the Enterprise Data Warehouse (EDW), including design, implementation, optimization, and ongoing support. Infrastructure provisioning is managed in partnership with IT, while responsibility for data platform enablement, ingestion pipelines, and CI/CD automation rests entirely with this position. The incumbent ensures platform stability, drives engineering capability, and safeguards continuity of data delivery as the primary technical authority for data engineering within Enterprise Analytics.What You Will Do- Design and operate data platform infrastructure in Azure in collaboration with IT, ensuring deployments align with organizational standards for security, scalability, and governance.
- Provision and configure core platform services including Azure Data Factory, Microsoft Fabric, Azure SQL, Synapse, Data Lake Gen2, and Databricks.
- Build and maintain automated ingestion and transformation pipelines, embedding observability, failover handling, and performance optimization.
- Implement and document Infrastructure-as-Code (IaC) using tools such as Terraform, Bicep, or ARM templates.
- Develop and maintain automated deployment pipelines (CI/CD) using Azure DevOps for data engineering solutions and infrastructure as code.
- Enforce data role-based access controls (RBAC), encryption standards, and identity boundary rules in accordance with governance guidance.
- Collaborate with the Lead, Data Governance to embed metadata capture, lineage tracing, and data validation into system pipelines.
- Partner with the Data & Insights function to deliver curated, high-performance datasets that power semantic models and enterprise dashboards in Power BI.
- Monitor platform health, resolve issues proactively, and implement improvements to enhance performance, optimize costs, and ensure high reliability.
- Lead platform hardening activities including penetration test response, network policy enforcement, and service principal rotation.
- Maintain infrastructure logs, system health reports, and support audit-readiness documentation for internal and external review.
- Act as an escalation point for pipeline failures, system outages, or data delivery delays, with a mandate to resolve or coordinate cross-team recovery.
- Provide mentorship and on-the-ground support for technical staff or external engineering partners.
- Represent data engineering at architectural planning meetings, tool evaluations, and roadmap discussions.
- Perform technical gap coverage across all Enterprise Analytics operations as directed by the Director, including ad hoc solution prototyping, tool configuration, and data remediation efforts.
What You Bring to the Team- Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related technical discipline.
- 7+ years of experience in enterprise data engineering, cloud infrastructure, or platform operations, with proven ability to lead end-to-end implementation.
- Expertise in deploying and managing Azure-based analytics platforms, including provisioning, networking, security, and integration of core services.
- Hands-on experience with Azure Data Factory, Microsoft Fabric, Azure SQL, Synapse, and Databricks.
- Proficiency in SQL and strong skills in Python or equivalent scripting languages for data engineering and automation tasks.
What Will Set You Apart- Working knowledge of Infrastructure-as-Code (IaC) tools such as Terraform, Bicep, or ARM templates for scalable environment provisioning.
- Demonstrated experience designing and managing Continuous Integration/Continuous Delivery (CI/CD) pipelines using Azure DevOps or similar tools.
- Ability to collaborate effectively with governance teams on metadata alignment, RBAC, lineage, and compliance standards.
- Strong background in performance optimization, platform monitoring, cost management, and resiliency planning.
- Excellent problem-solving, analytical, and troubleshooting skills, with the ability to work independently in dynamic environments.
- Strong written and verbal communication skills, including technical documentation.
- Commitment to ethical data practices and alignment with WoodGreen’s mission-driven values.
WoodGreen is an equal opportunity employer. We are committed to providing an inclusive and barrier-free selection process and work environment. If contacted in relation to an employment opportunity, please advise our People & Culture representatives at careers@woodgreen.org of the accommodation measures required. Information received relating to accommodation will be addressed confidentially.
This public job posting uses AI-powered tools to screen, assess, or select applicants.