Overview
The Data Architect is the senior technical authority responsible for ensuring the architectural integrity, scalability, and long-term stability of the Snowflake Enterprise Data Warehouse (EDW).
This role bridges operational reliability and strategic modernization : balancing the immediate needs of production support with long-term initiatives focused on innovation, performance, and continuous improvement (Enhance).
The Data Architect acts as a hands‑on leader, mentor, and escalation point for complex technical issues, setting standards and driving excellence across data engineering, BI, and quality assurance teams.
Key Responsibilities
- Design and document end-to-end data architecture and integration solutions that comply with enterprise standards.
- Contribute to the EDW technical roadmap, identifying opportunities for optimization, refactoring, and technology upgrades.
- Define and enforce engineering standards, CI / CD practices, and data quality frameworks.
- Serve as the final technical approver for major code changes, pull requests, and architectural decisions.
- Act as the top-level technical escalation point for critical P1 / P2 incidents and recurring issues.
- Lead root cause analysis during major outages, performing deep investigations across the full data stack.
- Drive long-term problem management by designing sustainable, preventive technical solutions.
- Mentor and upskill Data Engineers and BI Engineers, promoting adherence to best practices.
- Ensure critical knowledge and documentation are maintained and shared across the team.
- Collaborate with client architects, system owners, and partner vendors to ensure data flow integrity and effective cross‑platform communication.
Requirements
- At least 7‑8 years of experience in data‑centric roles, with proven technical leadership (e.g., Solution Architect, Technical Lead, Principal Engineer) Snowflake and Azure Data Factory (ADF).
- SQL Server, Informatica, IBM DB2.
- Orchestration: Control‑M.
- BI & Visualization: SAP Business Objects and Power BI.
- Advanced SQL and Python programming.
- Experience working with XML and JSON data formats.
- Broad expertise across ETL / ELT, Data Visualization, and Quality Engineering.
- DevOps & ITSM (Preferred): Azure Pipelines (CI / CD), ServiceNow, Azure DevOps.
- Fluent, professional‑level English (written and verbal).
Key Competencies & Attributes
- Strategic and Pragmatic: Balances long‑term vision with practical problem‑solving.
- Influential Leader: Respected for deep technical expertise and mentoring ability.
- Resilient Under Pressure: Maintains focus and clarity during incidents and tight deadlines.
- Exceptional Communicator: Able to convey complex concepts to both executives and engineers.
- Collaborative and Diplomatic: Builds consensus in multi‑vendor environments to achieve shared goals.
Hiring Details
- Work setup: Hybrid, based in Madrid, Spain (on‑site presence required weekly).
- Eligibility: Open only to candidates legally to work in Spain.
- Engagement: Direct employment Powered by JazzHR.