Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A leading company is seeking a highly experienced Data Solution Architect to design and optimize large-scale data and cloud solutions. This role involves architecting scalable data platforms, ensuring compliance, and mentoring teams. On-site work in Coventry is required 2-3 days per week.
Data Solution Architect
Whitehall Resources require a Data Solution Architect to work with a key client on a 3 month initial contract.
*This role will involve on site work in Coventry 2-3 days per week.
*Inside IR35.
Data Solution Architect
The Role
We are seeking for a highly experienced Data/Cloud Solution Architect to join our growing team. This role is responsible for designing, implementing, and optimizing large-scale data and cloud solutions across Azure, Databricks, and Microsoft Fabric. Must have proven track record in managing complex, large-scale data environments (1000TB+) with prior experience in the utilities or telemetry/SCADA systems domain is a strong plus.
Your responsibilities:
• Architect and implement scalable, secure, and high-performance data platforms using Azure, Databricks, and Microsoft Fabric for real-time and batch processing.
• Lead integration across structured and unstructured data sources such as SCADA, SAP, APIs, telemetry, and time-series data using modern ETL/ELT patterns.
• Establish robust data governance, security, and compliance frameworks including data masking, encryption, lineage, access controls, and regulatory adherence (e.g., GDPR, ISO 27001).
• Design and optimize cloud-native storage and compute architectures using Delta Lake, ADLS Gen2, Synapse, Azure SQL, Cosmos DB, and NoSQL for petabyte-scale workloads.
• Implement event-driven and real-time streaming architectures using Kafka, Azure Event Hubs, IoT Hub, and Lambda architecture patterns.
• Drive DevOps and IaC practices through automated CI/CD pipelines using tools such as Terraform, Bicep, Azure DevOps, and GitHub Actions.
• Collaborate with cross-functional stakeholders including business leads, engineers, security, and vendors to align technology with strategic business outcomes.
• Implement monitoring, observability, and disaster recovery strategies ensuring high availability, system resilience, and proactive issue resolution.
• Lead AI/ML and analytics integrations with Databricks, Power BI, and MDM platforms to enable advanced reporting and insights.
• Mentor and enable internal teams through technical training, knowledge-sharing sessions, and architectural best practices to promote a data-driven culture.
Your Profile
Essential skills/knowledge/experience:
• Azure Data Factory, Synapse Analytics, Azure Databricks
• ADLS, Azure Blob, Azure SQL DB, Cosmos DB, Delta Lake, Oracle DB
• Azure Event Hub, Azure Kafka, Azure IoT Hub, ADX
• Fabric including OneLake
• CI/CD using Azure DevOps, GitHub, ARM
• Experience with unstructured data
• Data Model, Data Mapping, ETL Mapping
• Data Governance (Purview, Databrick Unity Catlog)
• Data Profiling, Data Quality, Security
• MDM (Profisee, Informatica)
• HADR, AI/ML
• Compute, network strategies (Private Endpoint, Vnet, ExpressRoute)
• Security frameworks, IAM, RBAC, firewall rules, Zero Trust architecture
• Threat modelling, risk assessments
• Monitoring, Logging
• Cost and performance management
Desirable skills/knowledge/experience:
• MDM (Informatica)
• Data Governance (Collibra)
• Utilities experience
• Experience with SCADA, eSCADA, telemetry, SAP PM, GIS
• Terraform
• Bicep
• Python, Kusto
Name:
Please include your first and last name.
Email: @
Phone:
Please include your country code.
CV / Resume:
Yes, I am currently eligible to work (work permit/visa/citizenship) in the country to which I am applying. No, I am not currently eligible to work (work permit/visa/citizenship) in the country to which I am applying.