Social network you want to login/join with:
Location: London, UK (Hybrid - 3 days a week in person)
Contract: 12 months +
Start: ASAP
Status: Inside IR35
Key Responsibilities:
- Evaluate the Cyber Analytics Platform architecture and design the data lake structure.
- Identify data sources for ingestion and define a prioritized backlog of processing requirements.
- Design scalable data pipelines and document ingestion patterns.
- Provide technical leadership and mentorship to the engineering team.
- Implement data governance, access controls, and security models.
- Work with DevOps engineers to ensure CI/CD processes for data workflows.
- Build cloud-based solutions focusing on high-performance analytics and distributed data processing.
- Optimize data workflows and pipelines for large-scale data processing (terabytes/billions of events).
- Design cost-efficient, scalable solutions in the cloud.
- Ensure data quality, security, and compliance in all pipelines.
- Perform ongoing tuning and optimization for pipeline performance.
Required Skills & Qualifications:
- Expertise in data ingestion from sources like Azure Event Hubs, Kafka, REST APIs, and file storage.
- Proficiency with Azure services, including Data Lake, Event Hubs, and Storage Accounts.
- Advanced knowledge of Databricks, Spark, and distributed data processing frameworks.
- Strong experience in ETL/ELT optimization and data modeling.
- Familiarity with CI/CD for data workflows.
- Background in data governance, security, and compliance.
- Technical leadership experience in guiding engineering teams.
- Proven experience in designing and implementing cloud-based big data analytics platforms.
- Tools Expertise: Azure Cloud Services, Databricks, DevOps (CI/CD), Terraform/Infrastructure as Code (plus).