Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A leading company in the technology sector is seeking a DV (MOD) Cleared Data Engineer in Bristol. This hands-on contract role involves designing robust data solutions using Elastic Stack and Apache NiFi, emphasizing secure data management. Ideal candidates will have experience working on regulated sectors, focusing on high-performance data pipelines and collaboration with cybersecurity stakeholders.
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi
Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site)
Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to take the lead in building robust, real-time data pipelines in a security-focused environment.
This is a hands-on contract opportunity to make a real impact-ideal for professionals with a strong track record in regulated sectors.
What You'll Be Doing
Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi
Handling real-time data ingestion and transformation with an emphasis on integrity and availability
Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs
Monitoring and optimising high-throughput data flows across on-prem and cloud environments
Building insightful Kibana dashboards to support business intelligence and operational decision-making
Maintaining documentation of data flows, architecture, and security procedures to ensure audit-readiness
Your Experience
Must-Have:
Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries
Proficiency in the full Elastic Stack for data processing, analytics, and visualisation
Hands-on expertise with Apache NiFi in designing sophisticated data workflows
Solid scripting capabilities using Python, Bash, or similar
Familiarity with best practices in data protection (encryption, anonymisation, access control)
Experience managing large-scale, real-time data pipelines
Working knowledge of cloud services (AWS, Azure, GCP), especially around secure deployment
Nice-to-Have:
Background in government, defence, or highly regulated sectors
Exposure to big data tools like Kafka, Spark, or Hadoop
Understanding of containerisation and orchestration (e.g. Docker, Kubernetes)
Familiarity with infrastructure as code tools (e.g. Terraform, Ansible)
Experience building monitoring solutions with Prometheus, Grafana, or ELK
Interest in or exposure to machine learning-driven data systems