Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A leading company in Leeds is seeking an experienced Kafka Architect to design and implement scalable data streaming solutions using Apache Kafka on Cloudera. The ideal candidate will have expertise in distributed systems and real-time data processing, with responsibilities including optimizing Kafka clusters and collaborating with various teams to support data needs.
Social network you want to login/join with:
col-narrow-left
leeds, west yorkshire, United Kingdom
Other
-
Yes
col-narrow-right
1
25.05.2025
09.07.2025
col-wide
Job Title: Kafka Architect- Cloudera
Duration: 06+ Months
Job Summary:
We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing.
Key Responsibilities:
-Design and implement scalable Kafka-based architectures using open-source -Kafka on Cloudera on-premises infrastructure.
-Lead the setup, configuration, and optimization of Kafka clusters.
-Define standards and best practices for Kafka producers, consumers, and streaming applications.
-Integrate Kafka with various data sources, storage systems, and enterprise applications.
-Monitor Kafka performance and ensure high availability, fault tolerance, and data security.
-Collaborate with DevOps, Data Engineering, and Application teams to support real-time data needs.
-Automate deployment and configuration using tools like Ansible, Terraform, or Cloudera Manager.
-Provide technical leadership and mentorship to junior team members.
Required Skills:
-Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams).
-Experience with Cloudera distribution for Kafka on on-premises environments.
-Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc.
-Experience with data serialization formats like Avro, JSON, Protobuf.
-Proficient in Java, Scala, or Python for Kafka-based development.
-Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.).
-Understanding of networking, security (SSL/SASL), and data governance.
-Experience with CI/CD pipelines and containerization (Docker, Kubernetes) is a plus.