Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A leading company is seeking an experienced Kafka Architect to design and implement scalable data streaming solutions using Apache Kafka on Cloudera. The ideal candidate will possess strong expertise in distributed systems and real-time data processing, contributing to key responsibilities within a collaborative environment.
Job Description
Job Title: Kafka Architect- Cloudera
Location: Leeds (2days/week Onsite)
Duration: 06+ Months
Job Summary:
We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing.
Key Responsibilities:
-Design and implement scalable Kafka-based architectures using open-source -Kafka on Cloudera on-premises infrastructure.
-Lead the setup, configuration, and optimization of Kafka clusters.
-Define standards and best practices for Kafka producers, consumers, and streaming applications.
-Integrate Kafka with various data sources, storage systems, and enterprise applications.
-Monitor Kafka performance and ensure high availability, fault tolerance, and data security.
-Collaborate with DevOps, Data Engineering, and Application teams to support real-time data needs.
-Automate deployment and configuration using tools like Ansible, Terraform, or Cloudera Manager.
-Provide technical leadership and mentorship to junior team members.
Required Skills:
-Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams).
-Experience with Cloudera distribution for Kafka on on-premises environments.
-Proficiency in designing high-volume, low-latency data pipelines.
-Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc.
-Experience with data serialization formats like Avro, JSON, Protobuf.
-Proficient in Java, Scala, or Python for Kafka-based development.
-Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.).
-Understanding of networking, security (SSL/SASL), and data governance.
-Experience with CI/CD pipelines and containerization (Docker, Kubernetes) is a plus.
Kind Regards
--
Priyanka Sharma
Senior Delivery Consultant
Office: 02033759240
Email: psharma@vallumassociates.com