Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A leading company is seeking a Kafka Architect to design and implement scalable data streaming solutions using Apache Kafka on Cloudera. The ideal candidate will have expertise in distributed systems and real-time data processing, along with strong experience in configuring and optimizing Kafka clusters.
Daily new job offers in IT. Find your dream job today! De # 1 jobsite voor IT. Vind de perfecte match. 38.000 actieveprofielen. Brands: 38 000 active profiles, Find the perfect match, The #1jobsite for IT.
Leeds, Yorkshire and the Humber, United Kingdom
Posted today
This advertiser has chosen not to accept applicants from your region.
Job Description
JobTitle: Kafka Architect- Cloudera
Location:Leeds (2days/week Onsite)
Duration: 06+Months
Job Summary:
We are looking for an experienced Kafka Architect to design andimplement scalable, high-throughput data streaming solutions usingApache Kafka on Cloudera (on-premises). The ideal candidate will havea strong background in distributed systems, data pipelines, andreal-time data processing.
KeyResponsibilities:
-Design and implement scalableKafka-based architectures using open-source -Kafka on Clouderaon-premises infrastructure.
-Lead the setup, configuration, andoptimization of Kafka clusters.
-Define standards and bestpractices for Kafka producers, consumers, and streaming applications.
-Integrate Kafka with various data sources, storage systems,and enterprise applications.
-Monitor Kafka performance andensure high availability, fault tolerance, and data security.
-Collaborate with DevOps, Data Engineering, and Applicationteams to support real-time data needs.
-Automate deployment andconfiguration using tools like Ansible, Terraform, or ClouderaManager.
-Provide technical leadership and mentorship to juniorteam members.
Required Skills:
-Strong hands-on experience with Apache Kafka (including KafkaConnect, Kafka Streams).
-Experience with Cloudera distributionfor Kafka on on-premises environments.
-Proficiency indesigning high-volume, low-latency data pipelines.
-Solidknowledge of Kafka internals – topics, partitions, consumer groups,offset management, etc.
-Experience with data serializationformats like Avro, JSON, Protobuf.
-Proficient in Java, Scala,or Python for Kafka-based development.
-Familiarity withmonitoring tools (Prometheus, Grafana, Confluent Control Center,etc.).
-Understanding of networking, security (SSL/SASL), anddata governance.
-Experience with CI/CD pipelines andcontainerization (Docker, Kubernetes) is a plus.
Kind Regards
--
Priyanka Sharma
Senior Delivery Consultant
Office: 02033759240
Email: