We are seeking a highly experienced Kafka Architect to lead the design, development, and optimization of large-scale, real-time data streaming platforms. The ideal candidate will have deep expertise across the Kafka ecosystem and a strong grasp of Confluent Cloud, along with the ability to architect scalable and secure solutions that align with business objectives.
Design and implement Kafka-based data pipelines and streaming applications.
Architect end-to-end solutions using the Kafka ecosystem (Kafka Connect, Kafka Streams, ksqlDB, Schema Registry, etc.).
Deploy, configure, and manage Confluent Cloud in a secure and scalable manner.
Optimize Kafka performance including tuning, partitioning, replication, and security.
Work with DevOps teams to automate deployment and monitoring.
Provide thought leadership and best practices in event-driven architecture.
Collaborate with cross-functional teams including data engineers, solution architects, and product managers.
10+ years of IT experience with a strong focus on data architecture and engineering.
5+ years of hands-on experience with Kafka, including Kafka Streams, Kafka Connect, and Schema Registry.
In-depth experience with Confluent Cloud — deployment, configuration, and integration.
Strong programming background in Java, Scala, or Python.
Proficiency with cloud platforms (AWS/GCP/Azure), particularly in deploying Kafka clusters.
Solid understanding of microservices and event-driven architecture.
Familiar with security standards and encryption methods (SSL, SASL, RBAC).
Experience working with Kubernetes and Docker for container orchestration.
Exposure to monitoring tools like Prometheus, Grafana, and Confluent Control Center.
Confluent Certified Developer/Administrator is a big plus.
Strong problem-solving and communication skills.
Ability to work independently and lead technical discussions.
Stakeholder management and documentation skills.