Job Summary
We are seeking a highly skilled Confluent Kafka Engineer to join our dynamic team. The ideal candidate will be responsible for the development, user acceptance testing (UAT), and production support of our Confluent Kafka‑based systems. This role requires deep expertise in Kafka architecture, including Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect. The engineer will collaborate closely with cross‑functional teams to ensure the smooth operation of our data streaming services and provide support across various environments.
Key Responsibilities
Development
- Design, develop, and implement Kafka-based solutions to meet business requirements, utilizing Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
- Write and maintain high‑quality code for Kafka producers, consumers, and stream processing applications.
- Develop and manage Kafka connectors for seamless integration with external systems, ensuring data consistency and reliability.
- Utilize Kafka Streams for real‑time processing of streaming data, transforming and enriching data as it flows through the pipeline.
- Employ KSQLDB for stream processing tasks, including real‑time analytics and transformations.
- Collaborate with data engineers, software developers, and DevOps teams to integrate Kafka solutions with existing systems.
- Ensure all Kafka‑based solutions are scalable, secure, and optimized for performance.
User Acceptance Testing (UAT)
- Develop and execute UAT plans to validate Kafka solutions, including components such as Kafka Streams, KSQLDB, and Kafka Connect, before deployment to production.
- Work closely with QA teams to identify and resolve defects during the UAT phase.
- Ensure all UAT activities comply with the organization’s standards and best practices.
- Provide detailed reports on UAT outcomes and work with developers to implement necessary fixes.
Production Support
- Monitor and maintain Kafka clusters, including components like Confluent Control Center, to ensure high availability and reliability of data streaming services.
- Troubleshoot and resolve issues related to Kafka performance, latency, and data integrity, including issues specific to Kafka Streams, KSQLDB, and Kafka Connect.
- Perform routine maintenance tasks such as patching, upgrades, and backups for Kafka clusters and associated components.
- Implement monitoring solutions to proactively identify and mitigate potential production issues, leveraging Confluent Control Center for comprehensive cluster visibility.
- Provide 24/7 support for production systems, including participation in on‑call rotation.
Qualifications
Education
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
Experience
- 3+ years of hands‑on experience with Apache Kafka and Confluent Kafka in a production environment, including experience with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
- Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.
- Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.
- Familiarity with distributed systems, microservices architecture, and event‑driven design patterns.
- Experience with cloud platforms (e.g., AWS, Azure) and containerization (Kubernetes) is a plus.
- Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j, ELK Stack).
- Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
- Experience with CI/CD pipelines and automation tools (e.g., Jenkins, GitLab CI).
Technical Skills
- Proficiency in programming languages such as Java, Python, or Scala.
- Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.
- Experience with Kafka Streams for building scalable, fault‑tolerant stream processing applications.
- Experience with KSQLDB for real‑time processing and analytics on Kafka topics.
- Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.
- Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j, ELK Stack).
- Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
Soft Skills
- Strong analytical and problem‑solving abilities.
- Excellent communication and teamwork skills.
- Ability to work independently and manage multiple tasks effectively.
- A proactive approach to learning new technologies and improving existing processes.