Enable job alerts via email!

Remote DevOps Engineer (Kafka/Kong)

Pennant Solutions Group

Richmond (VA)

Remote

USD 90,000 - 150,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a skilled Senior Kafka/Kong Administrator with a strong DevOps background to join their remote IT operations team. In this pivotal role, you will oversee the deployment, configuration, and maintenance of Kafka clusters and Kong gateways, ensuring high availability and optimal performance of real-time platforms. Your expertise will support data engineering and application teams, driving seamless data integration. If you have a passion for technology and a proven track record in Kafka and Kong administration, this opportunity offers a chance to make a significant impact in a collaborative environment.

Qualifications

  • 5+ years of experience in administering Apache Kafka clusters.
  • Strong understanding of Kafka internals and ecosystem tools.
  • Bachelor's degree in Computer Science or equivalent experience.

Responsibilities

  • Deploy and configure Kafka clusters for optimal performance.
  • Implement monitoring solutions to identify performance bottlenecks.
  • Collaborate with teams to ensure smooth integration of Kafka.

Skills

Kafka Administration
Kong Administration
DevOps
Cloud Platforms (AWS, Azure, GCP)
Linux Systems
Containerization (Docker, Kubernetes)
Troubleshooting
API Design Standards
Security Measures
Documentation

Education

Bachelor's degree in Computer Science or related field

Tools

Kafka
Kong API Gateway
Kubernetes
Docker
Kafka Connect
Kafka Streams
Schema Registry

Job description

Senior Kafka Administrator/Kong Administrator with DevOps Experience

Remote-Contract to Hire

Our client is seeking a skilled and experienced Senior Kafka/Kong Administrator to join our client's IT operations team. As a Kafka/Kong Administrator, you will be responsible for the deployment, configuration, maintenance, and monitoring of our Kafka clusters and Kong gateways, ensuring the high availability, reliability, and optimal performance of our real-time platforms. Your role will be crucial in supporting our data engineering and application teams to deliver seamless data integration and processing.

Responsibilities: Kafka Administration
  1. Cluster Deployment and Configuration:
    • Deploy and configure Kafka clusters following best practices for scalability, security, and performance.
    • Collaborate with cross-functional teams to gather requirements and design Kafka infrastructure to meet data streaming needs.
    • Manage topics, partitions, replication factors, and broker configurations to ensure efficient data distribution and fault tolerance.
  2. Monitoring and Performance Optimization:
    • Implement monitoring and alerting solutions to proactively identify and address performance bottlenecks, resource constraints, and anomalies.
    • Conduct regular performance testing, load testing, and capacity planning to ensure clusters can handle anticipated workloads.
  3. High Availability and Disaster Recovery:
    • Design and implement high-availability strategies, including failover mechanisms and data replication across multiple data centers or cloud regions.
    • Develop and maintain disaster recovery plans and procedures to minimize data loss and downtime in the event of failures.
  4. Security and Compliance:
    • Implement and manage security measures such as encryption, authentication, and authorization to ensure data privacy and compliance with industry standards.
    • Stay updated on security vulnerabilities and patches, applying necessary updates to maintain a secure Kafka environment.
  5. Troubleshooting and Issue Resolution:
    • Diagnose and resolve Kafka-related issues, including performance degradation, data loss, and connectivity problems.
    • Collaborate with development and data engineering teams to troubleshoot consumer/producer application integration with Kafka.
  6. Documentation and Knowledge Sharing:
    • Maintain thorough documentation of Kafka configurations, deployment processes, and troubleshooting procedures.
    • Provide training and knowledge sharing sessions to junior team members and other stakeholders.
  7. Collaboration and Communication:
    • Collaborate with cross-functional teams, including data engineers, developers, and system administrators, to ensure smooth integration of Kafka into our data ecosystem.
    • Communicate effectively with stakeholders to provide updates on Kafka performance, maintenance, and improvements.
  8. Certification as a Confluent Certified Administrator for Apache Kafka (CCAAK) preferred.
Responsibilities: Kong Administration
  1. Hands-on experience installing and configuring Kong API Gateway as well as integration with existing API frameworks.
  2. Describe and configure dataplane security using RBAC, Roles, Workspaces, and Teams.
  3. Configure upstreams and load balancing.
  4. Create Services, Routes, and Consumers.
  5. Experienced in the Kubernetes platform.
  6. Working knowledge of Kong Ingress with Kubernetes/Dockers.
  7. Excellent knowledge of API design standards, patterns, and best practices especially with OpenAPI 2.0, REST, JSON, XML.
  8. Troubleshoot problems on Kong Gateway.
  9. Distinguish between downstream and upstream networking issues.
  10. Collect troubleshooting/debug information.
  11. Using log files and health and traffic metrics.
  12. Deep understanding of JWTs and their usage.
  13. Preferred certification in Kong Gateway Certified Associate (KGCA).
  14. 5+ years of experience with monitoring and troubleshooting. Should be able to monitor and debug platform or application issues.
  15. Use Kong Vitals to monitor Kong Gateway’s health and performance.
  16. Pull metrics using Vitals API.
Qualifications:
  • Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
  • 5+ years of hands-on experience in administering Apache Kafka clusters in production environments.
  • 5+ years proficiency in Kafka/Kong architecture, installation, configuration, and tuning.
  • Strong understanding of Kafka internals, including topics, partitions, replication, and consumer/producer APIs.
  • 5+ years of experience with Kafka ecosystem tools such as Kafka Connect, Kafka Streams, and Schema Registry.
  • Knowledge of Linux systems and shell scripting.
  • Strong DevOps background.
  • 5+ years of familiarity with cloud platforms (e.g., AWS, Azure, GCP) and containerization (e.g., Docker, Kubernetes).
  • Excellent problem-solving skills and the ability to troubleshoot complex issues efficiently.
  • Strong communication skills and the ability to work collaboratively in a team environment.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

DevOps Engineer

Nivoda Limited

New York

Remote

USD 90,000 - 150,000

30+ days ago

Principal Platform Engineer

eHealth

Remote

USD 120,000 - 180,000

30+ days ago