Job Description:
Responsibilities:- Consult with inquiring teams on how to leverage Kafka within their pipelines.
- Architect, Build and Support existing and new Kafka clusters via IaC.
- Partner with Client teams to route traffic through Kafka by utilizing open-source agents and collectors deployed via Chef
- Remediate any health issues within Kafka.
- Automate (where possible) any operational processes on the team.
- Create new and/or update monitoring dashboards and alerts as needed.
- Manage a continuous improvement / continuous development (CI/CD) pipeline.
- Perform PoC’s on new components to expand/enhance team’s Kafka offerings.
Skills and Qualifications:
- 2+ years of experience with Kafka clustering & administration.
- 2+ years of experience building, deploying, and supporting multiple Kafka clusters using IaC (Infrastructure-as-Code) best practices.
- Experience developing automated processes within and around Kafka to help supplement the service.
- Experience working with multiple teams to build and architect data pipeline solutions where Kafka will be involved.
- Experience with Linux/Unix and system management.
- IaC (Infrastructure-as-Code) experience with Virtual/Physical Servers using Chef, Ansible, Jenkins, Artifactory, etc.
- Understanding of Git workflows, continuous improvement / continuous development (CI/CD) concepts.
- Strong verbal and written communication skills.
- Strong technical acumen.
- Strong analytical skills.
- Experience working in Agile/Lean methodologies.
Preferred Qualifications:
- Knowledge and experience with Client, Elastic, Kibana and Grafana.
- Knowledge and experience with log collection agents such as Open-Telemetry, Fluent Bit, FluentD, Beats and LogStash.
- Knowledge and experience with Kubernetes / Docker.
- Knowledge and experience with Kafka-Connect.
- Knowledge and experience with AWS or Azure.
- Knowledge and experience with Streaming Analytics.