Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

GCP Architect with Kafka Administration exp

Highbrow LLC

Remote

EUR 103.000 - 130.000

Vollzeit

Vor 21 Tagen

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A leading technology company is seeking an experienced GCP Architect with strong Kafka administration skills. The role involves managing Kafka clusters, optimizing performance, and ensuring compliance with security standards. Ideal candidates will have over 15 years in infrastructure, proven GCP experience, and excellent communication skills. This is a remote position suitable for candidates across the USA.

Qualifikationen

  • 15+ years of experience in infrastructure, with 5+ years as a Kafka Administrator or in a similar role.
  • Proven experience in GCP as a cloud architect.
  • Strong understanding of distributed systems and data streaming.

Aufgaben

  • Install, configure, and maintain Kafka clusters across various environments.
  • Monitor Kafka cluster health and performance using appropriate tools.
  • Implement and manage security protocols for Kafka infrastructure.

Kenntnisse

GCP
Kafka
Linux/Unix
Terraform
GitLab CI/CD

Tools

Prometheus
Grafana
Ansible
Docker
Kubernetes
Jobbeschreibung
Job Title :- GCPArchitectwithKafkaAdministrationexp

Employment Type :- W2

Duration :- Long Term

Visa Type :- All Visa applicable which are ready for W2

Location- Remote (USA)

Key Skills:GCPandKafka

Job Description
  • 15+ years of experience in infrastructure , with 5+ years as aKafkaAdministrator or in a similar role managingKafkaclusters,GCPCloudarchitect.
  • 5+GCPcloudarchitect.
  • Proven experience as aKafkaAdministrator or in a similar role managingKafkaclusters.
  • Strong understanding of distributed systems, data streaming, and event-driven architectures.
  • Experience with Linux/Unix operating systems and shell scripting.
  • Proficiency inGCPand containerization technologies (Docker, Kubernetes) is a plus.
  • Experience with network security principles and practices.
  • Proficiency in Terraform for infrastructure as code.
  • Experience with GitLab CI/CD pipelines is a plus.
  • Strong customer communication skills to help expand and secure more work with clients.
Job Responsibilities
  • Cluster Management:
  • Install, configure, and maintainKafkaclusters across various environments (development, testing, production).
  • Perform upgrades and patching ofKafkaand related components (e.g., Zookeeper).
  • Ensure optimal performance and reliability ofKafkaclusters.
  • Monitoring and Troubleshooting:
  • MonitorKafkacluster health and performance using tools like Prometheus, Grafana, or proprietary monitoring solutions.
  • Diagnose and resolveKafkabrokers, topics, partitions, and consumer issues.
  • Implement proactive measures to prevent potential issues.
  • Security and Compliance:
  • Implement and manage security protocols forKafka, including SSL/TLS encryption, Kerberos authentication, and access control policies.
  • Ensure compliance with organizational and industry standards for data security and privacy.
  • Apply network security principles to protectKafkainfrastructure.
  • Capacity Planning and Scalability:
  • Perform capacity planning to ensure theKafkainfrastructure can handle current and future workloads.
  • OptimizeKafkaconfigurations for performance and scalability based on application requirements.
  • Backup and Recovery:
  • Develop and maintain disaster recovery plans forKafkaenvironments.
  • Implement and test backup and restore procedures to ensure data integrity and availability.
  • Collaboration and Support:
  • Work closely with development teams to understandKafkausage patterns and provide guidance on best practices.
  • Provide support forKafka-related issues, including on-call support as needed.
  • DocumentKafkainfrastructure, configurations, and operational procedures.
  • Automation and Scripting:
  • Develop automation scripts for routine tasks such as cluster provisioning, monitoring, and maintenance using tools like Ansible, Puppet, or custom scripts.
  • Implement CI/CD pipelines forKafka-related deployments and updates.
  • Client Communication and Reporting:
  • Maintain regular communication with clients to provide updates and gather feedback.
  • Prepare and present weekly and monthly status reports to stakeholders.
  • Present proof-of-concept (POC) solutions and designs to clients and internal teams.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.