Enable job alerts via email!

Kafka Engineer

United Software Group Inc. - Canada

Canada

Remote

CAD 80,000 - 120,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative firm is seeking a Kafka Engineer to design and manage Kafka-based data pipelines and messaging solutions. This role involves configuring and maintaining Kafka clusters, ensuring high availability, and optimizing performance. You will collaborate with development teams, implement security measures, and automate operations using tools like Terraform or Ansible. The ideal candidate will have strong experience with Kafka in production, proficiency in Java, Scala, or Python, and a solid understanding of distributed systems. Join a forward-thinking team where your expertise will drive real-time data processing and operational efficiency.

Qualifications

  • 3-5 Jahre Erfahrung mit Apache Kafka in einer Produktionsumgebung.
  • Kenntnisse in Kafka-Sicherheit und Cluster-Management.

Responsibilities

  • Entwerfen und Verwalten von Kafka-basierten Datenpipelines.
  • Optimieren der Kafka-Konfigurationen für Leistung und Skalierbarkeit.

Skills

Apache Kafka
Java
Scala
Python
Kafka security (SSL, SASL, ACLs)
Distributed systems
Problem-solving
Communication

Tools

Kafka Streams
KSQL
Apache Flink
Terraform
Ansible
Zookeeper/Kraft
Schema Registry
Kafka Connect
AWS
Azure
GCP
Docker
Kubernetes

Job description

Job Posting Title

Kafka Engineer


Location: Halifax, CA (Remote)

Responsibilities:
  1. Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing.
  2. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth.
  3. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency.
  4. Collaborate with development teams to integrate Kafka into applications and services.
  5. Develop and maintain Kafka connectors such as Postgres, Cosmos DB, MongoDB, and Azure Blob Storage connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights.
  6. Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance.
  7. Optimize Kafka configurations for performance, reliability, and scalability.
  8. Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead.
  9. Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications.
  10. Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration.
  11. Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends.
Required Skills:
  1. 3-5 years of experience working with Apache Kafka in a production environment.
  2. Strong knowledge of Kafka architecture, including brokers, topics, partitions, and replicas.
  3. Experience with Kafka security, including SSL, SASL, and ACLs.
  4. Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments.
  5. Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink.
  6. Solid understanding of distributed systems, data streaming, and messaging patterns.
  7. Proficiency in Java, Scala, or Python for Kafka-related development tasks.
  8. Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging.
  9. Experience with tools like Zookeeper/Kraft, Schema Registry, and Kafka Connect.
  10. Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment.
  11. Excellent communication and collaboration skills to work effectively with cross-functional teams and stakeholders.
Preferred Skills/Certifications:
  1. Kafka certification or related credentials, such as:
  • Confluent Certified Administrator for Apache Kafka (CCAAK)
  • Cloudera Certified Administrator for Apache Kafka (CCA-131)
  • AWS Certified Data Analytics – Specialty (with a focus on streaming data solutions)
  • Experience with cloud platforms like AWS, Azure, or GCP.
  • Knowledge of containerization technologies like Docker and Kubernetes.
  • Familiarity with other messaging systems like RabbitMQ or Apache Pulsar.
  • Experience with data serialization formats like Avro, or JSON.
  • Get your free, confidential resume review.
    or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.