Enable job alerts via email!

Kafka Expert – Data Streaming Engineer

Myticas Consulting

Ottawa

On-site

CAD 100,000 - 130,000

Full time

Today
Be an early applicant

Job summary

A technology consulting firm in Canada is seeking a skilled Kafka Expert to optimize real-time data streaming solutions. You will manage Apache Kafka clusters and integrate comprehensive tooling on Kubernetes. The ideal candidate has 5+ years of experience with Kafka and strong Linux skills to ensure high performance across distributed systems. This role includes designing pipelines and implementing observability practices.

Qualifications

  • 5+ years administering and supporting Apache Kafka in production.
  • Strong Linux system administration skills (Red Hat, Debian, Ubuntu).
  • Solid experience with Kubernetes (OpenShift, Rancher, or upstream distributions).
  • Proficiency in scripting and automation (Bash, Python, Ansible).
  • Experience with IaC, CI/CD pipelines, and GitOps.
  • Strong background in Kafka security, monitoring, and schema management.
  • 4+ years building scalable pipelines with Apache Spark and/or Apache Flink.

Responsibilities

  • Design, deploy, and manage Apache Kafka clusters.
  • Optimize Kafka performance, reliability, and scalability.
  • Integrate Kafka with tools such as Kafka Connect and Kafka Streams.
  • Deploy and operate Kafka on Kubernetes using Helm or custom manifests.
  • Manage and troubleshoot Linux systems supporting Kafka.
  • Build automation and Infrastructure-as-Code using Ansible and Terraform.
  • Implement observability and alerting with Prometheus and Grafana.

Skills

Apache Kafka
Linux systems
Kubernetes
Scripting and automation
Apache Spark
Apache Flink
IaC, CI/CD pipelines, and GitOps

Tools

Kafka Connect
Kafka Streams
Prometheus
Grafana
Fluent Bit
Job description

Kafka Expert – Data Streaming Engineer

We are seeking a highly skilled Kafka Expert with deep expertise in Apache Kafka, Linux systems, and Kubernetes to join our data platform team. This is a hands-on engineering role focused on building and optimizing real-time data streaming solutions that are scalable, secure, and production-ready. You’ll work closely with infrastructure and development teams to design streaming architectures, enable robust use cases, and ensure high availability and performance across distributed systems.

Key Responsibilities
  • Design, deploy, and manage Apache Kafka clusters in development, testing, and production.

  • Optimize Kafka performance, reliability, and scalability for high-throughput pipelines.

  • Integrate Kafka with tools such as Kafka Connect, Schema Registry, MirrorMaker, and Kafka Streams.

  • Deploy and operate Kafka on Kubernetes using Helm, Operators, or custom manifests.

  • Manage and troubleshoot Linux systems (Red Hat, Debian, Ubuntu) supporting Kafka.

  • Build automation and Infrastructure-as-Code (IaC) using Ansible, Terraform, GitLab CI/CD, and GitOps practices.

  • Implement observability and alerting with Prometheus, Grafana, Fluent Bit, Spark UI, and Flink Dashboard.

  • Lead incident response and root cause analysis for Kafka and related systems.

  • Collaborate with teams to build end-to-end data pipelines for real-time integration, event-driven microservices, and logging/monitoring.

  • Design and operate Spark Structured Streaming and Flink DataStream pipelines, including:

  • Resource allocation, scaling, and job scheduling.

  • State management, checkpointing, and fault tolerance.

  • Performance tuning for low latency and high throughput.

Required Qualifications
  • 5+ years administering and supporting Apache Kafka in production.

  • Strong Linux system administration skills (Red Hat, Debian, Ubuntu).

  • Solid experience with Kubernetes (OpenShift, Rancher, or upstream distributions).

  • Proficiency in scripting and automation (Bash, Python, Ansible).

  • Experience with IaC, CI/CD pipelines, and GitOps.

  • Strong background in Kafka security, monitoring, and schema management.

  • 4+ years building scalable pipelines with Apache Spark and/or Apache Flink, with expertise in performance tuning and distributed systems.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.