Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Consultant Apache Kafka & Distributed Data Systems (m / f / d)

beON consult

Frankfurt

Vor Ort

EUR 70.000 - 95.000

Vollzeit

Vor 4 Tagen
Sei unter den ersten Bewerbenden

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

A leading consultancy firm is seeking a Senior Consultant in Apache Kafka & Distributed Data Systems. This role involves designing and implementing advanced event streaming platforms, providing comprehensive customer consulting, and leading projects utilizing the latest tools and technology in an agile environment. Candidates should possess strong technical skills in Java, Kafka, and cloud technologies, with a customer-oriented approach and the ability to communicate effectively in both German and English.

Qualifikationen

  • Sound experience and knowledge in Java.
  • Solid experience with Apache Kafka or similar large-scale distributed data systems.
  • Understanding of SDLC processes and Cloud Operations.

Aufgaben

  • Responsible for design and architecture of event streaming platforms.
  • Prepare, plan, and implement highly scalable Apache Kafka projects.
  • Lead design and implementation of streaming architectures using open source and cloud tools.

Kenntnisse

Java
Apache Kafka
Big Data Management
Stream Processing
Docker
Kubernetes
SQL Azure
AWS Development
CI/CD
Agile

Ausbildung

Completed studies with a technical background

Tools

Apache Spark
CockroachDB
HDFS
Hive
NewRelic
ELK
Jenkins
GIT

Jobbeschreibung

  • As Senior Consultant Apache Kafka & Distributed Data Systems you will be responsible for the design, architecture, administration, and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards, and using the latest tools and methods.
  • You are in close contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable Apache Kafka event streaming platforms or Distributed Data Systems projects and for comprehensive customer consulting on the current state of these technologies.
  • As a Senior Consultant for Big Data Management and Stream Processing, your goal is to lead the design and implementation of architectures for streaming platforms and stream processing use cases using open source and cloud tools.

Qualifications

  • Completed studies or comparable training with a technical background
  • Sound experience and knowledge in Java
  • Solid experience with Apache Kafka or similar large-scale enterprise distributed data systems with various distributed technologies e.g. Apache Kafka , Spark, CockroachDB, HDFS, Hive, etc.
  • Experience in software development and automation to run big data systems
  • Experience with developing and implementing complex solutions for Big Data and Data Analytics applications
  • Experience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and / or Kubernetes
  • Experience in developing resilient scalable distributed systems and microservices architecture
  • Experience with various distributed technologies (e.g. Kafka , Spark, CockroachDB, HDFS, Hive, etc.)
  • Experience with stream processing frameworks (e.g. Kafka Streams , Spark Streaming, Flink, Storm)
  • Experience with Continuous Integration / Continuous Delivery (CI / CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.
  • Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
  • Knowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring Boot
  • Experience in SQL Azure and in AWS development
  • Experience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and / or Hybrid / Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and tools
  • Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus, and Graylag
  • Ability to communicate technical ideas in a business-friendly language
  • Interest in modern organizational structure and an agile working environment (SCRUM)
  • Customer-oriented and enjoy working in an international environment in German and English
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.