Senior Consultant Apache Kafka & Distributed Data Systems (m / f / d)
beON consult
Frankfurt
EUR 60.000 - 100.000
Jobbeschreibung
As Senior Consultant Apache Kafka & Distributed Data Systems you will be responsible for the design, architecture, administration, and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards, and using the latest tools and methods.
You are in close contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable Apache Kafka event streaming platforms or Distributed Data Systems projects and for comprehensive customer consulting on the current state of these technologies.
As a Senior Consultant for Big Data Management and Stream Processing, your goal is to lead the design and implementation of architectures for streaming platforms and stream processing use cases using open source and cloud tools.
Qualifications
Completed studies or comparable training with a technical background
Sound experience and knowledge in Java
Solid experience with Apache Kafka or similar large-scale enterprise distributed data systems with various distributed technologies e.g. Apache Kafka, Spark, CockroachDB, HDFS, Hive, etc.
Experience in software development and automation to run big data systems
Experience with developing and implementing complex solutions for Big Data and Data Analytics applications
Experience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and / or Kubernetes
Experience in developing resilient scalable distributed systems and microservices architecture
Experience with various distributed technologies (e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.)
Experience with Continuous Integration / Continuous Delivery (CI / CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.
Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
Knowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring Boot
Experience in SQL Azure and in AWS development
Experience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and / or Hybrid / Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and tools
Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus, and Graylag
Ability to communicate technical ideas in a business-friendly language
Interest in modern organizational structure and an agile working environment (SCRUM)
Customer-oriented and enjoy working in an international environment in German and English