Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Consultant Apache Kafka & Distributed Data Systems (m/f/d)

TN Germany

Berlin

Vor Ort

EUR 60.000 - 100.000

Vollzeit

Vor 21 Tagen

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

An established industry player is seeking a Senior Consultant specializing in Apache Kafka and Distributed Data Systems. In this pivotal role, you will design and deploy advanced event streaming platforms while collaborating closely with clients to ensure successful implementation and migration of scalable solutions. You will leverage your expertise in Big Data Management, cloud technologies, and microservices architecture to drive impactful projects. Join a dynamic team that values innovation and fosters an agile working environment, where your contributions will play a key role in shaping the future of data systems. If you thrive in an international setting and are passionate about technology, this opportunity is perfect for you.

Qualifikationen

  • Technical background or comparable training is required.
  • Experience in Java and Apache Kafka is essential.
  • Knowledge of cloud technologies and DevOps practices is beneficial.

Aufgaben

  • Design and implement advanced event streaming platforms using Apache Kafka.
  • Lead customer consulting on scalable data systems and migration projects.
  • Develop architectures for streaming platforms and data analytics applications.

Kenntnisse

Java
Apache Kafka
Big Data Management
Stream Processing
Microservices Architecture
Agile Methodologies
DevOps Transformation
Cloud Migration
SQL Azure
Communication Skills

Ausbildung

Completed studies in a technical field

Tools

Docker
Kubernetes
Jenkins
AWS
Azure
NewRelic
ELK
Prometheus

Jobbeschreibung

Social network you want to login/join with:

Senior Consultant Apache Kafka & Distributed Data Systems (m/f/d), Berlin

Client:

beON consult

Location:

Berlin, Germany

Job Category:

Consulting

EU work permit required:

Yes

Job Description:

Responsibilities

  • As Senior Consultant Apache Kafka & Distributed Data Systems you will be responsible for the design, architecture, administration, and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards, and using the latest tools and methods.
  • You are in close contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable Apache Kafka event streaming platforms or Distributed Data Systems projects and for comprehensive customer consulting on the current state of these technologies.
  • As a Senior Consultant for Big Data Management and Stream Processing, your goal is to lead the design and implementation of architectures for streaming platforms and stream processing use cases using open source and cloud tools.

Qualifications

  • Completed studies or comparable training with a technical background
  • Sound experience and knowledge in Java
  • Solid experience with Apache Kafka or similar large-scale enterprise distributed data systems with various distributed technologies e.g. Apache Kafka, Spark, CockroachDB, HDFS, Hive, etc.
  • Experience in software development and automation to run big data systems
  • Experience with developing and implementing complex solutions for Big Data and Data Analytics applications
  • Experience in system deployment and container technology with building, managing, deploying, and release managing Docker containers and container images based on Docker, OpenShift, and/or Kubernetes
  • Experience in developing resilient scalable distributed systems and microservices architecture
  • Experience with various distributed technologies (e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.)
  • Experience with stream processing frameworks (e.g. Kafka Streams, Spark Streaming, Flink, Storm)
  • Experience with Continuous Integration / Continuous Delivery (CI/CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory, and Nexus.
  • Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
  • Knowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring Boot
  • Experience in SQL Azure and in AWS development
  • Experience with DevOps transformation and cloud migration to one of AWS, Azure, Google Cloud Platform, and/or Hybrid/Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns, and tools
  • Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus, and Graylag
  • Ability to communicate technical ideas in a business-friendly language
  • Interest in modern organizational structure and an agile working environment (SCRUM)
  • Customer-oriented and enjoy working in an international environment in German and English
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.