Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

(Sr.) Data Platform Engineer (m/f/d)

Smartbroker

Berlin

Vor Ort

EUR 70.000 - 90.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A leading data and fintech company based in Berlin is seeking an experienced Data Platform Engineer. The role involves improving a cloud-based data platform and developing data pipelines for analytics and business insights. Candidates should have a strong background in data engineering, experience with cloud platforms like AWS or GCP, and excellent programming skills in Python. The position offers opportunities to work with innovative data technologies and contribute to a data-driven culture.

Leistungen

Flexible work environment
Health benefits
Training opportunities

Qualifikationen

  • Several years of experience in data engineering and strong know-how in building data-native platforms.
  • Significant hands-on experience designing data pipelines on cloud platforms.
  • Excellent SQL skills and strong programming skills in Python.

Aufgaben

  • Develop and improve our cloud-based data platform.
  • Build end-to-end data pipelines from raw data ingestion to consumable data.
  • Implement data quality monitoring to ensure accuracy of data pipelines.

Kenntnisse

Data engineering
Cloud-based data platforms
SQL
Python programming
Data modeling

Ausbildung

University degree in computer science or related field

Tools

AWS
GCP
Docker
Kubernetes
Kafka
Jobbeschreibung
Your mission

Our significantly growing Data, AI & MarTech Department is looking for an experienced Data Platform Engineer. The position is responsible for improving and continuously further developing our cloud-based data platform – the heart of Smartbroker‘s technical data infrastructure for business analytics & insights. Join us and play a major role in promoting and enabling a truly data-driven culture across the organisation!
Job description:

  • Develop and improve our cloud based data platform for data analytic and business insights using most innovative data technologies
  • Build end-to-end data pipelines from raw data ingestion to consumable data: prepare and clean structured and unstructured data and develop high-quality data models for advanced analytics and AI use cases
  • Implement data quality monitoring to ensure accuracy and reliability of data pipelines
  • Architect, code, and deploy data infrastructure components
  • Collaborate closely with highly ambitious data engineers and analysts in our growing Data, AI & MarTech Department as well as product technology colleagues
  • Stay up to date with latest market developments in data cloud architecture and share your knowledge
Über die Smartbroker Gruppe
Your profile
  • Universitydegree in computer science, mathematics, natural sciences, or a similar field
  • Several years of experience in data engineering and strong know-how in building data-native, robust, scalable, and maintainable data platforms
  • Significant hands-on experience designing and operating data pipelines on cloud based data platforms (AWS, GCP) using data-native services (S3, Athena, BigQuery…)
  • Experience in data warehousing and containerization, e.g., Kubernetes, Docker…
  • Advanced knowledge about cloud networking & security (IAM, security groups…)
  • Proficient and experienced with Infrastructure as Code
  • Deep understanding of software engineering best practices: requirements specification, version control, CI/CD, testing, deployment, and monitoring of data pipelines and services
  • Excellent SQL skills and strong programming skills in Python, ideally including Airflow and PySpark
  • Strong knowledge in data streaming technologies like Kafka, Kinesis, Flink…
  • Excellent English communication skills, German is a plus
  • Interest in finance and fintech industry and a sense of humor
Benifits
Why us?
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.