Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Engineer Data, AI & Analytics (m/w/d)

Purpose Green

Schönefeld

Hybrid

EUR 50.000 - 75.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A dynamic ClimateTech start-up in Germany is seeking a senior-level engineer to oversee their data and AI systems, spanning ingestion and modeling to LLM-based applications. Responsibilities include deploying AI-powered systems, optimizing data workflows, and mentoring a growing Data & AI team. The ideal candidate should have production-level expertise in Python and SQL, strong cloud experience (AWS, GCP, Azure), and proficiency in modern BI tools. The role offers a hybrid model, 30 vacation days, and opportunities for educational leave.

Leistungen

30 vacation days plus additional time off
Educational leave with budget
Membership at Urban Sports Club
BVG public transport ticket

Aufgaben

  • Architect and deploy AI-powered systems into production.
  • Build scalable ingestion and transformation pipelines for data.
  • Maintain and evolve a clean, documented data model.
  • Work closely with stakeholders to define data products.
  • Lay the foundation for a high-performing Data & AI team.

Kenntnisse

Strong communication and stakeholder management in both English and German
Production-level experience in Python
Solid SQL and data modeling expertise
Hands-on experience with a major cloud provider
Terraform/IaC experience
Experience using at least one modern BI tool
Deep AWS experience
Hands-on experience deploying AI/LLM-based systems
Experience using dbt Cloud
Familiarity with tracing and observability
Experience preparing datasets for LLMs
Exposure to reverse ETL tools

Tools

AWS
Terraform
dbt
SQL
Python
Power BI
FastAPI
Looker
GCP
Azure
Jobbeschreibung

Salary: 50.000 - 75.000 per year

Requirements
  • Strong communication and stakeholder management in both English and German (written + spoken) - min. C1
  • Production-level experience in Python for data and AI engineering (pipelines, APIs, orchestration)
  • Solid SQL and data modeling expertise (including incremental strategies)
  • Hands-on experience with a major cloud provider (AWS, GCP, or Azure)
  • Terraform/IaC experience for provisioning cloud infrastructure
  • Experience using at least one modern BI tool (e.g., Metabase, Power BI, Looker)
  • Deep AWS experience (S3, Lambda, Glue, Redshift, OpenSearch)
  • Hands-on experience deploying AI/LLM-based systems into production
  • Experience using dbt Cloud for transformation pipelines
  • Familiarity with tracing and observability (e.g., Langfuse, OpenTelemetry)
  • Experience preparing datasets and running supervised fine-tuning (SFT) of LLMs
  • Exposure to reverse ETL tools (e.g., Census, Hightouch) or building custom syncs to HubSpot, Slack, APIs
Responsibilities
  • AI & Application Engineering
  • Architect and deploy AI-powered systems into production (e.g., FastAPI apps with RAG architecture using OpenSearch and Langfuse)
  • Optimize agentic workflows, prompt & embedding pipelines, and retrieval quality through experimentation and tracing
  • Extend LLM capabilities with supervised fine-tuning (SFT) for in-domain data distributions where RAG alone underperforms
  • Data Engineering
  • Build scalable ingestion and transformation pipelines for both proprietary and external data (using Lambda, Glue, Terraform)
  • Own embedding pipelines for retrieval-augmented generation systems
  • Manage infrastructure-as-code for all core components (Redshift, S3, VPC, IAM, OpenSearch)
  • Analytics Engineering & BI
  • Maintain and evolve a clean, documented data model (dbt Cloud leverage Fusion)
  • Develop and maintain BI dashboards in QuickSight and/or Metabase
  • Provide ad-hoc analytical support for product, sales, and ops teams
  • Build event-driven automation and reverse ETL pipelines to serve data or AI outputs back into operational systems (e.g., HubSpot)
  • Leadership & Collaboration
  • Work closely with stakeholders across engineering, product, and operations to define the right data products and abstractions
  • Lay the foundation for a high-performing Data & AI team. Help hire, mentor, and establish best practices as we grow
Technologies
  • AI
  • AWS
  • Architect
  • Azure
  • Cloud
  • ETL
  • FastAPI
  • GCP
  • IAM
  • LLM
  • Looker
  • Power BI
  • Python
  • SQL
  • Slack
  • Terraform
  • dbt
More

Welcome

We are looking for a senior-level engineer to take full ownership of our data and AI systems, from ingestion and modeling to embedding pipelines and LLM-based applications. You will operate across domains (data infrastructure, BI, and AI), working closely with stakeholders in product, operations, and engineering. This is a full-stack data role with a dual mandate: Build and scale our AI-native data platform; Help shape and lead a growing Data & AI team as we scale. We are early-stage, so you will move between strategy and code, long-term design and fast iteration. You will lay the groundwork not just for the system, but for the team that will own it.

Why we

Become part of one of Europe’s fastest-growing ClimateTech start-ups. Supported by leading US and EU investors, we are shaping the future of the sustainable construction and real estate industry. Modern headquarters in the west of Berlin: conveniently located and well connected. Hybrid work model with flexible working hours. 30 vacation days plus one additional day per year, as well as time off on December 24 and 31. We reward referrals for our open positions through a thoughtful and appreciative referral program. After the probation period: up to five days of educational leave per year with a budget of 1,500. BVG public transport ticket. Membership at Urban Sports Club. Do not just work with us, but help shape the change and drive decarbonization forward.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.