Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Engineer Data, AI & Analytics (m / w / d)

Purpose Green

Schönefeld

Hybrid

EUR 50.000 - 75.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A leading ClimateTech start-up in Germany is looking for a senior-level engineer to lead its data and AI systems. The role involves building and optimizing data infrastructure, deploying AI applications, and collaborating with cross-functional teams. The ideal candidate speaks both English and German, has strong Python skills, and experience in cloud environments. Benefits include a hybrid work model, 30 vacation days, and continuous education support. Join us to shape the future of sustainable construction and drive decarbonisation forward.

Leistungen

Hybrid work model with flexible hours
30 vacation days plus additional days off
Referral rewards for open positions
Up to five days of educational leave
BVG public transport ticket
Membership at Urban Sports Club

Qualifikationen

  • Minimum C1 level in English and German is required.
  • Must have hands-on experience in Python for data and AI engineering.
  • Familiarity with both cloud service and BI tools is necessary.

Aufgaben

  • Architect and deploy AI-powered systems in production.
  • Build scalable ingestion and transformation pipelines for data.
  • Maintain and evolve a clean, documented data model.
  • Collaborate with stakeholders to define data products.

Kenntnisse

Strong communication in English and German
Production-level experience in Python
Solid SQL and data modeling expertise
Hands-on experience with a major cloud provider
Terraform / IaC experience
Experience with modern BI tools
Deep AWS experience
Experience deploying AI systems
Experience using dbt Cloud
Familiarity with tracing tools

Tools

AWS
FastAPI
Terraform
dbt
Jobbeschreibung

Salary: 50,000 – 75,000 per year

Requirements
  • Strong communication and stakeholder management in both English and German (written + spoken) - min. C1
  • Production-level experience in Python for data and AI engineering (pipelines, APIs, orchestration)
  • Solid SQL and data modeling expertise (including incremental strategies)
  • Hands‑on experience with a major cloud provider (AWS, GCP, or Azure)
  • Terraform / IaC experience for provisioning cloud infrastructure
  • Experience using at least one modern BI tool (e.g., Metabase, Power BI, Looker)
  • Deep AWS experience (S3, Lambda, Glue, Redshift, OpenSearch)
  • Hands‑on experience deploying AI / LLM‑based systems into production
  • Experience using dbt Cloud for transformation pipelines
  • Familiarity with tracing and observability (e.g., Langfuse, OpenTelemetry)
  • Experience preparing datasets and running supervised fine‑tuning (SFT) of LLMs
  • Exposure to reverse ETL tools (e.g., Census, Hightouch) or building custom syncs to HubSpot, Slack, APIs
Responsibilities
  • AI & Application Engineering
    • Architect and deploy AI‑powered systems into production (e.g., FastAPI apps with RAG architecture using OpenSearch and Langfuse)
    • Optimize agentic workflows, prompt & embedding pipelines, and retrieval quality through experimentation and tracing
    • Extend LLM capabilities with supervised fine‑tuning (SFT) for in‑domain data distributions where RAG alone underperforms
  • Data Engineering
    • Build scalable ingestion and transformation pipelines for both proprietary and external data (using Lambda, Glue, Terraform)
    • Own embedding pipelines for retrieval‑augmented generation systems
    • Manage infrastructure‑as‑code for all core components (Redshift, S3, VPC, IAM, OpenSearch)
  • Analytics Engineering & BI
    • Maintain and evolve a clean, documented data model (dbt Cloud leverage Fusion)
    • Develop and maintain BI dashboards in QuickSight and / or Metabase
    • Provide ad‑hoc analytical support for product, sales, and ops teams
    • Build event‑driven automation and reverse ETL pipelines to serve data or AI outputs back into operational systems (e.g., HubSpot)
  • Leadership & Collaboration
    • Work closely with stakeholders across engineering, product, and operations to define the right data products and abstractions
    • Lay the foundation for a high‑performing Data & AI team. Help hire, mentor, and establish best practices as we grow
More: Welcome

We’re looking for a senior‑level engineer to take full ownership of our data and AI systems, from ingestion and modeling to embedding pipelines and LLM‑based applications. You’ll operate across domains (data infrastructure, BI, and AI), working closely with stakeholders in product, operations, and engineering. This is a full‑stack data role with a dual mandate:

  • Build and scale our AI‑native data platform
  • Help shape and lead a growing Data & AI team as we scale

As an early‑stage company you’ll move between strategy and code, long‑term design and fast iteration. You’ll lay the groundwork not just for the system, but for the team that will own it.

Technologies
  • AI
  • AWS
  • Azure
  • Cloud
  • ETL
  • FastAPI
  • GCP
  • IAM
  • LLM
  • Looker
  • Power BI
  • Python
  • SQL
  • Slack
  • Terraform
  • dbt
Benefits
  • Hybrid work model with flexible working hours.
  • 30 vacation days plus one additional day per year, as well as time off on December 24 and 31.
  • Referral rewards for open positions.
  • Up to five days of educational leave per year with a budget of €1,500 after the probation period.
  • BVG public transport ticket.
  • Membership at Urban Sports Club.
Why us

Become part of one of Europe’s fastest‑growing ClimateTech start‑ups.

Supported by leading US and EU investors, we are shaping the future of the sustainable construction and real estate industry.

Modern headquarters in the west of Berlin: conveniently located and well connected.

Don’t just work with us, but help shape the change and drive decarbonisation forward.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.