Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Senior Engineer Data, AI & Analytics (m / w / d)

Purpose Green

Potsdam

Hybrid

EUR 50.000 - 75.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A leading ClimateTech start-up in Potsdam is seeking a senior-level engineer to take ownership of their data and AI systems. The role includes architecting and deploying AI-powered applications, building scalable data pipelines, and collaborating closely with stakeholders across engineering and product teams. Candidates should possess in-depth experience in Python and AWS, with strong communication skills in both English and German. This position offers a hybrid work model, 30 vacation days, educational benefits, and a vibrant work culture.

Leistungen

Hybrid work model with flexible hours
30 vacation days plus additional days
Budget for educational leave
Public transport ticket
Membership at Urban Sports Club

Qualifikationen

  • Min. C1 level in English and German required.
  • Extensive experience with Python for data engineering tasks.
  • Expertise in SQL and data modeling practices.

Aufgaben

  • Architect and deploy AI-powered systems into production.
  • Build scalable ingestion and transformation pipelines.
  • Maintain and evolve a clean, documented data model.

Kenntnisse

Strong communication and stakeholder management in English and German
Production-level experience in Python
Solid SQL and data modeling expertise
Hands-on experience with a major cloud provider
Terraform / IaC experience
Experience using modern BI tools
Deep AWS experience
Experience deploying AI / LLM-based systems
Experience using dbt Cloud

Tools

AWS
Terraform
dbt
SQL
Power BI
Jobbeschreibung

Salary: 50.000 - 75.000 per year

Requirements
  • Strong communication and stakeholder management in both English and German (written + spoken) - min. C1
  • Production-level experience in Python for data and AI engineering (pipelines, APIs, orchestration)
  • Solid SQL and data modeling expertise (including incremental strategies)
  • Hands-on experience with a major cloud provider (AWS, GCP, or Azure)
  • Terraform / IaC experience for provisioning cloud infrastructure
  • Experience using at least one modern BI tool (e.g., Metabase, Power BI, Looker)
  • Deep AWS experience (S3, Lambda, Glue, Redshift, OpenSearch)
  • Hands-on experience deploying AI / LLM-based systems into production
  • Experience using dbt Cloud for transformation pipelines
  • Familiarity with tracing and observability (e.g., Langfuse, OpenTelemetry)
  • Experience preparing datasets and running supervised fine-tuning (SFT) of LLMs
  • Exposure to reverse ETL tools (e.g., Census, Hightouch) or building custom syncs to HubSpot, Slack, APIs
Responsibilities
  • AI & Application Engineering
    • Architect and deploy AI-powered systems into production (e.g., FastAPI apps with RAG architecture using OpenSearch and Langfuse)
    • Optimize agentic workflows, prompt & embedding pipelines, and retrieval quality through experimentation and tracing
    • Extend LLM capabilities with supervised fine-tuning (SFT) for in-domain data distributions where RAG alone underperforms
  • Data Engineering
    • Build scalable ingestion and transformation pipelines for both proprietary and external data (using Lambda, Glue, Terraform)
    • Own embedding pipelines for retrieval-augmented generation systems
    • Manage infrastructure-as-code for all core components (Redshift, S3, VPC, IAM, OpenSearch)
  • Analytics Engineering & BI
    • Maintain and evolve a clean, documented data model (dbt Cloud leverage Fusion)
    • Develop and maintain BI dashboards in QuickSight and / or Metabase
    • Provide ad-hoc analytical support for product, sales, and ops teams
    • Build event-driven automation and reverse ETL pipelines to serve data or AI outputs back into operational systems (e.g., HubSpot)
  • Leadership & Collaboration
    • Work closely with stakeholders across engineering, product, and operations to define the right data products and abstractions
    • Lay the foundation for a high-performing Data & AI team. Help hire, mentor, and establish best practices as we grow
Technologies
  • AI
  • AWS
  • Architect
  • Azure
  • Cloud
  • ETL
  • FastAPI
  • GCP
  • IAM
  • Support
  • LLM
  • Looker
  • Power BI
  • Python
  • SQL
  • Slack
  • Terraform
  • dbt
More

We’re looking for a senior‑level engineer to take full ownership of our data and AI systems, from ingestion and modeling to embedding pipelines and LLM‑based applications. You’ll operate across domains (data infrastructure, BI, and AI), working closely with stakeholders in product, operations, and engineering. This is a full‑stack data role with a dual mandate:

  • Build and scale our AI‑native data platform
  • Help shape and lead a growing Data & AI team as we scale

We’re early‑stage, so you’ll move between strategy and code, long‑term design and fast iteration. You’ll lay the groundwork not just for the system, but for the team that will own it.

Why we

Become part of one of Europe’s fastest‑growing ClimateTech start‑ups.

Supported by leading US and EU investors, we are shaping the future of the sustainable construction and real estate industry.

Modern headquarters in the west of Berlin: conveniently located and well connected.

Hybrid work model with flexible working hours.

30 vacation days plus one additional day per year, as well as time off on December 24 and 31.

We reward referrals for our open positions through a thoughtful and appreciative referral program.

After the probation period: up to five days of educational leave per year with a budget of 1,500.

BVG public transport ticket.

Membership at Urban Sports Club.

Don’t just work with us, but help shape the change and drive decarbonization forward.

Last updated 49 week of 2025

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.