Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer

Ledgy

Frankfurt

Vor Ort

EUR 60.000 - 80.000

Vollzeit

Vor 2 Tagen
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A leading data management firm in Frankfurt is looking for a Data Engineer to build robust data pipelines and optimize data architecture. You will work with tools like Fivetran, Airbyte, and Google Cloud Platform, while collaborating with teams to drive data-driven insights. Candidates should have 2-3 years of experience in data engineering, familiarity with DBT and SQL, and proficiency in Looker. The firm promotes a diverse workplace with team members from 26 different countries and emphasizes the importance of innovation in data practices.

Qualifikationen

  • 2-3+ years experience building production data pipelines and analytics infrastructure.
  • Experience implementing and managing ETL/ELT tools.
  • Ideally hands-on experience with GCP (BigQuery).
  • Proficiency in Looker, including LookML development.
  • Strong plus if you have experience using n8n or similar automation tools.

Aufgaben

  • Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and Google Cloud Platform.
  • Develop, test, and maintain DBT models that transform raw data into analytics-ready datasets.
  • Create and manage LookML models in Looker for self-service analytics.

Kenntnisse

DBT
SQL
Python (Pandas)
Looker (LookML)
ETL/ELT tools (Fivetran, Airbyte)
GCP (BigQuery)
SaaS data sources (HubSpot, Stripe, Vitally, Intercom)
Automation tools (n8n)
Problem-solving skills
Communication skills
Jobbeschreibung

At Ledgy, we’re on a mission to make Europe a powerhouse of entrepreneurship by building a modern, tech-driven equity management and financial reporting platform for private and public companies. In 2025, we aim to be the leading provider for European IPOs and reporting for share-based payments. We are a value-based company with a core focus on being humble, transparent, ambitious and impactful, all in order to delivery the best experience for our customers and end users. We are proud to partner with some of the world’s leading investors. New Enterprise Associates led our $22m Series B round in 2022, with Philip Chopin joining Sequoia’s Luciana Lixandru on our board. We were founded in Switzerland in 2017 and today we operate globally from offices in Zurich and London. We encourage diversity and are an international team coming from 26 different countries and speaking 25 different languages. As a Data Engineer at Ledgy, your mission is to build robust data pipelines, design scalable data architecture, and collaborate with teams to deliver insights that drive business decisions. Reporting directly into Head of Operations & AI, you’ll play a key role in driving our data engineering strategy.

At Ledgy, you will:
  • Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and Google Cloud Platform, ensuring reliable data flow from multiple sources into our analytics ecosystem
  • Develop, test, and maintain DBT models that transform raw data into analytics-ready datasets following best practices
  • Create and manage LookMLmodels in Looker to enable self-service analytics for stakeholders across the company
  • Drive continuous improvement of our data engineering practices, tooling, and infrastructure as a key member of the Operations team
The job is a good fit if you have:
  • 2-3+ years experience building production data pipelines and analytics infrastructure, with DBT, SQL, and Python (Pandas, etc.)
  • Experience implementing and managing ETL/ELT tools such as Fivetran or Airbyte
  • Ideally hands-on experience with GCP (BigQuery)
  • Proficiency in Looker, including LookML development
  • Strong plus if you have experience using n8n or similar automation tools
  • Experience with SaaS data sources (HubSpot, Stripe, Vitally, Intercom)
  • Familiarity with AI-powered development tools (Cursor, DBT Copilot) and a strong interest in leveraging cutting-edge tools to improve workflow
  • Strong problem-solving skills and ability to debug complex data issues
  • Excellent communication skills with ability to explain technical concepts to non-technical stakeholders
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.