Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer

Ledgy

Deutschland

Vor Ort

EUR 60.000 - 80.000

Vollzeit

Vor 12 Tagen

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A tech-driven equity management platform is seeking a Data Engineer responsible for building robust data pipelines and scalable architectures. The role involves managing ETL workflows using tools like Fivetran and Google Cloud Platform. Ideal candidates should have 2-3 years of experience in data pipelines, proficiency in DBT, SQL, and Python, along with a strong communication skill set. This position is crucial for driving the company's data engineering strategy while supporting decision-making across diverse teams.

Qualifikationen

  • 2-3+ years experience building production data pipelines and analytics infrastructure.
  • Experience implementing and managing ETL/ELT tools such as Fivetran or Airbyte.
  • Ideally hands-on experience with GCP (BigQuery).
  • Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders.

Aufgaben

  • Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and GCP.
  • Develop, test, and maintain DBT models for analytics-ready datasets.
  • Create and manage LookML models in Looker for self-service analytics.
  • Drive continuous improvement of data engineering practices.

Kenntnisse

DBT
SQL
Python
Looker
Data manipulation
Problem-solving

Tools

Google Cloud Platform
Fivetran
Airbyte
LookML
n8n
Jobbeschreibung

At Ledgy, we’re on a mission to make Europe a powerhouse of entrepreneurship by building a modern, tech-driven equity management and financial reporting platform for private and public companies. In 2025, we aim to be the leading provider for European IPOs and reporting for share-based payments. We are a value-based company with a core focus on being humble, transparent, ambitious and impactful, all in order to delivery the best experience for our customers and end users. We are proud to partner with some of the world’s leading investors. New Enterprise Associates led our $22m Series B round in 2022, with Philip Chopin joining Sequoia’s Luciana Lixandru on our board. We were founded in Switzerland in 2017 and today we operate globally from offices in Zurich and London. We encourage diversity and are an international team coming from 26 different countries and speaking 25 different languages. As a Data Engineer at Ledgy, your mission is to build robust data pipelines, design scalable data architecture, and collaborate with teams to deliver insights that drive business decisions. Reporting directly into Head of Operations & AI, you’ll play a key role in driving our data engineering strategy.

At Ledgy, you will:
  • Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and Google Cloud Platform, ensuring reliable data flow from multiple sources into our analytics ecosystem
  • Develop, test, and maintain DBT models that transform raw data into analytics-ready datasets following best practices
  • Create and manage LookMLmodels in Looker to enable self-service analytics for stakeholders across the company
  • Drive continuous improvement of our data engineering practices, tooling, and infrastructure as a key member of the Operations team
The job is a good fit if you have:
  • 2-3+ years experience building production data pipelines and analytics infrastructure, with DBT, SQL, and Python (Pandas, etc.)
  • Experience implementing and managing ETL/ELT tools such as Fivetran or Airbyte
  • Ideally hands-on experience with GCP (BigQuery)
  • Proficiency in Looker, including LookML development
  • Strong plus if you have experience using n8n or similar automation tools
  • Experience with SaaS data sources (HubSpot, Stripe, Vitally, Intercom)
  • Familiarity with AI-powered development tools (Cursor, DBT Copilot) and a strong interest in leveraging cutting-edge tools to improve workflow
  • Strong problem-solving skills and ability to debug complex data issues
  • Excellent communication skills with ability to explain technical concepts to non-technical stakeholders
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.