Enable job alerts via email!

Data Engineer

Nansen

Oslo

Remote

NOK 800,000 - 1,000,000

Full time

17 days ago

Job summary

A leading blockchain analytics platform is seeking a Data Engineer to enhance their data infrastructure. You'll collaborate with a dynamic team to tackle large-scale data processing challenges, design and build data pipelines using tools like ClickHouse and dbt, and ensure data quality. This role offers a remote-first work environment with a focus on autonomy and high-impact work.

Benefits

Competitive compensation
Remote-first work environment
Autonomy and execution focus

Qualifications

  • Proven track record of building and scaling high-performance data systems.
  • Strong grasp of streaming data architectures.
  • Comfortable working full-stack with data.

Responsibilities

  • Collaborate with engineers and product managers on data modeling.
  • Design and scale data pipelines using ClickHouse and dbt.
  • Ensure high data quality and reliability across all systems.

Skills

SQL
Python
Data pipeline design
Streaming data architectures
AI tools (Cursor, MCPs, LLMs)

Tools

ClickHouse
dbt
BigQuery
Job description

Nansen is a leading blockchain analytics platform that empowers investors and professionals with real-time, actionable insights derived from on-chain data.

The Opportunity:

As we scale our data infrastructure to meet the growing complexity and volume of the crypto ecosystem, we're hiring a Data Engineer to build and optimise systems that power our customer-facing analytics product.

You'll work with a collaborative, pragmatic team of product and data engineers engineers to solve hard technical problems with real-world impact — powering tools used daily by thousands of crypto investors, builders, and institutions.

What You'll Do:

  • Collaborate closely with crypto researchers, other engineers, and product managers to shape how data is modeled, surfaced, and productized.
  • Tackle large-scale data challenges, processing terabytes of streaming and batch data daily.
  • Design, build, and scale performant data pipelines and infrastructure, primarily using ClickHouse and dbt.
  • Ensure high standards for data quality, reliability, and observability across all systems.
  • Bring fresh thinking to the table, staying current with best practices and evolving your toolkit over time.
  • Use AI tools and Agents such as Cursor, MCPs and LLMs to accelerate development, automate repetitive work, and boost quality.

What We're Looking For:

  • A proven track record of building and scaling high-performance data systems in production.
  • Expertise in SQL and Python, with hands-on experience using dbt, ClickHouse, and BigQuery.
  • Strong grasp of streaming data architectures and experience handling large-scale data volumes.
  • Comfortable working full-stack with data, from ingestion and transformation to storage, modeling, and serving.
  • Experience using AI-powered tools in day-to-day development and a natural curiosity to push boundaries with them.
  • Excellent written and verbal communication skills. You're confident working in a remote-first, async team.
  • A pragmatic, execution-oriented mindset. You value fast feedback loops, clean architecture, and scalable design.
  • Experience with, or strong interest in, blockchain data structures, crypto, and Web3 technologies.

Why Join Nansen

  • Remote-first & async-native: Work from anywhere. No unnecessary meetings.
  • AI-native team: Work in a team of AI trailblazers.
  • High-impact work: You'll shape the data architecture of a fast-growing crypto platform.
  • Work on massive-scale data challenges in one of the fastest-growing industries.
  • Influence product and technical direction from day one.
  • Build systems that power the decision-making of crypto investors around the world.
  • Join a team that values autonomy, impact, and execution above all.
  • We offer a competitive compensation and benefits package, tailored to a remote-first environment.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.