Enable job alerts via email!

Data Engineer

Alt

United States

Remote

USD 90,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A dynamic tech startup in the United States is seeking a Data Engineer to design and optimize data pipelines critical for their pricing model. You will ensure the delivery of clean, usable data while implementing monitoring systems for performance tracking. Ideal candidates have strong Python and SQL skills, 3–4 years of experience in data engineering, and familiarity with cloud services. This role offers a chance to thrive in a fast-paced environment while making impactful decisions.

Qualifications

  • 3–4 years of experience in data engineering.
  • Strong Python proficiency with hands-on experience.
  • Proven experience with large-scale data processing.
  • Hands-on experience with pipeline orchestration tools.
  • Track record of owning a data pipeline end-to-end.

Responsibilities

  • Design and optimize data pipelines for major marketplaces.
  • Build monitoring systems to track data metrics.
  • Continuously improve data infrastructure and technology.
  • Ensure pipelines deliver clean data that meets requirements.

Skills

Python proficiency
Data processing (Pandas, PySpark)
SQL skills
Airflow or DAG-based orchestration tools

Tools

Selenium
Airflow
AWS
Job description

Are you a data engineer who thrives on building robust pipelines and solving complex data challenges? In this role, you'll own Alt's critical data infrastructure that powers our pricing model and marketplace insights by ingesting transaction and listing data from dozens of external marketplaces. You'll be responsible for ensuring our data pipelines deliver fresh, accurate information that drives pricing decisions, market analytics, and business intelligence across the entire platform.

What You’ll Do Here

  • Design, optimize, and own data pipelines that scrape, process, and ingest transaction and listing data from major auction houses and marketplaces.

  • Build comprehensive monitoring and alerting systems to track latency, uptime, and coverage metrics across all data sources.

  • Continuously improve our data infrastructure by modernizing storage and processing technologies, reducing manual interventions, and optimizing for cost, performance, and reliability.

  • Partner with internal teams to understand data usage patterns and ensure pipelines deliver clean, standardized data that meets product requirements.

This Is a Perfect Fit If You...

  • Care deeply about data quality and understand that extraction is only the beginning—the real value comes from delivering clean and usable data.

  • Thrive in a startup environment where you can make an immediate impact and own critical systems end-to-end.

  • Take an incremental approach to improvements, preferring evidence-based decisions over wholesale replacements.

  • Are intellectually curious about how data flows through systems and passionate about automation opportunities.

  • Want to work at the intersection of data engineering and product, understanding how your pipelines directly impact business outcomes.

What You Bring to the Table

Must-Haves:

  • 3–4 years of experience in data engineering or related fields

  • Strong Python proficiency with at least 3 years of hands-on experience

  • Proven experience with large-scale data processing using dataframe technologies (Pandas, Polars, PySpark, or similar)

  • Hands-on experience with pipeline orchestration tools (Airflow, Dagster, or similar DAG-based systems)

  • Track record of owning at least one data pipeline end-to-end within the past 2 years

  • Solid SQL skills for data analysis and transformation

  • Previous startup experience—you understand the pace and adaptability required in a fast-moving environment

  • A pragmatic mindset focused on delivering value incrementally rather than pursuing perfection

Nice-to-Haves:

  • Experience with web scraping technologies (Selenium, Puppeteer, Beautiful Soup)

  • Familiarity with data infrastructure and cloud services (AWS preferred)

  • Interest in or knowledge of trading cards, collectibles, or alternative asset markets

  • Experience with LLM-based automation tools for data extraction and processing

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.