
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A startup specializing in data automation is seeking its first Data Automation Engineer to enhance existing data operations infrastructure. This pivotal role involves automating data requests, developing scrapers, and ensuring system reliability in a fast-paced environment. Candidates should have 2-5 years of experience in backend or automation engineering, proficient in Python and SQL. The position offers a competitive salary, equity, and unlimited PTO in a collaborative and innovative team setting.
Our world runs on public infrastructure, yet government data sits fragmented across thousands of portals, PDFs, and poorly designed databases. Finding relevant information—like which city just put out an RFP, or which agency is buying a new software system—often requires detective-level research. NationGraph’s mission is to end that detective work.
By automating data collection, normalizing records, building a knowledge graph with this data, and presenting them in a single, intuitive interface, we do for public procurement what Bloomberg did for finance and CoStar did for commercial real estate.
Our team works hard to simplify the complicated process of doing business with the government by building great software to solve a real problem.
AI applications applied to government procurement is in its infancy, join us in building an industry defining product.
Has successfully built, scaled, and sold companies in the past.
Built software infrastructure processing billions of dollars in transactions.
Is backed by world-class VCs and operating partners who’ve invested in—and built—iconic companies.
We’re hiring our first Data Automation Engineer to upgrade our existing data operations infrastructure and build the backbone that gathers, syncs, and maintains structured data at scale. You’ll design and ship the automation layer that powers tens of thousands of real-world interactions (emails, webforms, portals, downloads) and ensures reliability across distributed systems.
Automate sending and receiving data requests end-to-end: filtering/noise control, thread linkage, auto-downloads, failure handling, basic replies.
Data Request Engine Automation: automate email, email-forms, and webforms; tackle portal submissions with browser automation.
Enrichment pipeline: turn client enrichment requests into data ready sends; route and monitor responses automatically.
Targeted discovery/scraping: build URL/data scrapers using Ops’ search patterns to find exact endpoints at scale.
Engineer reliability: manage retries/backoff, queues, observability, and self-healing jobs.
Ship fast, iterate faster: get real systems live early; refine with traffic.
2–5 years backend/automation engineering.
Python expertise; API integrations; browser automation (Playwright/BrowserBase; BeautifulSoup okay).
SQL fluency (Postgres/Supabase/RDS) inside live automation loops.
Experience with async jobs/queues/schedulers and background workers.
Comfort stitching LLMs into workflows for routing/interpretation.
Clear communication, great docs, and ownership mindset.
Nice to have: early-stage startup experience; Airflow or similar; logging/metrics/monitoring patterns.
We offer a highly competitive salary + equity as a well funded and fast growing startup.
Unlimited PTO
High-quality health insurance, dental & vision coverage.
We believe that in-person work should be the default, with work-from-home days used as needed to support a healthy and balanced work environment.