Skip to content

SavvyDealer Inventory Tracker

Repo: savvydealer-adam/savvydealer-inventory-tracker · Path: C:/Users/adam/inventory-tracker · Owner: Adam Status: Live · % Done: 70 · Last commit: 2026-04-01 Deployed: https://inventory-tracker-952362582307.us-central1.run.app (backend) · https://comp.savvydealer.com (dashboard)

What it is

Multi-dealer competitor inventory tracker. Scrapes dealer sites daily, detects sold vehicles, flags price changes, and feeds the competitive dashboard.

Why it exists

Dealers need to see what competitors have in stock, how long units sit, and when prices drop — without watching every site manually.

How it works

Python/Flask on Cloud Run + Supabase Postgres + Playwright headless Chrome. - Sitemap-based scraping for complete VDP coverage (beats paginated search scraping) - Hash-based change detection — only re-scrape VDPs whose content hash changed (smart scraper is ~80% faster) - Cloudflare bypass for protected dealer sites - Per-dealer scraper configs (DealerOn, Dealer Inspire, SavvyDealer platforms) - Cron-driven daily scans (external cron, internal APScheduler was killed for OOM)

What's done

  • Core scrape + sold detection + price-change tracking
  • Incentive capture (rebates/discounts/conditional offers)
  • Multi-platform scrapers (DealerOn, Dealer Inspire, SavvyDealer-native)
  • Supabase schema + Cloud Run deploy
  • Dashboard integration at comp.savvydealer.com

What's next

  • Expand dealer roster beyond current 3 working configs
  • Better Cloud Run timeout handling for full-fleet scrapes
  • Harden Cloudflare bypass across more platforms

Where the code lives

  • app.py — Flask routes + API
  • scraper.py — Playwright scraper + dealer configs
  • sitemap_parser.py — VDP URL extraction
  • cloudbuild.yaml — Cloud Run deploy

Integrations

  • Feeds competitive-dashboard (comp.savvydealer.com) — primary consumer
  • Pairs with lease-scraper for full competitor offer intelligence

Don't rebuild this — extend it

Add new competitor dealers by writing a platform-specific scraper module and registering it in scraper.py configs.