Where the tool lives

Repository: github.com/CloudAxisAi/price-tracker

Install with npm install, then run node tracker.js --url … --selector …. Each run appends a row: timestamp, label, URL, selector, raw text, parsed number. Use --threshold for a cap, --threshold-pct to alert when the price has fallen at least N% versus the first reading for that URL, and --watch to re-fetch every unique URL+selector already in the CSV — useful for a single daily cron.

Why selectors beat scraping the whole page

Prices often sit in a predictable node (.price, [itemprop=price], etc.). Pointing the selector at that subtree avoids noisy matches from navigation, reviews, or "you may also like" blocks that also contain digits.

When cheerio is not enough

This stack only sees the HTML returned by the server before client-side JavaScript runs. Many storefronts hydrate prices in the browser — then your fetch is empty or stale. That is exactly where a cloud browser (sessions, JS, optional live view) matters; see cloud browser automation for the full picture and pricing for browser-minute caps.

Broader toolkit

For change detection without a numeric target, pair with website change monitor CLI (2026) and the website monitoring tools (2026) landscape post.

Further reading

Related reading

Ready to automate this? CloudyBot can handle tasks like this on a schedule — with a real browser, memory, and WhatsApp delivery.

Try CloudyBot free →

Free: 30 AI Tasks/month, no card required