A curated set of agents with a shared permission & risk profile.
pack.json + each agent stub.Spiders a starting URL, extracts all hyperlinks (<a href="...">), and sends HTTP HEAD/GET requests to verify they are accessible (Status 200). Detects dead links ("link rot").
Uses cheerio (jQuery for server-side) to scrape static HTML pages. Faster and more lightweight than Playwright for sites that don't need JavaScript execution.
Pulls the top posts from a specific subreddit (title, score, link) and saves them as a structured JSON report. Great for trend monitoring.