Quickstart
This guide walks you through creating your first aggregator and adding a source.
Prerequisites
Section titled “Prerequisites”- A Fetchosaurus account (sign up here)
- A target website you want to collect data from
Create Your First Aggregator
Section titled “Create Your First Aggregator”-
Log in to the dashboard
Go to app.fetchosaurus.com and sign in.
-
Create an aggregator
Click New Aggregator and choose a built-in schema:
- Job - For job listings
- Event - For events and happenings
- Listing - For products or real estate
- Article - For news and blog posts
- Recipe - For recipes
Give your aggregator a name (e.g., “Remote DevOps Jobs”).
-
Add a source
Click Add Source and paste the URL of a page containing the data you want to extract.
-
Describe what to extract
Tell the AI what data you want:
“Extract all job listings. The title is in the h2 tag, company name is below it, and the location is in the gray text.”
-
Preview and test
Review the extracted data preview. If it looks correct, click Test Flight to run a full extraction.
-
Save and schedule
Choose a schedule (hourly, daily, weekly, or manual) and save your source.
Fetch Your Data via API
Section titled “Fetch Your Data via API”Once your sources are running, fetch the collected items via API:
curl -H "Authorization: Bearer YOUR_API_KEY" \ "https://api.fetchosaurus.com/api/v1/aggregators/YOUR_AGGREGATOR_ID/items"Response:
{ "items": [ { "id": "clx1abc123", "source_id": "clx2def456", "schema_version": 1, "created_at": "2026-01-20T10:30:00Z", "data": { "title": "Senior DevOps Engineer", "company": "Acme Corp", "location": "Remote", "url": "https://acme.com/jobs/123" } } ], "next_cursor": "48291", "has_more": true}What’s Next?
Section titled “What’s Next?”- Understand the core concepts - Aggregators, sources, schemas, and items
- API Reference - Full API documentation
- Built-in schemas - See all available schema types