The Complete Package
We strive to help you at every step. Here’s 12 of them.
99.7% Success Rate
Don’t bother with IP proxies and user agents. It’s our job to reach the webpage.
Browser Rendering
We render the pages as a real browser would, using the latest chrome version.
Run Custom JS
Click, scroll, select, fill out forms... or even run custom javascript code in the page.
Geoloc & Locale
Set the location of the crawler to access the site from different parts of the world.
Navigate & Paginate
Infinite scroll? Load more button? Pagination? List of links? We got it covered.
Extract Structured Data
Collect precise objects and attributes, clean them, format them, package them.
Schedule Your Crawls
Automate your crawls by scheduling them automatically. All on our servers.
12 Months Storage
The datasets crawled are stored on our servers and accessible when you need them.
Export Anywhere
Retrieve the datasets from our API, or export them where you need them.
Highly Scalable
Our infrastructure is designed to handle a large number of high-volume crawlers.
Monitoring & Retry
Along with strong retry and self-healing strategies, we monitor to alert you if things go wrong.
Pay For What You Use
No commitment, no layered pricing. You pay for what you crawl. And nothing more.
An API you’ll love to use!
Our API and documentation are built by developers for developers.
We focus on the details, and try to get them right.
curl 'https://api.mantabase.com/1/preview' \ -H 'Authorization: Bearer $MANTABASE_API_KEY' \ -H 'Content-Type: application/json' \ -d '{ "steps": [ "url(https://news.ycombinator.com/)", "pagination_next(a[rel=\"next\"])", "click_links(span.age a)", { "object(hn_post)": { "attributes": { "title": ".titleline", "link_url": ".titleline a @href", "user": "a.hnuser", "user_link": "a.hnuser @href", "score": "to_number(.score)", "nb_comments": "to_number(.subline > a:last-child)", "time_ago": ".subline a:last-child" } } } ] }'