Scrapy Cloud is a battle-tested cloud platform for running web crawlers (aka. spiders). Your spiders run in the cloud and scale on demand, from thousands to billions of pages. Think of it as a Heroku for web crawling.
Over 2,000 companies trust Scrapy Cloud to run their spiders
Make managing spiders a breeze
Run, monitor, and control your crawlers with Scrapy Cloud's easy-to-use web interface.
Increase the scale and firepower of your scraping operation with only a few clicks.
Seamlessly integrate Crawlera, Splash, Spidermon, etc. to your web scraping stack to take the hassle out of scraping the web at scale.
Full suite QA tools
Built in spider monitoring, logging and data QA tools. Along with easy integration of Spidermon, our open source spider monitoring framework.
Zero vendor lock-in
Develop your code using Scrapy, the most popular open-source web scraping framework, and retain the freedom to migrate it to any hosting solution.
Give it a try for free
Get access to the Scrapy Cloud free version today.
1 Scrapy Unit = 1 GB of RAM and 1 concurrent crawl
With Scrapy Cloud, you control how you allocate your resources. Just update your settings to allocate 1 Scrapy Cloud unit to each job.
Provides a simple way to run your crawls and browse results
"Scrapy is really pleasant to work with. It hides most of the complexity of web crawling, letting you focus on the primary work of data extraction.
Scrapinghub provides a simple way to run your crawls and browse results, which is especially useful for larger projects with multiple developers."
Jacob Perkins StreamHacker.com
Does not force vendor lock-in
"I love that Scrapy Cloud does not force vendor lock-in, unlike the other scraping and crawling services. Investment developing the right scraping logic is not stuck in some proprietary format or jailed behind some user friendly interface. With Scrapy Cloud scraping logic is in standard Python code calling the open-source Scrapy Python library. You retain the freedom to run the scraping Python code on your own computers or someone else’s servers."
Castedo Ellerman Quantitative Analyst/Developer
Need data you can rely on?
Tell us about your project or start using our scraping tools today.