SkillAgentSearch skills...

Aduana

Frontera backend to guide a crawl using PageRank, HITS or other ranking algorithms based on the link structure of the web graph, even when making big crawls (one billion pages).

Install / Use

/learn @scrapinghub/Aduana
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Description Build Status

A library to guide a web crawl using PageRank, HITS or other ranking algorithms based on the link structure of the web graph, even when making big crawls (one billion pages).

Warning: I only test with regularity under Linux, my development platform. From time to time I test also on OS X and Windows 8 using MinGW64.

Installation

pip install aduana

Documentation

Available at readthedocs

I have started documenting plans/ideas at the wiki.

Example

Single spider example:

cd example
pip install -r requirements.txt
scrapy crawl example

To run the distributed crawler see the docs

View on GitHub
GitHub Stars55
CategoryData
Updated11mo ago
Forks9

Languages

C

Security Score

87/100

Audited on Apr 7, 2025

No findings