Commit graph

1 commit

Author SHA1 Message Date
Peter Postma
9e19303a65 Import crawl-0.4.
Crawl is a small and efficient HTTP crawler.
The crawl utility starts a depth-first traversal of the web at the specified
URLs. It stores all JPEG images that match the configured constraints.
Crawl is fairly fast and allows for graceful termination. After terminating
crawl, it is possible to restart it at exactly the same spot where it was
terminated. Crawl keeps a persistent database that allows multiple crawls
without revisiting sites.

patch-a[abc] are needed for correct detection of bdb on my Debian Linux
system.
2004-06-12 11:02:52 +00:00