freebsd-ports/www/p5-WWW-RobotRules/pkg-descr

7 lines
303 B
Text
Raw Normal View History

This module parses /robots.txt files which are used to forbid conforming
robots from accessing parts of a web site. The parsed files are kept in
a WWW::RobotRules object, and this object provides methods to check if
access to a given URL is prohibited.
WWW: http://search.cpan.org/dist/WWW-RobotRules/