CLASSES=none rename PSTAMP=q20150722043316 LICFILE=gpl.txt LICURL=http://www.gnu.org/copyleft/gpl.html LICINFO=GNU General Public License, Version 2, June 1991 DESC=The WWW::RobotRules module parses /robots.txt files as specified in 'A Standard for Robot Exclusion', at http://www.robotstxt.org/wc/norobots.html Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. BASEDIR=/usr VENDOR=LINOFEE, http://www.linofee.org EMAIL=developers@linofee.org CATEGORY=develop,utils,application NAME=WWW::RobotRules - Perl database of robots.txt-derived permissions SERIALNUM=001 VERSION=6.2 ARCH=i386 PKG=LNFwww-robotrules-512