This script is for Slackware 13.37 and may be outdated.

SlackBuilds Repository

13.37 > Perl > perl-www-robotrules (6.02)

This module parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>
Webmasters can use the /robots.txt file to forbid conforming robots
from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this
object provides methods to check if access to a given URL is
prohibited. The same WWW::RobotRules object can be used for one
or more parsed /robots.txt files on any number of hosts.

This requires perl-uri-escape.

Maintained by: LukenShiro
Approved by: dsomero
Keywords:

Homepage:
http://search.cpan.org/dist/WWW-RobotRules/

Source Downloads:
WWW-RobotRules-6.02.tar.gz (b7186e8b8b3701e70c22abf430742403)

Download SlackBuild:
perl-www-robotrules.tar.gz
perl-www-robotrules.tar.gz.asc (FAQ)

(the SlackBuild does not include the source)

Validated for Slackware 13.37

See our HOWTO for instructions on how to use the contents of this repository.

Access to the repository is available via:
ftp git cgit http rsync

© 2006-2017 SlackBuilds.org Project. All rights reserved.
Slackware® is a registered trademark of Patrick Volkerding
Linux® is a registered trademark of Linus Torvalds
Web Design by WebSight Designs |  Managed Hosting by OnyxLight Communications