SlackBuilds Repository

14.0 > Perl > perl-www-robotrules (6.02)

This module parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>
Webmasters can use the /robots.txt file to forbid conforming robots
from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this
object provides methods to check if access to a given URL is
prohibited. The same WWW::RobotRules object can be used for one
or more parsed /robots.txt files on any number of hosts.

Maintained by: LukenShiro
Keywords:

Homepage:
http://search.cpan.org/dist/WWW-RobotRules/

Source Downloads:
WWW-RobotRules-6.02.tar.gz (b7186e8b8b3701e70c22abf430742403)

Download SlackBuild:
perl-www-robotrules.tar.gz
perl-www-robotrules.tar.gz.asc (FAQ)

(the SlackBuild does not include the source)

See our HOWTO for instructions on how to use the contents of this repository.

Access to the repository is available via:
ftp git cgit http rsync

[  Exec: 0.0419 sec  |  Load: 0.50 0.49 0.48  ]

© 2006-2014 SlackBuilds.org Project. All rights reserved.
Slackware® is a registered trademark of Patrick Volkerding
Linux® is a registered trademark of Linus Torvalds
Web Design by WebSight Designs |  Managed Hosting by OnyxLight Communications