This module parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>
Webmasters can use the /robots.txt file to forbid conforming robots
from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this
object provides methods to check if access to a given URL is
prohibited. The same WWW::RobotRules object can be used for one
or more parsed /robots.txt files on any number of hosts.
Maintained by: LukenShiro
Keywords:
ChangeLog: perl-www-robotrules
Homepage:
https://metacpan.org/pod/WWW::RobotRules
Download SlackBuild:
perl-www-robotrules.tar.gz
perl-www-robotrules.tar.gz.asc (FAQ)
(the SlackBuild does not include the source)
Individual Files: |
README |
perl-www-robotrules.SlackBuild |
perl-www-robotrules.info |
slack-desc |
© 2006-2023 SlackBuilds.org Project. All rights reserved.
Slackware® is a registered trademark of
Patrick Volkerding
Linux® is a registered trademark of
Linus Torvalds