Robots.txt, Robots Meta Tag, .htaccess mod_rewrite

 

There are three commonly supported methods for instructing/requesting internet indexing spiders/bots/robots what to scan and what to skip. Each of these methods are complimentary in usefulness to each other, but none are not equal in effect.

  1. robots.txt
  2. Robots Meta Tag
  3. .htacess and mod_rewrite

Summary:

To really protect and enforce rules for any specific user agent that is visiting your website you will have to constantly analyze website traffic analytics, bandwidth reports and visiting IP addresses and geographic locations, known pubilc or private proxy servers, and the specific methods and tactics of EVERY unwanted program and visitor and be able to implement new means to thwart their new methods on a regular basis.

Leave a Reply

Your email address will not be published. Required fields are marked *