What is a web robot explained

A web robot is a program that automatically retrieves web pages by following the links on web pages that it has already retrieved before. Web robots are used by search engines to find web pages for inclusion in their search database. Other web robots, however, will try to harvest any email addresses on your web pages to include in their email database, which they then use or sell to email spammers.

Through a special file on your web host, called robots.text, you can tell a web robot which pages you would like it to ignore. Most email harvesting robots will, of course, just ignore the instructions in your robots.txt file.
  • 22 Users Found This Useful
Was this answer helpful?

Related Articles

What is the robot.txt file explained

The robot.txt file is a file in your main (root) directory of your web site that is retrieved by...

Powered by WHMCompleteSolution