A web robot is a program that automatically retrieves web pages by following the links on web pages that it has already retrieved before. Web robots are used by search engines to find web pages for inclusion in their search database. Other web robots, however, will try to harvest any email addresses on your web pages to include in their email database, which they then use or sell to email spammers.
Through a special file on your web host, called robots.text, you can tell a web robot which pages you would like it to ignore. Most email harvesting robots will, of course, just ignore the instructions in your robots.txt file.
Categories
12
Cpanel
7 Accounts
25 Audio / Video Streaming
6 Billing
11 Certificates
4 Database
6 Dedicated Servers
5 Domain Management
16 Domain Name Registration
2 Domain Renewal
27 Emails
7 FTP
1 Others
4 Plesk / Windows Panel
8 Reseller Hosting Linux
2 Search Engine Submission
0 Setting Up your Nameservers
3 Softaculous
16 VPS
9 WebDesign
1 Website Forms
13 Website Hosting
1 Website Traffice and Statistics
6 WHM
Categories
- 22 Users Found This Useful
Related Articles
What is the robot.txt file explained
The robot.txt file is a file in your main (root) directory of your web site that is retrieved by...
Powered by WHMCompleteSolution