Suggestion: add a default file to the setup

Status
Not open for further replies.

bugfinder

Retired
Messages
2,260
Reaction score
0
Points
0
By default it seems that we dont get a robots.txt file, while in many cases this may not be an issue, if you then go round submitting your site to search engines, it can become an issue, especially with one of the new crawlers going round which just seems to access everything constantly or does it just seem like it? (one with a dns of cuill.com)

anyway, my suggestion then is to add a defaults text file with the following to hopefully reduce the load on your servers.. Ive watched a shared hosting service crippled due to slightly bad coding on some forums, an overloaded db, and so for those who dont know much about it, it at least may stop a few suspensions due to being hassled by a crawler service.

User-agent: *
Crawl-delay: 10
 
Status
Not open for further replies.
Top