DreamHost Blocks Googlebot?
The owner of Romanian site Zoso.ro allegedly received an email recently from his web host Dreamhost ordering him to block Googlebot from his sites that have high traffic as it was causing heavy load on the webserver.
Here’s the alleged email sent by Dreamhost:
This email is to inform you that a few of your sites were getting hammered by Google bot. This was causing a heavy load on the webserver, and in turn affecting other customers on your shared server. In order to maintain stability on the webserver, I was forced to block Google bot via the .htaccess file.
[Limit GET HEAD POST
deny from 66.249
allow from all]
You also want to consider making your files be unsearchable by robots and crawlers, as that usually contributes to high number of hits. If they hit a dynamic file, like php, it can cause high memory usage and consequently high load…
I moved to Dreamhost just over a month now and so far I haven’t encountered any hosting problems related to Googlebot nor have I received any emails like the one above. It could probably be because my blog doesn’t have much site traffic. Googlebot is known to take up an enormous amount of bandwidth causing websites to exceed their bandwidth limit and be taken down temporarily. I guess a part of this story is true, that some of the sites involved did cause a heavy load on the shared server due to site traffic and Googlebot activity.
I’m not sure about this but it seems to be an unlikely move by an established web hosting company such as Dreamhost. I’m thinking that this is just a hoax and I haven’t confirmed if this story is true or not. But if it were true, it would definitely affect the image of Dreamhost and the confidence of current and future customers.
Hopefully someone from Dreamhost gets to read this and can give an explanation or confirm if this story is true or not. I’ll update this post as soon as I confirm the validity of this story.