Companies Blocking Resources That Costs You Money
Business

Companies Blocking Resources That Costs You Money

I just learned something astounding recently as I was actually wondering why a site was preforming poorly all of a sudden as it seemed like search engines couldn’t index it. At first I thought it was just a hiccup. But then it became apparent it was happening on multiple sites. After doing some research I actually learned that one of the hosting companies actually place a default robot.txt instruction that actually hinders search engine crawlers to the point where it acts as a wall that prevents them from listing it.

The reasoning for this is because sites can get hammered with robot requests so placing a large delay of sort can stop the site from being overwhelmed. However, with the setting they have it’s basically preventing people from finding the site. This isn’t publicized to you in the beginning as you have to search for the topic to discover this and they tell you that you can override it if you wish.

I am just thinking how much business was loss because this company was going to the extreme to stop sites from getting too many requests. I suppose it’s why you should often try and if companies cut corners to try and be more profitable as it could literally cost you money in the long run.

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading...