Google’s primary web master, Mr Matt Cutts, has put together another great video about one of the big SEO questions out there: Cloaking.
This used to be a black hat SEO technique where say a porn site would serve Googlebot content about health advice and would serve human users porn.
Naturally, this is something Google have tried to clamp down on in an attempt to clean up the index.
However, there are legitimate reasons why a website might serve different content to different users on page load. For example, a site might serve content based on entrance details for better conversion. The SDL Tridion CMS has a feature that serves content based on the users data which means that the content on the page is massively different from user to user, bot to bot. In this instance what they’re doing is a good thing, trying to help the customer and aid conversions, but at the same time they are also technically deceiving Googlebot. Should they be punished for this?
Cloaking is a method used by experts who practice black hat SEO to fool the search engines. In other words, the search engines are shown one kind of version of a website while the users visiting the site do not see the same page. This type of practice was more prevalent around ten years ago when Google’s filters were not as adept at catching the offenders. Typically, cloaking was used when a web page was not ranking all that high. Therefore, another page was developed for the bots to crawl.
Fortunately, anyone who wants to practice cloaking today will usually get caught, which really would not be worth it, considering that most pages can obtain a ranking even when less popular keywords are used. Plus, Google is well aware of the practice, so to take the chance of getting banned by using this approach would be unconscionable if not foolish.