Cloaking is when a website is returning different content to search engines and other crawlers. The reasons to why one would like to cloak a website are many, but the most common include:
- To improve search engine ranking
- To hide certain content from search engines
- To protect pages from other search engine optimization, SEO
Cloaking can be done in very small scale, such as adding text and keywords that will improve SEO but shouldn't be displayed to human visitors. But it can also mean displaying a whole different web page - maybe one that is not at all user friendly or matches what the user was looking for.
How do you cloak your website?
Well, there are two methods here - IP address, User agent, or both. When a crawler visits your website, it's IP address can be recorded and used to determine which content that should be delivered. This requires a dynamic website written in a scripting language such as ASP or PHP.
It also requires a list of the IP's belonging to the crawler you want to deliver different content.
Such lists can be found here..
Most serious crawlers also identify themselves in the HTTP header called User Agent when visiting. It usually contains keywords such as 'Crawler', 'Bot', 'Robot' or 'Spider', as well as information about which search engine they belong to.
However, you never know if a crawler hides this information, so it might be dangerous to rely only in that.
So why should you not cloak your website?
Firstly, it's unethical. Search engines rely on that the information they get from the crawlers is correct, so that they can deliver as good as possible search results to the users - including you.
But what discourages most people is that it is considered 'Black hat' - unethical SEO practices which is frowned upon and can even get your website banned from search engines. It might work at first, but the risk is quite high you will be detected eventually and removed from all search listings. You do not want that.