Contents
- What is cloaking?
- How does cloaking work?
- Search engines' attitude towards cloaking
- Cloaking in advertising networks
- Types of cloaking
- How is cloaking set up?
- Services for creating cloaking
What is cloaking?
Cloaking is a method of substituting the content that a user sees when visiting a website. The word "cloaking" translates from English as "to hide" or "to mask." In this case, ordinary users and search bots see different information on the same page. This method is used to distribute traffic across various pages depending on the characteristics of the visitor.
For example, if an advertiser plans to promote adult products through social networks, and their ads are rejected by moderators, they may resort to cloaking. In this case, when clicking on the link visible to moderators, one page opens, while other target users end up on a resource that has prohibited content.
How does cloaking work?
The implementation of cloaking begins with preparing several versions of content. Specifically, for the promoted page, it is necessary to create:
- White page — this page contains content that fully complies with the requirements of search engines or advertising networks. It features quality texts and keywords for successful SEO promotion. This is the page that search bots and moderators see.
- Landing page of the offer — this is what regular users see. It is most often used to promote prohibited products or fraudulent schemes.
After preparing the necessary pages, traffic segmentation is set up. The webmaster manually specifies the conditions for displaying the relevant landing page in the website settings or sets filters in the connected service. If a visitor is identified as a user, they will see the landing page, while bots will see the white page.
Search engines' attitude towards cloaking
Search engines view cloaking as a method of manipulating search algorithms. The rules of Yandex define cloaking as "a method of deceiving search engines, where a user sees one content of the page, while robots see another." As a result, some webmasters may hope for improved positions of their site in search results, but in fact, they only expose their resource to the risk of having pages excluded from indexing.
Google also qualifies cloaking as spam, as it implies content masking. "Sites that violate our rules may drop lower in the search results or even disappear from them," warn company representatives. For using cloaking, search engines may lower the resource's rankings, and upon repeated violations, exclude the site from search results.
Cloaking in advertising networks
The use of cloaking allows compliance with advertising networks' content requirements and promotes products of prohibited themes. Moderators see a white page that fully meets established norms. However, legal advertising networks, like search engines, oppose any illegal promotion methods.
Cloaking is prohibited in traffic arbitrage within CPA networks. If detected, a user's account is blocked without the possibility of recovery, regardless of how long ago the violation occurred.
Types of cloaking
There are two main types of cloaking:
- Gray Hat Cloaking — this method is often used to increase traffic or promote a page in search results, for example, by showing search bots an SEO page with excessive optimization. Although gray cloaking may involve demonstrating quality content to users, it still violates search engine rules.
- Black Hat Cloaking — this option is completely illegal. While search engines see relatively acceptable SEO content, users often encounter advertisements for prohibited substances or fraudulent offers.
How is cloaking set up?
Setting up cloaking involves separating regular users from search bots. This can be done in various ways, the most popular of which are:
- User-Agent — a string used by the web server to identify the client. It contains information about the browser, operating system, and device of the user. In cloaking, User-Agent is used to determine who is visiting the site.
- IP address — a unique identifier of a device on the network. Using IP addresses in cloaking is considered effective because they are hard to falsify. Search bots are identified by their IP addresses, based on which the appropriate version of the page is displayed.
- Combined cloaking — combines checks of User-Agent and IP addresses, making it a more labor-intensive but effective method.
Services for creating cloaking
Cloaking services allow for working in arbitrage without using trackers. Traffic segmentation is achieved through the use of JS scripts installed on the landing page. These services independently collect databases of "undesirable" addresses, redirecting bots to the white page and regular users to the landing page.
The principle of operation of such services is that after registration, the webmaster downloads and installs on their site a generated script that allows redirecting bot traffic to an optimized page. These services differ from trackers in that they can independently collect and process databases.