One of the web project's success factors is that the pages are displayed higher in the search engine results tables. Google is constantly improving its search algorithms and try to offer as much as possible high-quality results for internet users. One of the stages of this evolution has been named “Panda" which appeared in early 2011. And I am going to talk about it.
Causes: before and after the SEO Panda
“Panda" would not have been implemented if there had not been specific problems. It appeared quite high - in 2010 SEO specialists and web developers started blaming Google louder that promotional and junky pages stuffed with keywords started appearing in the first search results page again and again. In other words, the user types in a Google search for the desired keyword, sometimes it was possible to immediately look at the bottom of the page first, or even go to the second page. With that Internet users did not want to put up, and the search algorithm started to improve.
How about SEO principles that had worked for many years before? Very simple - keep in your site a lot of links, lay down the right keywords for it, and can be expected in the first place of Google's search results page. With time, the algorithm has become more complex and increased the number of factors, but did not change the essential thing - the software analyzes the content of the sites sense, but not its quality. And with the new Panda was the introduction of additional new factors, which pushed below the result of “poor quality" sites - those that copy or duplicate content, which perform only the functions and collections of links, etc. And at the same time high-quality Web sites got a better position, which was victory for both owners of those websites and internet users.
How does it work?
Immediately I will say is that the panda is not a new algorithm, it is simply a set of additional important factors for defining the content and web design quality. Another very important fact is that panda assigned rating to individual pages, and the whole site in general. Analyzing each individual page, and if it is noted that there are many are “poor quality" content, then the whole site was to get a lower rating. A similar principle operates in the good old PageRank, but Google's algorithm now pays much less attention to the latter.
So, Google analyzes whether the content is clear and clean. The question is how to do this if people should be the ones to decide, rather than a computer program? The answer is simple - people should decide themselves. After all, when you appear in the search results page and the user clicks on any of your links, then it is possible to analyze how much time he will spend at that site until the return back to the search, if at all returns. Or you can explore different user clicks - track how many links they click on the search results page, how and by a number of options they have. In a word, freedom to conduct a complete analysis - that's what Google does. And then distributes small Panda rating sites, from which the consumer goes, without staying too much.
By the way, you can very easily see how your pages are doing in this regard. There is an interesting indicator of the “bounce rate", which describes the percentage of visitors coming to your site, more than one basic mode without links. If the percentage is high - it is worth concern.
Yet, of course, repeated analysis or copied content from another site. This factor led to the results of the past, but with the appearance of panda, it became one of the main indicators, which your site may be punished for.