Panda Mania: A Look Back on This Search Engine Algorithm Update History
Since 2011, Google has been rolling out steady updates to its search engine algorithms in the form of Google Panda updates.
This algorithm update was designed to target low-quality sites with
thin content, and they have been slowly improving and refining it so
that it catches the spam without dinging legitimate sites. Up to 12% of
all sites online were affected by the move, which is a huge number in
terms of search engine updates.
Here is a brief history of the algorithm updates and what you need to do to stay ahead of them.
2011: Google Panda’s first updates
When Google Panda first hit, it affected thin content sites without much unique content, sites with lots of ads and not much content, and sites with other clearly-defined quality issues. Most webmasters who lost rankings were those who had been using shady practices or not focusing on quality. A big difference between Panda and previous updates was that Panda hit the entire site, not just the affected page. An update was released in April of 2011 to affect all English queries, and this had a huge effect on the web, understandably. Later, in August, Panda was applied to global English queries and most non-English queries (Chinese, Japanese, and Korean queries weren’t yet affected).
2011: Non-Panda updates affected queries
Panda wasn’t the only project to affect queries. 2011 also saw other algorithm tweaks that didn’t fall under the Panda heading affect search engine rankings, including minor updates called “flux” and the now-infamous encryption of search queries that began returning “(not provided)” as the keyword to analytics software for some organic traffic to websites. One crucial update was the November freshness update that focused on recent news and updated content on all sites.
2012: Further Panda updates became frequent
Many Panda updates were rolled into 30-packs and other packs of algorithm tweaks. These included quality detection for landing pages, penalties for sites with a lot of ad space “above the fold” (in the first visible page after clicking onto a site), and more. Anchor text became valued differently, presumably to stop spammy backlinking practices, and Penguin came along in 2012. Penguin affected sites that were using keyword-stuffing practices and Panda continued to refresh data and undergo tweaks.
2013: Panda becoming integrated fully into algorithm.
Now that a few years have passed, suggestions are that Google algorithms will begin fully integrating Panda updates. Penguin still isn’t quite there yet, but Panda seems well-tested and undergoes updates every ten days or so. In 2013, it seems that penalties have been softened as most webmasters have begun to comply with new quality rules.
Stay on top by staying ahead of the news.
Web hosting comparison sites can provide up-to-date information on web hosting, usability, and other factors. Make sure you’re reading industry news sites and paying attention to what Google says on its official blog. Staying ahead means paying attention to every warning and looking critically at your site to make sure it meets quality requirements.
Google Panda has devastated some webmasters but made others very happy. Its original aim of targeting low-quality and thin content sites seems to be working for the most part, and webmasters are learning to stay ahead by keeping their content high-quality and fresh.
Here is a brief history of the algorithm updates and what you need to do to stay ahead of them.
2011: Google Panda’s first updates
When Google Panda first hit, it affected thin content sites without much unique content, sites with lots of ads and not much content, and sites with other clearly-defined quality issues. Most webmasters who lost rankings were those who had been using shady practices or not focusing on quality. A big difference between Panda and previous updates was that Panda hit the entire site, not just the affected page. An update was released in April of 2011 to affect all English queries, and this had a huge effect on the web, understandably. Later, in August, Panda was applied to global English queries and most non-English queries (Chinese, Japanese, and Korean queries weren’t yet affected).
2011: Non-Panda updates affected queries
Panda wasn’t the only project to affect queries. 2011 also saw other algorithm tweaks that didn’t fall under the Panda heading affect search engine rankings, including minor updates called “flux” and the now-infamous encryption of search queries that began returning “(not provided)” as the keyword to analytics software for some organic traffic to websites. One crucial update was the November freshness update that focused on recent news and updated content on all sites.
2012: Further Panda updates became frequent
Many Panda updates were rolled into 30-packs and other packs of algorithm tweaks. These included quality detection for landing pages, penalties for sites with a lot of ad space “above the fold” (in the first visible page after clicking onto a site), and more. Anchor text became valued differently, presumably to stop spammy backlinking practices, and Penguin came along in 2012. Penguin affected sites that were using keyword-stuffing practices and Panda continued to refresh data and undergo tweaks.
2013: Panda becoming integrated fully into algorithm.
Now that a few years have passed, suggestions are that Google algorithms will begin fully integrating Panda updates. Penguin still isn’t quite there yet, but Panda seems well-tested and undergoes updates every ten days or so. In 2013, it seems that penalties have been softened as most webmasters have begun to comply with new quality rules.
Stay on top by staying ahead of the news.
Web hosting comparison sites can provide up-to-date information on web hosting, usability, and other factors. Make sure you’re reading industry news sites and paying attention to what Google says on its official blog. Staying ahead means paying attention to every warning and looking critically at your site to make sure it meets quality requirements.
Google Panda has devastated some webmasters but made others very happy. Its original aim of targeting low-quality and thin content sites seems to be working for the most part, and webmasters are learning to stay ahead by keeping their content high-quality and fresh.
No comments