Google frequently has various penalties or filtering mechanisms discovered and studied by SEOs. Last year, for example, there was the Google penalty of dropping 30 positions in rankings, followed later by the 950 penalty, and most recently, the sixth position penalty.
Neither the 950 penalty nor the sixth position penalty prompted me to write a post because I've become somewhat numb to these kinds of penalties. It's natural that Google and other search engines continuously adjust their algorithms and introduce new systems for evaluating website quality. Therefore, it's normal to occasionally discover these so-called penalties, and I didn't feel the need to mention them on my blog.
By the way, many so-called penalties aren’t necessarily punishments; they may simply reflect changes in search engine algorithms reducing the effectiveness of certain problematic SEO techniques, causing websites to return to their appropriate ranking positions.
This time, the sixth position penalty isn't interesting because of the penalty itself but rather because Matt Cutts recently confirmed its existence, which led to some thoughts.
On December 26, 2007, Tedster, the moderator of the Google section on WebmasterWorld, summarized the Google ranking changes in December and noticed a pattern he called the "Google sixth position penalty" or "position 6 penalty." This means that keywords previously ranked first were consistently dropped to the sixth position. Alternatively, this penalty could also be referred to as a drop of five positions.
Many webmasters reported this phenomenon. Tedster summarized the following points:
The penalized sites were all older sites with long histories.
The keyword rankings had been consistently at the top position (first place) for a long time, with some even maintaining second place but always dropping to a fixed sixth position.
The penalized keywords were usually the most important ones with the highest conversion rates. However, some keywords remained at the top rank for the same pages. Thus, this penalty seemed to target specific keywords rather than affecting entire domains or all keywords universally.
Analyzing the common features among the penalized pages suggested that the issue wasn't related to the optimization of individual pages themselves. The focus instead shifted to site-wide factors or off-site elements such as backlinks. It was suspected that the distribution of external links might not have been natural or uniform, or that the external links hadn't changed over a long period.
Some webmasters confirmed these findings while others disagreed, leading to no definitive conclusion.
After the WebmasterWorld thread was reported on SERoundtable, it garnered significant attention. However, Matt Cutts commented that, to his knowledge, he hadn't seen anything that would cause such a penalty.
On January 16, many of the penalized sites reported that the penalty disappeared, and their rankings returned to normal. Around the same time, SEO Book also reported that their affected site had recovered its ranking. Aaron Wall initially thought it was due to some corrections he made to the site, but now it seems that wasn't the reason.
On January 29, Matt Cutts confirmed the existence of the "Google sixth position penalty" in a comment on Sphinn. He said:
"When Barry asked me about the sixth position penalty at the end of December, I didn’t see anything that would cause that effect. But about a week later, I noticed something that might lead to this phenomenon. We are in the process of making adjustments, and some data centers have already shown the updated algorithm, while others will follow in the coming weeks."
Clearly, Google is adjusting something, and these so-called penalties are merely side effects of those adjustments. Often, the ranking changes caused by these penalty filters aren’t intentional actions by Google. Some algorithmic changes may not even be fully understood by Matt Cutts himself.
What’s certain is that search engines are constantly tweaking their algorithms, and fluctuations in rankings are normal. When webmasters notice these changes, the first thing they should do is stay calm, avoid panicking, and refrain from rushing to modify their sites.
Sometimes, Google experiments with algorithms too aggressively, only to dial them back after a while. Observing these overly aggressive algorithm adjustments can help us understand what Google might focus on in the near future. For instance, there was once a brief period when directory sites surged in rankings, but Google quickly dialed the algorithm back. This phenomenon helped webmasters appreciate the importance of linking out to other sites and becoming a hub.
Another example is the Florida update, where the algorithm was eventually dialed back, but it highlighted the consequences of over-optimization.
In short, it’s best to keep your website as natural as possible without overthinking things. Focus on creating good content and minimize the use of potentially controversial techniques. If something doesn’t have negative side effects today, it might tomorrow. Also, don’t worry too much about fluctuations in rankings.
This article was originally published on: Shanghai SEO http://www.seo-sh.cn/zhishi/google/254.html