May 17th, 2013 by Erin Meyer
In a May 13 blog post, Google’s Matt Cutts posted a video on what web developers should expect in the next few months in terms of updates to the way Google ranks sites. A lot of these changes will have an impact on webmasters and the SEO community.
Some of the changes are a continuation of updates (Penguin, Panda) and others focus on specific issues that Google is working to address.
The highlights of Cutt’s comments appear below.
“We’re relatively close to deploying the next generation of Penguin”
Cutts confirms that an update of Penguin is coming, and explains that Penguin’s focus is on trying to identify and address blackhat webspam. Penguin 2.0, as the Google team identifies it, is expected to go deeper and be more comprehensive than the previous iteration.
“We’ve also been looking at advertorials”
Google will be looking more closely at advertorials. When it comes to sites that take money, link to websites, and pass PageRank, Google will be looking at ways to be stronger on enforcement with advertorials that violate quality guidelines. There should be clear disclosure so that users realize that something is paid and not organic.
Going to “areas that have traditionally been a little more spammy”
Responding to feedback from users with complaints about certain search results, Google will also be focusing on some types of queries that tend to produce a lot of spam. Without getting into much detail, Cutts explains they have two different changes that will attempt to manage those types of queries in different ways.
“We’re working on a completely different system that does more sophisticated link analysis”
Google will employ new strategies for dealing with all types of link spam. According to Cutts, Google is looking at ways to go “upstream” to deny value to link spammers. Google will be rolling out several new ideas for achieving this in the coming months, and also working on evaluating data for a more sophisticated link analysis system in the future.
“We will also continue to work on hacked sites”
Google intends to do a better job of detecting hacked sites. In the coming months, they plan to launch a site detection strategy that is more comprehensive. Along with this, Google will work to provide better support and communication with webmasters. The goal is to provide more comprehensive informational resources in Webmaster Tools, to help webmasters clean up hacked sites.
“We’re doing a better job of detecting when someone is more of an authority”
Cutts says that Google will help regular webmasters by doing a better job of detecting when a site is more of an authority on a specific subject. If the algorithms determine that website content is more authoritative, it will be deemed more appropriate and rank more highly.
“We’ve also been looking at Panda”
Google has also been working to refine Panda, looking for additional signals that will soften the impact on “border zone” websites that fall into a more gray area. Cutts says they will be looking for quality indicators that will, to some degree, help sites that have previously been affected by Panda.
Less likely to see a “cluster of several results all from one domain”
In response to complaints about too many results from the same domain on some queries, Google has worked to push these clusters of result further down. Cutts explains that they are also looking at developing a change that would mean once a user has seen a particular cluster of results from one site, they will probably not see additional results from that site.
In summary, Cutts says that Google will roll out these enhancements and continue working to provide more information to webmasters. Throughout his message, he emphasizes that plans are always in flux and subject to change. He assures webmasters that, “as long as you’re working hard for users, we’re working hard to show your high quality content to users.”