Google doesn’t usually pre-announce big changes to their services, especially regarding their search algorithm, however Matt Cutts during SXSW gave us a heads up on some changes.
There are a few things to note, and none of them are overly surprising:
- Google will continue to, and will soon place even more emphasis on quality content
- Aggressively overly-optimized pages will be affected
Read the following quote, which Matt Cutts gave during his SXSW interview:
What about the people optimizing really hard and doing a lot of SEO. We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.
(Quote and recording courtesy Search Engine Roundtable)
But to get a decent look at what some of the top thought leaders in the SEO industry are thinking about this, head over to a comment thread at Inbound.org on a submission of the article that has Matt’s SXSW interview in it.
A couple of highlights of the comments:
Tad Chef: We all need keyword density checkers again, to find out whether our keyword density is too high.
Dr. Peter J. Meyers: Seriously, can’t believe this fell under the radar. Little coverage of “Venice”, too. Are these monthly algo announcements by Google making us all lazy (except Barry)?
Ross Hudgens: Haven’t they always been hammering over-optimization? Maybe they’re turning it up a notch. But definitely will be checking the density and internal anchor text.
Willy Franzen: From the language he used, it sounds like they’re going to algorithmically penalize sites for over-optimization instead of taking away the benefits of over-optimziation. That seems odd to me. Why not just make those practices less effective? If spammy stuff didn’t work, there wouldn’t be a need to “level the playing field.”
I think that the key takeaway here is that if you are using less than favorable methods to optimize your content and pages than shake in your boots because Google is about to slap you a little more, post-Panda.
If you are focusing on truly quality content, however Google and Matt defines it (would be great to hear some insight into this by the way), and certainly optimizing (would be dumb not to do it at all) but making it relevant and logical keeping in best practice, don’t stress it so bad. Just don’t let it fly under your radar. Stay on top of it.