Back in April, Google released a new major update to its search algorithms under the project name “Penguin”. The update didn’t cause the same amount of chaos and panic that followed the 2011 Panda update, but it still took a big bite out of the search ratings for a number of (previously) highly-ranked websites.
The Panda update (in February 2011) hammered very profitable websites known as “content farms” because it devalued or negatively rated duplicate content and content that the algorithm judged “low quality”. Now that the dust has settled from the Penguin roll out, it appears this update focused heavily on the quality of a site’s incoming links and on the “spamminess” of its content.
Websites that suffered the most from Penguin appear to have depended heavily on incoming links from paid directories, link farms and other types of link exchanges with questionable relevance to the site’s content. Sites also got marked down in the SERPs for what we euphemistically term “over-optimization” — or keyword stuffing in plainer English.
Google’s developer’s notes and blogs regarding Penguin generally point to the goal of providing higher search rankings for websites that have a more “natural” structure to them; that is, websites that don’t appear to have been too artificially enhanced simply for search engine appeal.
Google’s search algorithms have always poorly ranked incoming links from what the search engine considers bad neighborhoods. Penguin appears to have considerably expanded the definition of “bad neighborhood”. The update also takes a closer look at the anchor text used for incoming links. Anchor text that is too repetitive or unnaturally keyword dense can now work against a good search ranking.
The same approach applies when Penguin examines a website’s content. Keyword densities that are too high and keyword phrases that repeat too often can cause a site to plummet in the SERPs. The exact parameters the search algorithms use is unknown, of course, which causes a lot of angst about how much is “too much” when it comes to keywords. It’s not exactly trial-and-error, but helping a site recover from a Penguin “bite” takes some experimentation and patience.
Google’s long-term goal is to provide users with search results focused on websites built to deliver information to people, rather than sites designed to appeal primarily to search engines. Coming to grips with the concept of a “natural” website structure can be difficult, but if you keep in mind that your site’s primary audience is people and build your content accordingly, you’ll be at least halfway to fending off the killer Penguin.