The Google Penguin Update has graduated. It’s no longer an “update.” Now it’s an official “ranking factor in the core algorithm.”
Penguin tobogganed onto the the scene in April 2012, taking many webmasters and SEOs by surprise.
The Penguin Update was designed to attack sites with spammy links. You might get slapped if your site had backlinks that:
- Were from spammy sites (e.g. adult sites, pills, gambling)
- Over-used keywords as the anchor text (come on, how many people really linked to you of their own free will using the keywords “best inexpensive italian restaurant birmingham?”)
- Linked to you from every page of a site (e.g. footer links, sidebar links)
The old Penguin Update (1.0-3.x) had a site-wide effect. Even if only some of your pages were affected by the spammy link signs mentioned above, Penguin would swallow your site whole like a beak-smacking piece of krill.
Your site would plummet in the search engine results, and you would have to find all those spammy links, get them removed or disavowed, and wait patiently, hopefully, for the Penguin to appear again.
The old Penguin Update only swam up once every few months (or years) to see what had been done and dole out prizes (ranking increases to anyone who had fixed their link issues) and slaps (ranking decreases to anyone who had built a link profile that stank like rotten fish).
But Penguin has graduated. It’s no longer an independent update, run at Google’s whim. It’s now a part of Google’s core algorithm, which means its “opinion” of your page will be modified every time Google crawls a page (just like where you put your keywords in your title tag or changes you make to your content). And that Penguin “opinion” will be one of the factors used when Google decides how to rank your page for any given keyword search.
What does that mean for you?
- If you get slapped by the Penguin, you don’t need to bite your fingernails for months, waiting for a reprieve. As soon as you clean up your act, and Google recrawls your site, the Penguin shall smile upon you again.
- On the other hand, if you’re naughty, and you’re building 100 links a month with anchor text like “top-rated real estate law firm in Dallas,” the Penguin is going to find you pretty fast.
What else does the new and improved Penguin do?
Where the old Penguin demoted sites based on spammy links, the new Penguin doesn’t directly demote the site, but instead devalues those spammy links. The links may not count at all, or they may be relegated to a much more minor factor in the ranking decision for that page and/or keyword search.
For example, all your “top-rated real estate law firm in Dallas” links will suddenly not count for anything. Your ranking for that term/target page may drop, if you were relying on those links to rank, but it’s not because Google has signaled your site for doom. You just need to work harder on quality links and other ranking factors (like you would to rank any page).
What does that mean for you?
- You don’t have to be obsessed with the possibility of spammy links pointing to your site. If you didn’t want them, and you’re not relying on them, then just get on with your job of optimizing your site with quality content and links. Ignore them, because that’s likely what Google is going to do.
(One exception here is if you have an exceptionally large amount of spammy links pointing to your site. The risk here is more one of being slapped by a manual “Unnatural Links Penalty” than of Penguin problems, but either way: put in due diligence and clean up/disavow as necessary.)
- You should always keep an eye on your backlink profile to make sure nothing funny is happening. Little blips aren’t anything to be concerned about. If your eyebrows shoot up, however, so will Google’s, so take care of it.
One more trick our Penguin has learned:
The old Penguin would demote the entire site, even if only some pages were affected by the spammy links. The new Penguin has learned to be more discerning. He can single out pages or categories of pages as targets for link devaluation, while leaving other pages untouched.
What does that mean for you?
- If you notice a drop in rankings, and you suspect it may be Penguin (spammy link)-related, see if you notice a pattern in the types of pages the drop is coming from. If you can identify a common thread among the pages, or among the suspected keywords leading to those pages, you’ll have a better idea of what pages/keywords/anchor text might be causing your issue.
- You can be more relaxed and assertive with your link building campaigns. This does NOT mean you can be happy-go-lucky with link spam again and all will be well. It DOES mean that you don’t need to panic if your quality link-building campaign is a huge success and you get many links (some with targeted anchor text) in a short period of time. If that trips a Penguin wire, the worst that will happen is that your hard-earned links won’t count.
Why didn’t Google just do Penguin this way in the first place, you ask? Why update so infrequently, and demote the entire site?
Alex Miller of Posirank starts us out on the answer:
“Google is evidently VERY confident of their ability to detect high quality and low quality links.
“Well if they weren’t, they absolutely would not release an update like this because they know by doing so, SEO’s will become a lot more aggressive in their link building “overnight”.
“SEO’s will throw a lot more at a site just to see if it will boost. Because – as long as they aren’t super aggressive, there seems to be very low risks here.
“This of course could expose big holes in their algorithm, unless they were very confident of their new technology.
“This is most likely why they moved away from this model for so many years.
“And – it’s likely why it took TWO years for them to update Penguin and bake it into their algorithm. They had to be sure.”
Rand Fishkin of Moz completes the answer with his theory that it was those infrequent updates and site demotions that gave Google the information they needed to confidently identify spammy links:
“I believe that Google specifically went through this process in order to collect a tremendous amount of information on sketchy links and bad links through the disavow file process.
“Once they had a ginormous database of what sketchy and spammy bad links looked like, that they knew webmasters had manually reviewed and had submitted through the disavowal file and thought could harm their sites and were paid for or just links that were not editorially acquired, they could then machine learn against that giant database.
“Once they’ve acquired enough disavowals, great. Everything else is gravy. But they needed to get that huge sample set. They needed it not to just be things that they, Google, could identify, but things that all of us distributed across the hundreds of millions of websites on the planet could identify. Using those disavowal files, Google can now make Penguin more real-time.”
It’s happened. Google is confident. And if you’re a well-intentioned webmaster, you can also be more confident. Go west and build quality links, young man (or woman).
For your convenience, here’s a concise chart of the differences between old Penguin and new Penguin – and what impact it has on you.