Connect with us
McAfeemcafee banner ad


Website penalties gone: Google upgrades to Penguin 4.0

Google isn’t messing around when it comes to spam cluttering their search engine. Penguin 4.0 is a perfect example.

Google Penguin 4.0

Google isn’t messing around when it comes to spam cluttering their search engine. Penguin 4.0 is a perfect example.

Google has just made a major announcement regarding its core algorithm. In a post on the Official Google Webmaster Central Blog, the search engine giant published information regarding the incorporation of Penguin 4.0 to its core algorithm.  

“After a period of development and testing, we are now rolling out an update to the Penguin algorithm in all languages,” said Google’s Gary Illyes from the Google Search Ranking Team in a relatively short post.

According to Illyes, the new changes are among the most highly requested from webmasters: making Penguin real time and more granular.

What does this mean for webmasters and developers? How should the SEO world adapt to these changes? In order to get a better idea, let’s dissect the exact changes being made.

A Little Background

Penguin, first announced in April 2012, is a Google algorithm update that attempts to stop search engine spam. The goal is to prevent the use of spamdexing techniques – also known as black-hat SEO – that manipulate search engine results.

Some of the illegitimate techniques used include link spamming, keyword stuffing, and the use of unseen or invisible text on web pages. Penguin would devalue pages that included these techniques by decreasing their placing in the search engine results page.

Previously, these techniques were widely used to achieve a highly ranked website. After Penguin was released, webmasters were forced to audit every link and page on their website so they wouldn’t be affected by Penguin. If a website was penalized for containing some of these elements, it couldn’t recover until Penguin was refreshed.

These refreshes were inconsistent and irregular. A list of Penguin updates, published by Barry Schwartz at the Search Engine Roundtable, shows that the most recent Penguin update took two years to implement. “The last update in 2014 – Penguin 3.0 – may have only affected less than 1% of US/UK searches, but that ultimately translated to 12 billion queries,” asserts Christopher Ratcliff from Search Engine Watch.

The new updates announced by Google are meant to change the way Penguin reacts with the core algorithm and the way it influences and/or penalizes pages on the web.

Penguin is Now Real-Time

The first major change to the algorithm is that Penguin 4.0 is now in real time. “With this change, Penguin’s data is refreshed in real time, so changes will be visible much faster, typically taking effect shortly after we recrawl and reindex a page,” Illyes said in the post.

Previously, Penguin was refreshed on an unknown and inconsistent schedule. “Once a webmaster considerably improved their site and its presence on the internet, many of Google’s algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed,” Illyes said.

For webmasters, this is extremely important. Now that the algorithm element is real-time, website data is refreshed almost immediately, every time the page is recrawled and reindexed by Google. This means that changes made to influence SEO will be reflected much quicker in the Google search results pages.

Penguin is Now More Granular

Previously, penalizations handed out by Google could influence an entire web domain. Meaning that one small problem could influence an entire website’s SEO. This was a huge problem for webmasters and developers that had to pay attention to every small detail that was included in the website.

“Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site,” said Illyes.

Essentially, the new changes seem to point to the fact that Penguin will stop penalizing entire websites for flawed content. Instead, websites will be impacted on a page-by-page basis, giving webmasters the opportunity to improve individual pages.

Recommendations Moving Forward

Penguin is now among the 200-plus signals that Google uses in its core algorithm to rank pages and determine how they display on the search engine results page. Despite the fact that Google is “…not going to comment on future refreshes” of Penguin, it’s safe to assume that it will be updated more frequently.

There is still a lot that is unclear about the Penguin update, including the use of disavow files. Gary Illyes still recommends using disavow files to recover from issues related to Penguin according to a Twitter post. When asked “Can you address whether disavow files are still a useful link pruning tool under Penguin 4.0?”, he responded, “we haven’t changed our recommendations for the disavow tool with this launch.”

Yet, in a Facebook exchange between Illyes and Barry Schwartz, Illyes did clarify that “manual actions are still there, so if we see that someone is systematically trying to spam, the manual actions team might take a harsher action against the site.”

“The web has significantly changed over the years, but as we said in our original post, webmasters should be free to focus on creating amazing, compelling websites,” Illyes concluded in the post.

Overall, these changes will undoubtedly affect the SEO realm as experts move quickly to adapt the adjustments and implement them on their websites.

Follow us on Flipboard, Google News, or Apple News

Founder and CEO of Online Performance, a company specialized in providing SEO services and Founder and UK CEO of Exactive Marketing, an advertising agency, focused on digital marketing. He is an expert in search engine optimization (SEO) with years of experience in web and mobile marketing.

More in Business