Search Engine Optimization August 19th, 2013
In February 2011, Google rolled out Panda – a new algorithm that was aimed at moving high-quality websites up in the search results. After this update, pages with low quality not only affected their own rank but also the entire section or website they were part of. Then, another big change took place when Google introduced Penguin, a new algorithm update in April 24th 2012. While Panda’s task was to promote high-quality content in Google ranking and focused more on UX, Penguin was targeted specifically at fighting spamdexing.
It’s worth mentioning that Google implements changes, refreshes and adjustments in its algorithms constantly, and obviously not each one is significant enough to get its own name. In April 2012 alone, when Penguin was first rolled out, over 50 other changes were made in Google algorithm.
Penguin 2.0 is a Google webspam algorithm, announced by Google’s head of webspam Matt Cutts on the 22nd of May 2013. In the video, uploaded early May 2013, Matt Cutts announces the approaching launch of Penguin 2.0 and describes it as follows: “We’re relatively close to deploying the next generation of Penguin. Internally we call it ‘Penguin 2.0,’ and again, Penguin is a webspam change that’s dedicated to try to find black hat webspam, and try to target and address that. So this one is a little more comprehensive than Penguin 1.0, and we expect it to go a little bit deeper, and have a little bit more of an impact than the original version of Penguin.”
To understand Penguin 2.0 we need to take a look at the spamming
techniques that it is meant to detect. Black hat webspam, also referred to as spamdexing or black hat SEO tactics means influencing how search engines index a website, in a way that is often against the engines’ guidelines. In other words, those are all the attempts of “tricking” a search engine into giving higher rank to a website. The black hat versus white hat terms derived from western films, where the color of the hat distinguished protagonist from villain. There are multiple black hat tactics and techniques, let’s look at the most common ones.
Keyword Stuffing. This means repeating the keyword many times, to the point that it doesn’t add value to the content, it makes the text difficult to read or even just preposterous. The intention of this is that Google will detect the keywords and hopefully find them relevant.
Hidden text or small text. It ranges from putting tiny, illegible phrases at the bottom of the page to making the font’s or image’s color the same as the background.
Cloaking. It refers to showing different content to the search engine than the one that is displayed to the actual user.
Doorway pages. These are web pages that consist of senseless content and are mostly stuffed with keywords, with no value to the viewer. Their goal is to achieve a high search rank and automatically redirect the user to the right landing page.
Link farming. Trying to improve website’s Google rank by backlinking to it on low-quality pages that only serve the purpose of link building.
Comment spamming. This means automatically posting links to a website in blogs’ comment sections, in order to gain more inbound links.
Sybil Attack. This is the name for creating multiple low-quality content web pages that link to each other e.g. link farms or spam blogs. Sometimes, it may also refer to spamming forums, social media, and blogs by impersonating various users to leave links and ads in comments.
Other spamdexing tactics include, among others, mirror sites, page swapping, and hiding affiliate links. It’s worth mentioning that while Google Panda was targeted mostly at the on page spamdexing tactics, the Penguin focuses on the off-page black hat SEO and spammy link building in particular.
Nowadays, aside from being unethical and risky, these tactics became almost useless.
After Penguin and Penguin 2.0 the black hat tactics are mostly ineffective and may result in Google penalties, doing harm to the website SEO. Spamdexing assure little, if any, advantages and only in the short term. Improving your websites Google Ranking position in an instant does seem tempting; however, it may not pay off in the long run. With black hat SEO you risk decreasing your Google position or even that that your website might get de-indexed.
The essence of the Penguin 2.0 is that is penalizes unwanted SEO tactics which are against its guidelines. But what exactly does Penguin 2.0. penalize?
Backlinks from low-quality sites which contain many outbound links
High ratio of keyword-rich backlinks
High percentage of inbound links coming from comment sections
Backlinks from websites with irrelevant, unrelated content
No diversity within backlinks e.g. all of them coming from similar source types (directories, comments and such)
Any other violation of Google Search Quality Guidelines
Penguin 2.0 increased the significance of inbound links to the SEO. Not only can they improve websites position in SERP, but in certain cases they can worsen it. Since you don’t have control over backlinks to your website, it creates an opportunity for abuse. Your competitors may attempt to negatively influence your ranking by employing third-party websites, such as farm links, to create multiple low-quality or spammy backlinks to your website. In order to counteract this procedure, Google launched a tool to disavow links which means that “you can ask Google not to take certain links into account when assessing your site. This does not, however, spare webmasters and SEO professionals in monitoring all the backlinks to their websites and making effort to take those harmful down. Firstly, Google itself ask users to treat disavowing links as the “last resort” after they’ve done everything to take the bad link down. Google also emphasizes the fact that it won’t be able to “clean up” all the bad inbound links and encourages webmasters to take matters into their hands. Secondly, even if backlinks will ultimately get disavowed, it may take weeks before it will be processed and valid. This may mean lots of lost opportunities – potential traffic, prospects and actual clients, who never found your offer because of the negative SEO.
After introduction of Penguin 2.0 it is become even more essential to monitor all inbound links to your website. Moreover, you should be able to distinguish those of high quality that bring you actual traffic and conversions from the harmful spam. There are tools to help you out with constant backlinks monitoring and analysis, making it faster, easier and insightful. With these tools you can measure and rate quality of inbound links, the traffic they generate and how it translates into conversions on your site. Also, you can get e-mail notifications of all new referrals, so you can stay up to date and react immediately in case of negative SEO. This allows you optimise your online marketing efforts and helps avoid penalties from Penguin 2.0.
Google stated that 2.3% of US/English searches were affected by the Penguin 2.0 update to the degree that a regular viewer may notice. However, there are plenty long-term effects that many websites have to face. A survey conducted, in May 2013, one year after the first Penguin update, showed that the majority of the respondents, who claimed they were affected have not recovered yet.
Time will tell what the effect of the 2.0 update will be; however, you can already see some deficiencies in the new algorithm. For instance, an analysis was made by www.econsultancy.com, which compared many domains ranking performance on May 21st and May 24th 2013 in order to assess the immediate results of the update. It revealed that many of the websites that gained at least 20% increase in volume directly after the update, were those spammy and low-quality ones. At the same time, some direct damages were made to large and valid websites (e.g. www.icelolly.com).
It is still too soon to fully assess Penguin 2.0 effects and analyze the examples of successful recoveries. However, Marie Haynes recently described an interesting case study of a website which used to hold the first position in Google organic search on several targeted keywords. The company was worried about losing their position due to the forthcoming Penguin 2.0 and turned to Marie for link audit. The goal was to prevent potential damage caused by the update. As a result, all the backlinks from low quality sources with keyword rich anchor text were detected and disavowed. Interestingly, this did not trigger any loss in Google ranking. Nevertheless, on the day that Penguin 2.0 was rolled out (the 22nd of May 2013), the website lost its organic rankings for its main keywords. It didn’t manage to prevent the damage.
It is difficult to prevent being affected by Penguin 2.0 and instant recovery is unlikely because earning valuable links, as well as, cleaning up the bad ones takes time. Recovery is an ongoing and constant process and involves long-term activities rather than one-time actions. Creating in-depth content, building long-term relations, gaining trust and authority based on actual value added and expertise. Those are the elements of a long-term SEO strategy that cannot be developed and executed overnight.
It should come as no surprise that these types of changes mostly affect small and medium-sized businesses rather than corporations, because of SMB’s limited resources. Nevertheless, there are certain opportunities that businesses can take advantage of in order to stay competitive in the face of Penguin. Here are 10 simple guidelines for effective SEO after the Penguin 2.0 update:
1. Prefer earned media over paid ones. Penguin 2.0 results in an even bigger importance of content quality. Providing meaningful and in-depth content is the key.
2. Be careful with advertorials. If you are paying for content placement e.g. a blog post, you have to provide a disclaimer or use only nofollow links.
3. Leverage Google Author Rank. You can start building your authority by assigning your websites and your content (articles, publications, posts) to your Google+ account. By sharing it and connecting with other Google+ members you contribute to your Author Rank which will ultimately influence your website’s Google rank.
4. Attract natural links. The way to achieve it is by providing valuable content, attracting users and building relationships with other authorities and quality websites.
5. Avoid black hat tactics and “clean-up” the results of past spamdexing, if you have a record of such.
6. Make sure to monitor inbound links to your website. Do it on a regular basis in order to detect and fight potential negative SEO.
7. Pay attention to your anchor text. Make sure it’s diverse and not overloaded with keywords. If you notice many identical anchor texts, don’t hesitate to remove or disavow the links.
8. If you haven’t done it yet, consider guest posting for building high-quality, organic link-building. Apart from assuring valuable inbound links, it allows you to build relations with other platforms and websites, influences your Author Rank and provides additional reach and traffic, assures overall. It’s an effective way of strengthening your SEO in the post-Penguin reality.
9. Maintain your activity in social media, including but not limited to Google+. Social signals are still gaining importance in comparison to other SEO tools.
10. Disavow only if necessary. If you identify any spam or harmful backlinking, that you are unable to take down, make sure to report it as soon as possible, using the Google disavow links tool.
Last but not least, just do your thing. Don’t put too much emphasis on the constant changes in algorithms. If you provide superb and relevant content using the white hat tactics, your website will stay immune to negative effects of surrounding changes.
What Penguin 2.0 Means for Your Website In PracticeRead time: 8 minutes