A little while ago, the major search engines announced a solution for internal duplication issue named “Rel=canonical”. Google has now gone a step further in fixing the internal duplication issue within a website to make their indexing intact. As part of the efforts, Google has announced one more solution for internal duplication issue called “Parameter Handling Tool” in Google Webmaster Tools. Here is the official announcement from Google Webmaster Central Blog.
If you are not familiar with Duplication Issue, you can find the basic information in my earlier post “Duplicate Content Problems & Solutions For SEO.”
The parameter handling tool has been enabled by Google in webmaster tool accounts. The tools allows webmasters to have complete control of the Google robot to index our files. Some content management systems may produce different URLs for the same content based on how the user navigates around your website.
Here is an example how to use the Parameter Tool
Primary URL: http://www.example.com/product.php?item=swedish-fish
If a user reached the same page via category “gummy-candy”, the CMS may produce the URL as: http://www.example.com/product.php?item=swedish-fish&category=gummy-candy
If a user reached the same page through a shopping cart page, the CMS may produce this as the URL: http://www.example.com/product.php?item=swedish-fish&trackingid=1234&sessionid=5678
The issue in the past is that search engines would look at this as three different pages but the content is the same. This may lead to penalty due to the same page content. With "Parameter Handling" in GWT (Google Webmaster Tools) setting, you can now provide suggestions to Google’s crawler to ignore the URLs that contain parameters like category, trackingid, and sessionid.
Below is an image that illustrates how to enter the parameters which you want to be restricted from Google's Index.
This way of fixing duplication will not immediately reflect the indexing algorithm but it will be taken into account after Google's manual review. There is no harm in using both rel=canonical and parameter handling tool for fixing the duplication issue within the website. Rel=canonical works for all search engines whereas parameter handling tool works only in Google. You can choose the option that works the best for you although both if used simultaneously will help you to deal with duplication most effectively.
Yet another tip to keep your SEO in good order!
on
@Tasting:
Thanks for reading.
Google Webmaster Tools reports only on duplicate title/meta descriptions and not the pages duplication. Have unique title/meta description for the pages, this will get removed in the next crawl.
on
I added the canonical tag and added the parameters to exclude in the Google Webmaster Tools there is more than one year and my bomepage is always in duplicate in the webmaster tools. Any idea about it ?
on
Nice to know of the use of this tool, Thanks Elan for this great post.
on
Making Google ignore up to 15 URLs and keep the standard one is a nice feature to avoid duplicate content. Very informative Post!
on
A great tool which will definitely give a relief to programmers to optimize the websites and to get accurate reports. Thanks Elan !
on
Guess this is going to be a definitive solution for handling duplicate content issues. Nice post.
on
Indeed a very informative post Elan.
Really this is a great tool that will help the optimization of a site. Webmasters should welcom this new Google feature with open arms.