Google never compromises when it comes to content quality. Websites that do not adhere to the search engine giantâ€™s quality guidelines face the wrath of Googleâ€™s pet monsters, Penguin and Panda. These algorithmic updates see that sites with poor quality content are penalized with low ranking. Panda mainly penalizes websites that do not have original content, or in other words, sites with duplicate content.
For years, duplicate content has been a hot topic in the SEO industry and recently, Matt Cutts cleared the air about the kind of duplicate content that Google will penalize. To Google, duplicate content â€śrefers to substantive blocks of content within or across domains that either completely matches other content or is appreciably similarâ€ť. There are two types of duplicate content: that which is not deceptive in origin, and the other, that which is purely intentional and attempts to manipulate search rankings and attract more traffic using devious means. The good news is that Google will not target duplicate content unless it is spammy or keyword-stuffed.
Duplicate content of the non-malicious type includes discussion forums that can create both regular and stripped-down pages targeted at mobile devices, store items displayed or linked through multiple distinct URLs, and printer-only versions of web pages.
Though he does not elaborate on the matter, Matt Cutts states that Google will ignore duplicate content unless it is spammy or stuffed with keywords.
On a Webmaster Tools page, Google offers tips to help you deal with duplicate content issues: use 301 redirects, be consistent with your internal linking, use top-level domains to handle country-specific content, syndicate your content on other sites carefully, minimize boilerplate repetition, avoid publishing stubs, understand how to manage your content, and minimize the use of similar content.
Going by these tips can ensure that your visitors see the content you intend them to see. Google rarely impose penalties for duplicate content issues, but is clear that it aims to see that users get distinct information.
The best way to avoid duplicate and poor content issues is to outsource your content development requirements to a professional SEO company. This will ensure content that is informative, fresh, and search engine friendly.