The hoopla over duplicate content has been going on for quite some time now, and I see it as simply just another money making scheme by online entrepreneurs wishing to chase down the Golden Goose. Almost every day, my inbox is inundated with yet another “article converter” that is guaranteed to make my private label rights articles hit the top of the search engines with no fear of the Google Police knocking at my PR door, screaming “Duplicate Content!”
I ofttimes wonder how many of the so called gurus take the time to really read the Google Adsense Program Policies. And I wonder many times during my working day just how many people open their wallets to let fly their hard earned dollars to these people.
Here are Google’s exact words, and I quote: “Do not create multiple pages, subdomains, or domains with substantially duplicate content.” What does this really tell us? Does it tell us that the PLR sites that sell thousands of the same articles to people who don’t have time – or are too lazy – to write their own content are breaking Google’s rules? Not hardly. Google is telling us that we cannot create what used to be called “mirror sites” (This is a Web site which contains the same information that is located on another site. If the site abc.com is the same as def.com, then it may be disqualified from listing by search engines) in an attempt to increase Page Rank and increase Adsense income.
Many opinions abound on the forums and elsewhere on the web discussing duplicate content. And many netrepeneurs have taken advantage of the misinterpretation of Google’s policies to capitalize on this. Because Google has made this the era of content, everyone that is involved in the online communities is scrambling for the proper answers. I see threads that are three to five pages in length on the more popular forums with people agonizing over their fear of duplicate content. What a field day for the guru’s! I wonder how many thousands – perhaps millions – have been made by people taking advantage of this fear factor?
Lets examine the facts. If there really was a duplicate content filter then many news web sites that publish AP or Reuters news would be banned from search engines. Many catalogue sites would go under, because they sell the same products, using the same promotional items as other sites. Affiliate sites would be banned from the search engines because people use the promotional items provided by the site owners. And even the giant eBay would go under, because anyone who has spent time there sees a ton of items listed which are identical, using the same description, same images, and same user ID. I wonder how Copyscape.com would handle this?
What about the sites that put articles and ezines in archive. This content ends up being displayed both in static pages and archives as well. Penalized for duplicate content, when the website owner wants to have his articles available to the general public? I doubt it…
Common sense is the order of the day. If you take the time to provide original and unique content to your site, the site is well optimized for the search engines, and you have relevant backlinks, then your site will do well with no fear of penalty.
Don’t use article scrapers, which mirror the exact content of other sites, and is nothing more than a rip off. If you buy PLR articles, try to rewrite them in your own unique voice. If your budget will allow, hire a ghostwriter to create articles pertinent to your particular niche. And most of all, just use plain common sense!
Visit http://www.for-the-record.biz for more relevant articles on duplicate content, seo, rss and blogging. Updated weekly…
We are sorry that this post was not useful for you!
Let us improve this post!
Tell us how we can improve this post?