Duplicate content is becoming an increasing issue for popular search engines. Without doubt it's something any self-respecting webmaster should be aware of, and the fact that Google is starting to become more focused on the issue……. for that reason alone they need to take note.
From a search engine perspective duplicated pages are a bit of a headache. It's difficult to know which page to serve up in search results if so many of them are the same. In order to compete, those behind search engine algorithms are working harder and harder to make search results more relevant to the user. Offering the user a batch of results of a very similar nature can hardly be considered the best service; therefore duplication is now being targeted.
Duplication often happens without webmasters really realising it. Organisations with mirror sites should be among the first to give the issue some thought. Multiple domains pointing to the same page is a well used trick that'll now potentially get you into trouble. Webmasters should also give consideration to content that may be repeated in PDF files and printer friendly pages etc. In instances such as these good use of robots. txt can allow you to determine what pages get priority in search results rather than leaving it to chance.
Many webmasters now carryout article syndication, whereby content is freely offered up to other sites in order for a link back. While this can quickly create several fresh inbound links, improving ‘link popularity', the process does multiply the instances of the same content. Anyone regularly carrying out article syndication will notice how Google initially lists possibly hundreds of pages and then reduces them down over time to just a few. Some content of course is just simply stolen. To help a little in this area it may be worth copyrighting material especially if the content is highly productive to generating traffic and enquiries etc. Another area in which you can fall fowl is by recycling content. If a website contains several pages which share very similar paragraphs, it may be best to try reducing them down or combining pages.
As many search engines are becoming more sensitive to the issue it's seems like just a matter of time before duplication has a real impact. While focus is often on Google, other search engines are beginning to turn the screw. Yahoo for instance now tries to reduce the frequency in which duplicate content is crawled. Another good reason why you may find your site slipping in search results.
This article is free to republish provided the resource information remains intact.
Paul Coupe is lead designer / developer with Zoom Online.
Zoom Online - Providing total online solutions.