How Do Search Engines Handle Republished Content?

On the Google Webmaster Central blog, the Webmaster Trends Analyst states explicitly that there is no duplicate content penalty. If Google finds a number of URLs that have the same content, they group them into a cluster and give prominence to the one that best represents the set.

 

How Do Search Engines Handle Republished Content

How Do Search Engines Handle Republished Content
https://upload.wikimedia.org/wikipedia/en/thumb/8/87/Google_Search_error_of_January_31,_2009.png/220px-Google_Search_error_of_January_31,_2009.png

 

So What was the Purpose of the Farmer Update?
If there is no duplicate content penalty (http://googlewebmastercentral.blogspot.com/2008/09/demystifying-duplicate-content-penalty.html ) , what happened with the Farmer update that targeted scraper sites? Google does penalize scraper sites that steal content from others and republish it on their own blogs, presenting it as their own.

Unfortunately, while there were and are sites like these that featured user generated content that had lots of quality articles like this, there were also users who simply looked around the web for articles they liked and stole them. The latter are referred to as scraper sites and they were the ones that were targeted.

It becomes easier to scrape content when sites don’t have an editorial team that checks all submissions to determine whether the content is owned by the writer who submits it. As I wrote in another article, Google has developed author rank as one of the steps in addressing this issue.

How do Sites Prevent Content Theft?

Multi author sites prevent unauthorized use of original content in several ways. For example, authors are required to link back to any of their work that they republish and provide a link from the original to the newer edition. They are also required to use the same name that they originally published the piece under.

What About the Panda Update?

The Panda update was also designed to target scraper sites. Some people refer to the Panda update and use “duplicate content” in broad terms. However, this algorithm change was designed to push scraper sites down in the rankings and also address the issue of webmasters who had duplicate webpages on the same site.

Why would webmasters place the same exact content on multiple pages of their site? Apparently, it has something to do with manipulating the search engines. This is the type of malicious motive that the Panda update is supposed to address.

How do you Determine your Author Rank?
As far as I know there is currently no way to measure your author rank. However, there are ways to build your author rank. I have shared these in my article “Google Author Rank”.

What About All the Sites That Seem to Use Republished Content and Are Not Sinking in the Search Results?
Several sites use republished content. They are not penalized. Sites like Honestreviewz.com do not. It varies from site to site. However, it is important to note that sites which use republished content try to add value to whatever is reused on another site. They may offer services or additional information. They also have the permission of the authors with respect to using the articles. They don’t just copy content feeds from other sites without, as Google put it, “providing some sort of unique organization or benefit to the user”.

What are the Benefits of Syndicating Your Content?

For many years, companies have syndicated, or republished their content on different sites. Many writers also do this. It increases their influence and also provides the following benefits:

  1. Increase their overall exposure
  2. Raises awareness of their brand and credibility
  3. Provides new opportunities for them in the media, etc
  4. They earn more when interested customers or clients read their work.
  5. For a company, they generate more leads

This is why I republish my own content. I sometimes republish an article once. This gives me more exposure and allows me to earn more money. I discussed this in more detail on another article. I don’t usually republish an article more than two times and I am careful about where I republish.

Limiting the number of times I republish allows me to better monitor the use of my work and its performance. I don’t want to put my work on a site that has a bad reputation or where there are low standards.

Writers are not the only people who do this. You can also syndicate videos, artwork (think of all the Sunday comics you know) and more. Modern syndication providers include a link back to the original work. Would you like to see an article you have written published in not one online magazine, but two, or three? This is what you do when you choose to republish the article you own on two or three sites. You self-syndicate.

Is There a Way to Indicate Which Copy of an Article is Preferred?
Sometimes business owners or other publishers want to tell the search engines that one version of an article is preferred. This is a common issue, since many times, they place several versions of the same content on the same site for their own purposes.

Some publishers choose to use the canonical link. The canonical link tells search engines to prioritize one page from all of those on the same site. The thing is, with the canonical link, only one of these identical pages is indexed in the search results. If that’s not what you want, don’t use a canonical link.

It costs money to produce an article. You take the time to write it and possibly, do research to support what you write. Say that it costs $20 for you to produce an article. How long do you think it will take before you can make that $20 in terms of views? Generating more traffic by republishing or syndicating your work once or twice on quality sites gives you more opportunities to earn, so you can break even and make a profit from the hours you put in.


Share with your friends
Facebooktwittergoogle_plusredditpinterestlinkedinmail
To report this post you need to login first.

Leave a Reply

Your email address will not be published. Required fields are marked *