Google Says Websites Shouldn’t Mark Republished Content for Index

Google Noindex Advice

Google (NASDAQ:GOOGL) recently suggested more steps toward eliminating duplicate content across the web. And website owners, including small business owners, may want to pay attention considering the search engine has a history of eventually penalizing sites for not taking heed.

Google Noindex Advice

Specifically, websites currently republishing content sourced from original authors are now encouraged to “noindex” that content. This Google noindex advice is something most content creators are unlikely to follow.

In a fight to rank on the top page of Google, specifically the top five search results, websites —  including major news outlets — often republish popular articles. Applying noindex to all of these syndicated articles would solve one of Google’s biggest headaches — duplicate content. But so far, the reward for doing so seems too high.

Overload of Duplicate Content

Currently just about every major news source including: NY Times, Wall Street Journal, Washington Post, MSNBC, Fox News and others are simply re-posting content without applying noindex. The content generally comes from syndicated news sources like the Associated Press or Reuters.

Search any headline in Google and you will undoubtedly get thousands of sources with identical content, writes SEO expert Barry Schwartz of Search Engine Roundtable. Ironically enough, the top search results are often not the original source. Nevertheless, the majority of websites will continue this practice in search of the reward of high-traffic to their web address.

In a recent Twitter exchange on a related topic, Google webmaster trends analyst John Mueller suggested sites should not be marking such context for index by the search engine:

Marking Content Noindex Means No Traffic from Google

Noindexing is pretty much the opposite of what most websites want  to do.

The noindexing tag is an HTML value applied to content for the purpose of keeping search engines from ranking it. It’s applied on the backend of web administration and is normally used for private data or files linked to large databases.

This of course is bad news for websites that make their living —  at least in part — from republishing content. And small business website owners who try to make up for a lack of original content on their sites with republished articles from other sources should be concerned too.

This is not the first time that Google has pushed to remove excessive duplicate content from the web. It also seems likely that the search engine will eventually respond with an algorithm change that downgrades sites with too much of this content.  In the meantime, however, the noindex step seems something Google is simply requesting of webmasters.

Google Wants a World Without Duplicate Content

Whether it’s practical at the moment or not, it’s clear that Google’s ideal is web where only one copy of each piece of content is indexed for ranking on the search engine.

Although, it may take some time for that to happen, websites and content creators would be wise to evolve their business models.

In the future of the web, those with original content will not only rule — they’ll likely be the only ones left.

Copy Machine Photo via Shutterstock

More in: Comment ▼

Michael Guta Michael Guta is the Assistant Editor at Small Business Trends and currently manages its East African editorial team. Michael brings with him many years of content experience in the digital ecosystem covering a wide range of industries. He holds a B.S. in Information Communication Technology, with an emphasis in Technology Management.

Comments are closed.