My most recent hub was flagged as duplicate. It is not a duplicate hub but I have published it elsewhere on the net. Is this why it was flagged? If so, can I get it unflagged? How bad does this effect my hubscore?
If it is elsewhere on the net then it is a duplicate... rewrite it so that it is no longer worded the same... very simple..
Duplicate means either you have copy and paste (above certain % of an article) or the article had been published on somewhere else.
Duplicates will have an effect on search engine and even your promotions (ads) on the article.
Ways to get over with it is try change the words of it. Alternative, delete the hub and rewrite again as fresh. I heard hubs that are flagged as duplicate will need alot of time to get the flagged off.
"Duplicate" has two meanings:
1. Another person illegally copying your work.
2. The same content posted in two places on the internet.
You'll find google isn't highly impressed by either.
Google filters duplicate content it doesn’t penalize it. Duplicates are still indexed, maintain PageRank and pass link juice. Google doesn’t have a problem with duplicate content.
@Gold Money, HubPages have strict guidelines about duplicate content, its best to keep everything here unique and original. A few dupes of your own work amongst a larger body of hubs won’t cause much harm. If every hub is duplicated your hubscore will take a tanking.
I disagree with this, Google did announce that they do not penalise duplicate content. However in the same video they go on to say that they penalize spam, which in the same sentence they describe spam content, including duplicate content as one of those descriptions. If a page appears spammy and largely filled with duplicate content (Rather than say for instance, quoting a poem) then I do think there is a danger of being ranked lower.
I have always seen unique content do better than duplicate content, and have had a site de-indexed when all it did was draw duplicate content from other sources.
While a single article may not be de-indexed for duplicate content, and if it is still indexed it will still pass link juice, I feel that it is an unnecessary risk.
That is however only my opinion, and I know Peter is doing well from his own marketing tactics so I would not discount them without researching the topic and forming your own opinion!
Duplicate content is said to essentially compete with itself, which is one reason is makes no sense to post it. I have some duplicate stuff (because it's stuff that I think it's "worthwhile writing" and didn't care about traffic with). Some of my highest scoring Hubs are duplicates and get lots of comments, which doesn't hurt them. I'm not advocating duplicates, by any means; because obviously there's a reason they're frowned on. Still, I haven't found it has affected my scores on here (to whatever extent scores on here matter). (I have some duplicates that I know have scores that reflect it - just not all of them. Then again, those that appear affected aren't the same quality of information or writing as the higher-scoring ones.)
I'm under the impression that there's something to being as "consistently inconsistent" as I am, when it comes to the kinds of stuff I write and the sites where I post it. (Nobody that "willy-nilly" and "odd" could possibly be a spammer. ) The stuff I have (duplicate or not) that doesn't do well traffic-wise is stuff that, even if I don't think it's bad writing, wouldn't attract traffic anyway. It's not like I have a "zillion" of them or anything, but I have duplicates that do quite well traffic-wise. (Maybe they'd do better if they weren't duplicates, of course; but it hasn't prevented them from getting traffic.) I still pretty much think it's all about not coming across as a spammer, overall.
I understand and appreciate your views on this Oli, however they are, I think, based on an assumption that duplicate content is spam.
Firstly if you are going to put duplicate content on your own site you are in danger of those pages being filtered out of the results. The page that will eventually "stick" is based on the authority of the site hosting the article, how well linked the article is and the IP address of the hosting site. So posting to a site that has been established longer and has greater authority could see the page on your own site being filtered.
Also, the filter is not a constant, the geographic location of the searcher and the IP address of the hosting site can come into play so that localized results can be delivered more efficiently. A page that is filtered from your results might show up in the results of someone searching on the same term from another country.
Everything on your money site should be original and unique and duplicate content used to gain links through article syndication. My experience is that a well written well syndicated article will return much more benefit than a single unique posting and it's easier than having to continually rewrite.
Duplicate content will be spotted and filtered, so don’t expect to see search results littered with copies of your articles, it just won’t happen. If it did search results would become unusable. So when syndicating content it has to be understood that much of what you are syndicating will never be seen. The goal is to get those articles indexed along with the links they contain.
This is where article spinning comes in and is touted as a way to defeat the duplicate content filters. The majority of spun content is detectable, even the stuff produced by so called "spinning experts". By using article spinning software you are deliberately setting out to dupe both search engines and readers. Now this is spam and Google is getting to grips with it and removing it.
Would it be fair to say that an agreement would be that Google does penalize duplicate content, in that it gives it a much less significant place in the rankings, but still allows backlinks to be indexed and counted, meaning duplicate content can still provide a benefit?
In which case it would be more benefitial to move on to whether or not the backlinks on the duplicate content have a similar amount of 'trust' which would promote your own site when compared to pure unique content.
This is actually a very useful conversation to me, as I am writing a book on SEO right now, and would like to make it as comprehensive as possible.
I think it’s important to understand the difference between filters and penalties. Standard search results are filtered by default, host crowding and dupe content filters shield us from hundreds of overly similar results giving us access to a wider selection of resources and make it easier to find the information we seek.
You can switch those filters off if you desire and look at the raw results, the filters as they are do a good job of making the information much more manageable and usable. Those pages that were filtered are not under any penalty; the most relevant pages are being shown. This could be viewed as host crowding on a grander scale.
Leveraging duplicate content as a strategy to promote your money site isn’t about gaining rankings from the syndicated content it’s about powering up the money pages. However, if you are trying to gain rankings from the syndicated articles themselves you are unlikely to have much success. This may look like many of the syndicated articles are being penalized but at least one should escape the filter and if it’s good enough might even rank.
It’s a cleanup job in order to provide better results rather than a penalty. Pages that are filtered can be fixed but penalties are a whole different problem.
From my understanding the filters that Google describes are those that turn results in to 'see omited' or 'see more results from this site'.
However being penalized by google does not necessarily mean being removed from the search results, it normally means that your position is lowered in the SERPS.
Being penalized by Google is often a description of when something to do with that article trips a filter which lowers it's position within the rankings.
However if an article remains in the rankings, but is lower in the rankings because of a penalty (maybe I should clarify this as duplicate content), will that lower the trust of a website it is placed on, and if so will that lower the quality of a backlink leading back to your own website?
Your understanding is correct, the filters can be lifted by clicking on 'see omitted' or 'see more results from this site'. 'see omitted' will remove the dupe filter and 'see more results from this site' will remove the host crowding filter. The pages that are revelled were never under any penalty, rather the more relevant or trusted pages were given preference.
I doubt if a duplicate that remains in the unfiltered results is likely to be penalised just because it’s a dupe, there is no such penalty as far as I understand.
Let me say though, this doesn’t mean you can syndicate crap across hundreds of low quality sites, you still require quality content hosted on pages that are indexed.
Perhaps another way to think about it is that unique content will always be given preference while duplicate content is subject to filtering. That does not allude to any penalty being applied. When you syndicate an article you will never know which one will be preferred. However, those that are filtered still pass benefits over to the page targeted in the links they contain. Just don't duplicate content from your main site or sites.
by Zubair Ahmed6 years ago
Is there anyway that a feature can be added to Hubpages that identifies where in a hub it thinks there is an instance of duplication.I recently published a hub which was identified as duplicate, I spent hours...
by canadawest995 years ago
People think of duplicate content as a bad thing, but google only defines it as bad if you have duplicate content on a single domain. So now after the google slap, every writing site is trying to make sure...
by Mark Knowles9 years ago
I have a question. After all the arguments recently about publishing duplicate content I wondered about this hub:http://hubpages.com/hub/2008-Forex-Mark … ading-deskI am pretty sure I flagged it some time ago as...
by Lyndon Henry5 years ago
Like some other contributors in this forum, I've also just had an article suspended for "duplication". This is an article on grammar rewritten from a small section of my own much longer article on...
by Nicole Pellegrini2 years ago
I just got hit by a duplicate content unpublishing of one of my articles. For the record, it is my own work which I had originally published on another platform and wanted to relocate here. After deleting the article...
by Mark Knowles8 years ago
I am moving this question over here to ask for help from the hubpages team.2 of my hubs and some of hovalis' hubs seem to have had a penalty applied by google and have disappeared from the search rankings down to the...
Copyright © 2017 HubPages Inc. and respective owners.
Other product and company names shown may be trademarks of their respective owners.
HubPages® is a registered Service Mark of HubPages, Inc.
HubPages and Hubbers (authors) may earn revenue on this page based on affiliate relationships and advertisements with partners including Amazon, Google, and others.