jump to last post 1-9 of 9 discussions (13 posts)


  1. Gold Money profile image60
    Gold Moneyposted 6 years ago

    My most recent hub was flagged as duplicate.  It is not a duplicate hub but I have published it elsewhere on the net.  Is this why it was flagged?  If so, can I get it unflagged?  How bad does this effect my hubscore?


    1. LeanMan profile image81
      LeanManposted 6 years ago in reply to this

      If it is elsewhere on the net then it is a duplicate... rewrite it so that it is no longer worded the same... very simple..

    2. nicregi profile image66
      nicregiposted 6 years ago in reply to this

      Duplicate means either you have copy and paste (above certain % of an article) or the article had been published on somewhere else.

      Duplicates will have an effect on search engine and even your promotions (ads) on the article.

      Ways to get over with it is try change the words of it. Alternative, delete the hub and rewrite again as fresh. I heard hubs that are flagged as duplicate will need alot of time to get the flagged off.

  2. Peter Hoggan profile image85
    Peter Hogganposted 6 years ago

    Duplicate threads are not that great an idea either.

  3. WryLilt profile image88
    WryLiltposted 6 years ago

    "Duplicate" has two meanings:

    1. Another person illegally copying your work.
    2. The same content posted in two places on the internet.

    You'll find google isn't highly impressed by either.

    1. Peter Hoggan profile image85
      Peter Hogganposted 6 years ago

      Google filters duplicate content it doesn’t penalize it. Duplicates are still indexed, maintain PageRank and pass link juice. Google doesn’t  have a problem with duplicate content.

      @Gold Money, HubPages have strict guidelines about duplicate content, its best to keep everything here unique and original. A few dupes of your own work amongst a larger body of hubs won’t cause much harm. If every hub is duplicated your hubscore will take a tanking.

      1. thisisoli profile image72
        thisisoliposted 6 years ago in reply to this

        I disagree with this, Google did announce that they do not penalise duplicate content. However in the same video they go on to say that they penalize spam, which in the same sentence they describe spam content, including duplicate content as one of those descriptions.  If a page appears spammy and largely filled with duplicate content (Rather than say for instance, quoting a poem) then I do think there is a danger of being ranked lower. 

        I have always seen unique content do better than duplicate content, and have had a site de-indexed when all it did was draw duplicate content from other sources.

        While a single article may not be de-indexed for duplicate content, and if it is still indexed it will still pass link juice, I feel that it is an unnecessary risk.

        That is however only my opinion, and I know Peter is doing well from his own marketing tactics so I would not discount them without researching the topic and forming your own opinion!

    2. Lisa HW profile image84
      Lisa HWposted 6 years ago

      Duplicate content is said to essentially compete with itself, which is one reason is makes no sense to post it.  I have some duplicate stuff (because it's stuff that I think it's "worthwhile writing" and didn't care about traffic with).  Some of my highest scoring Hubs are duplicates and get lots of comments, which doesn't hurt them. I'm not advocating duplicates, by any means; because obviously there's a reason they're frowned on.  Still, I haven't found it has affected my scores on here (to whatever extent scores on here matter).  (I have some duplicates that I know have scores that reflect it - just not all of them.  Then again, those that appear affected aren't the same quality of information or writing as the higher-scoring ones.)

      I'm under the impression that there's something to being as "consistently inconsistent" as I am, when it comes to the kinds of stuff I write and the sites where I post it.  (Nobody that "willy-nilly" and "odd" could possibly be a spammer.   lol  )  The stuff I have (duplicate or not) that doesn't do well traffic-wise is stuff that, even if I don't think it's bad writing, wouldn't attract traffic anyway.  It's not like I have a "zillion" of them or anything, but I have duplicates that do quite well traffic-wise.  (Maybe they'd do better if they weren't duplicates, of course; but it hasn't prevented them from getting traffic.)  I still pretty much think it's all about not coming across as a spammer, overall.

    3. Peter Hoggan profile image85
      Peter Hogganposted 6 years ago

      I understand and appreciate your views on this Oli, however they are, I think, based on an assumption that duplicate content is spam.

      Firstly if you are going to put duplicate content on your own site you are in danger of those pages being filtered out of the results. The page that will eventually "stick" is based on the authority of the site hosting the article, how well linked the article is and the IP address of the hosting site. So posting to a site that has been established longer and has greater authority could see the page on your own site being filtered.

      Also, the filter is not a constant, the geographic location of the searcher and the IP address of the hosting site can come into play so that localized results can be delivered more efficiently. A page that is filtered from your results might show up in the results of someone searching on the same term from another country.

      Everything on your money site should be original and unique and duplicate content used to gain links through article syndication. My experience is that a well written well syndicated article will return much more benefit than a single unique posting and it's easier than having to continually rewrite.

      Duplicate content will be spotted and filtered, so don’t expect to see search results littered with copies of your articles, it just won’t happen. If it did search results would become unusable. So when syndicating content it has to be understood that much of what you are syndicating will never be seen. The goal is to get those articles indexed along with the links they contain.

      This is where article spinning comes in and is touted as a way to defeat the duplicate content filters. The majority of spun content is detectable, even the stuff produced by so called "spinning experts". By using article spinning software you are deliberately setting out to dupe both search engines and readers. Now this is spam and Google is getting to grips with it and removing it.

    4. thisisoli profile image72
      thisisoliposted 6 years ago

      Would it be fair to say that an agreement would be that Google does penalize duplicate content, in that it gives it a much less significant place in the rankings, but still allows backlinks to be indexed and counted, meaning duplicate content can still provide a benefit?

      In which case it would be more benefitial to move on to whether or not the backlinks on the duplicate content have a similar amount of 'trust' which would promote your own site when compared to pure unique content.

      This is actually a very useful conversation to me, as I am writing a book on SEO right now, and would like to make it as comprehensive as possible.

    5. Peter Hoggan profile image85
      Peter Hogganposted 6 years ago

      I think it’s important to understand the difference between filters and penalties. Standard search results are filtered by default, host crowding and dupe content filters shield us from hundreds of overly similar results giving us access to a wider selection of resources and make it easier to find the information we seek.

      You can switch those filters off if you desire and look at the raw results, the filters as they are do a good job of making the information much more manageable and usable. Those pages that were filtered are not under any penalty; the most relevant pages are being shown. This could be viewed as host crowding on a grander scale.

      Leveraging duplicate content as a strategy to promote your money site isn’t about gaining rankings from the syndicated content it’s about powering up the money pages. However, if you are trying to gain rankings from the syndicated articles themselves you are unlikely to have much success. This may look like many of the syndicated articles are being penalized but at least one should escape the filter and if it’s good enough might even rank.

      It’s a cleanup job in order to provide better results rather than a penalty. Pages that are filtered can be fixed but penalties are a whole different problem.

      1. thisisoli profile image72
        thisisoliposted 6 years ago in reply to this

        From my understanding the filters that Google describes are those that turn results in to 'see omited' or 'see more results from this site'.

        However being penalized by google does not necessarily mean being removed from the search results, it normally means that your position is lowered in the SERPS.

        Being penalized by Google is often a description of when something to do with that article trips a filter which lowers it's position within the rankings.

        However if an article remains in the rankings, but is lower in the rankings because of a penalty (maybe I should clarify this as duplicate content), will that lower the trust of a website it is placed on, and if so will that lower the quality of a backlink leading back to your own website?

    6. Peter Hoggan profile image85
      Peter Hogganposted 6 years ago

      Your understanding is correct, the filters can be lifted by clicking on 'see omitted' or 'see more results from this site'. 'see omitted' will remove the dupe filter and 'see more results from this site' will remove the host crowding filter. The pages that are revelled were never under any penalty, rather the more relevant or trusted pages were given preference.

      I doubt if a duplicate that remains in the unfiltered results is likely to be penalised just because it’s a dupe, there is no such penalty as far as I understand.

      Let me say though, this doesn’t mean you can syndicate crap across hundreds of low quality sites, you still require quality content hosted on pages that are indexed.

      Perhaps another way to think about it is that unique content will always be given preference while duplicate content is subject to filtering. That does not allude to any penalty being applied. When you syndicate an article you will never know which one will be preferred. However, those that are filtered still pass benefits over to the page targeted in the links they contain. Just don't duplicate content from your main site or sites.