Duplication

Jump to Last Post 1-9 of 9 discussions (13 posts)
  1. Gold Money profile image60
    Gold Moneyposted 13 years ago

    My most recent hub was flagged as duplicate.  It is not a duplicate hub but I have published it elsewhere on the net.  Is this why it was flagged?  If so, can I get it unflagged?  How bad does this effect my hubscore?

    Thanks.

    1. LeanMan profile image79
      LeanManposted 13 years agoin reply to this

      If it is elsewhere on the net then it is a duplicate... rewrite it so that it is no longer worded the same... very simple..

    2. nicregi profile image65
      nicregiposted 13 years agoin reply to this

      Duplicate means either you have copy and paste (above certain % of an article) or the article had been published on somewhere else.

      Duplicates will have an effect on search engine and even your promotions (ads) on the article.

      Ways to get over with it is try change the words of it. Alternative, delete the hub and rewrite again as fresh. I heard hubs that are flagged as duplicate will need alot of time to get the flagged off.

  2. Peter Hoggan profile image68
    Peter Hogganposted 13 years ago

    Duplicate threads are not that great an idea either.

  3. WryLilt profile image87
    WryLiltposted 13 years ago

    "Duplicate" has two meanings:

    1. Another person illegally copying your work.
    2. The same content posted in two places on the internet.

    You'll find google isn't highly impressed by either.

    1. Peter Hoggan profile image68
      Peter Hogganposted 13 years ago

      Google filters duplicate content it doesn’t penalize it. Duplicates are still indexed, maintain PageRank and pass link juice. Google doesn’t  have a problem with duplicate content.

      @Gold Money, HubPages have strict guidelines about duplicate content, its best to keep everything here unique and original. A few dupes of your own work amongst a larger body of hubs won’t cause much harm. If every hub is duplicated your hubscore will take a tanking.

      1. thisisoli profile image69
        thisisoliposted 13 years agoin reply to this

        I disagree with this, Google did announce that they do not penalise duplicate content. However in the same video they go on to say that they penalize spam, which in the same sentence they describe spam content, including duplicate content as one of those descriptions.  If a page appears spammy and largely filled with duplicate content (Rather than say for instance, quoting a poem) then I do think there is a danger of being ranked lower. 

        I have always seen unique content do better than duplicate content, and have had a site de-indexed when all it did was draw duplicate content from other sources.

        While a single article may not be de-indexed for duplicate content, and if it is still indexed it will still pass link juice, I feel that it is an unnecessary risk.

        That is however only my opinion, and I know Peter is doing well from his own marketing tactics so I would not discount them without researching the topic and forming your own opinion!

    2. Lisa HW profile image62
      Lisa HWposted 13 years ago

      Duplicate content is said to essentially compete with itself, which is one reason is makes no sense to post it.  I have some duplicate stuff (because it's stuff that I think it's "worthwhile writing" and didn't care about traffic with).  Some of my highest scoring Hubs are duplicates and get lots of comments, which doesn't hurt them. I'm not advocating duplicates, by any means; because obviously there's a reason they're frowned on.  Still, I haven't found it has affected my scores on here (to whatever extent scores on here matter).  (I have some duplicates that I know have scores that reflect it - just not all of them.  Then again, those that appear affected aren't the same quality of information or writing as the higher-scoring ones.)

      I'm under the impression that there's something to being as "consistently inconsistent" as I am, when it comes to the kinds of stuff I write and the sites where I post it.  (Nobody that "willy-nilly" and "odd" could possibly be a spammer.   lol  )  The stuff I have (duplicate or not) that doesn't do well traffic-wise is stuff that, even if I don't think it's bad writing, wouldn't attract traffic anyway.  It's not like I have a "zillion" of them or anything, but I have duplicates that do quite well traffic-wise.  (Maybe they'd do better if they weren't duplicates, of course; but it hasn't prevented them from getting traffic.)  I still pretty much think it's all about not coming across as a spammer, overall.

    3. Peter Hoggan profile image68
      Peter Hogganposted 13 years ago

      I understand and appreciate your views on this Oli, however they are, I think, based on an assumption that duplicate content is spam.

      Firstly if you are going to put duplicate content on your own site you are in danger of those pages being filtered out of the results. The page that will eventually "stick" is based on the authority of the site hosting the article, how well linked the article is and the IP address of the hosting site. So posting to a site that has been established longer and has greater authority could see the page on your own site being filtered.

      Also, the filter is not a constant, the geographic location of the searcher and the IP address of the hosting site can come into play so that localized results can be delivered more efficiently. A page that is filtered from your results might show up in the results of someone searching on the same term from another country.

      Everything on your money site should be original and unique and duplicate content used to gain links through article syndication. My experience is that a well written well syndicated article will return much more benefit than a single unique posting and it's easier than having to continually rewrite.

      Duplicate content will be spotted and filtered, so don’t expect to see search results littered with copies of your articles, it just won’t happen. If it did search results would become unusable. So when syndicating content it has to be understood that much of what you are syndicating will never be seen. The goal is to get those articles indexed along with the links they contain.

      This is where article spinning comes in and is touted as a way to defeat the duplicate content filters. The majority of spun content is detectable, even the stuff produced by so called "spinning experts". By using article spinning software you are deliberately setting out to dupe both search engines and readers. Now this is spam and Google is getting to grips with it and removing it.

    4. thisisoli profile image69
      thisisoliposted 13 years ago

      Would it be fair to say that an agreement would be that Google does penalize duplicate content, in that it gives it a much less significant place in the rankings, but still allows backlinks to be indexed and counted, meaning duplicate content can still provide a benefit?

      In which case it would be more benefitial to move on to whether or not the backlinks on the duplicate content have a similar amount of 'trust' which would promote your own site when compared to pure unique content.

      This is actually a very useful conversation to me, as I am writing a book on SEO right now, and would like to make it as comprehensive as possible.

    5. Peter Hoggan profile image68
      Peter Hogganposted 13 years ago

      I think it’s important to understand the difference between filters and penalties. Standard search results are filtered by default, host crowding and dupe content filters shield us from hundreds of overly similar results giving us access to a wider selection of resources and make it easier to find the information we seek.

      You can switch those filters off if you desire and look at the raw results, the filters as they are do a good job of making the information much more manageable and usable. Those pages that were filtered are not under any penalty; the most relevant pages are being shown. This could be viewed as host crowding on a grander scale.

      Leveraging duplicate content as a strategy to promote your money site isn’t about gaining rankings from the syndicated content it’s about powering up the money pages. However, if you are trying to gain rankings from the syndicated articles themselves you are unlikely to have much success. This may look like many of the syndicated articles are being penalized but at least one should escape the filter and if it’s good enough might even rank.

      It’s a cleanup job in order to provide better results rather than a penalty. Pages that are filtered can be fixed but penalties are a whole different problem.

      1. thisisoli profile image69
        thisisoliposted 13 years agoin reply to this

        From my understanding the filters that Google describes are those that turn results in to 'see omited' or 'see more results from this site'.

        However being penalized by google does not necessarily mean being removed from the search results, it normally means that your position is lowered in the SERPS.

        Being penalized by Google is often a description of when something to do with that article trips a filter which lowers it's position within the rankings.

        However if an article remains in the rankings, but is lower in the rankings because of a penalty (maybe I should clarify this as duplicate content), will that lower the trust of a website it is placed on, and if so will that lower the quality of a backlink leading back to your own website?

    6. Peter Hoggan profile image68
      Peter Hogganposted 13 years ago

      Your understanding is correct, the filters can be lifted by clicking on 'see omitted' or 'see more results from this site'. 'see omitted' will remove the dupe filter and 'see more results from this site' will remove the host crowding filter. The pages that are revelled were never under any penalty, rather the more relevant or trusted pages were given preference.

      I doubt if a duplicate that remains in the unfiltered results is likely to be penalised just because it’s a dupe, there is no such penalty as far as I understand.

      Let me say though, this doesn’t mean you can syndicate crap across hundreds of low quality sites, you still require quality content hosted on pages that are indexed.

      Perhaps another way to think about it is that unique content will always be given preference while duplicate content is subject to filtering. That does not allude to any penalty being applied. When you syndicate an article you will never know which one will be preferred. However, those that are filtered still pass benefits over to the page targeted in the links they contain. Just don't duplicate content from your main site or sites.

     
    working

    This website uses cookies

    As a user in the EEA, your approval is needed on a few things. To provide a better website experience, hubpages.com uses cookies (and other similar technologies) and may collect, process, and share personal data. Please choose which areas of our service you consent to our doing so.

    For more information on managing or withdrawing consents and how we handle data, visit our Privacy Policy at: https://corp.maven.io/privacy-policy

    Show Details
    Necessary
    HubPages Device IDThis is used to identify particular browsers or devices when the access the service, and is used for security reasons.
    LoginThis is necessary to sign in to the HubPages Service.
    Google RecaptchaThis is used to prevent bots and spam. (Privacy Policy)
    AkismetThis is used to detect comment spam. (Privacy Policy)
    HubPages Google AnalyticsThis is used to provide data on traffic to our website, all personally identifyable data is anonymized. (Privacy Policy)
    HubPages Traffic PixelThis is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.
    Amazon Web ServicesThis is a cloud services platform that we used to host our service. (Privacy Policy)
    CloudflareThis is a cloud CDN service that we use to efficiently deliver files required for our service to operate such as javascript, cascading style sheets, images, and videos. (Privacy Policy)
    Google Hosted LibrariesJavascript software libraries such as jQuery are loaded at endpoints on the googleapis.com or gstatic.com domains, for performance and efficiency reasons. (Privacy Policy)
    Features
    Google Custom SearchThis is feature allows you to search the site. (Privacy Policy)
    Google MapsSome articles have Google Maps embedded in them. (Privacy Policy)
    Google ChartsThis is used to display charts and graphs on articles and the author center. (Privacy Policy)
    Google AdSense Host APIThis service allows you to sign up for or associate a Google AdSense account with HubPages, so that you can earn money from ads on your articles. No data is shared unless you engage with this feature. (Privacy Policy)
    Google YouTubeSome articles have YouTube videos embedded in them. (Privacy Policy)
    VimeoSome articles have Vimeo videos embedded in them. (Privacy Policy)
    PaypalThis is used for a registered author who enrolls in the HubPages Earnings program and requests to be paid via PayPal. No data is shared with Paypal unless you engage with this feature. (Privacy Policy)
    Facebook LoginYou can use this to streamline signing up for, or signing in to your Hubpages account. No data is shared with Facebook unless you engage with this feature. (Privacy Policy)
    MavenThis supports the Maven widget and search functionality. (Privacy Policy)
    Marketing
    Google AdSenseThis is an ad network. (Privacy Policy)
    Google DoubleClickGoogle provides ad serving technology and runs an ad network. (Privacy Policy)
    Index ExchangeThis is an ad network. (Privacy Policy)
    SovrnThis is an ad network. (Privacy Policy)
    Facebook AdsThis is an ad network. (Privacy Policy)
    Amazon Unified Ad MarketplaceThis is an ad network. (Privacy Policy)
    AppNexusThis is an ad network. (Privacy Policy)
    OpenxThis is an ad network. (Privacy Policy)
    Rubicon ProjectThis is an ad network. (Privacy Policy)
    TripleLiftThis is an ad network. (Privacy Policy)
    Say MediaWe partner with Say Media to deliver ad campaigns on our sites. (Privacy Policy)
    Remarketing PixelsWe may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.
    Conversion Tracking PixelsWe may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.
    Statistics
    Author Google AnalyticsThis is used to provide traffic data and reports to the authors of articles on the HubPages Service. (Privacy Policy)
    ComscoreComScore is a media measurement and analytics company providing marketing data and analytics to enterprises, media and advertising agencies, and publishers. Non-consent will result in ComScore only processing obfuscated personal data. (Privacy Policy)
    Amazon Tracking PixelSome articles display amazon products as part of the Amazon Affiliate program, this pixel provides traffic statistics for those products (Privacy Policy)
    ClickscoThis is a data management platform studying reader behavior (Privacy Policy)