jump to last post 1-4 of 4 discussions (4 posts)

I have to hand it to this person

  1. Uninvited Writer profile image83
    Uninvited Writerposted 8 years ago

    This person has 11 hubs up which all basically say the same thing.

    They actually took the effort to rewrite each slightly differently...unless they are copied and pasted...I didn't look. Unfortunately, you have to click on a tinyurl to check out the link which I'm not willing to do smile


  2. Marc David profile image59
    Marc Davidposted 8 years ago

    They probably paid somebody to re-write an original, 11 different times (titles too) to avoid some duplicate content penalty.

    I think they missed the boat on "duplicate" content.

    If you re-write it, that's cool.  But go post it on sites that don't already have the same content.  :-)

    I'd venture to guess that tactic won't last on HubPages because a real human will see thru it as duplicate content when a computer might not.

  3. Inspirepub profile image88
    Inspirepubposted 8 years ago

    I guess the question is - how can we define "duplicate content" fairly from a human-detection perspective.

    Most people, when writing a Hub, will go and find information about a topic, and then write about it. The information won't be new and different, in most cases. I can't imagine someone here is the first person every to write "how to groom an Old English Sheepdog", for example.

    Someone complained on the Forum that another Hubber had "stolen" their Hub, because the information was the same, but the words were different.

    Yet, that's what most people are doing when they create Hubs - just usually not from other Hubs! But from Wikipedia, or other authority sites.

    Original thought is so rare that it is not actually required to attain a university degree - in fact, it is not a requirement until you reach the level of a PhD.

    So, given that most people are reiterating information which is already about the place, what guidelines would you suggest to make the output look "non-duplicate" to a human reader?

    And, as a broader question, let's think about why we have the rule against duplicate content. My understanding is that it will prevent the hubpages.com domain from being devalued by search engines. And they use computerised algorithms to detect duplicate content.

    So, apart from a sense of "moral outrage" because someone took a short cut, are the ten variant pages actually harming us in any way?


  4. darkside profile image84
    darksideposted 8 years ago

    Looks like someone bought themselves a pack of PLR Articles.