ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel

Writing Versus Rewriting

Updated on April 14, 2013

Warning: If It Seems True, It May Be Or It May Not

As a professional writer for 13 years at national, state, and local levels, I am seeing a disturbing trend I'd like to discuss here - a trend that is slowly but surely destroying the validity of what one reads on the Internet. This trend is "re-writing" and those posting the text are calling the new material original content although it isn't. Not by any stretch of the imagination.

The unfortunate part of this is that "re-writing" dilutes the original content and introduces errors, misunderstanding, and mistranslation. There is an old story about how an individual tells another individual a short story but by the time the second individual tells a third, and the third tells a fourth, on down the line to a total of 10 people, the story has changed so much that it no longer the original story at all.

There is a lot of documentation and evidence now that proves today's Bible is nothing like the original writings that the Bible is created from before the content was converted from one language to another. There is also evidence that no writer of the Bible lived in the era they claim to authoritatively describe in detail. All books of the Bible are written by authors who "borrowed" names of Disciples and not one was a real disciple at all. The New Testament books were not written until a minimum of 80 years after the crucifixion occurred. Most, if not all, of the New Testament authors weren't even born yet when the events surrounding Jesus took place. The stories and statements made in the time of Christ were passed down from person to person who all had opinions, views, and verbal translations of their own with more and more inaccuracies and opinions included with each story telling that took place. So the authority and accuracy of Biblical writings are doubtful.

I discuss this because re-writing web content is causing this same inaccuracy, misunderstanding, and mistranslation today as well. I use Google heavily for my research. I recently found article after article on one subject that were obviously clones of each other - sometimes as many as 12 to 15 articles at different locations.

Another subject I researched recently is websites paying "writers" to write website content for website owners. They encourage going to websites and copying paragraphs of information to acquire enough material to "write from" which translates to "rewriting" what is already written. They pay practically nothing per project, something like $2 to $3 per webpage full of text. They don't encourage creativity, they just cloak the concept that they want you to "rewrite" other material into "something new" and that is what you get paid to submit.

I learned that there are also software packages that allow you to copy/paste the text of someone's article into it, the software moves the sentences around, juggles the words of the sentences around, then suggests synonyms to replace a high percentage of the words so it doesn't look exactly like the original article. Google 'bots aren't supposed to catch this pseudo-plagiarism but Google also has live webpage analytical people by the thousands employed to find this and report it along with other things. The user is supposed to then proofread what the software came up with and smooth it out into an article that flows well. Unfortunately there are software users who speak other languages, barely speak English, and assume the software came up with something good and don't have the capability of properly proofreading so they leave the article as-is. Then another non-English speaking user "rewrites" that article and another one that is far worse is created and not proofread.

What this accomplishes is an article that had horrible mistakes in it already being rewritten to have even more horrible mistakes in the second one. Both are published, and both are then rewritten further by others so that even more mistakes are in the third, fourth, and fifth generations of the incorrect originals. All of those get rewritten too so that there is soon 20 mistake-riddled articles on websites shown to international audiences and the original authoritative article is likely to be overshadowed by all the clones that are inaccurate and "rewritten" by non-experts.

With just a few clicks of seeing articles that were supposedly written by "experts" in a field of endeavor, I can easily find mistakes so obvious that it is clear these "experts" were not experts at all and were simply rewriting original content they clearly had no experience in. For example, you can tell when a writer has experience if it's a subject in which you, the reader, also have experience in. Even if it's a site full of "how-to" or "about something" articles, the editors in charge of those articles have absolutely no clue as to whether the article is true, half true, or completely fiction unless the editor just happens to have worked in that field themselves in the past and that's typically not the case. Google makes an effort to rank websites using this lame "rewritten" content to low levels so it isn't as likely to be found on the first or second page of searches but with tens or hundreds of thousands of pages being posted daily with "rewritten" content, they're not winning the battle against pseudo-plagiarism.

I'm seriously concerned that the Internet is going to become an absolute mish-mash of weak and invalid error-filled badly written or almost unreadable content that readers can't trust at all and they will soon learn to stop searching for authoritative content too. That's bad for the original content that was valid and accurate because it will be suspect too.

Some would argue that the Internet has been a mish-mash of lies and inexperienced writers for a long time. That's true but there has always been a larger wealth of exceptional and accurate content to negate the trash. And if you weren't sure that what you read was valid, you could call up 3, 4, or 5 authoritative websites and compare them for content to learn what was true and what wasn't. Today you can barely be sure by calling up 15 to 20 pages. When researching a product to buy, I use up to 10 pages of content, with some that I've trusted for years, and then go to Amazon, eBay, Best Buy, or some of the other huge retailers and read customer satisfaction reviews. Those are also tainted by those who write, or are paid by competitors to write, negative reviews for purposes of hurting the chain store or the product line. I also must read upwards of 20 or 25 customer reviews and add those to the 15 to 20 pages of websites I read. It's unfortunate to have to go to this extent but all this lying, deception, and rewriting forces us to do so.

Rewriting is creating a world wide web filled with so many invalid articles that comparing 3,4, or 5 of them could make one conclude that a high percentage of websites agree on something when in reality all, or most, of the articles read might be re-writes of each other. In which case, the articles don't agree, they are all, in a round about way, all from the same source.


    0 of 8192 characters used
    Post Comment

    No comments yet.