jump to last post 1-4 of 4 discussions (7 posts)

Google's manual for its unseen humans who rate the web

  1. sabrebIade profile image85
    sabrebIadeposted 3 years ago

    I thought this was pretty interesting....
    "It's widely believed that Google search results are produced entirely by computer algorithms - in large part because Google would like this to be widely believed. But in fact a little-known group of home-worker humans plays a large part in the Google process. The way these raters go about their work has always been a mystery. Now, The Register has seen a copy of the guidelines Google issues to them."
    http://www.theregister.co.uk/2012/11/27 … rs_manual/

    1. Greekgeek profile image96
      Greekgeekposted 3 years ago in reply to this

      Hee. The Register just discovered something that many of us have known about for years. I think those guidelines have been leaking every few years since 2007, if not earlier.

      I posted a hub about Google's Quality Rater guideline in 2011, and the leaked 2012 guidelines are floating around the web as well.

      The thing that many people don't understand -- including, perhaps, the author of that article --  is that Google's human quality raters' ratings are NOT used by Google's algorithm. If a Google Rater rated my page as "Useful," for example, that rating would not be considered by or known to Google's online algorithm when someone types a search query. It would not change my site's position up or down in search results.

      Instead, Quality Rater data is used at the R&D stage. Google engineers are using the quality raters' data to help them develop a better algorithm. For example:

      "Our raters seem to agree that A, B and C are great websites, D, E and F, are okay, and X, Y and Z are bad. So, what factors make A, B, C good? What factors make X, Y, Z bad? How can we teach our computer to recognize those factors, for the purpose of making the algorithm's judgment match human tastes?"

      The reason I look at the raters' guidelines -- even if they're only used for in-house R&D -- is that they give me insight into what Google, at least, thinks readers are looking for, or shy away from. They base that on the data they've collected about user behavior. It's yet another piece of info to keep in mind when writing for a web audience.

  2. paradigmsearch profile image90
    paradigmsearchposted 3 years ago

    There just might be some gold nugget nuances in there... Like don't let any content go longer that 4 months without being updated...

    1. sabrebIade profile image85
      sabrebIadeposted 3 years ago in reply to this

      Yep...I noticed that

  3. WriteAngled profile image92
    WriteAngledposted 3 years ago

    This is really old news. The guide was first made public sometime last year.

    I used to answer questions on Google Answers many years ago, and was recruited from there onto the rating panel. At the time, it paid really well. I was in full-time employment so only rating for a couple of hours or so a day, but still cleared hundreds of dollars per month.

    I resigned when they stopped paying UK raters directly into our bank accounts and wanted us to go through an employment agency for manual labourers and blue-collar workers. Not only did I find this deeply insulting, but I also could not be bothered to go through all the bureaucracy involved to sign up with the agency especially since I was starting to expand my translation business, which eventually became my fulltime activity.

    1. sabrebIade profile image85
      sabrebIadeposted 3 years ago in reply to this

      You should have contacted the media then and scooped em!

  4. Rosie2010 profile image84
    Rosie2010posted 3 years ago

    Interesting read.  Thanks for the link, sabreblade.