Saturday, September 19, 2009

Ratings and the wisdom of the crowds (?)

This article on Technology review discusses an issue that I thought about before, but never actually tested. I am thankful to those who empirically study the ratings on websites such as imdb and amazon.

According to the article, the ratings on such sites do not really reflect the wisdom of the crowds. Rather they are reflections of the wisdom of a few biased individuals.

If you look for reviews the same day a particular movie is on screen, you will find the ratings higher compared to a few weeks afterwards. It is rather fun to experiment on that. Those who see a movie the day of its release may have been waiting for it for a while. Without any conspiracy theories, having positive feelings towards an item (movie/book/new technology) influences the way humans make judgments. I like those passionate judgments, but it would be good to easily differentiate whether an item is ranked high because it is really good, or because Mr. X has a special childhood story about it.

From the same article, I agree with Niki Kittur that providing more information about the participants' review patterns is a way to make people differentiate between wisdom of the crowds and of biased few. This idea futher incorporates technologies into wikis and rating systems.

What I do with the available technologies is (not to wait a few months to buy a book or watch a movie, but) to look at multiple review sites (using my digital literacy skills, if you will) and actually read the comments that people make justifying their ratings. Yet I am excited about new (yet fair and democratic) methods and technologies of ratings assessments.

No comments:

Post a Comment