Peer Review: Ranking the movies and the users who rank them.
After three months of selecting movies I hadn't seen from near the tops of the charts of friends over at the movie-ranking website Flickchart, I have completed the Peer Review project. I watched 47 movies and one 10-part TV series totaling 106 hours, and ranked 51 different Flickchart users according to how well their Top 20 movies scored on my own Flickchart. the data is in, and the results are: useless.
Really, all that number-crunching is largely beside the point. I do PopGap as a way of forcing myself to watch "good" movies that I otherwise would never have seen, and this seemed as good a gimmick as any, but my most significant finding during the process has been thus: compiling data on different movie fans takes time away from watching and writing about movies, which is far more enjoyable and interesting to me and probably anyone who reads this. Conclusion: I will be picking much simpler sources of new movies in the future. This was kind of a drag.
For anyone who does care about the data, it's not very interesting. Generally, the users who had top-scoring charts for compatibility at the beginning of the process also had them at the end, and vice versa. In the two rare cases of significant score changes, each was determined by those individual Flickcharters making significant changes to which movies were ranked in their Top 20s, with no relation to the movie I picked from their chart. It was far more common for a Flickchart user's score to drop and not rise over the course of the three months, because with each new movie I ranked, all the movies below it would drop lower on my chart, thereby naturally lowering the compatibility scores even for movies that I love.
Here's how the final results panned out for all three months of Peer Review: