Research & Insights

By Patrick Philips, February 2, 2011

And the crowd goes wild (revisited)

A few months ago, we created a crowdsourced ranking of NFL players according to their fantasy value. For those just tuning in, we took the Top 75 players in ESPN’s preseason preview as the experimental sample, paired them against each other, then had workers through the CrowdFlower platform select the better of the two.

Now that the regular season has finished, let’s see who did a better job of predicting the final ranking of this sample. Of the 75 players in the sample, only 28 finished in the Top 50. In other words, ESPN’s team of experts failed to predict nearly half of the most valuable players in 2010.

We brushed off this halting start by Team ESPN and looked at those players in the preseason sample who did finish in the Top 50. Did Team ESPN or Team CrowdFlower do a better job of ranking them? To find out, we looked at which of the two ranking systems contained more of the top players at a number of intervals.

For example, of the seven most valuable players, five were in the preseason sample. Team CrowdFlower correctly picked four of the five, which we’ll call a Precision Score at 7 of 0.8 (that is 4 out of 5 players, or 80 percent). Team ESPN, on the other hand, picked only two of the five players to finish in the Top 7, corresponding with a Precision Score at 7 of 0.4.

As you can see in the graph above, crowdsourced workers were at least as good as ESPN’s experts. For the Top 5 Rated players, Team CrowdFlower predicted all three of the players in the preseason sample, while Team ESPN picked zero. For the Top 20 Rated players, Team CrowdFlower predicted 9 of the 13 players in the preseason sample (Precision Score at 20 of 0.69) while Team ESPN predicted only 6 of the 13 players (Precision Score at 20 of 0.46).

Across the board, crowdsourced workers never performed worse than a professional team of experts at predicting the value of Fantasy Football players. Approximately 80 percent of the time, crowdsourced workers were better. Most interesting, the biggest difference between crowdsourced workers and experts was for the most valuable players.