Ross links to Erik Lundgaard's argument that "once you control for marketing budgets and theater saturation (big things to control for, obviously), well-reviewed movies tend to outgross their badly-reviewed competitors," and thus the world is just. I'd be interested to know if that effect is increasing over time. This month, Radar has an interesting article on the contemporary inability of big budget movie star's to ensure box office success through sheer force of presence. The article quotes a lot of folks who seem puzzled by this phenomenon, but it seems fairly clear: Movie stars were effective for a reason. A decade ago, if you wanted to see a movie, and you didn't have the Sunday LA Times around, you just went to the movies and walked into whatever looked good. There wasn't much accessible information. One way for a film to signal quality was to feature a major movie star, because a big star meant lots of money, which meant that even if the production had problems, it would be slick and entertaining. So movie stars had a major effect; they signaled a sort of trustworthy professionalism on the part of the studio. Now, however, the net made it much easier to access direct information on the relative quality of film's, so not only do I have better information than casting, but it's become fairly clear that casting isn't much related to quality. Moreover, review aggregators like Rotten Tomatoes and Metacritic have made that information much more authoritative. If one aging white guy didn't like Hancock, I'd probably just assume he wasn't into the concept. But since no one likes Hancock, I probably won't see the movie. They create consensus where there used to be only opinion. This may mean I miss some good movies -- Van Wilder, which I sort of love, got an 18 percent from Rotten Tomatoes -- but it also means I see fewer bad films. My sense is this is widely true, but I have no data on that.