College rankings

October 1, 2012

Joe Nocera has a nice piece in Friday’s Times about college rankings.  He argues that the popular magazine rankings of colleges are crude, at best:  “U.S. News likes to claim that it uses rigorous methodology, but, honestly, it’s just a list put together by magazine editors.”  He’ll get no disagreement from most academics, who know well how sloppy most magazine methodologies are, but is it possible we’re missing the point?

The larger point, I think, is that there’s enormous value in measurement systems — “metrics,” as they’re sometimes called.  Even crude measurements can be better than no measurement at all.  There are, however, two inherent difficulties with doing it well:  (i) many of the things we care about are difficult to measure and (ii) any specific measurement system can be gamed or even corrupted.  You need to take both into account when you use them.

There’s nothing special about colleges here, you see the same issue all over.  Profitability of firms:  there’s a reason accounting is a profession.  Performance evaluation of individuals:  there’s almost never a mechanical system that does this well.  Bond ratings:  ditto. College football:  how do we know who the best teams are when they don’t play each other?  What does “best” even mean?  I’m sure you have your own examples.

So what’s the solution?  First, we need to work constantly toward better measurement.  At the same time, we need to use some judgement about any measurements that come our way.  It would be nice if there were an easier way, but there’s not.  Meanwhile, we’re working hard here to give value to our students.  We think they’ll notice, and that’s good enough for us.


2 Responses to “College rankings”

  1. Luis Cabral Says:

    Nice post. One is always reminded of Churchill’s famous remark regarding democracy. Something similar applies to metrics: it’s the worse possible method except for all the others.

  2. Richard Freedman Says:

    I assure you, based on considerable experience that the results of many measurement systems are perverse, leading to wrong decisions. One trivial example is performance evaluation systems that systematically rated women below men. The value of pay systems are based on the accuracy of performance appraisal systems. I used to use an exercise in class that had students evaluate and recommend pay raises bases on carefully developed data. Needless to say they were almost all offbase. So the defense of pay systems is performance appraisal measures, yet these measures are generally have low accuracy.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: