Brace Yourselves: College Rankings Season Approaches


Every August faculty, staff, and alumni hold their collective breath when the US News and World Report college rankings are published. And every August families with college-bound children snap up about 2.4 million copies of that one issue--a phenomenal news-stand-sales rate, the profit from which is nearly equivalent to three months of subscription revenue.

Like magazines that rate consumer products, this issue gathers data, throws them into the statistical mixer, and, voilà, complicated social and educational institutions are ordered from first to worst, allowing consumers to make a decision regarding a life-long investment in literally a couple of seconds. ("Son, you better fire off an application to Whattsamatta U. I don't know where it is or if they have a writing major, but it sure ranks up there with the best of 'em!")

But unlike editors of, say, Road and Track, who rank automobiles, these editors have not taken any of these institutions on a "test drive," thereby allowing qualitative assessments to accompany the quantitative. (How does this place feel? What are the people like? What do its graduates do?) Instead, the editors of US News rely heavily on simple input measures (most from the institutions themselves) to assess the quality of an institution.

It's bad enough that their methodology is often criticized by statisticians and probably subject to change every year (after all, how many magazines would they sell if the "top 25" always showed up in the same order?). But even worse, the reductionist nature of such measures of quality robs an institution of its distinctive nature, and encourages students and parents to rely on measures that have little bearing on helping them understand what institution is appropriate for them.

No college guide can give a complete description of the Oberlin experience--assigning a numerical score to any institution does not measure what is truly important, such as values, mission, student outcomes, and faculty-student relationships. Indeed, numerical rankings actually serve as an incentive to stop worrying about such noble pursuits and concentrate on developing schemes to affect rank. There is considerable evidence that the rankings have affected some institutions' admissions policies and have encouraged generous rounding policies when it's time to turn over SAT scores. Rather than encouraging colleges to focus on what really matters (are we helping our students to develop intellectually, artistically and emotionally in ways consistent with our mission?), the process pressures them to think about changing policies and practices in order to positively affect their rank.

We cannot allow such things to happen.

Can we therefore ignore our rank? Unfortunately the answer is no. In a national survey of students at highly selective institutions, 23 percent of them said rankings were "Very Important" in deciding which school to attend. At Oberlin, that number is lower (18 percent) but still significant. And according to a recent study out of UCLA, the most academically able students tend to give more credence to such rankings.

What we can do is educate prospective families--and remind ourselves--about Oberlin's special character. We can encourage all families to learn about Oberlin from a variety of sources, including personal contact as well as other, more qualitatively driven assessments, such as The Fiske Guide to Colleges and Barron's Profiles of American Colleges. We can also continue to work with publishers to broaden their assessments of institutions.

We know that Oberlin is intellectually powerful, artistically vibrant, and wonderfully diverse. Those are qualities that can't be simply statistically derived.

--Ross Peacock
Director of Institutional Research