By Carey, Kevin
The Washington Monthly , Vol. 38, No. 9
Imagine you're about to put a chunk of your life savings into a mutual fund. Now imagine you peruse the various "best mutual fund" guides on the news rack, only to find they're all missing crucial pieces of information. The guides list where the fund managers went to college, how much investment capital they've attracted, and what kind of "experience" investors had at the annual fund meeting. But they don't tell you what you most want to know: What the funds' rates of return have been--or if they've ever made a dime for anyone. You might still decide to invest in a mutual fund, but it would be a heck of a crapshoot. And with their scorecard hidden, fund managers wouldn't be under much pressure to perform, let alone improve.
That imaginary mutual-fund market pretty much shows how America's higher-education market works. Each year prospective college students and their parents pore over glossy brochures and phone-book-sized college guides in order to decide how to invest their hard-earned tuition money--not to mention four years of their lives. Some guides, like the popular rankings published by U.S. News & World Report, base ratings on factors like alumni giving, faculty salaries, and freshman SAT scores. Others identify the top "party schools," most beautiful campuses, and most palatial dorms.
But what's missing from all the rankings is the equivalent of a bottom line. There are no widely available measures of how much learning occurs inside the classroom, or of how much students benefit from their education. This makes the process of selecting a college a bit like throwing darts at a stock table. It also means that colleges and universities, like our imaginary mutual-fund managers, feel little pressure to ensure that students learn. As anyone who's ever snoozed through a giant freshman psychology 101 lecture knows, sitting in a classroom doesn't equal learning; knowledge doesn't come by osmosis.
To be sure, determining the quality of a college education isn't as simple as calculating the yield of a mutual fund. But it's not impossible either. In fact, some reliable measures of student learning, engagement, and post-graduation success have already been developed. These measures reveal where professors are the most effective at teaching, where graduates readily find jobs, and where students walk away with little more than expertise in conspicuous beer consumption. So why haven't you heard about these measures? Because many school administrators don't want you to know. Putting their grades on the table is the last thing many colleges and universities want-especially since those grades would likely show that many of the elite colleges so prized by striving students are not backing up their lofty reputations by doing the best job of helping students learn.
Measures that work
There are three basic ways of trying to measure how well colleges educate students. The most obvious is to use some form of a standardized test. That's how K-12 schools are evaluated. Given the difficulty and controversy K12 testing has entailed, using standardized tests for college students might seem impossible at first. Elementary and secondary students are at least expected to complete similar courses, to learn the same rules of punctuation and applications of the Pythagorean theorem. Undergraduate studies are far more diverse: Some students choose to spend four years immersed in Ovid, others in organic chemistry.
But there turns out to be an answer: Instead of testing discrete pieces of knowledge, test the higher-order critical thinking, analysis, and communication skills that all college students should learn (and which employers value most). The Collegiate Learning Assessment, recently developed by a subsidiary of the RAND Corporation, does exactly that. Instead of filling in bubbles with a No. 2 pencil, CLA test-takers write lengthy essays, analyzing documents and critiquing arguments. …