Boston Herald

College rankings are misleading – why use them?

- — lOS aNGElES tiMES

Many high school seniors have been opening emails over the past weeks that tell them whether they got into the colleges of their choice. Even as they do so, the criticisms of published college rankings that may have guided their preference­s are cropping up — again.

A math professor at Columbia University is challengin­g the data that the Ivy League school reported to U.S. News & World Report, which earned it the No. 2 ranking this year.

A couple of weeks ago, in what must be the granddaddy of fake-data scandals, the ousted dean of Temple University’s business school received a 14month sentence after he was convicted in federal court of sending bogus informatio­n to U.S. News & World Report to boost the school’s prestige. Claremont McKenna College, The George Washington University and many other schools have tweaked data to boost rankings.

But the ultimate issue with the rankings doesn’t lie with the cheaters. The problem is the rankings themselves. They can be a counterpro­ductive way for families to pick schools — for example, a much less expensive school might offer an equal or better education than a more highly ranked but costlier one.

The most selective schools — Princeton, MIT and so forth — don’t need rankings to boost their reputation or applicant pool. And the difference­s between a school that might be 70th on the list and one that might be 90th are unlikely to have much of an effect on a student’s post-graduate prospects or college experience.

Probably few college applicants are aware that the single biggest factor U.S. News uses to rank schools is their reputation among officials at other colleges, who might or might not have deep knowledge of the schools. That accounts for 20% of the score.

The second biggest factor is sixyear graduation rates. But since low-income students are far less likely to graduate within that time period — or ever — than middleclas­s students, this is more an indication of student affluence than academic excellence. In fact, it can have the perverse effect of discouragi­ng colleges from accepting more low-income students, lest it worsen their graduation rates.

U.S. News has made some positive changes in recent years. It dropped student acceptance rate as one of the criteria, which had led colleges to heavily market to students even if they had almost no chance of acceptance. Lower acceptance rates equaled higher rankings. The rankings started including the percent of Pell grant students who graduated within six years — a meaningful statistic indicating whether colleges were helping low-income students complete their education.

But many other factors used in ranking the schools have little meaning to a student’s experience. The rankings use alumni donations as a proxy for students’ happiness with their alma mater. That’s a pretty meager way to measure satisfacti­on.

What most high school students and parents need to know is whether a college offers a rich choice of courses with good instructor­s; whether graduates will leave with a load of debt; whether students will feel comfortabl­e and engaged on campus; and whether they’ll be prepared for a fulfilling career.

College administra­tors bemoan the rankings but they continue participat­ing. They should stop going along with the charade and insist on being partners in drawing up more valid ways to evaluate higher education. What should matter most is how satisfied students and alumni were with their choice.

Newspapers in English

Newspapers from United States