The college rankings business is BIG business. Almost all American colleges and universities report annually on various institutional statistics about their incoming students. This data — which includes mean SAT/ACT scores, high school class ranks, GPAs, and the like — is then utilized by third parties (like Newsweek, Kiplinger’s, Forbes, and US News & World Report) to rank colleges.
The US News & World Report Best Colleges 2013, which is due out any day now, is by far the most widely circulated and influential of the bunch. It ranks colleges according to “16 different indicators of academic excellence”, most of which are based on this aforementioned self-reported data from the colleges about their incoming freshman class. These rankings have tremendous (and, in my view, disproportionate) influence on a college’s perceived prestige, pricing and popularity, and even — now here’s the real kicker — on its bond rating. Yes, the higher the ranking, the easier a school’s access might be to capital.
And so, despite the fact that these rankings tell us (the consumer) very little about the actual quality of the education AT these institutions, they are EXTREMELY important to a colleges’ administrators, board members and other fiduciaries. So important that it has led some to ‘misreport’ or manipulate certain admissions-related data.
Last Friday, for example, Emory University President James Wagner, sent a letter to alumni disclosing that an internal investigation had uncovered ‘intentional errors in reporting standardized test scores and class rank dating at least to 2000.’ Seems Emory was reporting scores and class rank for admitted students vs. enrolled students, leading to higher numbers because many students at the top of Emory’s admission pool enrolled at other institutions. And just so you don’t think I’m picking on Emory (because I actually really like the educational quality at Emory), Claremont McKenna College (another school that delivers a quality education) announced in May that a dean there had been misreporting data about admissions. And I suspect there may be more announcements of similar indiscretions from other highly competitive schools in the future.
Of course, US News officials said the effect on Emory’s ranking (No. 20) was small, ‘negligible’, in fact. What else are they going to say? The reality is this: The US News and other similarly subjective ‘rankings’ indicate the schools with the top high school students. They tell us very little about the quality of the education that they actually deliver to their undergrads. The point here is that when the 2013 rankings get released, you should take the rankings with a grain of salt. I’m not suggesting that you ignore them, but rather use them as a single, small indicator and not as the sole reason to choose a school. Choose colleges that are the right fit for your family, starting with the academic program, how they will nurture the student’s intellectual growth, how they will prepare your student for life after college, and whether they will meet your financial needs. This requires much more effort than an impulse buy of the most recent College Edition of Newsweek at the supermarket express lane.
Bottom line: just because one school is ranked 5th and another 15th, does not mean that the former is better for your student.