The Ultimate Absurdity of College Rankings

May 10, 2013 | Updated Jul 10, 2013

Before anyone interprets this blog as an exercise in sour grapes, let me state for the record that Rochester Institute of Technology, where I serve as President, has recently (in various print and online publications) been ranked 27th in the world in engineering, 11th in the world and 2nd nationally in industrial design, 4th nationally in game design and development, 3rd nationally in film and animation, 3rd among all colleges nationally by the aviation industry, top-ten nationally for student internships, and top-ten nationally in another 9 academic programs. We play in this realm as well as other colleges and universities of our size and scope.

So why do I see the various college ranking systems as absurd and a disservice to the public?

First, they attempt to force very dissimilar institutions through the same filter for the purpose of defining essentially meaningless "winners" and "losers". Why anyone would think that Princeton and North Carolina A&T should be compared head-to-head is beyond me. They admit very different students and apply very different resources to their educational programs. While the various ranking organizations all rank Princeton ahead of NCA&T, I am not at all confident that the actual value-added to individual students is higher for either of these fine universities than it is for the other.

Attempts to evaluate the quality of colleges and universities should start with the mission of the institution and evaluate the institution's success in achieving that mission (this, in fact, is what accrediting agencies do). In such a process, a community college could well achieve as high a rating as an Ivy League school, and students could choose to go where their educational needs are best matched. Unfortunately, no published ranking scheme has taken this approach.

And most college ranking schemes actually measure the quality of students that a given college enrolls, not the quality of education they receive. For example, the use of SAT scores and high school grade point averages, along with the percentage of enrolled students that were in the top 10% of their high school graduating classes, is clearly an input, rather than an output measure. And even measures like four-year graduation rates are often really input measures, since many of the best high school students now enter college with a year or more of advanced placement credit. Furthermore, high four-year graduation rates may simply indicate that academic standards are low, although this is clearly not true in most cases.

But the true silliness of these efforts finally came to me when two recent rankings were published on-line. First, gave its annual ranking of the 25 colleges and universities that they claim provide the greatest return on investment in terms of the earning potential of graduates after receiving bachelor's degrees. Not surprisingly, 17 of the top 25 colleges were engineering schools. It has been known for many years that, on average, engineering graduates receive higher starting salaries and earn higher average compensation over their careers than most other college bachelor degree recipients. Since engineering schools educate more engineering students as a fraction of their total student enrollment than do other colleges and universities, we should expect them to do well in this kind of ranking system.

At about the same time, the group that oversees the Forbes college rankings (in my opinion the silliest of the lot), published its annual list of the top 25 colleges and universities with the worst professors based on student evaluations posted on On their list were 8 engineering schools. Now anyone who has given any thought to the issues involved knows that engineering majors are among the most difficult to complete because of their academic rigor. In fact, a much higher percentage of students entering engineering programs fail to complete their engineering degrees than do those who choose other majors, on average. Professors who impose the highest academic standards on their students may not be popular at the time students take their courses, but they frequently are cited by graduates years later as being the faculty whose teaching and mentoring they most appreciate in retrospect. So the large number of engineering schools on the "worst professors" list should not surprise anyone.

But here's the true indication of the value of such rankings: three schools, all first-rate engineering schools, are on both top-25 lists! This may not seem very significant at first, but there are more than 3,000 colleges and universities in the U.S., and the chance of this happening randomly is very small. So what can we conclude? That engineering bachelor degree programs are among the hardest to complete but that upon completion they have among the highest returns on investment? I'm sorry, ranking folks, but we already knew this.

Interestingly, RIT appears on neither top-25 list, and the reason is obvious. RIT is really a comprehensive university, despite its name. Only 20% of our undergraduates are enrolled in engineering programs, about the same fraction as are enrolled in fine-arts programs. My guess is that in order to appear on both top-25 lists a college or university would have to have a significantly greater fraction of its students taking engineering majors, which is certainly true for the three colleges mentioned above.

So, ranking people, do us all a favor and find another way to earn a living. We need a diverse set of higher education institutions to serve our nation's needs for an educated citizenry. We also need diversity in higher education to serve our increasingly diverse population. Attempts to rank all colleges and universities using a single set of measures work against the diversification of our colleges and universities to meet these needs, and are therefore a disservice to the public.