And the winner is … highly disputed: U.S. News & World college rankings satisfy some, doesn’t explain the real picture as seen by faculty

They’re easy to recognize. Some touring students are wearing letter jackets, most have brought their parents, and each is carrying a red folder, the folder designating them as prospectives.

Some are from Ohio, having heard about OWU from older siblings or family friends. But how do the students hailing from Florida or California or Washington hear about OWU?

Their school may have been visited by an Admissions counselor, or they, too, may have heard about the school from someone they know. But many of them—or, more likely, their parents—spotted OWU on a national ranking list like that of U.S. News & World Report and decided to give the Battling Bishops a shot.

But are these national ranking lists accurate? How are the statistics in such national rankings decided? Does what a university is ranked even matter?

What’s the U.S. News & World Report ranking, and how is it determined?

In U.S. News & World Report’s 2011 Edition of America’s Best Colleges, Ohio Wesleyan University was ranked as one of the top liberal arts colleges. OWU, listed at number 102, was among 178 ranked liberal arts colleges.

But how are these rankings determined, and how does Ohio Wesleyan measure up to the other members of the Five Colleges of Ohio, colleges with which OWU competes for prospective students?

According to the U.S. News & World Report website, the 2011 Edition of America’s Best Colleges list is determined using quantitative data that higher-education experts have agreed reliably gauge a university’s quality. Colleges are classified by the type of institution; national universities or liberal arts colleges are two such classifications. U.S. News & World Report then collects data from the university which rates sixteen factors. These factors include the undergraduate academic reputation, student retention, amount of financial aid awarded, faculty resources, number of alumni who donate funds and more.

This is the complex process that places Ohio Wesleyan at 102, an impressive statistic when considering hundreds of liberal arts institutions operate in the U.S.

But it’s also the process that places the other members of the Five Colleges of Ohio—Denison University, the College of Wooster, Oberlin College, and Kenyon College—ahead of Ohio Wesleyan. Within the liberal arts classification, America’s Best Colleges puts Oberlin at 24; Kenyon at 33; Denison at 49; and Wooster at 71.
In terms of student population, the Five Colleges are comparable. As of 2010, Ohio Wesleyan was the second-smallest, boasting a student body of 1,919. Gambier, OH hosts Kenyon’s 1,632—the smallest student body—while Oberlin represents the largest of the Five with 2,974.

Despite Oberlin’s size, though, it also has a 31 percent acceptance rate—the lowest of the Five. This low score means it’s harder to get in, ostensibly upping the ante in terms of academics and student performance. Ohio Wesleyan, on the other hand, has the highest acceptance rate, at 69 percent.

Ohio Wesleyan German professor Thomas Wolber has served on the Committee for Financial Aid and is interested in the role college rankings (specifically those of the Great Lakes Colleges Association, or GLCA) play at universities. Wolber said he tends be skeptical of universities’ acceptance rate statistics.

“I’m no expert on how Wesleyan does it, but I do know other institutions are exaggerating these numbers,” he said. “For example, some institutions may deliberately and intentionally encourage students to apply who are clearly not qualified.

They have no chance of coming. But yet the university encourages these students to apply, giving them false hope, for the simple reason [of creating] a lower acceptance rate.”

Getting accepted at a university is one thing; staying in is another. Ohio Wesleyan’s retention rates reflect this distinction: its 87 percent freshman retention rate is the lowest of the Five Colleges, followed by Wooster’s 87 percent, Denison’s 90 percent, Kenyon’s 93 percent and Oberlin’s 94 percent.

Oberlin scored high in nearly all of the U.S. News & World Report indicators; on paper, Oberlin students receive the best education of any students at one of the Five Colleges. But they also pay the most. According to the Integrated Postsecondary Education Data System (IPEDS), Oberlin’s tuition and fees for the 2010-2011 academic year totaled $41,577, up from $40,004 the previous year. Least expensive was Ohio Wesleyan: its 2010-2011 tuition and fees came to $36,398, an increase of $1,368 from the year before.

What about the Great Lakes Colleges Association?

To Ohio Wesleyan, national rankings such as U.S. News & World Report and the Princeton Review matter less than number comparisons within the GLCA, according to Wolber.

“The GLCA is important because they are our immediate competitors,” Wolber said. “When students apply here, they also apply to and visit Kenyon, Denison, Wooster, Oberlin, Wittenberg and so on. And so we have to pay close attention to our sister institutions. We are not competing with some college in California or Florida. I think the national rankings of U.S. News & World Report, for example, are not particularly relevant to us.”

Depending on the source, there are 12 or 13 schools in the GLCA. (Antioch College went out of business in 2008; efforts are being made to reopen it, and it is sometimes included in GLCA lists.) The other schools in the association are: Albion College, Allegheny College, Denison, DePauw University, Earlham University, Hope College, Kalamazoo College, Kenyon, Oberlin, OWU, Wabash College and the College of Wooster.

The factors for comparison between GLCA schools are similar to those used in determining national ranking lists. Tuition, available majors, student-to-faculty ratios, sports facilities and more play a role in ranking within the GLCA.

Wolber said Ohio Wesleyan and the other GLCA schools are under constant pressure to be aware of these factors and remain competitive.

“If Denison builds a new tennis court, we have to build a new tennis court,” Wolber said. “If we build a new Meek Aquatic Center, then, of course, Kenyon has to do the same thing or they will be left in the dust. Everybody now is emphasizing dormitories, so that is the main reason why we are renovating Stuy…we have to do it, because we might lose students to our competitors. That is why the GLCA group is of paramount importance to our operations here.”

Although they aren’t necessarily private—databases such as IPEDS offers much of the information on GLCA schools—the GLCA statistics are not generally published for the public. Each institution self-reports its statistics and disseminates them to the other universities. Students are largely unaware of a GLCA university’s ranking, although they see all around campus evidence of the university’s efforts to keep up with other GLCA schools.

Wolber said many schools alter statistics, leaving the ranking lists, and the stats upon which they are built, largely untrustworthy.

“I do not know if the problem of airbrushing numbers is a problem at Wesleyan or not…but it happens elsewhere, because the institutions self-report their numbers. And either they deliberately lie, or it might be an inadvertent mistake. They might do false calculations, misinterpret numbers and report the false numbers.”

Wolber said the responsibility should fall on multiple people to create a system of checks-and-balances, but often there are simply too few people to recheck the statistics.

“It is difficult to double-check that. So people lie, falsify, fabricate numbers simply to keep their jobs, to make everybody happy, and life goes on and they’re not found out. But sometimes they are found out.”

Wolber is referring to a 2012 example at Claremont McKenna College, where a top administrator falsified students’ standardized test scores in an effort to rank higher on national college ranking lists.

To what extent do these rankings matter?

Have universities altered their operations in an effort to score better on ranking lists such as that of U.S. News & World Report? Not necessarily, according to OWU President Rock Jones.

“Most colleges and universities have become more focused on issues related to retention and graduation rates over the past two decades,” Jones said. “This may or may not be a direct response to U.S. News & World Report. In either case, it is a good thing for students.”

Wolber agreed. “Students don’t read [ranking lists]. They may buy a copy, or their parents may buy a copy… But do they really care if it’s ranked 83rd or 104th or whatever? I don’t see that. You look at the website, and you visit Admissions and look at their campus. Do they have what you hope to study? Do you like the teachers? Are the people friendly? And then, of course, tuition and tuition discount. Nobody worries about those national rankings anymore. Other factors seem to play a much more significant role than those rankings. I think they are overrated and exaggerated, and yet many colleges go ahead and tweak their numbers, misreport.”

Meagan Ferns, a junior who works as a tour guide for the Ohio Wesleyan Admissions Office, said in her experience, students typically aren’t concerned about where OWU is ranked on national lists like U.S. News & World Report.

“Most students don’t care about that,” Ferns said. “It’s usually the parents who are more concerned about how we’re ranked versus other schools in the area.”

Jones said national data conducted by enrollment research firms show college guides are most important to students at the beginning of the search process.

“We consistently find that students looking at OWU are much more influenced by people they know who have attended OWU, their experience of a visit to the OWU campus, and their relationship with an OWU admission officer than by college guides.”

Jones said overall the categories tracked by national ranking lists are important, but the most crucial measure of a university’s effectiveness is student learning.

“[National ranking lists] do not measure student learning…nor do they measure outcomes, such as post-graduate fellowships, graduate and professional school placement, and placement in the work force. So while we should track the metrics measured by the guidebooks for other purposes, it is far more important to track student learning and the outcomes of an OWU education.”

One thought on “And the winner is … highly disputed: U.S. News & World college rankings satisfy some, doesn’t explain the real picture as seen by faculty”

  1. Dr. Wolber, who is mentioned above several times, should probably update his research skills as a faculty researcher at Ohio Wesleyan: contrary to Wolber’s argument that rankings do not matter, NBER Working Paper No. 7227 shows precisely the opposite. And the paper is written by people who actually care about causal effects and not just spurious correlations. So, given that Dr. Wolber “has served on the Committee for Financial Aid and is interested in the role college rankings (specifically those of the Great Lakes Colleges Association, or GLCA) play at universities”, he should probably read a bit more carefully an academic paper that has been around for almost 15 years on a topic that affects how people view Ohio Wesleyan.

Comments are closed.