The ReachBright Blog > Alumni Development > The Rankings Are In: What Have We Learned?

The Rankings Are In: What Have We Learned?

Every year, admissions directors, advancement officials, and others working on college campuses wait anxiously for the announcement of the U.S. News and World Report college rankings. Just as film stars and music moguls await the Oscar and Grammy nominations, higher ed experts wait to see whether their hard work—recruitment efforts, crowdfunding campaigns, alumni events, marketing strategies, and more—measure up and score them a spot.

There are a variety of lists a school can be placed on, from the “Best National Universities” and “Best Value” lists to the list with the questionable “A Plus Schools for B Students” title. In total, there are over ten lists on which a school can find its place. However, when the lists are released, most people tend to focus on the two bigger ones: “Best National Universities” and “Best Liberal Arts Colleges.”

The top schools dominate the news cycle for a week or two before everyone starts looking ahead to the following year. So, as the U.S. News and World Report and colleges around the country start preparing for next year, what has this year taught us?

Little Has Changed

This is undoubtedly frustrating for schools who struggle to take top spots in the rankings. Since 2014, Princeton has been ranked #1 in the “Best National Universities” list. Meanwhile, for an astounding fifteenth year, Williams College has been ranked the Best National Liberal Arts College in the country.

Schools are left in a vicious cycle, one that ultimately carries favor toward Princeton, Williams, Harvard, the U.S. Naval Academy, and other schools that have found themselves at the top of the list for years. Schools that rank higher are more likely to experience both increased enrollment and engagement, usually reserving the spot on the list for years to come. As contributor to Forbes, Willard Dix, argues: “College rankings just confirm what we already know.”

“Nontraditional” Schools Are Being Punished

The world of higher education was a much different place in 1983 when the U.S. News and World Report first started ranking colleges. However, in the 34 years since the first report was released, higher ed has changed tremendously. Unfortunately, the formula the publication follows to award its rankings has not.

There are nearly 140 schools that make the list, but aren’t given a rank. These “unranked” schools find themselves in this predicament for one of several reasons.

  • They may not require SAT or ACT scores for first-year students
  • They may enroll a large proportion of nontraditional (adult, international, etc.) students
  • They may be a school specialized in art, business, or engineering

In 2017, more schools are deciding not to require SAT scores. According to David Rosen, contributor to The Washington Post and professor at Trinity College, his school’s decision to become “test-optional” dropped Trinity 30 spots on the list this year. Meanwhile, the population of adult students is currently growing at a higher rate than the population of students in the 18-25 age group.

These changes highlight the growing accessibility of higher education, which is always a positive. However, these changes also mean that schools who are doing everything right— recruiting more prospects, engaging more students, etc.—are actually being punished for these actions. Focusing more on “traditional” students will not only lead to a less accessible higher education, but a less diverse and engaging education as well.

There Are Missing Pieces

While the SATs may still be a factor in college rankings, there are plenty of other calculations missing from the publication’s formula. In addition, there are plenty of concepts that cannot be measured, like student experience.  

The current formula includes:

  • Graduation and retention rates (22.5 percent)
  • Undergraduate academic reputation (22.5 percent)
  • Faculty resources (20 percent)
  • Student selectivity (12.5 percent)
  • Financial resources (10 percent)
  • Graduation rate performance (7.5 percent)
  • Alumni giving rate (5 percent

Among the things the formula doesn’t include? Student safety, student debt, diversity, facilities, and the student body’s voice itself. If these factors played as large a role as reputation does in the above formula…how might the rankings change? Might we actually learn something from them?

Filed Under: Alumni DevelopmentCampus EngagementEnrollment Management

Is your school’s enrollment down? Is engagement waning?

Give enrollment and engagement a boost!