Editorial: Best college rankings: Who really is the ‘best?’
September 14, 2014
Groans inevitably followed the release of U.S. News & World Report’s annual “National Universities Rankings” last week.
For example, upon learning Pitt’s ranking, a subsequent “Really?” may arise — we’re tied for 62nd with five other schools: BYU, Clemson, Purdue and the Universities of Maryland and Georgia..
But what exactly makes us the 62nd “best” university in the nation? According to the satirical publication The Onion, one of U.S. News’ main criteria for the list involves the “aggregate incoming freshmen’s SAT, ACT and COWFACTS test scores.”
The last is, obviously, not true, but the SAT and ACT part are. These scores are included in what U.S. News calls the “selectivity score.” It’s one of the largest factors in the rankings and is largely based on incoming freshmen’s SAT and/or ACT scores. Currently, schools that don’t evaluate prospective students based on SAT or ACT scores in their admission process go unranked. As The Onion implies, though, this is just about as relevant to a college’s worth as COWFACTS.
This is because there are major problems with the SAT and ACT tests. SAT and ACT scores are directly correlated, not with a student’s IQ, but with the student’s socioeconomic standing. According to the 2009 College Board Report, students’ average critical reading, math and writing scores increased along with their family income bracket. For instance, students in the “$200,000 and above” income bracket averaged a writing score of 560 on the SATs. Comparatively, those in the “under $20,000” income bracket averaged a score of 430.
This data is coupled with a multi-billion dollar test prep industry.
Naturally, parents want the best for their children, and are willing to spend a lot of money to ensure their kids can get into the best universities possible. Thus, SAT and ACT prep books and classes are met with a high demand — and high prices — that those in the lower end of the economic spectrum cannot afford.
So, when a student can literally buy his or her way into a college, what does that really prove, in terms of merit?
Or, in relation to U.S. News’ list, what does this prove in terms of “selectivity?”
Based on the trends, the criteria for U.S. News’ selectivity category cannot accurately depict the overall worth of a university. U.S. News should focus on less superficial and more practical factors to more precisely determine the nation’s top schools.
When selecting a school, factors like job placement and return on investment are more valuable to a student than the student body’s average SAT or ACT scores.
Job placement rates can reflect how effectively universities train their students in their field of study. U.S. News can gather these averages by calculating the percentage of graduates who gain employment related to their degree within six months of graduation.
This will ensure that incoming freshmen know what to expect and where their school stands in the field they wish to study. For instance, the likelihood of eventually becoming a doctor should be vital information to any pre-medical student — if this was a criterion for the list, Pitt would most likely be much higher.
U.S. News can pair this with return on investment, which can determine the cost-effectiveness of going to a particular university. In other words, it can help determine whether or not the costs outweigh the benefits of going to a certain school. This can be calculated by determining the amount of debt versus the amount of income accumulated by graduates a number of years after they complete education. Before spending thousands of dollars on a college education, a student can know whether or not the ends will justify the means.
The value in the list lies in its ability to guide students towards the best and most effective universities available. However, the criteria must change, for the sake of the students and for the sake of many schools’ reputations.