College ranking studies are flawed and arbitrary

Roze Brooks

College rankings look enticing in print, but looking deeper into the arbitrary nature of media outlets and website criteria for prioritizing colleges is bleak and inconsistent.

The methodology of placing colleges in a Best of ______ list is confusing.

According to U.S. News and World Report, “16 key measures of quality” determine the merit of each college.

Things get technical with complicated math and  phrases like “category weight” and “sub-factor weight.”

The charts on list numerous ranking categories: undergraduate academic reputation, graduation and retention rates and financial resources, along with several others.  Each category is broken into sub-factors such as acceptance rates, ACT scores and graduation rate performance.

Those categories are further broken down into National Universities/National Liberal Arts Colleges and Regional Universities/Regional Colleges.

Somewhere in this frenzy of numbers-crunching, a best colleges list is produced each year.  Anyone reading these charts in an attempt to understand why their college wasn’t listed would likely not get an answer.

Criteria vary from study to study, confusing prospective students who may use rankings to decide where to attend college.

Further, it’s hard to label a college as the best of anything without visiting the campus.

Universities surrounded by neighborhoods with high crime rates tend to drop in rank based on a criteria completely unrelated to the educational value of the school. Many of these campuses are also perfectly safe and do not reflect the crime rates of the surrounding neighborhoods.

The inclusion of student surveys in compiling research and opinions is a better way to generate a list of tuition-worthy colleges.

No one knows how awesome or awful a school is better than the students.

[email protected]