You are here
You are here
Keeping School Rankings Lists in Perspective
School rankings and lists are alluring because they seem to offer efficient and authoritative data on the value of a school. But just as the potential of students cannot be reduced to an SSAT or SAT score, neither can the worth of an educational experience be distilled to a single number. This was reinforced this fall when Boston Magazine first tried to fix errors in its “top 50” list of area high schools, and then retracted the list altogether.
The original rankings used a formula with heavy emphasis on SAT scores. Editors soon discovered, however, that erroneous data had influenced the outcome. One school had been credited with average SAT scores of 2,222 out of 2,400. This earned the plum of being ranked as the third best high school in the region. However, the magazine had overlooked the fact that the school had reported the average of the top 10 percent of its students, rather than the average of all its students. Another school had reported a median rather than an average. When numbers were crunched again, the school originally ranked as third had plummeted to 48th out of the 65 schools evaluated.
To further complicate matters, it turns out that 26 of the schools evaluated—40 percent of the data sample—did not report SAT scores at all. To complete the formula as needed, the magazine assigned those schools an average SAT score that was an average of those that were reported by other schools. Thus any school that did not share students’ SAT scores was guaranteed a spot solidly in the middle of the rankings.
In addition, the rankings combined all types of high schools into one pool: public, parochial, and independent schools; inner-city and suburban schools; large and small schools; schools focused on special needs alongside mainstream schools. This is like comparing apples and soccer balls to determine which offers the “best” path to health. The rankings process made no distinction for the type of educational experience that each individual institution has as its mission.
In the end, Boston Magazine decided to withdraw the list and revisit its methodology before reporting similar articles in the future. We are fortunate that this case study reveals the inner workings of the process. All rankings and listings are based on some effort to line schools up side-by-side and compare metrics that may or may not match up precisely. And, the process depends on the researchers to determine which variables are considered “important” in the judgment of a school. Not to mention that any such initiative is also open to common human error.
The holistic experience of an education is impossible to truly quantify. It is hard not to look at lists and rankings in the search for information about schools. But this situation reminds us that these lists are subjective and do not paint a complete picture of a school. This is why we at The Bertram Group work so hard to help families get to know the academic and social culture of a school. The true value of a school cannot be measured with numbers or determined by ranking, but in how well its specific mission will serve the unique student for whom you are planning.
Holly McGlennon Treat specializes in helping families interested in independent junior and secondary boarding schools. She can be reached at firstname.lastname@example.org