Tuesday, May 22, 2012

High school rankings: Are they important?

Three separate high school rankings have appeared recently. These are the usual end-of-the-year reports that have been circulating in one form or another for years.

The Washington Post Challenge Index takes the total number of Advanced Placement, International Baccalaureate and Advanced International Certificate of Education tests taken each school year and divides by the number of seniors who graduate. It provides both a national rank, a region rank, and a state rank. (Novi's ranks: 756 national; 69 in the Midwest; and 7 in Michigan.)

Newsweek also compiles a list using a variety of factors: graduation rate, college matriculation, AP/IB test participation, AP/IB test scores, average ACT/SAT test scores, and AP tests offered by student. By this ranking Novi is number 472 of all the high schools in the United States.

A third ranking appears courtesy of US News and World Report. Here Novi is not ranked either nationally or in Michigan even though some of the schools that are ranked have scores that are lower than Novi's scores.

What does it all mean?

Jay Mathews, an education writer for the Washington Post, has an article that discusses the high school ranking phenomenon. The bottom line is that each list has a formula that include some schools and exclude other schools. He admits that if a school is on any of the three lists it should be happy.

From my perspective the lists try to identify a set of criteria and then measure schools against that criteria. Novi does well on two of the lists and is mysteriously left off the third list.

External validation is important.

More important is a robust accountability system that we develop and measure ourselves against. We are in the process of developing that.

Two of our four district goals deal with student achievement:

  1. The Novi Community School District will ensure that each student will make no less than one year's growth in one year's time.
  2. The Novi Community School district will ensure that all students achieve at a high level. (There will be no achievement gaps.)
If we can accomplish these two goals, every student in our district will be prepared to leave our district and be successful. Our responsibility is to ensure that students learn. These two goals help us focus on that goal.

Rankings are interesting and they make for some enjoyable debates. But what is more important is being able to clearly demonstrate that every student is making progress and that every student is learning. If we can do that then I can rest well at night.


  1. Jay Matthew's method for ranking schools by the number of IB/AP tests taken is flawed. It has little relevance and does nothing more than skew rankings in favor of schools who offer both AP and IB.

    Unfortunately, it also has the effect of putting pressure on school leaders around the country to offer IB in order to improve their ranking. Jay has been a vocal advocate for IB for many many years and is the author of "Supertest: How the International Baccalaureate Can Strengthen Our Schools".

    At some point, I would be very interested at seeing a report detailing the amount of money our district spends on this program in relation to the number of students earning the IB diploma each year. While earning an IB diploma is quite an accomplishment, I'm not convinced that it offers non-diploma candidates any significant advantages.

    I know we're only just graduating our first group this year, but we should continue to monitor the program's cost-effectiveness closely.

  2. U.S. News & World Reports has been ranking U.S. universities since 1983. In 1998, Jay Mathews, when he was affiliated with Newsweek, developed the first high school ranking system in 1998. Last year, Newsweek was sold in a fire sale for $1 and Mathews' List was relegated to the Washington Post.

    Do any of these lists truly reflect the quality of education delivered to our public school students? Nope. They sell magazines/newspapers and stroke the egos of administrators who supply the media with the stats.

    People should note that the USNWR 2012 List is based on 2009-2010 data, while WAPO's is based on 2010-2011 data. I haven't checked Newsweek's as that left-wing rag should have gone the way of the dodo.

    Because IB has SL and HL and is able to push students to also take the AP exams, IB schools have an unfair advantage in all 3 of the rating systems. AP students cannot "supplement" their transcripts with IB exams, the way IB students can with AP.

    This is actually fairly ironic since IB supporters claim IB is not about teaching to the test, but then encourage the same students to sit 8 hours worth of advanced exams in a single subject, instead of relying on only IB.

    These "Lists" are nothing more than a PR scam and deserve to be ignored.