More Than One Path Out of the Bottom

Five years ago, the Obama administration’s School Improvement Grants (SIG) famously targeted extra resources to the nation’s most struggling schools. The feds defined “struggling” schools as those performing in the bottom 5% of their state based on performance.

As is often the case with sweeping policy initiatives, the results were mixed. It’s probably unsurprising, then, that when we recently looked at the bottom 5% of public schools in 50 mid-to-large size cities in our report, Measuring Up, we found that around 40% of the schools that ranked in the bottom 5% of their state in the first year of our data (based on proficiency) stayed in the bottom 5% for the next two years.

But the results weren’t all bad news. In a handful of cities, none of the schools we identified in the bottom 5% stayed there for three years running. Given the limits of our data, we can’t say what (if anything) these go-to-zero cities did to address their lowest performers. We also don’t know what resources these schools had at their disposal or how much the results might reflect relative performance shifts by other schools in the state.

We can, however, see that these cities got their bottom 5% schools to zero, or close to zero, by different paths.

Those different paths are clear if we compare charts that track what happened to schools ranked in the bottom 5% of their state over three years in three cities: Memphis, New Orleans, and Washington, D.C. But first, we need a little explanation of how to read the charts, which are called “river plots,” or Sankey diagrams.

The charts track schools that were at the bottom 5% of their state in the first year of the data (2012) and what happened to their ranking over the next two years. On the left side of the charts, we show the share of schools in each city that ranked in the bottom 5% of the state (or in the case of DC, the city) in 2012. The chart then tracks what happened to these bottom-ranked schools in 2013 and 2014.

Importantly, the different “flows” in the chart are proportional to the number of schools in one of three conditions: (1) orange flows represent schools stuck in the bottom 5%; (2) green flows represent schools that moved up and out of the bottom 5%; and (3) blue flows represent schools whose unique identifiers (which we use to track schools over time) exited the data set, suggesting that the school either closed for good or perhaps re-opened as a new school with a different identifier.

Let’s look at the flows in the three cities, beginning with what happened to New Orleans’ bottom-scoring schools in mathematics. The following chart reveals two things right way: first, none of the schools in New Orleans that started in the bottom 5% of the state in 2012 were still present in our data set in 2014 (their unique identifier fell out of the data). These schools likely either closed or were reconstituted in some way. Second, a large share of closures or reconstitutions in New Orleans happened in year 2013 of the data: you can see this because the width of the blue flow in 2013 captures most of schools that were orange in 2012.

In New Orleans, the bottom 5% in math faced closure or reconstitution, mostly in 2013:

New Orleans Math Data

The next chart shows reading results in Memphis. Again, the majority of bottom-scoring schools from 2012 are “blue” by 2014. But one school—the thin green line at the top of the chart—improved its rank by moving up and out of the bottom 5% of the state. It’s also clear from the Memphis chart that, relative to New Orleans, Memphis closed or reconstituted most of their schools in 2014, rather than in 2013.

In Memphis, the bottom 5% in reading faced closure or reconstitution, mostly in 2014:

Memphis Reading Data

Finally, Washington, D.C.’s chart for math proficiency shows an entirely different picture. Here, unlike in New Orleans and Memphis, the majority of bottom-scoring schools moved up and out of the bottom 5%, as shown by the two green flows on the top right of the chart. A few schools were closed or reconstituted in 2013 (the thin blue flow in the center); a few were also closed or reconstituted in 2014. We can also see some relapse: a flow that started in the bottom 5% in 2012 improved in 2013, but then returned to the bottom 5% in 2014.

In Washington, D.C., the bottom 5% in math schools mostly improved in either 2013 or 2014; fewer faced closure or reconstitution:

WADC Math Data

Three years of data isn’t much time to consider the fate of a city’s lowest-performing schools. But it’s enough to see that, at least among cities that have zero to few schools staying at the bottom of their state for three consecutive years, leaders can address low performers with multiple pathways. Of course, the general pathways we describe here belie complex policies and strategies going on in these cities that deserve a closer look, from aggressive efforts to close and turn around low-performing schools in New Orleans, to the Achievement School District’s work in Memphis, to the DC Public Schools’ improvement efforts under superintendent Kaya Henderson and the DC Public Charter School Board’s oversight of the city’s charter sector.

Related Publications

Skip to content