University of Arkansas Office for Education Policy

Rating From OEP: Thumbs Down!

In The View from the OEP on March 17, 2011 at 9:22 am

While it is not generally in the OEP job description to do movie reviews, I do believe it is appropriate for OEP to review reports on education that have an influence in the policymaking process.   Earlier this week, I was asked to comment on a story about a new report making the rounds at the Capitol on the effectiveness of charter schools.  The story, in the Arkansas Dem Gaz, focuses on a report claiming that the success of charters in Arkansas is due solely to the fact that charters have more favorable demographics (that is, according to the group, kids from richer families).

In short, the report (co-sponsored with AEA and our friends at AACF) is based on analyses and data that are quite simply ….. well .. bad.  In my comments in the Dem Gaz article, I provided a few of my concerns with the report.  However, with the extra space provided here in cyber-land, I will elaborate a bit on my concerns and organize them within two primary categories.  First of all, the data used and analyses conducted are inappropriate. Secondly, and perhaps more importantly, the conclusions drawn by the authors are not supported by evidence.

Faulty Comparisons Using Bad Data

  • The regression analyses are based on grade levels (instead of individual students), use only one year of data with no attempt to measure student growth (and that year is not even the most recent year available), and include very strange choices for comparison schools.
  • Among the examples of inappropriate comparisons are the use of Farmington as a comparison district for Haas Hall (which has been located in Fayetteville for several years) and the use of Little Rock SD as a comparison for the Arkansas Virtual Academy (which has students from all across the state).
  • Moreover, the report erroneously suggests that Haas Hall and the Virtual Academy serve NO low-income children.  This is because neither school has a cafeteria and thus neither serves any free lunches.  This does NOT mean that there are 0% low-income children in these schools (school leaders at these schools claim to serve about 50% poor students).

Conclusions Are Unrelated to Any Evidence

  • The report concludes that charters cause a problem for students left in “under-resourced” traditional public schools.  I am not exactly sure what under-resourced means here.  For example, in 2009-10, traditional public schools in Arkansas spent $11,717 per student, while charters spent $9,417 (data gathered from ADE Annual Statistical Reports).
  • Finally – and this is especially relevant this week – the authors conclude (without any connection to the data they used) that the state needs more accountability to close failing charters. It should be clear to any observer of education in Arkansas that charter schools actually face more and stricter accountability.  First of all, they receive no funding if no students make the active choice to enroll there.  Charters must also go before the State Board for renewal regularly.  And, of course, the State Board on Monday voted to shut down a Little Rock charter school — immediately — due to financial problems (click here for more on the reaction of UCPC students and parents).  How much stricter should our state’s charter policies be?
  • Throughout, the report is also laden with internal inconsistencies … on page 1, the authors cite a Stanford study from 2009 showing that AR charters outperformed traditional public schools.  Based on this, the authors give credit to the state board’s careful screening process for charters.  Nonetheless, the authors then conclude that the evidence suggests that the state needs more criteria and accountability for charter schools. Which is it — do charters perform well or not?  Do we have a good screening process or not?

It is good that this group tried to assess student performance; it is not so good that the conclusions were entirely unrelated to the data and were likely drawn up well before any statistical analyses were conducted!

In fact, the authors’ interpretation of the student performance data (despite the flawed analyses)  ended up being pretty reasonable and in line with what others have found.  Charters likely perform just a little bit better than their traditional public school peers. Some do great work (e.g. KIPP), and some do a lousy job.  This is also the case with traditional public schools — most are pretty good, some are great, but some are not very good.  The difference is that the State Board generally allows under-performing traditional public schools to stay open and the students in these schools do not have the option to go elsewhere.

I am becoming more convinced that this whole debate is counter-productive in our state.  In my view, it would be far more productive if those of us in the education establishment would spend less time trying to limit the growth of charters and spend more time trying to improve the education we deliver to the 95% of students who attend traditional public schools.  Every minute that we spend lobbying policymakers to fight charter schools is a minute we’re NOT trying to help our teachers come up with even better strategies to serve all students across Arkansas.

— Gary Ritter

Advertisements
  1. Wonderful exlpanatoin of facts available here.

Comments are closed.

%d bloggers like this: