University of Arkansas Office for Education Policy

Unpacking School Performance Ratings

In The View from the OEP on October 3, 2018 at 1:52 pm

Arkansas school performance ratings and A-F letter grades will be released to the public on October 12th.

Here’s what we think you can expect:

1) Most schools will get the same Letter Grade as they did last year.

2) Schools serving a smaller percentage of students eligible for free and reduced lunch will be more likely to get an “A” or a “B” than schools serving a population where a greater percentage of students experience economic hardship.

3) Arkansas’ growth measure- a powerful indicator of students’ academic improvement over time- will still be over shadowed by single-year achievement and will still be challenging for educators and parents to understand.

____________________

In advance of the release, we wanted to review the purpose of a school performance report.

This is the true baseline year for Arkansas’ new accountability system.  The state has worked diligently to develop an accountability system that will support student learning, as presented in the theory of action presented below.

Theoryofaction

The idea is that if schools get good information about what is really happening in their schools, then they can make improvements that will improve outcomes for students.

The continuous cycles of inquiry will take time to develop and build, and will require some new feedback systems to support schools and districts as they work to identify needs within their systems.

But does a school performance report really give schools the tools that they need to improve?  It’s complicated, and not just for Arkansas! A report released yesterday provides some insight into how stakeholders feel about school accountability across the nation.

One of the big benefits of school accountability identified in the report has been the increased transparency and quality of information about what is going on in schools. The interpretation of the data and the ability of schools to interpret the data and develop a plan to improve learning and resource allocation is key to Arkansas’ plan, as presented below.

Cycle

Arkansas’ school personnel have been able to preview the school performance report since September 21, and we can tell that they have because ADE has made some updates to the reports and extended the private review deadline.

We appreciate the state letting school personnel review the information, and fully support the pursuit of high quality data in the system.  But we worry that school personnel are spending a lot of time trying to check the scores, as opposed to interpreting it and developing a plan to support student learning.

One thing that we think will help school personnel re-direct their time from checking a bunch of data points to developing a plan to support student learning, is to really understand what is driving the school performance score.

The Biggie- Achievement

The majority of the school performance score is determined by student achievement on the 2017-18 ACT Aspire English Language Arts and Mathematics performance of students in grades 3-8.  Wealthier schools will generally have higher achievement.  We know that achievement on assessments is negatively correlated with student risk factors such as poverty.  You can check out the relationship in our data visualization.

Schools and the public have had information about achievement since scores were released early in the summer, which was much earlier than in previous years. Schools may have difficultly, however, calculating what their weighted achievement score would be because student ELA scores not reported by the full range of categories used in the school performance calculations.  Additional information about cut scores for the full range of performance categories is included in the Final ESSA Decision Rules, and could be applied to student scores but that would take quite a bit of time!

Note: As we mentioned in our earlier blog– achievement on the ACT Aspire pretty much stayed the same as last year, but the weighted achievement scores will be lower than last year for most schools because ACT Aspire modified the criteria for ELA readiness. Thankfully, because of the forward thinking actions taken by the ADE to equate the scores with prior years and adjust the cut points for the letter grade, the lower achievement scores should not result in lower school letter grades across the state.

The Most Important (to us!)- Academic Growth

Here at OEP we feel that this is the most important piece of information in the school performance report because it reflects how much students at the school increased their academic performance over time compared to how much the average student improved.  We feel growth is a much more informative indicator of how schools are educating students than achievement, and are pleased that it is much less correlated with school poverty rates.

Unfortunately, schools can’t verify this growth information because it is calculated at the state level- using the relationship between current and historical test scores of every student in the state to develop a ‘predicted score’.  This score is then compared to each student’s actual score to determine if the student’s academic achievement as measured by the ACT Aspire assessment was more than expected, as expected, or less than expected.  These student-level scores are averaged at the school-level and reported as a reported to the school as a transformed variable with a mean of 80 and a standard deviation of around 3. We are confident that the calculations are correct, and would advise schools to worry less about re-calcaulting the values themselves (which they can’t due to only having access to their school’s data) and more about understanding what this indicator means.

Many schools throughout the state are familiar with NWEA’s MAP Growth, where there is a target score required for students to meet annual growth.  This makes it easy for schools to identify if students met or exceed growth.  NWEA has information with 370 million test event records spanning more than 15 years, so they have a really good idea of how a typical student will increase their score over time.  ACT Aspire is a relatively new assessment however, so we want to make sure that we aren’t ‘guessing’ how much a typical student ‘should’ increase.   Instead, the state uses real data to inform how much a typical student DID increase from one year to the next and then compares that to the performance of students with similar test score histories.

Growth is really the most meaningful at the student level.  If students in program X are not meeting growth expectations, while students in program Y are, then careful consideration should be given to re-allocating resources so more students can benefit from program Y.  In discussions with the ESSA advisory team yesterday, we were thrilled to hear that the state may be able to provide student-level growth information in the future which would be super valuable to school leaders as they develop a plan to enhance student learning and resource allocation.

The Most Distracting- SQSS

The School Quality and Success (SQSS) indicator is a mouthful, but is really the smallest contributor to the overall school performance score. Since parts of SQSS reflects achievement, it is not surprising that SQSS scores are also negatively correlated with school poverty rates (r= -0.48).

As we have said before there are a lot of indicators included in this measure and school personnel may be spending a lot of time focusing on each indicator and wondering if the data are accurate. For some indicators, schools could verify the data through their own systems by applying the business rules, but for other indicators they cannot.  The first time schools saw the SQSS indicators was last spring, and the data included in the current school performance reports was pulled soon after.  Because SQSS indicators represent systems in place at schools, such as attendance reporting practices and course enrollment, and because these systems may require some time to adjust, we don’t expect to see large (or meaningful) changes in these scores yet.

We like how SQSS indicators can help schools get more accurate information about what is happening at their school, but are looking forward to when they are presented in a way that schools can really use them in their strategic planning to support student outcomes.

The continuous cycles of inquiry will take time to develop and build, and will require some new feedback systems to support schools and districts as they work to identify needs within their systems.



What’s in a Grade? 

The school performance report also supports Arkansas’ legislation that every school must receive an A-F letter grade. The letter grade was designed to create a method for parents to easily understand the quality of a school, but does an A-F letter grade really give parents the information that they need about how a school is doing? One stakeholder in yesterday’s report captured the challenge of assigning schools letter grades:

“How do you make something that is simple enough to be understood, like an A through F rating system, but also incorporate a number of different factors that are complex enough to capture all of the things we want schools to do? Everything from math and reading to also discipline data or enrollment data or attendance data or all these other sort of facets of that system. So how do you make something that is usable and understandable, but also nuanced?”

Another stakeholder pointed out how the A-F grade represents what matters to the developers of the metrics:

“I sort of feel like the single rating of either A through F or on a number is sort of the worst impulses of accountability. Because not only are you saying what matters by its inclusion in that, but how much it matters, by how it’s weighted. So man, that takes a lot of faith in yourself that you can specify how much you should care about academics relative to attendance, relative to these other things.”

We agree- and have addressed before how although the intention was to make growth scores weight more heavily in the school performance reports and associated letter grade determination.  When schools where students make the largest improvements in achievement can still be saddled with a low grade due to the characteristics of the population they serve, we are sending the message that growth doesn’t really matter.   And as long as the performance index results in schools serving more advantaged students getting higher letter grades, we also send the message to parents that what makes a good school is not the learning that happens inside the building, but how large the houses are that surround the school.

We urge educators and parents to focus on the academic growth indicator, and view the purpose of a school performance report as the beginning of an ongoing conversation about how to continually increase student learning. 

%d bloggers like this: