University of Arkansas Office for Education Policy

Play It Again Sam… Letter Grades and ACT Scores

In The View from the OEP on October 17, 2018 at 12:20 pm

It’s been a big week in Arkansas education, with A-F Letter Grades and ACT scores being released, but both scores are generally a repeat of last year’s results.

As expected, the majority of schools (67%) received the same letter grade as last year.  There was some movement up a level (16% of schools) or down a level (17%) of schools, but these were generally more reflective of schools crossing over a classification threshold than significant changes in performance.  ACT scores for Arkansas’ 2018 graduates were just the same as they were for the Class of 2017. The average composite score of 19.4 and the determination that 17% of Arkansas graduates met the readiness benchmarks in all four tested areas are both below the national average and exactly the same as last year’s results.

How are school leaders, parents, and policy makers to interpret the static results?  Here at the OEP, we hope to share some tools for interpreting the new data. We also want to get into the weeds about how the Achievement and Growth scores interact in the ESSA model.

Before we jump into the details, we have some good news to highlight!

First- we congratulate ADE on getting the school performance information out so quickly this year. This helps schools evaluate how their school is serving students and make changes in a timely manner to address areas where the numbers are low.

Second- we celebrate the transparency that ADE has built into the system, and we fully support Arkansas’ decision to provide all students the opportunity to take the ACT for free.  These are positive decisions for students!  We love the public availability of the school performance reports through myschoolinfo.arkansas.gov– especially the option to compare achievement and growth performance with similar schools.  This option is available within the Reports tab by selecting similar schools based on %FRL, racial percentages and/or geographic proximity (see images below). If you haven’t checked this out, you should!

School filter

apply filter

 

 


The data is a lot to take in, and it can be tough to try to figure out what’s important and what the patterns are. We have posted this interactive data viz of the Letter Grades and associated scores to help you see a statewide picture.  You can find your school, and filter by academic growth score and poverty rate to see how schools that are similar to yours have scored.

OEP’s Interactive 2017-18 ESSA Data Visualization

ESSA Viz

Even with all the data out there, however, we are concerned that it may be difficult for many stakeholders to understand how their school is performing. In addition, we think each indicator provides insight independently, and so we made these simple one-pagers for easy reference.  To ease interpretation for stakeholders, we assigned each school a percentile rank (within their assigned grade span) for the Achievement, Growth, and School Quality indicators.  While this is too superficial for school personnel to act upon, we hope you find it a helpful communication tool. You can get them from our website officeforeducation policy.org, or here:

OEP’s One-Page School Summary of 2017-18 ESSA Data

One Pager


Now we are going to dig into the school performance data, and consider the new letter grades. Don’t worry – there are pictures, and you can download the data we used here!

As expected, the majority of schools (67%) received the same letter grade as last year.  There was some movement up a level (16% of schools) or down a level (17%) of schools, but these were generally more reflective of schools crossing over a classification threshold than significant changes in performance. Overall, the percentage of schools receiving each letter grade was similar to last year.

Figure 1: School Letter Grade Percentages, 2016-17 and 2017-18.

2017-18 LetterGrade Chart


Certain types of schools are more likely to get A’s and B’s.

Once again this year, we find that schools serving a lower percentage of students who participate in Free/Reduced Lunch (FRL) generally get better grades than schools who serve a more disadvantaged population.  As you can see in Figure 2 below, there is a strong negative relationship between FRL and ESSA scores (r=-0.69) although schools with the same %FRL have a range of about 20 ESSA points. Take note that there are a lot of schools reporting 100% FRL, which is the result of districts participating in the community eligibility provision.  We like the CEP, but some of these schools reported much lower FRL rates in years past, so we should use caution when considering the performance of these schools since it may not be an apples-to-apples comparison.

Figure 2: School ESSA Index Score and %FRL, 2017-18.

ESSA FRL

As we’ve said before, the ESSA index is mainly driven by achievement, and two scores are almost perfectly correlated at r=.97.  This is frustrating because the model was supposed to consider growth more heavily than achievement, but in reality achievement scores overwhelm growth. In addition, you can see in Figure 3 that this year’s achievement is almost perfectly correlated with prior year achievement (r=.94). So high achieving schools (generally lower FRL schools) tend to get better grades both years.

Figure 3: Weighted Achievement, 2016-17 and 2017-18.

Achievement 2 year


What about GROWTH?

The good news is that academic growth (our favorite) is less associated with school FRL rates than achievement (see in Figure 4 (r=-.25)). Growth is the indicator that measures how students scored compared to how well we predicted they would score based on prior achievement.  We feel it is very important to examine this indicator carefully, as it is the best reflection of the learning occurring in our classrooms.

Figure 4: Academic Growth and %FRL, 2017-18.

growth FRL

Interestingly, as you can see in Figure 5, academic growth is strongly correlated with prior year growth (r=.80).  This means that schools with high growth in 2016-17 also had high growth in 2017-18. The reverse is also true, unfortunately, as schools where students performed lower than predicted in 2016-17 demonstrated the same pattern in 2017-18.

Figure 5: Academic Growth 2016-17 and 2017-18.

Growth 2 year

This is great news for Arkansas educators!  We have an indicator of student learning that is not very correlated with school poverty, but seems to be consistently identifying schools as high, average, or low growth. Keep an eye out for the OEP awards, which will celebrate the high growth school!


But does increased growth relate to increased achievement, as we would think it should? The figure below represents the change in growth score from 2016-17 to 2017-18 and the change in achievement score over those same two years. Although increased growth does not always correlate with positive changes in achievement (because students can be achieving at a higher level than predicted but not necessarily making it into another proficiency category), there are few schools in the lower right quadrant where growth decreased and achievement increased. Of greatest concern are the schools in the lower left quadrant, where growth and proficiency both decreased since the prior year.

Figure 6: Change in Academic Growth and Achievement, from 2016-17 to 2017-18.

Growth Achievement


So what should school leaders, parents, and policy makers do?

  1.  Get into the data!  Understand and communicate how your school is performing on each indicator compared to similar schools.  If there is an indicator where scores are relatively high, build on that success.  If there is an indicator where scores are relatively low, consider what might be contributing tho that score and work to change students’ school experiences. Stakeholders and policy makers should celebrate school successes, and work to support development in low-performing indicators.
  2. Focus on the learning every day! The ESSA Index, the ACT Aspire, and the ACT tests are once-a-year snapshots of performance.  Teachers should be using high quality ongoing formative assessments to understand where their students are and work to help them move them forward in their learning. Stakeholders and policy makers should support schools as they work to make an opportunity for growth for their students.
  3. Stay the course! Change takes time. To move the needle on student achievement and success in Arkansas, we need to put in the work and give the work time.  Changing assessments or backing away from rigorous and high-quality school performance analyses will only add instability into the system.

Arkansas education is on the right track, so let’s keep playing that tune.

 

 

 

Advertisements

Arkansas’ Struggling Readers

In The View from the OEP on October 10, 2018 at 3:12 pm

Today we are excited to release new research about Arkansas’ struggling readers. We thought since school performance reports are being released this Friday, it is a good time to remember that actual kids are behind the test scores used to generate the reports.  We hope you take a moment to reflect on who Arkansas’ struggling readers are, and how their reading skills develop through early high school.

We think this research is particularly important in light of all the effort that Arkansas educators are putting into improving early reading ability. By better understanding the historical improvement patterns of students who demonstrate low reading ability in third grade, we can better evaluate the effectiveness of the efforts to improve outcomes for struggling readers.

We examined the reading achievement of nearly 77,000 Arkansas students who were continuously enrolled in Arkansas public schools from 3rd grade through early high school. We hope you read the policy brief and more in-depth Arkansas Education Report, but we briefly summarize our findings here:


Who isn’t reading ‘on grade level’ in 3rd grade?

  • Students who qualify for free or reduced price lunches were twice as likely to be low-achieving readers in 3rd grade, compared to their more economically advantaged peers.
  • Students who are Black or Hispanic were twice as likely to be low-achieving readers in 3rd grade, compared to their White peers.

We know- you’re like “Duh” any teacher could have told you that, but it is important to have the data, the facts, about out struggling readers.

  • Males are somewhat more likely than females to be identified as low-achieving readers but the difference is not as large as it is between economic and racial groups.
  • ELLs are somewhat more likely than non-ELLs to be identified as low-achieving readers but the difference is not as large as it is between economic and racial groups.

Do the students who demonstrate low reading achievement in 3rd grade ‘catch up’ to their peers over time and what are the characteristics of students who do?  Note- we use standardized scores (z-scores) to examine student achievement over time due to changes in assessment.  You can read more about the methodology in the full report.

  • Of students who were initially low-achieving in 3rd grade, 12% ‘caught up’ to average state reading performance by early high school.
  • Students who were economically advantaged, White, Hispanic and/or female students were more likely to reach average reading achievement by early high school than their Black, male, and economically disadvantaged peers.
  • Among over 6,000 Black students who were identified as low-achieving in 3rd grade, only 6% demonstrated average reading achievement by early high school.
  • All types of low-achieving students demonstrated large improvements between 3rd and 4th grades, although rates of improvement after 4th grade is very different for different types of students.

 

Presented below are the standardized scores of initially low-achieving students from 3rd through 10th grade.  Results are presented by FRL participation and by race.

Figure 1: Average Reading Scores of Initially Low-Achieving Students: Grade 3 through 10 by Economic Disadvantage (FRL) Status g3 reading frl

Figure 2: Average Reading Scores of Initially Low-Achieving Students: Grade 3 through 10 by Race

g3 reading race

None of these initially low-performing student groups, even White or economically advantaged students, caught back up to the state average as a group by early high school.

  • Hispanic and economically advantaged students are achieving almost a half standard deviation increase in achievement as a group, and White students are making approximately 0.4 standard deviation increase, while Black and low-income students are making closer to a quarter of a standard deviation increase in achievement.
  • Even though low-achieving Hispanic students initially have very low average scores, they make advancements comparable to those of White students, the most advantaged group. This is an exciting trend to observe because it indicates potential for a narrowing achievement gap between White and Hispanic students.

You might be thinking that there are differences in reading score improvement between Hispanic students who are identified as ELL and Hispanic students who are not. We were pleased to find that both ELL and Non-ELL students made large gains in reading achievement over time as presented in Figure 3.

Figure 3: Average Reading Scores of Initially Low-Achieving Students: Grade 3 through 10 by English Language Learner Status

g3 reading ELL

 


In summary, Arkansas students face large and persistent racial and socioeconomic disparities in third grade reading scores. Moreover, few of our students who are struggling readers in third grade ever catch up to the state average. And these are for relatively stable students, those who are continuously enrolled in our schools from grades 3-10.

Our hope is that Arkansas’ average reading scores will continue to increase and all students will grow to read proficiently, but it is evident that special attention needs to be given to low income and racial minority students and students who are struggling with basic reading skills in third grade.

Although some schools saw double-digit reading proficiency gains after RISE trainings in 2017, similar improvement was not reflected on 2018 assessments. Programs must be carefully monitored to determine what, if any, impact they are having on changing the long-terms outcomes for students who, as demonstrated in this research, are likely to continue to struggle to read proficiently throughout their educational experience.

Schools and districts should carefully examine the progress of their struggling readers and consider the effectiveness of any interventions or programs that are being implemented.  Although this analysis uses state assessments as the measure of student achievement, schools and districts should examine multiple measures, including high quality formative assessments, to evaluate progress in student’s reading.

We must continue to strive to ensure that all students are leaving elementary school as competent readers, equipped with the literacy foundation necessary for future academic success.

Unpacking School Performance Ratings

In The View from the OEP on October 3, 2018 at 1:52 pm

Arkansas school performance ratings and A-F letter grades will be released to the public on October 12th.

Here’s what we think you can expect:

1) Most schools will get the same Letter Grade as they did last year.

2) Schools serving a smaller percentage of students eligible for free and reduced lunch will be more likely to get an “A” or a “B” than schools serving a population where a greater percentage of students experience economic hardship.

3) Arkansas’ growth measure- a powerful indicator of students’ academic improvement over time- will still be over shadowed by single-year achievement and will still be challenging for educators and parents to understand.

____________________

In advance of the release, we wanted to review the purpose of a school performance report.

This is the true baseline year for Arkansas’ new accountability system.  The state has worked diligently to develop an accountability system that will support student learning, as presented in the theory of action presented below.

Theoryofaction

The idea is that if schools get good information about what is really happening in their schools, then they can make improvements that will improve outcomes for students.

The continuous cycles of inquiry will take time to develop and build, and will require some new feedback systems to support schools and districts as they work to identify needs within their systems.

But does a school performance report really give schools the tools that they need to improve?  It’s complicated, and not just for Arkansas! A report released yesterday provides some insight into how stakeholders feel about school accountability across the nation.

One of the big benefits of school accountability identified in the report has been the increased transparency and quality of information about what is going on in schools. The interpretation of the data and the ability of schools to interpret the data and develop a plan to improve learning and resource allocation is key to Arkansas’ plan, as presented below.

Cycle

Arkansas’ school personnel have been able to preview the school performance report since September 21, and we can tell that they have because ADE has made some updates to the reports and extended the private review deadline.

We appreciate the state letting school personnel review the information, and fully support the pursuit of high quality data in the system.  But we worry that school personnel are spending a lot of time trying to check the scores, as opposed to interpreting it and developing a plan to support student learning.

One thing that we think will help school personnel re-direct their time from checking a bunch of data points to developing a plan to support student learning, is to really understand what is driving the school performance score.

The Biggie- Achievement

The majority of the school performance score is determined by student achievement on the 2017-18 ACT Aspire English Language Arts and Mathematics performance of students in grades 3-8.  Wealthier schools will generally have higher achievement.  We know that achievement on assessments is negatively correlated with student risk factors such as poverty.  You can check out the relationship in our data visualization.

Schools and the public have had information about achievement since scores were released early in the summer, which was much earlier than in previous years. Schools may have difficultly, however, calculating what their weighted achievement score would be because student ELA scores not reported by the full range of categories used in the school performance calculations.  Additional information about cut scores for the full range of performance categories is included in the Final ESSA Decision Rules, and could be applied to student scores but that would take quite a bit of time!

Note: As we mentioned in our earlier blog– achievement on the ACT Aspire pretty much stayed the same as last year, but the weighted achievement scores will be lower than last year for most schools because ACT Aspire modified the criteria for ELA readiness. Thankfully, because of the forward thinking actions taken by the ADE to equate the scores with prior years and adjust the cut points for the letter grade, the lower achievement scores should not result in lower school letter grades across the state.

The Most Important (to us!)- Academic Growth

Here at OEP we feel that this is the most important piece of information in the school performance report because it reflects how much students at the school increased their academic performance over time compared to how much the average student improved.  We feel growth is a much more informative indicator of how schools are educating students than achievement, and are pleased that it is much less correlated with school poverty rates.

Unfortunately, schools can’t verify this growth information because it is calculated at the state level- using the relationship between current and historical test scores of every student in the state to develop a ‘predicted score’.  This score is then compared to each student’s actual score to determine if the student’s academic achievement as measured by the ACT Aspire assessment was more than expected, as expected, or less than expected.  These student-level scores are averaged at the school-level and reported as a reported to the school as a transformed variable with a mean of 80 and a standard deviation of around 3. We are confident that the calculations are correct, and would advise schools to worry less about re-calcaulting the values themselves (which they can’t due to only having access to their school’s data) and more about understanding what this indicator means.

Many schools throughout the state are familiar with NWEA’s MAP Growth, where there is a target score required for students to meet annual growth.  This makes it easy for schools to identify if students met or exceed growth.  NWEA has information with 370 million test event records spanning more than 15 years, so they have a really good idea of how a typical student will increase their score over time.  ACT Aspire is a relatively new assessment however, so we want to make sure that we aren’t ‘guessing’ how much a typical student ‘should’ increase.   Instead, the state uses real data to inform how much a typical student DID increase from one year to the next and then compares that to the performance of students with similar test score histories.

Growth is really the most meaningful at the student level.  If students in program X are not meeting growth expectations, while students in program Y are, then careful consideration should be given to re-allocating resources so more students can benefit from program Y.  In discussions with the ESSA advisory team yesterday, we were thrilled to hear that the state may be able to provide student-level growth information in the future which would be super valuable to school leaders as they develop a plan to enhance student learning and resource allocation.

The Most Distracting- SQSS

The School Quality and Success (SQSS) indicator is a mouthful, but is really the smallest contributor to the overall school performance score. Since parts of SQSS reflects achievement, it is not surprising that SQSS scores are also negatively correlated with school poverty rates (r= -0.48).

As we have said before there are a lot of indicators included in this measure and school personnel may be spending a lot of time focusing on each indicator and wondering if the data are accurate. For some indicators, schools could verify the data through their own systems by applying the business rules, but for other indicators they cannot.  The first time schools saw the SQSS indicators was last spring, and the data included in the current school performance reports was pulled soon after.  Because SQSS indicators represent systems in place at schools, such as attendance reporting practices and course enrollment, and because these systems may require some time to adjust, we don’t expect to see large (or meaningful) changes in these scores yet.

We like how SQSS indicators can help schools get more accurate information about what is happening at their school, but are looking forward to when they are presented in a way that schools can really use them in their strategic planning to support student outcomes.

The continuous cycles of inquiry will take time to develop and build, and will require some new feedback systems to support schools and districts as they work to identify needs within their systems.



What’s in a Grade? 

The school performance report also supports Arkansas’ legislation that every school must receive an A-F letter grade. The letter grade was designed to create a method for parents to easily understand the quality of a school, but does an A-F letter grade really give parents the information that they need about how a school is doing? One stakeholder in yesterday’s report captured the challenge of assigning schools letter grades:

“How do you make something that is simple enough to be understood, like an A through F rating system, but also incorporate a number of different factors that are complex enough to capture all of the things we want schools to do? Everything from math and reading to also discipline data or enrollment data or attendance data or all these other sort of facets of that system. So how do you make something that is usable and understandable, but also nuanced?”

Another stakeholder pointed out how the A-F grade represents what matters to the developers of the metrics:

“I sort of feel like the single rating of either A through F or on a number is sort of the worst impulses of accountability. Because not only are you saying what matters by its inclusion in that, but how much it matters, by how it’s weighted. So man, that takes a lot of faith in yourself that you can specify how much you should care about academics relative to attendance, relative to these other things.”

We agree- and have addressed before how although the intention was to make growth scores weight more heavily in the school performance reports and associated letter grade determination.  When schools where students make the largest improvements in achievement can still be saddled with a low grade due to the characteristics of the population they serve, we are sending the message that growth doesn’t really matter.   And as long as the performance index results in schools serving more advantaged students getting higher letter grades, we also send the message to parents that what makes a good school is not the learning that happens inside the building, but how large the houses are that surround the school.

We urge educators and parents to focus on the academic growth indicator, and view the purpose of a school performance report as the beginning of an ongoing conversation about how to continually increase student learning.