University of Arkansas Office for Education Policy

Posts Tagged ‘Arkansas’

Growth versus Proficiency in Arkansas

In The View from the OEP on April 5, 2017 at 12:55 pm

One of our favorite topics came up recently during the confirmation hearing for the Federal Secretary of Education when the question about “Growth- versus- Proficiency” as the best method for measuring schools was raised. This is an important question for students, parents, educators and policymakers and we think it is important for all to understand how these measures of school quality are being used in Arkansas, the difference between using growth and proficiency to evaluate school performance, and how these measures relate to school-level characteristics.

Tall to Ride.pngProficiency measures if students met the criteria set by the state to measure ‘grade level’ performance on the state test. We like to think of it as the “You Must Be This Tall To Ride” signs like they have at amusement parks.

Wall Growth.png

Growth measures if students grew academically from one year to the next as much as we expected them to grow based on what we know about them. This is similar to marks on a wall that parents use to track their child’s increasing height over time.

In today’s blog we examine Arkansas’ use of growth measures and how they relate to other school characteristics.

Proficiency rates have been the main measure of Arkansas school quality since No Child Left Behind, but Growth has been included in the state’s A-F school letter grades since 2014. Arkansas is currently developing a new plan to evaluate and report school quality and to identify schools in need of additional support under the federal requirements of the Every Student Succeeds Act (ESSA). In addition to these applications of proficiency and growth rates, under the Arkansas School Recognition Program, schools in the top 10% of the state in proficiency and those in the top 10% of the state in growth are provided monetary rewards (you can find the list of top schools here: and read our prior blog about it.

Success is more than “Proficiency”

Over the past 15 years since No Child Left Behind, students, parents, educators and policymakers have realized that proficiency rates have serious limitations when describing the quality of a school.

  • Certain types students are more likely to be proficient on the state test. Those who come into kindergarten already reading vs. those who may not have been exposed to books often in the home. Students who were well fed, both before and after birth, as opposed to those suffering from food insecurity. In Arkansas, as elsewhere, non-minority, non- disadvantaged students tend to be more likely to meet state “Proficiency” standards. When we use proficiency to evaluate schools, like we did under NCLB, it is very difficult for schools serving minority and disadvantaged students to be successful. Even if the schools are doing an excellent job educating the students, it is difficult to get students who started school behind to meet grade level expectations.
  • Can lead to focus on improving the skills of students who are ‘close’ to meeting proficiency targets, and neglecting high-performing students who are definitely going to exceed the proficiency target or struggling students who are very unlikely to meet the proficiency target at the end of the year.
  • States created their own assessments and set their own ‘proficiency’ criteria, meaning that a student ‘proficient’ in one state may not be ‘proficient’ in another.

How Can We Measure Student Growth?

To try to quantify how effective a school is at increasing the knowledge of its students, many states, including Arkansas, are measuring changes in individual student achievement over time. There are many different models for measuring growth, but Arkansas’ model uses a student’s prior academic performance on state assessments to predict where the student will likely score, then compares the actual score to the predicted score.

The process is illustrated in Figure 1. The dark blue dots represent an individual student’s score history on state assessments from third to seventh grade. Using these prior scores, researchers generate a trajectory (represented by the dotted line) that predicts what the student would be likely to score on the 8th grade test. If the student scores at the light blue dot, which is on the trajectory, then the student grew as expected. If the student scored at the level of the green dot, the student grew more than expected, and if the student scored at the level of the red dot, the student grew less than expected. It is important to note that student “Proficiency” is not indicated in the figure. This student may be well below, well above, or right at ‘grade level’, but this standard is not considered in relation to the student’s academic growth over time. You can learn more about the process here.

Figure 1: Example of Arkansas’ Longitudinal Growth Model

Growth Model

Students included in Arkansas’ growth model were tested on the state assessment in English Language Arts and Math, in grades 2 through 10, and had prior state assessment scores.  When predicting student scores, we should use as many years of a student’s score history as are available. Although two students scored similarly on state assessments in the year immediately prior to the prediction, they could have had very different score histories; perhaps one was declining but the other was increasing.

This student-level growth model becomes a “Value-Added” model when the growth is attributed to the school. The student-level growth scores are averaged at the school–level, and this is a measure of how “effective” the school has been at educating students, or, in other words how much “Value” the school has “Added” to its students’ learning.

Interpreting Value-Added scores

Schools that receive a growth score close to 0 enroll students who grew academically at the expected rate. Positive values indicate that the school had students, on average, grew more than expected, while negative values indicate the average student at the school grew less than predicted. These value-added scores have a mean value of 0, and the standard deviation at the school level is about 0.07.

What do we know about school-level growth scores?

Arkansas has reported school level values from the current growth model for two years. The results can help inform educators, parents, and policymakers about schools that are making greater than predicted academic improvements, and which schools might need additional support to ensure that all students are growing academically.

So, quick question…

Given what you know about Arkansas’ growth model- do you think the scores will be related to other school characteristics such as school size, the percentage of students eligible for Free/ Reduced Lunch, or proficiency rates? 

School size:

Do you think school size would be related to its students’ academic growth? We have no reason to think that smaller schools or larger schools would produce better growth rates, but want to use it as an introduction to interpreting school-level value-added scores.

We examined the relationship between 2015-16 school value-added scores and the number of students included in the model. Students included were tested on ACT Aspire English Language Arts and Math in grades 3-10, or the ITBS in grade 2, and had prior state assessment scores.

The scatter plot of the values for all for Arkansas schools is presented in Figure 2. The vertical axis presents the school value-added score, and ranges from -0.5 to +0.5. The horizontal axis represents the number of students included in the growth model for the school and ranges from 0 to 3,000 (we trimmed a few extremely large schools from the graph for illustrative purposes).

As you can see from the scatter plot, there is essentially no relationship between the number of student tested at a school and the value–added score that the school received. The correlation is essentially zero at +0.08.

Figure 2: Arkansas’ School-level Value-Added Scores and Number of Students Assessed/Included in Growth Model, 2015-16.


Free/Reduced Lunch:

Do you think schools serving more economically-advantaged students would have higher growth rates? Or would you expect a school with high enrollment of disadvantaged students would have “more room to grow” and that this would be reflected in higher growth scores? We examined the relationship between 2015-16 growth values and rates of FRL eligibility for all for Arkansas schools.

The scatter plot of the values for all for Arkansas schools is presented in Figure 3. The vertical axis presents the school value-added score, and ranges from -0.5 to +0.5.  The horizontal axis represents the school-level percent of students eligible for Free/ Reduced Lunch, a proxy measure for low socio-economic status, and the values range from 100% to 0%.

As you can see from Figure 3 below, there is some relationship between the number of student tested at a school and the value –added score that the school received. While at the majority of Free/ Reduced Lunch rates there are schools that received higher value added and lower value-added scores, almost all school with fewer than 30% of students eligible for FRL received positive growth values. The correlation is between value-added and school- level FRL is and moderately negative at -0.48.

Figure 3: Arkansas’ School-level Value-Added Scores and Percentage of Students Eligible for Free/Reduced Lunch Program, 2015-16.


Proficiency Rate:

Would you expect schools with students who are higher achieving to have higher growth rates? Or would you expect that school with more low-performing students would have higher value-added scores because the students have “more room to grow”. We examined the relationship between 2015-16 growth values and proficiency rates for all for Arkansas schools. Because 2015-16 proficiency rates would be directly impacted by student growth (higher growth would lead to higher proficiency), we used the 2014-15 school proficiency rates to examine the relationship between proficiency and growth.

The scatter plot of the values is presented in Figure 4. The vertical axis represents the school value-added score, and ranges from -0.5 to +0.5. The horizontal axis represents the percent of students meeting or exceeding benchmarks on the 2015 PARCC Math and ELA assessments in grades 3-10, and the values range from 0% to 100%.

As you can see from Figure 4, there is a moderate relationship between the number of students scoring proficient at a school and the value –added score that the school received: schools that had proficiency rates over 50% in 2015 also had higher value-added scores in 2016. School with lower proficiency rates had greater variability in the value-added scores but were more likely to receive low value-added in 2016. The correlation between value-added and prior year proficiency is moderately positive at +0.57.

Figure 4: Arkansas’ School-level Value-Added Scores (2015-16) and Percentage of Students Scoring Proficient, 2014-15.


In Figure 5, we overlay the scatte rplot presented in Figure 4 with colored zones to facilitate visualizing the patterns in the data.  In Figure 5, the upper-left quadrant represents schools where students were below the state average in proficiency rates, but demonstrated greater than expected growth (note schools in the white band are representing expected growth). The upper-right quadrant represents schools where students were above the state average in proficiency rates, and demonstrated greater than expected growth. The lower-left quadrant represents schools where students were below the state average in proficiency rates, and demonstrated lower than expected growth, and the lower-right quadrant represents schools where students were above the state average in proficiency rates, but demonstrated lower than expected growth.

While no one would argue that we should celebrate the schools with high proficiency and high growth, and support the schools with low proficiency and low growth, many disagree about whether we should be more concerned about schools with low proficiency and high growth or schools with schools with high proficiency and low growth.

Figure 5: Arkansas’ School-level Value-Added Scores (2015-16) and Percentage of Students Scoring Proficient, 2014-15 with Highlighted Quadrants.


How Consistent are Value Added Scores?

A common concern regarding value-added scores is that they are inconsistent – fluctuating up and down over time. Arkansas only has two years of Value-added data, but we looked to see how consistent the values were. The scatter plot of the values in presented in Figure 6. The vertical axis represents the 2016 value-added score while the horizontal axis represents 2015 value-added score. The axes both range from -0.5 to +0.5.

As you can see from Figure 6, there is a weak relationship between the school-level value–added scores from 2015 and 2016. Some schools that scored positively in 2015 had negative value-added for 2016 and vice versa. The correlation between school-level value-added from 2015 and 2016 is weak at 0.35.

Figure 6: Arkansas’ School-level Value-Added Scores,  2014-15 and 2015-16.


What does all this this mean?

Arkansas’ longitudinal student growth model measures the academic improvement of students over time and attributes that “Value-Added” to the school that they attend. When we examine the value-added information for the two years for which data are available, we find that the values are moderately correlated with FRL and the percent of students proficient at the school in the previous year, and that the school-level value–added scores from 2015 and 2016 are only weakly related.

We strongly support measuring student level growth, and think it is definitely the right thing to do.

That said, we were surprised that the value-added scores were so correlated with prior year proficiency and FRL rates. We were hoping they would be more independent, like school size, because when they aren’t it makes us wonder why individual student growth would be related to these school characteristics.

We must remember, however, that we are just examining the correlations, and ‘correlation is not causation’. Perhaps high-achieving schools are presenting students with more rigorous material that promotes student growth? Maybe some low performing schools have ineffective practices, or feeling defeated after years of “not meeting proficiency”. The goal of using student growth is to isolate the impact of the school as much as possible, excluding external factors from our measurements of school quality.  When we see the relationship between proficiency, poverty, and growth we are concerned that we may not yet have achieved success.

We are also somewhat concerned that the value-added score for the two available years are only weakly related at 0.35. On the one hand, we can see that things may change within a school that might have a positive or negative impact on school growth (teachers, principal, curriculum, etc.), and we may not WANT consistency but some variability to reflect the impact of changes in the school on student growth. On the other hand, we are somewhat surprised that the two years are so weakly correlated while proficiency is strongly correlated across the two years at 0.81.

Although the student growth model is well suited for measuring change even over different assessments like Arkansas has experienced, perhaps all the changes have resulted in inconsistent growth data. Not because of the model itself, but because of the different test content/ format, different performance expectations, students getting used to taking the assessments on the computers, etc.

Next Steps:

It may take some time for parents, educators, and policymakers to better understand what the value-added scores mean and how to act on the information to support student learning, but there are some things we recommend doing right now in regards to value-added scores.

  • Learn more about Arkansas’ value-added Scores (you can check this one off already!)
  • Check out your school’s value-added from 2015 and 2016 (you can access it here)
  • See how your school’s value-added score compares with “similar schools”- remember to consider differences of 0.04 or less as being essentially the same as your school.
  • State department personnel are likely reviewing schools that have had very low value-added scores for the past two years to see if additional support is needed, while schools that have had very high value-added scores should be reviewed to see if they are implementing any unique practices that could transfer to other schools.
  • Don’t get TOO wrapped up in the value-added scores yet. We look forward to examining the 2016-17 value-added scores to see if the relationship between these key variables is changing.
  • Chime in on ESSA planning.  You now have a better understanding than Betsy DeVos of the proficiency-versus-growth issue, and particularly how it impacts schools in Arkansas.  Let your voice be heard on the new state plan!

Perhaps this was more than you wanted to know about the ins and outs of growth and proficiency, but here at OEP, we believe the more you know about the measures being used to measure student and school success, the better decisions we can make to support our schools.

Stop Scapegoating: Educating kids should be the focus

In The View from the OEP on January 4, 2017 at 12:35 pm

In case you missed it- we wanted to share our Op-Ed from the paper this weekend about charter school enrollment in Little Rock.


img_3836The approved expansion of two Little Rock-area charter schools led many to express fears that charter schools skim off the easiest-to-educate students and leave “those other kids” for traditional schools. Specifically, concerns were raised that charters would decrease the white population of Little Rock School District and increase the district’s percentage of poor students.

We at the Office for Education Policy also care about the interactions between public charter schools and traditional public schools and decided to investigate what the data had to say about these questions. We examined student-level enrollment and academic data from the 2008-09 to 2014-15 school years. We tracked annual student moves to understand who leaves the Little Rock district for charters and how those moves impact racial and socioeconomic integration.

We found that students who left the district for charters were typical, both demographically and academically, and their exits increased racial and socioeconomic integration in the district.

As a reminder, charter schools are public schools. Like traditional public schools, there is no cost for students to attend. Unlike traditional public schools, to which students are assigned based on their address, open-enrollment charters are open to any student. Charters are authorized to serve a specific number of students, so students must apply for a seat. If more students want to attend than there are seats, students are selected through a random lottery. Students who are not selected can remain on a wait list. Charter schools cannot select or reject student applications based on demographic or academic characteristics. Charters must administer all state exams and abide by identical accountability requirements.

About 15 percent of students (excluding graduates) leave the Little Rock School District each year for some other schooling option. We were surprised to find that nearly half of these students (7 percent) leave the Arkansas public school system entirely–they move out of state, drop out, or select private or home school settings. Some (6 percent) move to other public school districts; half move nearby to the North Little Rock or Pulaski County districts, and half move to other public schools in the state. Perhaps surprisingly, given all of the attention given to charter transfers, only 2 percent (fewer than one of every seven who leave) of students transfer from the Little Rock district to charter schools each year!

What do we know about these students?

First, the 2 percent of students who transferred into area charters were representative of the district student population as a whole. Students who moved to charters were 64 percent black and 19 percent white, compared to the district population of 67 percent black and 20 percent white. Socioeconomically, 61 percent of students who moved to charters were eligible for free/reduced lunch, while 69 percent of district students participated. Students who left for area charters were not more likely to be white or economically advantaged than the overall district population.

Students who left for area charters performed similarly on state assessments as students who remained. In four of the six years examined, there were no statistically significant differences in scores between students who left for charters and those who remained in the district. However, students who left for charters were average performers in their school in all years examined. This finding refutes the argument that charters poach the best students.

Further, we found that when students exited the district for charters, the schools they left behind became less racially and/or socioeconomically segregated.

Our findings contradict critics’ concern that charters increase racial and socioeconomic segregation. One fact we must acknowledge is that Little Rock district schools are already racially and socioeconomically segregated. Thus, when students exit, they are most often leaving segregated settings. We found that black students who leave tend to exit schools with an above-average percentage of black students, and white students leave schools with an above-average percentage of white students.

Residential segregation in Little Rock, as in many other cities throughout the U.S., results in racial and socioeconomic segregation of residentially assigned public schools. Charter schools allow for students to enroll regardless of ZIP code. Little Rock families who choose to sever the link between where they live and the school that their children attend are countering the racial and socioeconomic segregation of traditional public schools.

Those who are passionate about equity should stop demonizing charters and chasing the false argument that charters cause segregation; instead, we should focus our collective energy on providing an affirming and effective learning environments for all Little Rock public school students–regardless of sector.

A wise school leader once said that “the students don’t care whether the sign outside the school says ‘Charter’ or not.” They simply need effective teachers who care about them and prepare them for the future.

Sarah C. McKenzie is the executive director of the Office for Education Policy at the University of Arkansas. Elise Swanson is a research assistant at the Office for Education Policy and a distinguished doctoral fellow in the Department of Education Reform at the University of Arkansas.

Editorial on 12/31/2016


Stanford Charter Schools Report: National Gains; Arkansas Decline

In The View from the OEP on June 25, 2013 at 10:17 am


Stanford’s Center for Research on Education Outcomes (CREDO) released a study today looking at charter school gains across the country, including right here in Arkansas.  This is the “most comprehensive study ever conducted of charter school performance,” with over one million charter students considered in the study.

The study finds that charter schools are making progress around the nation, on average.  The twenty-six state study showed significant gains in learning for impoverished students, black students, and English language learners in charter schools, as compared to their “virtual twins” in traditional public schools. To complement this growth, these three groups have seen their enrollment in charters shows increase over the course of this study, from 2009 to 2013.

Here at the OEP, we are more than glad that these types of rigorous studies are being done to look at innovative solutions to improve education across the country.  With growth or decline, it is important for this information to be seen by politicians and parents alike, so that all can make informed decisions about the future of education.

Concerning our charter schools here in Arkansas, the study shows that while there was growth in the 2008-09 year (in math and reading), Arkansas charters have seen a slight decline in scores during 2009-11.  One potential explanation for this slow-down is that these analyses include the scores of four low-performing charter schools, which have since been closed. While this does not explain away all of the decline, it does show the Arkansas reauthorization system for charters does have teeth to close schools that cannot meet the standards our state expects.

The OEP has testified about the performance of a few charter schools in Arkansas to the State Board of Education.  What makes this system work is that these schools are held accountable.

As the best schools endure, we all hope that Arkansas charter schools will shine bright in the coming years, and that all schools around state, traditional or charter or any kind, will continue to grow and meet the needs of our children.


Special note: We want to give a special “shout-out” to former OEP Graduate Researcher and Arkansas native Dr. James L. Woodworth who was one of the authors of this report.  We wish him all the best!