University of Arkansas Office for Education Policy

Growth versus Proficiency in Arkansas

In The View from the OEP on April 5, 2017 at 12:55 pm

One of our favorite topics came up recently during the confirmation hearing for the Federal Secretary of Education when the question about “Growth- versus- Proficiency” as the best method for measuring schools was raised. This is an important question for students, parents, educators and policymakers and we think it is important for all to understand how these measures of school quality are being used in Arkansas, the difference between using growth and proficiency to evaluate school performance, and how these measures relate to school-level characteristics.

Tall to Ride.pngProficiency measures if students met the criteria set by the state to measure ‘grade level’ performance on the state test. We like to think of it as the “You Must Be This Tall To Ride” signs like they have at amusement parks.

Wall Growth.png

Growth measures if students grew academically from one year to the next as much as we expected them to grow based on what we know about them. This is similar to marks on a wall that parents use to track their child’s increasing height over time.

In today’s blog we examine Arkansas’ use of growth measures and how they relate to other school characteristics.

Proficiency rates have been the main measure of Arkansas school quality since No Child Left Behind, but Growth has been included in the state’s A-F school letter grades since 2014. Arkansas is currently developing a new plan to evaluate and report school quality and to identify schools in need of additional support under the federal requirements of the Every Student Succeeds Act (ESSA). In addition to these applications of proficiency and growth rates, under the Arkansas School Recognition Program, schools in the top 10% of the state in proficiency and those in the top 10% of the state in growth are provided monetary rewards (you can find the list of top schools here: and read our prior blog about it.

Success is more than “Proficiency”

Over the past 15 years since No Child Left Behind, students, parents, educators and policymakers have realized that proficiency rates have serious limitations when describing the quality of a school.

  • Certain types students are more likely to be proficient on the state test. Those who come into kindergarten already reading vs. those who may not have been exposed to books often in the home. Students who were well fed, both before and after birth, as opposed to those suffering from food insecurity. In Arkansas, as elsewhere, non-minority, non- disadvantaged students tend to be more likely to meet state “Proficiency” standards. When we use proficiency to evaluate schools, like we did under NCLB, it is very difficult for schools serving minority and disadvantaged students to be successful. Even if the schools are doing an excellent job educating the students, it is difficult to get students who started school behind to meet grade level expectations.
  • Can lead to focus on improving the skills of students who are ‘close’ to meeting proficiency targets, and neglecting high-performing students who are definitely going to exceed the proficiency target or struggling students who are very unlikely to meet the proficiency target at the end of the year.
  • States created their own assessments and set their own ‘proficiency’ criteria, meaning that a student ‘proficient’ in one state may not be ‘proficient’ in another.

How Can We Measure Student Growth?

To try to quantify how effective a school is at increasing the knowledge of its students, many states, including Arkansas, are measuring changes in individual student achievement over time. There are many different models for measuring growth, but Arkansas’ model uses a student’s prior academic performance on state assessments to predict where the student will likely score, then compares the actual score to the predicted score.

The process is illustrated in Figure 1. The dark blue dots represent an individual student’s score history on state assessments from third to seventh grade. Using these prior scores, researchers generate a trajectory (represented by the dotted line) that predicts what the student would be likely to score on the 8th grade test. If the student scores at the light blue dot, which is on the trajectory, then the student grew as expected. If the student scored at the level of the green dot, the student grew more than expected, and if the student scored at the level of the red dot, the student grew less than expected. It is important to note that student “Proficiency” is not indicated in the figure. This student may be well below, well above, or right at ‘grade level’, but this standard is not considered in relation to the student’s academic growth over time. You can learn more about the process here.

Figure 1: Example of Arkansas’ Longitudinal Growth Model

Growth Model

Students included in Arkansas’ growth model were tested on the state assessment in English Language Arts and Math, in grades 2 through 10, and had prior state assessment scores.  When predicting student scores, we should use as many years of a student’s score history as are available. Although two students scored similarly on state assessments in the year immediately prior to the prediction, they could have had very different score histories; perhaps one was declining but the other was increasing.

This student-level growth model becomes a “Value-Added” model when the growth is attributed to the school. The student-level growth scores are averaged at the school–level, and this is a measure of how “effective” the school has been at educating students, or, in other words how much “Value” the school has “Added” to its students’ learning.

Interpreting Value-Added scores

Schools that receive a growth score close to 0 enroll students who grew academically at the expected rate. Positive values indicate that the school had students, on average, grew more than expected, while negative values indicate the average student at the school grew less than predicted. These value-added scores have a mean value of 0, and the standard deviation at the school level is about 0.07.

What do we know about school-level growth scores?

Arkansas has reported school level values from the current growth model for two years. The results can help inform educators, parents, and policymakers about schools that are making greater than predicted academic improvements, and which schools might need additional support to ensure that all students are growing academically.


So, quick question…

Given what you know about Arkansas’ growth model- do you think the scores will be related to other school characteristics such as school size, the percentage of students eligible for Free/ Reduced Lunch, or proficiency rates? 


School size:

Do you think school size would be related to its students’ academic growth? We have no reason to think that smaller schools or larger schools would produce better growth rates, but want to use it as an introduction to interpreting school-level value-added scores.

We examined the relationship between 2015-16 school value-added scores and the number of students included in the model. Students included were tested on ACT Aspire English Language Arts and Math in grades 3-10, or the ITBS in grade 2, and had prior state assessment scores.

The scatter plot of the values for all for Arkansas schools is presented in Figure 2. The vertical axis presents the school value-added score, and ranges from -0.5 to +0.5. The horizontal axis represents the number of students included in the growth model for the school and ranges from 0 to 3,000 (we trimmed a few extremely large schools from the graph for illustrative purposes).

As you can see from the scatter plot, there is essentially no relationship between the number of student tested at a school and the value–added score that the school received. The correlation is essentially zero at +0.08.

Figure 2: Arkansas’ School-level Value-Added Scores and Number of Students Assessed/Included in Growth Model, 2015-16.

ENROLL_VAS.png

Free/Reduced Lunch:

Do you think schools serving more economically-advantaged students would have higher growth rates? Or would you expect a school with high enrollment of disadvantaged students would have “more room to grow” and that this would be reflected in higher growth scores? We examined the relationship between 2015-16 growth values and rates of FRL eligibility for all for Arkansas schools.

The scatter plot of the values for all for Arkansas schools is presented in Figure 3. The vertical axis presents the school value-added score, and ranges from -0.5 to +0.5.  The horizontal axis represents the school-level percent of students eligible for Free/ Reduced Lunch, a proxy measure for low socio-economic status, and the values range from 100% to 0%.

As you can see from Figure 3 below, there is some relationship between the number of student tested at a school and the value –added score that the school received. While at the majority of Free/ Reduced Lunch rates there are schools that received higher value added and lower value-added scores, almost all school with fewer than 30% of students eligible for FRL received positive growth values. The correlation is between value-added and school- level FRL is and moderately negative at -0.48.

Figure 3: Arkansas’ School-level Value-Added Scores and Percentage of Students Eligible for Free/Reduced Lunch Program, 2015-16.

FRL_VAS.png

Proficiency Rate:

Would you expect schools with students who are higher achieving to have higher growth rates? Or would you expect that school with more low-performing students would have higher value-added scores because the students have “more room to grow”. We examined the relationship between 2015-16 growth values and proficiency rates for all for Arkansas schools. Because 2015-16 proficiency rates would be directly impacted by student growth (higher growth would lead to higher proficiency), we used the 2014-15 school proficiency rates to examine the relationship between proficiency and growth.

The scatter plot of the values is presented in Figure 4. The vertical axis represents the school value-added score, and ranges from -0.5 to +0.5. The horizontal axis represents the percent of students meeting or exceeding benchmarks on the 2015 PARCC Math and ELA assessments in grades 3-10, and the values range from 0% to 100%.

As you can see from Figure 4, there is a moderate relationship between the number of students scoring proficient at a school and the value –added score that the school received: schools that had proficiency rates over 50% in 2015 also had higher value-added scores in 2016. School with lower proficiency rates had greater variability in the value-added scores but were more likely to receive low value-added in 2016. The correlation between value-added and prior year proficiency is moderately positive at +0.57.

Figure 4: Arkansas’ School-level Value-Added Scores (2015-16) and Percentage of Students Scoring Proficient, 2014-15.

PRO_VAS.png

In Figure 5, we overlay the scatte rplot presented in Figure 4 with colored zones to facilitate visualizing the patterns in the data.  In Figure 5, the upper-left quadrant represents schools where students were below the state average in proficiency rates, but demonstrated greater than expected growth (note schools in the white band are representing expected growth). The upper-right quadrant represents schools where students were above the state average in proficiency rates, and demonstrated greater than expected growth. The lower-left quadrant represents schools where students were below the state average in proficiency rates, and demonstrated lower than expected growth, and the lower-right quadrant represents schools where students were above the state average in proficiency rates, but demonstrated lower than expected growth.

While no one would argue that we should celebrate the schools with high proficiency and high growth, and support the schools with low proficiency and low growth, many disagree about whether we should be more concerned about schools with low proficiency and high growth or schools with schools with high proficiency and low growth.

Figure 5: Arkansas’ School-level Value-Added Scores (2015-16) and Percentage of Students Scoring Proficient, 2014-15 with Highlighted Quadrants.

PRO_VAS_color

How Consistent are Value Added Scores?

A common concern regarding value-added scores is that they are inconsistent – fluctuating up and down over time. Arkansas only has two years of Value-added data, but we looked to see how consistent the values were. The scatter plot of the values in presented in Figure 6. The vertical axis represents the 2016 value-added score while the horizontal axis represents 2015 value-added score. The axes both range from -0.5 to +0.5.

As you can see from Figure 6, there is a weak relationship between the school-level value–added scores from 2015 and 2016. Some schools that scored positively in 2015 had negative value-added for 2016 and vice versa. The correlation between school-level value-added from 2015 and 2016 is weak at 0.35.

Figure 6: Arkansas’ School-level Value-Added Scores,  2014-15 and 2015-16.

VAS_VAS


What does all this this mean?

Arkansas’ longitudinal student growth model measures the academic improvement of students over time and attributes that “Value-Added” to the school that they attend. When we examine the value-added information for the two years for which data are available, we find that the values are moderately correlated with FRL and the percent of students proficient at the school in the previous year, and that the school-level value–added scores from 2015 and 2016 are only weakly related.

We strongly support measuring student level growth, and think it is definitely the right thing to do.

That said, we were surprised that the value-added scores were so correlated with prior year proficiency and FRL rates. We were hoping they would be more independent, like school size, because when they aren’t it makes us wonder why individual student growth would be related to these school characteristics.

We must remember, however, that we are just examining the correlations, and ‘correlation is not causation’. Perhaps high-achieving schools are presenting students with more rigorous material that promotes student growth? Maybe some low performing schools have ineffective practices, or feeling defeated after years of “not meeting proficiency”. The goal of using student growth is to isolate the impact of the school as much as possible, excluding external factors from our measurements of school quality.  When we see the relationship between proficiency, poverty, and growth we are concerned that we may not yet have achieved success.

We are also somewhat concerned that the value-added score for the two available years are only weakly related at 0.35. On the one hand, we can see that things may change within a school that might have a positive or negative impact on school growth (teachers, principal, curriculum, etc.), and we may not WANT consistency but some variability to reflect the impact of changes in the school on student growth. On the other hand, we are somewhat surprised that the two years are so weakly correlated while proficiency is strongly correlated across the two years at 0.81.

Although the student growth model is well suited for measuring change even over different assessments like Arkansas has experienced, perhaps all the changes have resulted in inconsistent growth data. Not because of the model itself, but because of the different test content/ format, different performance expectations, students getting used to taking the assessments on the computers, etc.

Next Steps:

It may take some time for parents, educators, and policymakers to better understand what the value-added scores mean and how to act on the information to support student learning, but there are some things we recommend doing right now in regards to value-added scores.

  • Learn more about Arkansas’ value-added Scores (you can check this one off already!)
  • Check out your school’s value-added from 2015 and 2016 (you can access it here)
  • See how your school’s value-added score compares with “similar schools”- remember to consider differences of 0.04 or less as being essentially the same as your school.
  • State department personnel are likely reviewing schools that have had very low value-added scores for the past two years to see if additional support is needed, while schools that have had very high value-added scores should be reviewed to see if they are implementing any unique practices that could transfer to other schools.
  • Don’t get TOO wrapped up in the value-added scores yet. We look forward to examining the 2016-17 value-added scores to see if the relationship between these key variables is changing.
  • Chime in on ESSA planning.  You now have a better understanding than Betsy DeVos of the proficiency-versus-growth issue, and particularly how it impacts schools in Arkansas.  Let your voice be heard on the new state plan!

Perhaps this was more than you wanted to know about the ins and outs of growth and proficiency, but here at OEP, we believe the more you know about the measures being used to measure student and school success, the better decisions we can make to support our schools.

Advertisements
%d bloggers like this: