University of Arkansas Office for Education Policy

Posts Tagged ‘School rewards’

Growth versus Proficiency in Arkansas

In The View from the OEP on April 5, 2017 at 12:55 pm

One of our favorite topics came up recently during the confirmation hearing for the Federal Secretary of Education when the question about “Growth- versus- Proficiency” as the best method for measuring schools was raised. This is an important question for students, parents, educators and policymakers and we think it is important for all to understand how these measures of school quality are being used in Arkansas, the difference between using growth and proficiency to evaluate school performance, and how these measures relate to school-level characteristics.

Tall to Ride.pngProficiency measures if students met the criteria set by the state to measure ‘grade level’ performance on the state test. We like to think of it as the “You Must Be This Tall To Ride” signs like they have at amusement parks.

Wall Growth.png

Growth measures if students grew academically from one year to the next as much as we expected them to grow based on what we know about them. This is similar to marks on a wall that parents use to track their child’s increasing height over time.

In today’s blog we examine Arkansas’ use of growth measures and how they relate to other school characteristics.

Proficiency rates have been the main measure of Arkansas school quality since No Child Left Behind, but Growth has been included in the state’s A-F school letter grades since 2014. Arkansas is currently developing a new plan to evaluate and report school quality and to identify schools in need of additional support under the federal requirements of the Every Student Succeeds Act (ESSA). In addition to these applications of proficiency and growth rates, under the Arkansas School Recognition Program, schools in the top 10% of the state in proficiency and those in the top 10% of the state in growth are provided monetary rewards (you can find the list of top schools here: and read our prior blog about it.

Success is more than “Proficiency”

Over the past 15 years since No Child Left Behind, students, parents, educators and policymakers have realized that proficiency rates have serious limitations when describing the quality of a school.

  • Certain types students are more likely to be proficient on the state test. Those who come into kindergarten already reading vs. those who may not have been exposed to books often in the home. Students who were well fed, both before and after birth, as opposed to those suffering from food insecurity. In Arkansas, as elsewhere, non-minority, non- disadvantaged students tend to be more likely to meet state “Proficiency” standards. When we use proficiency to evaluate schools, like we did under NCLB, it is very difficult for schools serving minority and disadvantaged students to be successful. Even if the schools are doing an excellent job educating the students, it is difficult to get students who started school behind to meet grade level expectations.
  • Can lead to focus on improving the skills of students who are ‘close’ to meeting proficiency targets, and neglecting high-performing students who are definitely going to exceed the proficiency target or struggling students who are very unlikely to meet the proficiency target at the end of the year.
  • States created their own assessments and set their own ‘proficiency’ criteria, meaning that a student ‘proficient’ in one state may not be ‘proficient’ in another.

How Can We Measure Student Growth?

To try to quantify how effective a school is at increasing the knowledge of its students, many states, including Arkansas, are measuring changes in individual student achievement over time. There are many different models for measuring growth, but Arkansas’ model uses a student’s prior academic performance on state assessments to predict where the student will likely score, then compares the actual score to the predicted score.

The process is illustrated in Figure 1. The dark blue dots represent an individual student’s score history on state assessments from third to seventh grade. Using these prior scores, researchers generate a trajectory (represented by the dotted line) that predicts what the student would be likely to score on the 8th grade test. If the student scores at the light blue dot, which is on the trajectory, then the student grew as expected. If the student scored at the level of the green dot, the student grew more than expected, and if the student scored at the level of the red dot, the student grew less than expected. It is important to note that student “Proficiency” is not indicated in the figure. This student may be well below, well above, or right at ‘grade level’, but this standard is not considered in relation to the student’s academic growth over time. You can learn more about the process here.

Figure 1: Example of Arkansas’ Longitudinal Growth Model

Growth Model

Students included in Arkansas’ growth model were tested on the state assessment in English Language Arts and Math, in grades 2 through 10, and had prior state assessment scores.  When predicting student scores, we should use as many years of a student’s score history as are available. Although two students scored similarly on state assessments in the year immediately prior to the prediction, they could have had very different score histories; perhaps one was declining but the other was increasing.

This student-level growth model becomes a “Value-Added” model when the growth is attributed to the school. The student-level growth scores are averaged at the school–level, and this is a measure of how “effective” the school has been at educating students, or, in other words how much “Value” the school has “Added” to its students’ learning.

Interpreting Value-Added scores

Schools that receive a growth score close to 0 enroll students who grew academically at the expected rate. Positive values indicate that the school had students, on average, grew more than expected, while negative values indicate the average student at the school grew less than predicted. These value-added scores have a mean value of 0, and the standard deviation at the school level is about 0.07.

What do we know about school-level growth scores?

Arkansas has reported school level values from the current growth model for two years. The results can help inform educators, parents, and policymakers about schools that are making greater than predicted academic improvements, and which schools might need additional support to ensure that all students are growing academically.


So, quick question…

Given what you know about Arkansas’ growth model- do you think the scores will be related to other school characteristics such as school size, the percentage of students eligible for Free/ Reduced Lunch, or proficiency rates? 


School size:

Do you think school size would be related to its students’ academic growth? We have no reason to think that smaller schools or larger schools would produce better growth rates, but want to use it as an introduction to interpreting school-level value-added scores.

We examined the relationship between 2015-16 school value-added scores and the number of students included in the model. Students included were tested on ACT Aspire English Language Arts and Math in grades 3-10, or the ITBS in grade 2, and had prior state assessment scores.

The scatter plot of the values for all for Arkansas schools is presented in Figure 2. The vertical axis presents the school value-added score, and ranges from -0.5 to +0.5. The horizontal axis represents the number of students included in the growth model for the school and ranges from 0 to 3,000 (we trimmed a few extremely large schools from the graph for illustrative purposes).

As you can see from the scatter plot, there is essentially no relationship between the number of student tested at a school and the value–added score that the school received. The correlation is essentially zero at +0.08.

Figure 2: Arkansas’ School-level Value-Added Scores and Number of Students Assessed/Included in Growth Model, 2015-16.

ENROLL_VAS.png

Free/Reduced Lunch:

Do you think schools serving more economically-advantaged students would have higher growth rates? Or would you expect a school with high enrollment of disadvantaged students would have “more room to grow” and that this would be reflected in higher growth scores? We examined the relationship between 2015-16 growth values and rates of FRL eligibility for all for Arkansas schools.

The scatter plot of the values for all for Arkansas schools is presented in Figure 3. The vertical axis presents the school value-added score, and ranges from -0.5 to +0.5.  The horizontal axis represents the school-level percent of students eligible for Free/ Reduced Lunch, a proxy measure for low socio-economic status, and the values range from 100% to 0%.

As you can see from Figure 3 below, there is some relationship between the number of student tested at a school and the value –added score that the school received. While at the majority of Free/ Reduced Lunch rates there are schools that received higher value added and lower value-added scores, almost all school with fewer than 30% of students eligible for FRL received positive growth values. The correlation is between value-added and school- level FRL is and moderately negative at -0.48.

Figure 3: Arkansas’ School-level Value-Added Scores and Percentage of Students Eligible for Free/Reduced Lunch Program, 2015-16.

FRL_VAS.png

Proficiency Rate:

Would you expect schools with students who are higher achieving to have higher growth rates? Or would you expect that school with more low-performing students would have higher value-added scores because the students have “more room to grow”. We examined the relationship between 2015-16 growth values and proficiency rates for all for Arkansas schools. Because 2015-16 proficiency rates would be directly impacted by student growth (higher growth would lead to higher proficiency), we used the 2014-15 school proficiency rates to examine the relationship between proficiency and growth.

The scatter plot of the values is presented in Figure 4. The vertical axis represents the school value-added score, and ranges from -0.5 to +0.5. The horizontal axis represents the percent of students meeting or exceeding benchmarks on the 2015 PARCC Math and ELA assessments in grades 3-10, and the values range from 0% to 100%.

As you can see from Figure 4, there is a moderate relationship between the number of students scoring proficient at a school and the value –added score that the school received: schools that had proficiency rates over 50% in 2015 also had higher value-added scores in 2016. School with lower proficiency rates had greater variability in the value-added scores but were more likely to receive low value-added in 2016. The correlation between value-added and prior year proficiency is moderately positive at +0.57.

Figure 4: Arkansas’ School-level Value-Added Scores (2015-16) and Percentage of Students Scoring Proficient, 2014-15.

PRO_VAS.png

In Figure 5, we overlay the scatte rplot presented in Figure 4 with colored zones to facilitate visualizing the patterns in the data.  In Figure 5, the upper-left quadrant represents schools where students were below the state average in proficiency rates, but demonstrated greater than expected growth (note schools in the white band are representing expected growth). The upper-right quadrant represents schools where students were above the state average in proficiency rates, and demonstrated greater than expected growth. The lower-left quadrant represents schools where students were below the state average in proficiency rates, and demonstrated lower than expected growth, and the lower-right quadrant represents schools where students were above the state average in proficiency rates, but demonstrated lower than expected growth.

While no one would argue that we should celebrate the schools with high proficiency and high growth, and support the schools with low proficiency and low growth, many disagree about whether we should be more concerned about schools with low proficiency and high growth or schools with schools with high proficiency and low growth.

Figure 5: Arkansas’ School-level Value-Added Scores (2015-16) and Percentage of Students Scoring Proficient, 2014-15 with Highlighted Quadrants.

PRO_VAS_color

How Consistent are Value Added Scores?

A common concern regarding value-added scores is that they are inconsistent – fluctuating up and down over time. Arkansas only has two years of Value-added data, but we looked to see how consistent the values were. The scatter plot of the values in presented in Figure 6. The vertical axis represents the 2016 value-added score while the horizontal axis represents 2015 value-added score. The axes both range from -0.5 to +0.5.

As you can see from Figure 6, there is a weak relationship between the school-level value–added scores from 2015 and 2016. Some schools that scored positively in 2015 had negative value-added for 2016 and vice versa. The correlation between school-level value-added from 2015 and 2016 is weak at 0.35.

Figure 6: Arkansas’ School-level Value-Added Scores,  2014-15 and 2015-16.

VAS_VAS


What does all this this mean?

Arkansas’ longitudinal student growth model measures the academic improvement of students over time and attributes that “Value-Added” to the school that they attend. When we examine the value-added information for the two years for which data are available, we find that the values are moderately correlated with FRL and the percent of students proficient at the school in the previous year, and that the school-level value–added scores from 2015 and 2016 are only weakly related.

We strongly support measuring student level growth, and think it is definitely the right thing to do.

That said, we were surprised that the value-added scores were so correlated with prior year proficiency and FRL rates. We were hoping they would be more independent, like school size, because when they aren’t it makes us wonder why individual student growth would be related to these school characteristics.

We must remember, however, that we are just examining the correlations, and ‘correlation is not causation’. Perhaps high-achieving schools are presenting students with more rigorous material that promotes student growth? Maybe some low performing schools have ineffective practices, or feeling defeated after years of “not meeting proficiency”. The goal of using student growth is to isolate the impact of the school as much as possible, excluding external factors from our measurements of school quality.  When we see the relationship between proficiency, poverty, and growth we are concerned that we may not yet have achieved success.

We are also somewhat concerned that the value-added score for the two available years are only weakly related at 0.35. On the one hand, we can see that things may change within a school that might have a positive or negative impact on school growth (teachers, principal, curriculum, etc.), and we may not WANT consistency but some variability to reflect the impact of changes in the school on student growth. On the other hand, we are somewhat surprised that the two years are so weakly correlated while proficiency is strongly correlated across the two years at 0.81.

Although the student growth model is well suited for measuring change even over different assessments like Arkansas has experienced, perhaps all the changes have resulted in inconsistent growth data. Not because of the model itself, but because of the different test content/ format, different performance expectations, students getting used to taking the assessments on the computers, etc.

Next Steps:

It may take some time for parents, educators, and policymakers to better understand what the value-added scores mean and how to act on the information to support student learning, but there are some things we recommend doing right now in regards to value-added scores.

  • Learn more about Arkansas’ value-added Scores (you can check this one off already!)
  • Check out your school’s value-added from 2015 and 2016 (you can access it here)
  • See how your school’s value-added score compares with “similar schools”- remember to consider differences of 0.04 or less as being essentially the same as your school.
  • State department personnel are likely reviewing schools that have had very low value-added scores for the past two years to see if additional support is needed, while schools that have had very high value-added scores should be reviewed to see if they are implementing any unique practices that could transfer to other schools.
  • Don’t get TOO wrapped up in the value-added scores yet. We look forward to examining the 2016-17 value-added scores to see if the relationship between these key variables is changing.
  • Chime in on ESSA planning.  You now have a better understanding than Betsy DeVos of the proficiency-versus-growth issue, and particularly how it impacts schools in Arkansas.  Let your voice be heard on the new state plan!

Perhaps this was more than you wanted to know about the ins and outs of growth and proficiency, but here at OEP, we believe the more you know about the measures being used to measure student and school success, the better decisions we can make to support our schools.

Rewards and Recognition

In AR Legislature, The View from the OEP on December 21, 2016 at 1:33 pm

Here at the OEP, we love to see schools get recognized for excellence.  Last Friday, the Arkansas Department of Education announced the Arkansas School Recognition and Reward Program (read the commissioners memo about the program here).

The ADE rankings of schools are posted on the OEP website here and you can look to see how your school fared in the performance and growth/graduation rankings.

Show me the Money!

The program is offering almost $7 million in reward funds to 158 schools (out of 1,037 schools in the state). Education funding is not often allocated at the school-level by the state, and so this program is unique in distributing funding directly to schools, as opposed to the district-level.

Schools receive $100 per student for being in the top 5% of schools in the state and $50 per student for being in the top 6-10% of schools in the state.

Schools can spend the money on:

  • Non-recurring bonuses to faculty and staff,
  • temporary personnel to assist, maintain and improve student performance, or
  • educational equipment or materials.

A school committee including the principal, a teacher elected by the faculty, and a parent representative (as selected by the PTA or another parental involvement group) determine how the school would like to spend the funds, and the proposal must then be approved by the ADE.

There are two categories of rewards: Performance and Growth/ Graduation.

 Performance Rewards:

  • Performance awards are based on student performance on the 2015-16 state assessments in ELA and Math.

There are 51 schools in the top 5% and 52 schools between 6% and 10%:

  • 76 are elementary schools (14% of the state’s elementary schools),
  • 25 are middle and junior high schools (12%of the state’s middle and junior high schools)
  • 2 are high schools (less than 1% of the state’s high schools – only Haas Hall Fayetteville and Bentonville)

Not surprisingly, the schools rewarded for performance are less poor than the state: only 33% of students in the top 5% performance schools and 45% of students in the top 6-10% receive free-and-reduced lunch, while schools that did not receive a performance reward serve 64% FRL population.

Although there is a relationship between student performance and poverty, there isn’t a strong correlation between performance rank and student poverty.  The figure below shows the relationship between each school’s Performance Rank and % FRL.  The schools in the green box are schools identified for performance rewards as they were ranked above the 90th percentile.  On the far left side of the figure, you can see a school with a performance rank of 99 and 0% FRL.  If you look to the right side of the figure, however, you can see that a school with 80% of students eligible for FRL received a performance rank of 95. This school, College Station Elementary from PCSSD, is an example of a school where students are high performing despite academic risk factors.

screen-shot-2016-12-21-at-11-23-59-am

 

As can be seen in the figure below, the Northwest and Central regions have the highest percentages of performance reward schools (48% and 32% respectively).

perf_legend

performance-state

 

 

Growth/ Graduation Rewards:

  • Growth awards are based on school-level growth in student performance from the 2014-15 to the 2015-16 state assessments in ELA and Math.
  • For high schools, this award is based on the ranking on their 2015 graduation rate.

Growth: There are 35 schools in the top 5% and 40 schools between 6% and 10%:

  • 50 are elementary schools (9% of the state’s elementary schools),
  • 25 are middle or junior high school, in addition to a few small high schools (12% of the state’s middle and junior high schools.

Graduation: There are 15 high schools in the top 5% and 15 schools between 6% and 10% (11% of the state’s high schools).  More than half of these schools are 7-12 schools, and the average enrollment is less than 300 students.

We would expect student growth and graduation to be less correlated to student participation in FRL, and it is a little more diverse, but the schools rewarded for growth and graduation  are still less poor than the state: only 36% of students in the top 5% growth/grad schools and 49% of students in the top 6-10% receive free-and-reduced lunch, while schools that did not receive a growth/grad reward serve 63% FRL population.

The figure below shows the relationship between each school’s Growth Rank and % FRL. High schools with graduation rankings are not included in the figure, to allow better examination of the relationship between poverty and growth. The schools in the green box are schools identified for growth rewards as they were ranked above the 90th percentile.  On the far left side of the figure, you can see a school with a performance rank of 98 and 7% FRL.  If you look to the right side of the figure, however, you can see that a school with 92% of students eligible for FRL received a performance rank of 92. This school, Pine Bluff Lighthouse College Prep, is an example of a school where students are demonstrating high academic growth despite academic risk factors.

screen-shot-2016-12-21-at-11-23-52-am

 

As can be seen in the figure below, the Northwest and Central regions have the highest percentages of growth/graduation reward schools (43% and 33% respectively).

growth-legend

growth-state

Closing Thoughts:

Hooray!   Congratulations to all the schools who received awards!   We love the use of a student-level growth model to reward schools who are making strong gains with their students but may not yet be achieving the highest levels of performance.  We did find it interesting, however, that there was such overlap between the awards: only 1/4 of the schools who received an award for growth did not also receive an award for performance.

Hmmm…We are concerned that the reward money is flowing only to certain areas of the state as almost no schools in the Southwest or Southeast regions of the state received reward money for performance or growth/ graduation.

We hope that schools who didn’t receive a reward this year examine the data to see which schools that are similar to them DID.  In both the performance and growth graphic, we can see that there are schools with similar FRL % ages performing at very different levels.

We would also be interested in seeing how schools are spending the money and what impact that is having on teachers and students.

We hope you and yours enjoy the holiday and stay tuned for more analysis about student performance and growth!