University of Arkansas Office for Education Policy

Search for “goodbye”

Say Goodbye to the 70s- PARCC Scores are here

In The View from the OEP on November 12, 2015 at 1:57 pm

travoltaSeems like just yesterday it was the 70s:

in 2013-14, 78 percent of students scored proficient or advanced on state literacy assessments, and 72% scored proficient or advanced on state math assessments.

Today the State Board of Education approved the PARCC cut off scores for grades 3-8 English Language Arts and Math, allowing us the first opportunity to see how well Arkansas students scored on the new, much discussed, and now abandoned test.

How Did Arkansas Do?

Table 1. Percent of Arkansas Students Scoring “Proficient” (Level 4) and Above on 

2014-15 PARCC Assessment

PARCC

WHOA! Only about one in three Arkansas students scored proficient or better?

Last year’s test data showed more than twice that amount!  WHAT HAPPENED?

We got a new test!

PARCC is the first assessment aligned to Arkansas’ Common Core State Standards, which set a higher bar for student learning, emphasizing the need for students to demonstrate critical thinking, problem solving, and clear writing.  PARCC results cannot be compared with the earlier Arkansas Benchmark results, both because this is a new test and a different test. This will be the only year of PARCC results, as Arkansas switched to ACT Aspire for assessment this school year.

In fact, these proficiency rates might sound familiar to you.  That is because just two weeks ago the results of the on the National Assessment of Educational Progress (NAEP), were released, and the scores were very similar.

How Did Arkansas Compare to Other PARCC States?

You may remember that  one of the key benefits of PARCC was that we would be able to compare Arkansas student performance to the performance of students in other states.

So far seven states have released their scores for grades 3-8.  Note: Some additional states have released high school scores, but because of differences in testing requirements and implementation, cross-state comparison of high school results isn’t useful.

The six other states that have released (at least preliminary) PARCC results are New Mexico, Louisiana, Illinois, Ohio, New Jersey and  Massachusetts.  The states are VERY different in many ways, but a key characteristic related to assessment is poverty. We would expect states with enrolling a greater percentage of students who are eligible for Free/Reduced Price Lunch (a proxy variable for poverty) will underperform states with fewer students eligible for Free/Reduced Price Lunch.  The seven PARCC states that have released scores range in FRL percentages, from New Mexico, with the greatest poverty at 68.5% of FRL students, to Massachusetts, with only 35.1% of FRL students. In the figures below, states are arranged from MOST FRL on the left to LEAST FRL on the right.  Not surprisingly, Massachusetts outperformed New Mexico.   Arkansas enrolls 60.9% of students eligible for FRL and is represented in the figures below by the RED bars.

Figure 1. Percent of Students Scoring “Proficient” (Level 4) and Above on 

2014-15 PARCC ELA Assessment

PARC ELANote- Ohio did not report scores for 3rd grade ELA

Figure 2. Percent of Students Scoring “Proficient” (Level 4) and Above on 

2014-15 PARCC Assessment

PARCC Math

Notes- New Jersey 8th grade scores are not representative.  Massachusetts allowed districts to choose between PARCC and the prior state assessment, and the split was fairly even.  Reported PARCC results for MA are based on a large representative sample, matched on achievement and demographic variables prior to score availability.

What Does This Mean?

English Language Arts

  • Arkansas is performing similarly to what we might expect, given our student population. Arkansas students outperform students from New Mexico, and are not as likely to be proficient as students from Massachusetts.
  • In many grades, Arkansas students scored similarly to students from states which have less disadvantaged student populations (Illinois and Ohio).
  • Interestingly, Louisiana students outperformed Arkansas students in almost every grade, even though they are more likely to be disadvantaged.

Math

  • Arkansas is performing similarly to what we might expect, given our student population. Arkansas students outperform students from New Mexico, and are not as likely to be proficient as students from Massachusetts.
  • Interestingly, Louisiana students outperformed (or equaled) Arkansas students in every grade, even though they are more likely to be disadvantaged.
  • 8th grade math scores are variable, perhaps in part because some advanced students completed high school level assessments (Algebra or Math I) instead of 8th grade math.

How Does PARCC Compare to NAEP?

The scores are very similar, but there are some trends in relationships between the scores.

In Reading, PARCC proficiency rates are typically a little bit higher than NAEP. Arkansas 4th graders were 2 percentage points more likely to be proficient on PARCC, and Arkansas 8th graders were 5 percentage points more likely.

In Math, PARCC proficiency rates  are typically a little bit lower than NAEP at 4th grade, and quite varied at 8th grade. Arkansas 4th and 8th  graders were 8 percentage points less likely to be proficient on PARCC than on NAEP.

So What Now?

Arkansas has gotten a lot of feedback about it’s education system recently- and while it isn’t great news, we need to be sure we have the right takeaways as we continue to move Arkansas education forward.

  1. Face the Music: PARCC and NAEP scores give us a clear picture of how Arkansas students perform compared to other states.  Both assessments are sending the same message- about one in three Arkansas students are ‘on grade level’.
  2. Learn the Steps: Arkansas teachers, students and parents need frequent, high quality data to provide a clear picture of where students are academically.
  3. Practice the Moves: Teachers need training on how to EFFECTIVELY use assessment data for their students to transform the instruction in the classroom.
  4. Find a Partner: Arkansas should consider why Louisiana is consistently outperforming us.
  5. Strut Your Stuff!  We look forward to seeing Arkansas students demonstrate improved performance!  For comparable data, we will likely need to wait until NAEP 2017.

Arkansas ‘Waives’ Goodbye to NCLB Provisions

In The View from the OEP on August 15, 2012 at 11:54 am

On June 29th, 2012, theUS Department of Education (USDE) announced that it had approved Arkansas’ ESEA waiver request. Here, at the OEP, we read the 158-page ESEA Flexibility Request and summarized the major points into a 6-page policy brief that explains the major changes from the previous NCLB system, how the new accountability labels are calculated, and what the consequences are.

Here are the major points we cover in the brief:

Major Changes from NCLB

  1. The NCLB subgroups will be replaced by a Targeted Achievement Gap Group (TAGG), a super subgroup that includes English Learners, Economically-Disadvantaged students, and Students with Disabilities.
  2. It is no longer required that 100% of students be proficient by 2014; rather, schools and districts will be rated based on the improvements made from year to year in the % of students reaching proficiency and the % of students making adequate annual growth.

Accountability Labels

  • In addition to identifying Exemplary Schools, there are 5 accountability labels* under the new system:
    • Achieving- 3-year ACSIP (Arkansas Comprehensive School Improvement Plan),
    • Achieving- 1-year ACSIP
    • Needs Improvement
    • Needs Improvement-Focus
    • Needs Improvement- Priority
  • Achieving-3-year, Achieving-1-year, and Needs Improvement designations are based on a school’s performance on its Annual Measurable Objectives (AMOs).
  • Schools are expected to make improvements from current performance levels toward the target of 100% student proficiency, student growth, and graduation rates for All Students and TAGG.  These Annual Measurable Objectives (AMOs) specify the levels schools should be reaching each year in order to achieve these goals by 2017.
  • The Exemplary designation is given to the top schools in four categories: high performance, high TAGG populations with high performance, high progress, and high TAGG populations with high progress.

The graph below details how each accountability label is determined:

click table to enlarge

Consequences

There are consequences for schools that do not reach their targeted goals. Among those consequences:

  • Achieving-3-Year ACSIP and Achieving-1-Year-ACSIP will have to submit ACSIPs every three years and one year, respectively. Districts with Achieving schools will enjoy high autonomy.
  • Needs Improvement schools will also have to submit annual ACSIPs and will experience low to moderate intervention from the ADE depending on a school’s needs. Similarly, districts with Needs Improvement schools will enjoy only moderate district autonomy, with the amount of district intervention differentiated based on the progress made and the persistence of gaps.
  • Needs Improvement-Focus schools are required to diagnose elements that are not serving TAGG students and establish a Targeted Improvement Plan (TIP) aligned to the needs of TAGG students. The district will be required to allocate sufficient funds to support the implementation of the interventions. Persistent lack of progress will result in the application of any or all turnaround principles at the school level, including replacing school leadership or teachers.
  • Needs Improvement-Priority schools will be required to work with an external School Improvement provider to make progress on a 3-year Priority Improvement Plan (PIP). A continued lack of progress can lead to district academic distress (pending a change of the definition of “academic distress” by the Board of Education.) Districts that remain in “academic distress” for two years are subject to state takeovers.

Want to read more about the new accountability system under the ESEA Waivers? Click here to read the detailed OEP Policy Brief on the ESEA Waiver approval.

Math Proficiency to Increase this Spring!

In The View from the OEP on March 2, 2016 at 12:02 pm

This spring we should see an increase in the percentage of Arkansas students found to be meeting performance expectations (‘on grade level’) in math. Arkansas students will be taking a new assessment, the ACT Aspire, and according to a new study from AIR, the achievement standards are significantly lower in mathematics than they were on the assessment students completed last year.

How many Arkansas students are performing ‘on grade level’ seems like a straightforward question, but there are actually many different answers. States set their own criteria for what it means to be ‘on grade level’ and which assessment will be used to measure student achievement of the criteria.  In the past, Arkansas measured student performance through the Arkansas Benchmark exams. Using these performance levels in Spring 2014, 78% of students in grades 3-8 were Proficient or Advanced in Literacy, and 72% were Proficient or Advanced in Math.  In Spring 2015, however, Arkansas administered a new assessment called the PARCC test.  Only 34% or Arkansas students Met or Exceeded Expectations in Literacy, and only 24% did so in Math.  The difference, as we have discussed before, lies in the change in assessments and the criteria to be considered ‘on grade level’.   This spring, Arkansas students will be measured against yet a different standard that seems to have a significantly lower criteria for ‘grade level’ performance, especially in mathematics.

How Do We Know?

 

There is an assessment given in every state that can be used as a common metric to compare state assessment results.  The National Assessment of Educational Progress (NAEP) in reading and math is administered to a representative sample of 4th and 8th grade students in every state every two years.  The authors of this study used equipercentile linking to benchmark state achievement standards against NAEP achievement levels. Think of it as determining the price of an item in dollars, euros and yen.

Because NAEP is only administered in grades 4 and 8, the comparisons are limited to those grade levels, but it sure is interesting!  The chart below shows the NAEP scaled score equivalent to being ‘on grade level’.  For NAEP, this is the Proficient category, for PARCC it is termed “Meeting Expectations” and for ACT Aspire the term is on track for “College Ready”.  Overblown terminology aside, the information shows why Arkansas students are more likely to be ‘on grade level’ this spring.

Figure 1. NAEP Equivalent Scores of Achievement Standards PARCC and ACT Aspire.

Screen%20Shot%202016-03-02%20at%2010.10.43%20AM

What Does This Mean?

 

4th grade reading: ↔ Not significantly different.

The NAEP equivalent score to meet performance expectations is the same for PARCC and ACT Aspire; 232. This is slightly below the score needed to be identified as Proficient on the NAEP.

8th grade reading: ↓ Significantly lower.

The NAEP equivalent score to meet performance expectations is significantly lower for ACT Aspire than for PARCC. A NAEP score of 273 was equivalent to Met Expectations on the PARCC assessment, but students would be identified as college ready on the ACT Aspire with NAEP score of 264. There is an effect size difference of .28.  Neither score was commensurate to the score needed to be identified as Proficient on the NAEP.

4th grade math:  ↓ Significantly lower.

The NAEP equivalent score to meet performance expectations is significantly lower for ACT Aspire than for PARCC. A NAEP score of 252 was equivalent to Met Expectations on the PARCC assessment, but students would be identified as college ready on the ACT Aspire with NAEP score of 235. There is an effect size difference of .55. While the PARCC score is slightly above the score needed to be identified as Proficient on the NAEP, the ACT Aspire equivalent score is well below.

8th grade math: ↓ Significantly lower.

The NAEP equivalent score to meet performance expectations is significantly lower for ACT Aspire than for PARCC. A NAEP score of 307 was equivalent to Met Expectations on the PARCC assessment, but students would be identified as college ready on the ACT Aspire with NAEP score of 290. There is an effect size difference of .48. While the PARCC score is slightly above the score needed to be identified as Proficient on the NAEP, the ACT Aspire equivalent score is below.

So What?

 

This can not be interpreted as a move from a ‘bad test’ to a ‘good test’ or vice versa. The comparison to the NAEP standards for proficiency simply helps us give some context to the changing assessment results for Arkansas students.  In addition, the NAEP standards for “Proficient” should not be interpreted as the criteria needed to be ‘on grade level’ or as predicting college success, but can help us understand that higher proficiency rates on the ACT Aspire may not reflect actual increases in student achievement.  As shown in this study, the achievement required to meet ACT Aspire’s performance standard is significantly lower than the PARCC standards in 8th grade English language arts and 4th and 8th grade math.  Given this information, if math proficiency rates increase in 4th grade this spring, we might be cautious in assuming it is because students actually know more math.

The big issue is that we don’t actually know what being ‘on grade level ‘ or ‘proficient’ or ‘college ready’ means in terms of performance later in life. A slightly lower standard on the ACT Aspire could be problematic if the standards are too low to provide meaningful information about student ability, falsely indicating to teachers that the student is doing fine when in fact he or she does not have the skills needed to be successful in college or careers.  Given that the ACT Aspire standards are still much more challenging than Arkansas achievement standards used to be, however, here at the OEP we aren’t too concerned about that.

After Spring 2017, when we have our second year of ACT Aspire assessment results, we will be able to more confidently identify positive or negative changes in Arkansas student achievement.

 

 

No *s!

In The View from the OEP on January 27, 2016 at 12:07 pm

asterisk

When reviewing the preliminary PARCC assessment results at the December Education Caucus, Senator Clark told ADE staff that he wanted to see a data report “without asterisks”.  Although we didn’t see the document Senator Clark was referring to, the asterisks were presumably being used to indicate that although several states administered the same PARCC assessments last year, results between Arkansas and some other states were not directly comparable.  As noted in our recent blog on statewide PARCC results, here at OEP we found that even before considering the varied poverty levels of the states, it was appropriate to refrain from making direct comparisons in some cases.

In the 2014-15 school year, 5 million students in 11 states and the District of Columbia took the PARCC assessments in grades 3-11,  but not all participating states had students in all grades taking the test.  Education is a state responsibility, and states were free to set individual expectations for PARCC testing.  In Massachusetts for example, districts were allowed to choose if they wanted students to complete the PARCC tests or the MCAS, the state test used previously.  In Ohio, 3rd graders didn’t take the English Language Arts portion of the PARCC.  In Arkansas, the Algebra II and 11th grade literacy tests were optional for districts. PARCC testing expectations for high school students were the most varied across the states, and therefore the most inappropriate to compare.  Here at OEP we feel you can compare performance in the earlier grades, but still must consider the possible impact of varying degrees of poverty across the states.

So, for ONCE we were taking the same annual test as 10 other states, and we STILL CAN’T COMPARE student performance?  *!  We understand, Senator, all of this is super frustrating.  You just want to know how Arkansas students are doing.

It’s frustrating for Arkansas parents too, because… how do we know if our kids are on track for success?

Here at OEP we wanted to share some suggestions to help parents and policymakers appropriately interpret various sources of information about student performance.

PARCC scores*

Only 1% of Arkansas students exceeded expectations in mathematics, and only 4% exceeded expectations in English Language Arts. We aren’t Lake Wobegon, where all the children are above average, but what a change! Proficiency percentages dropped from over 70% last year to mid 30% with the change in assessment this year.  The expectations of student performance are much higher on PARCC than on the prior assessments, and here at the OEP we feel they are more consistent with expectations on other key assessments such as the NAEP and ACT, and more reflective of the skills students will need to be successful in college and careers.

From a parent perspective, however, interpreting the PARCC scores can be confusing.  There is a lot of information out there about understanding the score, but we still aren’t quite sure what to make of it as parents.  The average score is provided for a student’s district, state and the PARCC states overall, but there is no percentile rank provided, so the best parents can tell about a student’s overall performance is that it was better or worse than average. 

We all know of course, that one score isn’t the whole picture of a student.  The PARCC site notes, “This information, along with grades, teacher feedback and scores on other tests, will help give a more complete picture of how well your child is performing academically.”

Here at the OEP, we always suggest using multiple sources of data- but get ready for more asterisks!

Grades*

Grades are SOO subjective. If a student gets a ‘B’ in a 5th grade class in School A, is there any reason to think that that same student would get a ‘B’ in a 5th grade class in School B?  Not really!  Teachers assign grades based on their own criteria.  This can include completing homework, class participation, attendance, and performance on teacher-created projects and assessments.  Sometimes grades are a better measure of compliance than student’s knowledge about a subject.  According to the annual grade inflation report, in 2014 over 7% of students in Arkansas received a grade of ‘A’ or ‘B’ in their high school math classes and yet were not proficient on the state-mandated End of Course test.  While this may not seem too concerning (we all suffered through high school math), 12% of high schools in the state were found to have widespread ‘grade inflation’.  In these 39 schools more than 1 in 5 students didn’t pass the state test even though they were awarded an ‘A’ or ‘B’ in the relevant class.  These students were getting a grade that indicated they were on track for success, but had not mastered the content.  Student grades should definitely have an asterisk because they are NOT comparable between teachers, schools, districts or states.

Teacher Feedback*

Here at the OEP we value feedback from teachers about student performance.  Teachers spend time helping students learn every day, and we respect their expertise and knowledge about kids. Experienced teachers have seen a lot of students pass through, and can have an informative perspective on how a student compares with the ‘typical’ student they see.  A limitation to this can be that ‘typical’ is based on what he or she has experienced.  Being a 5th grade classroom teacher in Arkansas for 7 years does not necessarily make you an expert on how a student is progressing toward preparation for college and careers. You might have a different frame of reference if you were teaching in a different school or another state.  Teacher feedback should have an asterisk because it is one perspective that is NOT comparable between teachers, schools, districts or states.

Other Test Data

Hooray!  This is one piece of information that may not need an asterisk if the right type of assessment is used.  NWEA MAP assessments, used by many districts throughout Arkansas, can provide valid and reliable information on student performance and growth compared to a nationally representative sample.   MAP data shows how a student’s performance compares to other students across the country, and is available for students in grades K-12.  The individualized growth targets can keep track of how any student, regardless of performance level, is keeping pace, catching up, or falling behind his or her academic peers.

And now using this COOL NEW TOOL, students and parents can explore what colleges and universities students are on track to attend just by inputting MAP scores. AND- you can use this tool for students as early as fifth grade!

According to NWEA:”The College Explorer Tool uses correlations between MAP scores and college readiness benchmarks for the ACT to pinpoint the colleges and universities for which a students’ forthcoming scores would likely be near the median admissions scores. Additionally, the tool provides a quantitative profile of each institution using data from the U.S. Department of Education’s College Scorecard, which includes valuable information on cost of attendance and the average annual cost borne by families at different income levels. This crucial information shows students and their families how much they would likely need to borrow in order to complete their education at a given college, as well as graduates’ typical earnings.”

Screen Shot 2016-01-27 at 9.57.34 AM

 

 

 

 

 

Fifth grade might seem young to be thinking about post-secondary options, but the incredible thing about having this information available so early in the academic process is that it gives kids a heads up in time to CHANGE THE TRAJECTORY.  If a student wants to go to UofA, or CalTech, or University of North Dakota, he or she can see if they are on track to meet the typical ACT/SAT score acceptance criteria. If the student is falling short there is time to close the gap!  Of course, there is way more to being successful in college and career than test scores, BUT such early information may help kids expand their horizons and reach their dreams.

Because it is a stable and nationally representative assessment, NWEA MAP scores can also be used to nominate students for early talent identification programs, and special school-level growth norms can help determine if programs are effective. This computer-adaptive assessment with strong psychometric properties and a large national sample can help students, parents, teachers and school leaders look at some information with no asterisks.

Comparing Responsibly

In a quest for improving education for students, we are always trying to find what works-  did Program A work better than Program B? It’s complicated because a lot of things can impact educational outcomes, and there are lots of different measures of educational outcomes. Outcomes may include graduation rates, college-going rates, income after high school, health and well being, but test scores are, for us here at OEP, an important one.

*side note here to school district folks- be sure to carefully check your graduation rate data from ADE- we hear it may be more ‘messy’ than in prior years*

Straight comparisons between PARCC test scores for Arkansas students and kids in other states may be inappropriate given differences in student demographics, but there are processes to adjust for poverty and compare school performance across states and even nations (check out schoolgrades.org). And – newsflash- there is never a PERFECT comparison for any kids or school or state, but we still need to try to responsibly compare outcomes and find out what is working to get students learning and ready for a successful life after school.

In today’s world of powerful data and analysis techniques, there are responsible ways to compare without needing to use asterisks. Here at OEP we know that kids and schools and states all face different challenges, and make every effort to focus the conversation not just on ‘highest test score’ but on growth and improvement as well. Just comparing within Arkansas is tricky, but these conversations are more complicated when states are using different tests. PARCC assessments were intended to facilitate constructive comparison of student performance across state lines, but varied testing patterns mean we still need some asterisks.  Hopefully, we won’t need any asterisks next year to compare student performance in Arkansas to Alabama (the only other state using ACT Aspire).

SO- Senator Clark and parents across the state, here at OEP we feel your frustration. Unfortunately, we can’t change the education political landscape and have all kids in the country take the same test.  As policymakers and parents, however, we can change the conversation and ask our schools- what information do you have about how our students are learning and growing compared to peers across the country?

Because in real life- there aren’t any asterisks.asterisk

 

 

Back to School!!!

In The View from the OEP on August 22, 2012 at 11:16 am

Welcome Back Kotter!

As we are sure you have been told numerous times over this past week, WELCOME BACK!!! We hope everyone had a restful yet exciting summer vacation! We stayed busy in the OEP all summer (with a vacation here and there) – and even highlighted some of the big news stories that happened over the summer in a previous blog post.

Also during the summer, we spent some time gathering information for a few pre-start-of school publications. For example, in early August, the Arkansas Department of Education released the Benchmark, End-of-Course (EOC), and Iowa Test of Basic Skills (ITBS) results from the 2011-12 academic year. We promptly released a policy brief summarizing statewide performance on these exams and reported that student scores on these exams have increased over the previous years. In fact, there was a remarkable jump in literacy scores from the 2010-11 to the 2011-12 academic year.  Last year, 75% of test takers scored at the proficient or advanced level…but in 2011-12, that performance indicator grew by six percentage points to 81%! That is a substantial one-year gain. In fact, Arkansas students have not had an annual gain that large since the 2005-06 academic year when they grew seven percentage points in literacy over the previous year. You can read more about Arkansas’ student test performance in the 2011-12 Arkansas Test Results policy brief.

More recently, after reading the ADE’s 158 page document outlining Arkansas’ Elementary and Secondary Education Act (ESEA) Waiver Request, we condensed and presented the highlights in a six-page policy brief, described in a more abbreviated blog post on the subject. This brief discusses new ways schools will be measured in the absence of NCLB stalwart Adequate Yearly Progress – or AYP. We highlight the move toward Targeted Achievement Gap Groups (or TAGG), which encourages schools to focus on groups of students for whom the academic achievement gaps are most prominent. To learn more, read our ESEA Waiver Approval Update policy brief.

We have an exciting year in store for you at the OEP. In the coming weeks, we will be releasing our annual OEP Awards, where we recognize the top performing schools across the state on the Benchmark and End-of-Course exams. You can review last year’s OEP Awards here. In addition to the OEP Awards, we will also be releasing an update on the School Choice Law ruling, and a more in-depth look at student performance on the state exams – as well as many other publications as new and exciting information comes our way. It’s going to be an exciting year! We wish all educators the best of luck as we move forward through the school year!

We’d love to know what types of information you would like us to cover. Perhaps you have an idea for a research study or OEP Policy Brief. By all means, give us your input! Leave us a comment below and let us know what we can to to help!