University of Arkansas Office for Education Policy

No *s!

In The View from the OEP on January 27, 2016 at 12:07 pm


When reviewing the preliminary PARCC assessment results at the December Education Caucus, Senator Clark told ADE staff that he wanted to see a data report “without asterisks”.  Although we didn’t see the document Senator Clark was referring to, the asterisks were presumably being used to indicate that although several states administered the same PARCC assessments last year, results between Arkansas and some other states were not directly comparable.  As noted in our recent blog on statewide PARCC results, here at OEP we found that even before considering the varied poverty levels of the states, it was appropriate to refrain from making direct comparisons in some cases.

In the 2014-15 school year, 5 million students in 11 states and the District of Columbia took the PARCC assessments in grades 3-11,  but not all participating states had students in all grades taking the test.  Education is a state responsibility, and states were free to set individual expectations for PARCC testing.  In Massachusetts for example, districts were allowed to choose if they wanted students to complete the PARCC tests or the MCAS, the state test used previously.  In Ohio, 3rd graders didn’t take the English Language Arts portion of the PARCC.  In Arkansas, the Algebra II and 11th grade literacy tests were optional for districts. PARCC testing expectations for high school students were the most varied across the states, and therefore the most inappropriate to compare.  Here at OEP we feel you can compare performance in the earlier grades, but still must consider the possible impact of varying degrees of poverty across the states.

So, for ONCE we were taking the same annual test as 10 other states, and we STILL CAN’T COMPARE student performance?  *!  We understand, Senator, all of this is super frustrating.  You just want to know how Arkansas students are doing.

It’s frustrating for Arkansas parents too, because… how do we know if our kids are on track for success?

Here at OEP we wanted to share some suggestions to help parents and policymakers appropriately interpret various sources of information about student performance.

PARCC scores*

Only 1% of Arkansas students exceeded expectations in mathematics, and only 4% exceeded expectations in English Language Arts. We aren’t Lake Wobegon, where all the children are above average, but what a change! Proficiency percentages dropped from over 70% last year to mid 30% with the change in assessment this year.  The expectations of student performance are much higher on PARCC than on the prior assessments, and here at the OEP we feel they are more consistent with expectations on other key assessments such as the NAEP and ACT, and more reflective of the skills students will need to be successful in college and careers.

From a parent perspective, however, interpreting the PARCC scores can be confusing.  There is a lot of information out there about understanding the score, but we still aren’t quite sure what to make of it as parents.  The average score is provided for a student’s district, state and the PARCC states overall, but there is no percentile rank provided, so the best parents can tell about a student’s overall performance is that it was better or worse than average. 

We all know of course, that one score isn’t the whole picture of a student.  The PARCC site notes, “This information, along with grades, teacher feedback and scores on other tests, will help give a more complete picture of how well your child is performing academically.”

Here at the OEP, we always suggest using multiple sources of data- but get ready for more asterisks!


Grades are SOO subjective. If a student gets a ‘B’ in a 5th grade class in School A, is there any reason to think that that same student would get a ‘B’ in a 5th grade class in School B?  Not really!  Teachers assign grades based on their own criteria.  This can include completing homework, class participation, attendance, and performance on teacher-created projects and assessments.  Sometimes grades are a better measure of compliance than student’s knowledge about a subject.  According to the annual grade inflation report, in 2014 over 7% of students in Arkansas received a grade of ‘A’ or ‘B’ in their high school math classes and yet were not proficient on the state-mandated End of Course test.  While this may not seem too concerning (we all suffered through high school math), 12% of high schools in the state were found to have widespread ‘grade inflation’.  In these 39 schools more than 1 in 5 students didn’t pass the state test even though they were awarded an ‘A’ or ‘B’ in the relevant class.  These students were getting a grade that indicated they were on track for success, but had not mastered the content.  Student grades should definitely have an asterisk because they are NOT comparable between teachers, schools, districts or states.

Teacher Feedback*

Here at the OEP we value feedback from teachers about student performance.  Teachers spend time helping students learn every day, and we respect their expertise and knowledge about kids. Experienced teachers have seen a lot of students pass through, and can have an informative perspective on how a student compares with the ‘typical’ student they see.  A limitation to this can be that ‘typical’ is based on what he or she has experienced.  Being a 5th grade classroom teacher in Arkansas for 7 years does not necessarily make you an expert on how a student is progressing toward preparation for college and careers. You might have a different frame of reference if you were teaching in a different school or another state.  Teacher feedback should have an asterisk because it is one perspective that is NOT comparable between teachers, schools, districts or states.

Other Test Data

Hooray!  This is one piece of information that may not need an asterisk if the right type of assessment is used.  NWEA MAP assessments, used by many districts throughout Arkansas, can provide valid and reliable information on student performance and growth compared to a nationally representative sample.   MAP data shows how a student’s performance compares to other students across the country, and is available for students in grades K-12.  The individualized growth targets can keep track of how any student, regardless of performance level, is keeping pace, catching up, or falling behind his or her academic peers.

And now using this COOL NEW TOOL, students and parents can explore what colleges and universities students are on track to attend just by inputting MAP scores. AND- you can use this tool for students as early as fifth grade!

According to NWEA:”The College Explorer Tool uses correlations between MAP scores and college readiness benchmarks for the ACT to pinpoint the colleges and universities for which a students’ forthcoming scores would likely be near the median admissions scores. Additionally, the tool provides a quantitative profile of each institution using data from the U.S. Department of Education’s College Scorecard, which includes valuable information on cost of attendance and the average annual cost borne by families at different income levels. This crucial information shows students and their families how much they would likely need to borrow in order to complete their education at a given college, as well as graduates’ typical earnings.”

Screen Shot 2016-01-27 at 9.57.34 AM






Fifth grade might seem young to be thinking about post-secondary options, but the incredible thing about having this information available so early in the academic process is that it gives kids a heads up in time to CHANGE THE TRAJECTORY.  If a student wants to go to UofA, or CalTech, or University of North Dakota, he or she can see if they are on track to meet the typical ACT/SAT score acceptance criteria. If the student is falling short there is time to close the gap!  Of course, there is way more to being successful in college and career than test scores, BUT such early information may help kids expand their horizons and reach their dreams.

Because it is a stable and nationally representative assessment, NWEA MAP scores can also be used to nominate students for early talent identification programs, and special school-level growth norms can help determine if programs are effective. This computer-adaptive assessment with strong psychometric properties and a large national sample can help students, parents, teachers and school leaders look at some information with no asterisks.

Comparing Responsibly

In a quest for improving education for students, we are always trying to find what works-  did Program A work better than Program B? It’s complicated because a lot of things can impact educational outcomes, and there are lots of different measures of educational outcomes. Outcomes may include graduation rates, college-going rates, income after high school, health and well being, but test scores are, for us here at OEP, an important one.

*side note here to school district folks- be sure to carefully check your graduation rate data from ADE- we hear it may be more ‘messy’ than in prior years*

Straight comparisons between PARCC test scores for Arkansas students and kids in other states may be inappropriate given differences in student demographics, but there are processes to adjust for poverty and compare school performance across states and even nations (check out And – newsflash- there is never a PERFECT comparison for any kids or school or state, but we still need to try to responsibly compare outcomes and find out what is working to get students learning and ready for a successful life after school.

In today’s world of powerful data and analysis techniques, there are responsible ways to compare without needing to use asterisks. Here at OEP we know that kids and schools and states all face different challenges, and make every effort to focus the conversation not just on ‘highest test score’ but on growth and improvement as well. Just comparing within Arkansas is tricky, but these conversations are more complicated when states are using different tests. PARCC assessments were intended to facilitate constructive comparison of student performance across state lines, but varied testing patterns mean we still need some asterisks.  Hopefully, we won’t need any asterisks next year to compare student performance in Arkansas to Alabama (the only other state using ACT Aspire).

SO- Senator Clark and parents across the state, here at OEP we feel your frustration. Unfortunately, we can’t change the education political landscape and have all kids in the country take the same test.  As policymakers and parents, however, we can change the conversation and ask our schools- what information do you have about how our students are learning and growing compared to peers across the country?

Because in real life- there aren’t any asterisks.asterisk



%d bloggers like this: