University of Arkansas Office for Education Policy

Archive for the ‘The View from the OEP’ Category

Cash Rewards for Computer Science!

In The View from the OEP on October 18, 2017 at 1:13 pm

ARCSYesterday, the Arkansas Department of Education announced a program to drive more students to enroll and demonstrate success in a high-level computer science course. Students who complete an Advanced Placement Computer Science A course, and receive a score of 3, 4, or 5 on the associated AP exam are eligible for the cash reward!  According to the announcement, an Arkansas public school student can receive $1,000 for scoring a 5 on the exam, $750 for scoring a 4, and $250 for receiving a score of 3.

But the rewards don’t just apply to students!  Schools get money for each qualifying  score as well! Schools will receive $250 for each 5 on the AP CSA exam, $150 for each 4, and $50 for each 3.

Why is this important?

Advanced Placement Computer Science A is one of the highest-level Computer Science courses that has a quantitative assessment associated with it.  The course introduces students to computer science with fundamental topics that include problem solving, design strategies and methodologies, organization of data (data structures), approaches to processing data (algorithms), analysis of potential solutions, and the ethical and social implications of computing.

Advanced Placement (AP) exams are administered throughout the country in a wide variety of subjects. AP exams can serve as a consistent and nationally comparable measure of student content knowledge, and students are likely to be granted college credit for a score 3, 4, or 5 on an AP exam. In an earlier blog about AP, we mentioned Arkansas is one of a few states that provide AP testing at no cost to students.

AP CSA exam results in Arkansas compared to the country

In 2016, 46,480 public school students across the country completed the AP CSA exam, and 63% received a score of 3, 4, or 5.  In Arkansas, 298 public school students completed the AP CSA exam, and 29% of students received a score of 3, 4, or 5.  If the new incentive program had been in pace, 14 Arkansas students would have received $1,000, 29 students would have received $750, and 44 would have received $250.

Females made up only 27% of AP CSA testers nationally, so Arkansas is about average with 23% of AP CSA exam takers being female. African American students were 6% of those tested in Arkansas, and only 4% of the national pool.

Increasing Access to Computer Science

Arkansas hit the national computer science education stage in 2015 with Act 187, which  requires that public high schools and public charter high schools offer at least one computer science course at the high school level. As presented in the graph below, Arkansas has seen a sharp increase in the number of students enrolling in, and the number of districts offering computer science classes.  From 2004-05 through 2012-13, about 300 students from 15 districts enrolled in computer science courses.  In 2016-17, there are over 6,600 students from 225 districts enrolled in computer science courses.

CS trend

 

BUT- not yet equitable access to AP CSA

Most of these students are enrolled in classes other than AP CSA. In 2016-17, over 24% of computer science students were enrolled in “Essentials of Computer Programming”.  Comparatively, only 338 (5%) of computer science students were enrolled in AP Computer Science A in 2016-17, and only 33 school districts offered the class. When we consider what type of district provides access to AP Computer Science, we see that they are relatively large (4,500 students on average) and that more than half of the districts that offered AP CSA were in Northwest Arkansas, while it was offered in only 2 districts in Southeast Arkansas.  Please note that all districts are allowed to offer AP CSA, and decisions about which courses to offer to students are made by individual school districts.  If your district isn’t offering this course- we would love to know why!

In order to be eligible for the cash reward, students must enroll in the AP Computer Science A course. A student cannot just learn the material on their own and pass the test, so what if their school is one of the 87% of districts that does not offer the course?

According to ADE, there are six digital providers approved to teach AP CSA, but we are still checking into how a student gets signed up.

Our thoughts:

Here at OEP we like the idea of incentivizing students and schools to focus on computer science, but we are concerned that not all students have the same access to the course.  We fully support students taking the AP exam, as it is a stable measure of student knowledge than teacher-assigned course grades, which are inconsistent across the state.  We would like to see the program changed, however, so students who do not have access to the course or who prefer to learn the material independently could still be eligible for the reward.

The student-focused goal should be the learning, not the seat time.

 

Advertisements

PSAT day! Are your students benefiting?

In The View from the OEP on October 11, 2017 at 12:42 pm
Throughout Arkansas today, many high school students are spending a few hours taking the PSAT.  Here at OEP, we are big on everyone understanding the purpose behind assessments, who is going to make what decisions based on the results, and how students can benefit from the assessment, so we wanted to review what the PSAT is, how it is being used in Arkansas, how it benefits (or doesn’t) Arkansas students, and what OEP recommends moving forward.

What is the PSAT?

The PSAT is an assessment developed for high school students that measures skills in Reading, Writing, Language, and Math.  The paper/pencil test takes about 3 hours to complete.

How the PSAT is being used in Arkansas:

Arkansas school districts are not required to administer the PSAT, but if a district agrees to administer the test to all 10th graders, they can do so at no cost to students or to the district. The PSAT typically costs $16 per student, but the Arkansas State Board of Education approved covering the costs using at-risk funding as allowed by Act 989.
Districts do not have to offer the test to any student. Districts that want to test only select students on the PSAT can do so, but the district/student must cover the associated cost.
All 11th grade test fees are always the responsibility of the school district/student.

Some states (Deleware, Colorado, and Michigan) are requiring students to take the PSAT, and are planning on using the results in their state accountability system.

The PSAT is not required for Arkansas students, the results are not used in any aspect of the accountability system.  The PSAT administration does not replace the required 10th grade ACT Aspire administration in the spring, which is used as a measure in school accountability.

How the PSAT benefits Arkansas students:

Students can benefit from taking the PSAT in 10th grade in several ways.  The test serves as a practice for the voluntary 11th grade PSAT, which score qualifies you for National Merit Scholarship consideration. In addition, the PSAT is good practice for the SAT, a college entrance exam similar to the ACT, and required by some out-of-state colleges.
Participating in the 10th grade PSAT provides students and their schools with the opportunity to find out if students have the potential to be successful in Advanced Placement (AP) courses.  This can help identify students who may have been ‘flying under the radar’ for academic success in AP- and can serve as a particularly helpful tool for encouraging AP participation and enrollment of underrepresented academically prepared students. Schools and students receive AP Potential information in January, allowing time for students to discuss academic plans with counselors , teacher, and parents before selecting classes for their junior year.

A final bonus of PSAT participation in 10th grade is the opportunity to participate in Student Search Service, which connects students with information about educational and financial aid opportunities from nearly 1,700 colleges, universities, scholarship programs, and educational organizations.

When taken in 11th grade, the PSAT automatically enters students for consideration in the National Merit Scholarship competition.  From an initial pool of over 1.6 million students, this well-known program annually identifies 7,500 Merit Scholars to receive college scholarships.


What type of 10th graders are benefiting?

According to information provided by ADE, eighty-one (81) school districts elected to provide their 10th grade students the opportunity to complete the PSAT for free in 10th grade. This is less than one-third of the 262 school districts in Arkansas.

Although less than a third of Arkansas’ school districts are participating, nearly half of 10th graders in the state attend a participating district.  More than 17,000 10th grade students are getting access to a free PSAT, representing over $270,000 in test fees that are being covered by the state.

We were wondering about the characteristics of districts who chose to offer the PSAT to all 10th graders.  Overall, the districts seem representative of state demographics.  As a group the PSAT participating districts serve students whole are slightly more likely to be a minority than the state (47% of participating district students are minority compared to 36% of the state as a whole).   Participating districts also serve students who are about as likely to be participating in FRL as the state (60% of participating district students are FRL compared to 63% of the state as a whole).

Although the PSAT-participating districts look similar to the state as a whole, the program is not reaching some students.  Regional differences are presented below.

 

PSAT

District participation in the 10th grade free PSAT program is highest in the Southeast Arkansas, where over 50% of districts are participating, compared to the lowest participation of 21% in the Northeast region. In terms of overall 10th grade enrollment participation, Northwest Arkansas is providing free access to the PSAT to over 60% of 10th graders, but only 1 in 5 10th graders in the Northeast region are getting the opportunity.

When examined by student demographics, we find stark differences in access by region.   Over 70% of African-American students will take the test for free in the Northwest and Central regions, but only 1 in 5 African American students in the Northeast region are getting the opportunity.  Over 80% of Hispanic students in Northwest Arkansas will take the test for free, as will over half of the Hispanic students in Central and Southeast. Less than 15% of Hispanic students in Northeast and Southwest Arkansas will get the opportunity.


What does OEP recommend?

Here at OEP, we like how the state is willing to support all 10th graders taking the PSAT for free, but wonder about how meaningful an opportunity it is for students.

It is certainly a meaningful opportunity for students who are going to re-take the PSAT in 11th grade and may be one of the 3% of students who get selected to participate in the National Merit Scholarship competition. It seems prudent to note that although finalists are eligible for scholarships from colleges or corporations, only 2,500 students nationally win scholarships from National Merit, and these are a one-time payment of $2,500.

So, for most Arkansas students, the benefit will come if districts actively USE THE DATA to identify students for possible enrollment in AP.  Enrollment in AP is particularly helpful if students are on the college-bound track, and if instruction in the course is high-quality. Due to the ACT Aspire testing, which is required in the spring of 3rd through 10th grades, teachers and counselors should ALREADY have good data about students and their academic performance.  ACT Aspire for 9th and 10 graders gives students a predicted ACT score, which is likely a much more relevant indicator of success for Arkansas students than if they are ready to take an AP class.

We think the state should continue to promote the free 10th grade PSAT opportunity to districts, particularly those in the Northeast and Southwest regions, where African-American and Hispanic students are unlikely to get access to the test and subsequent information.  We also recommend that the state examine how many students are being identified for AP potential who are not already enrolled in AP courses.  Perhaps the schools are doing a good job of placing students in appropriate courses!

Most importantly, we need to be sure we are using our resources effectively to provide the best quality college and career counseling to all Arkansas students.

We would love to hear your thoughts…

 

Some Thoughts on Arkansas Teacher of the Year

In The View from the OEP on October 4, 2017 at 12:46 pm

ATOY_logo

Here at OEP, we wanted to extend our congratulations to Ms. Randi House, the 2018 Arkansas Teacher of the Year (ATOY)!  Ms. House teaches kindergarten in Conway, and as ATOY she receives a $15,000 award and, under Act 17, a year of paid administrative leave to work with ADE. Throughout the year, the ATOY creates professional development materials and provides technical assistance to Arkansas teachers and students.  In addition, the ATOY serves as a non-voting member of the State Board of Education, as an ambassador for education in Arkansas. The ATOY makes public appearances throughout the state and represents Arkansas in the National Teacher of the Year Competition.

Here at OEP, we love how the ATOY supports professional development and interacts with policy by sitting on the State Board.  We think hands-on experience with educational research might also be an interesting perspective to add and we would love to partner with Ms. House in researching a question of interest during her tenure.

According to ADE, the mission of the ATOY program is to promote the profession and recognize quality teachers who implement “best practices” in Arkansas public school classrooms. We know there are lots of great teachers in Arkansas’ schools, and we wondered about the process of identifying the Teacher of the Year.

 

 

How many ATOY have there been?

The National Teacher of the Year program began in 1952, but the Arkansas Department of Education lists ATOY back to 1959.  No ATOY is indicated for 1960 or 1961, so by our count Ms. House is the 58th ATOY!

Has an ATOY ever been selected as the National Teacher of the Year (NTOY)?

Not yet!  Arkansas is one of 17 states from which the NTOY has never been selected. Among our border states, Texas, Missouri, and Oklahoma have each had two NTOY, Tennessee has had one, while Louisiana and Mississippi join Arkansas in never having had a teacher selected for NTOY.

Where do ATOY come from?

Based on the information provided by ADE, ATOY have been elementary, middle, junior high, and high school teachers engaged in teaching a wide variety of subjects.  The 20 most recent ATOY are listed below:

Screen Shot 2017-10-04 at 12.30.39 PM

ATOY have been selected from districts of varying sizes: in 2015 the ATOY was from Poyen which enrolled 582 students, while the 2014 ATOY was from Little Rock which enrolled over 22,000 students.  The percentage of students eligible for Free/ Reduced lunch in ATOY districts also varied- from 44% (White Hall- ATOY 1991) to 90% (Osceola- ATOY 1974).

Interestingly, over 40% of the ATOY came from districts located in the central region of the state, even though only 28% of Arkansas teachers are employed there.  Over time, 25 teachers from the central region have been selected, compared to eight from each of the other regions (the NW, NE, SW and SE). Northwest Arkansas is noticeably underrepresented- the region employs over 35% of Arkansas teachers, but has produced only 14% of ATOY.  Although ATOY have been selected from Fort Smith, Russellville, Van Buren, Rogers, Springdale, and Fayetteville, we were surprised to see that no ATOY had been selected from Bentonville.

Why would there be such a discrepancy is where ATOY hail from?  We propose it is about visibility of the program in schools and districts.  Although each district may select one teacher as its District Teacher of the Year and nominate that teacher for the ATOY, very few do.  Only 33 districts, or 12% of those eligible, submitted a candidate for ATOY 2018.

Why wouldn’t EVERY district submit a candidate?

We have no idea! Perhaps districts are reluctant to participate because they don’t want to ‘lose’ one of their teachers.  This is understandable, because we all want our teachers working with students, but it is important to provide teachers the opportunity to move outside of the walls of their classroom to learn more about their profession and what is happening around the state. Almost all ATOY teachers return to the classroom following their experience, bringing back new skills and enriching perspectives to their school.

The ATOY application process is free, straightforward, and open to all licensed teachers from pre-k to 12th grade who have taught for at least three years, and who spend the majority of their time working with students in a classrooms.  Over 20,000 teachers are eligible to be ATOY, but only 33 applied.

Candidates for ATOY complete an online form and submit a resume, three letters of recommendation, two artifacts that showcase the candidate’s teaching and/or students’ achievement, a form that indicates the school and district leadership support the candidate’s application, and a photo.

From the submitted applications, 16 regional finalists are selected (one representing each education service cooperative and one representing Pulaski County), and four state semi-finalists are selected among the regional finalists. The selection panel visits each semi-finalists’ classroom before selecting the ATOY.

Interested in submitting an application?  

Having an ATOY can be a very positive for your community and provide an opportunity to highlight the great work being done by ALL the teachers in your schools.

Here are some next steps:

  1. Start talking with your staff about identifying a couple of excellent teachers to celebrate as district teacher(s) of the year.  Some districts partner with local businesses to provide bonuses (free meals, gift cards, services like car washes or house cleaning, etc.).
  2. With a team at the district level, select the teacher you would like to submit as a candidate for ATOY.  Keep your eyes out for the 2019 ATOY application, which ususally comes out in February and is announced through a Commissioner’s Memo.
  3. Help your district’s candidate for ATOY get the forms signed, the 3 letters of recommendation, and a nice portrait taken.
  4. Maybe your teacher will be selected as ATOY, and maybe Arkansas will get the opportunity to be recognized with a National Teacher of the Year.

While you are waiting for the 2019 application to come out, you can benefit from ATOY’s expertise by connecting with the Arkansas Exemplary Educators Network (AEEN). The statewide network is comprised of veteran and current Arkansas Teachers of the Year and Milken Educators Award recipients who have volunteered to share their knowledge and expertise with other educators and groups across the state. These educators have a vast wealth of knowledge and experience in education, as well as strong leadership skills, and are willing to support your work so take advantage of their expertise!

 

Assessment Literacy: Student Assessment Rights and ESSA

In The View from the OEP on September 27, 2017 at 11:57 am

Today, OEP was happy to present at the Education Innovation Summit in Little Rock about Assessment Literacy: Student Assessment Rights and ESSA.  For those of you who are not attending, we wanted to share the key points of our presentation.

Rooted in the work of the National Task Force on Assessment Education, of which we are honored to be a member, we describe what assessment literacy is and why it is critical to the success of Arkansas students.

Although some consider assessment to be a task that takes time away from teaching and learning and must be endured, the Task Force defines assessment as “The process of gathering information about student learning to inform education-related decisions.” Information can be gathered not only through summative assessments like the ACT Aspire, but also through formative assessments, interim assessments, performance assessments, and any other means of collecting information about what students know and are ready to learn.

In order to use the information gathered effectively, students, teachers, administrators, parents, community members, and policy makers must be assessment literate- that is understand how student assessment can enable them to better carry out their role in education, believe that assessment can improve teaching and learning, and put into place activities and behaviors to act on these beliefs.

Although it may seem that these understandings, beliefs, and skills would be a basic part or the educational process, there is a lack of understanding about assessment by those who adopt policy and laws, and govern our schools, by those who teach our students or lead our schools, due to continued lack of appropriate pre-service preparation and in-service learning for educators, and by parents and students due to ineffective communication about how student assessment can promote high quality student learning.

When these assessment literacy understandings, beliefs, and skills are in place, it can lead to positive outcomes for students. Research has shown that students who are more involved in their own learning – and assessment – achieve more. Effective use of formative assessment practices requires teachers to understand how on-going instructionally-embedded assessment can help all students achieve at higher levels, and administrator involvement in school improvement activity is related to higher student achievement.

The Task Force agrees that sound assessments have five characteristics that support student success:

  1. Begin with a clear sense of purpose
  2. Arise from a clear vision of the learning targets to be assessed
  3. Rely on high-quality assessment methods
  4. Communicate results effectively
  5. Keep students striving for success

The first criteria,”Begin with a clear sense of purpose” means that all stakeholders understand Why we are conducting the assessment and Who will be making what decisions based on results.  In some cases, like the annual state assessment, it is to certify learning, or identify if students have achieved a pre-determined level of performance.  In other cases, like formative assessments, the purpose is for teachers to use the information gathered to inform future instruction by determining where students are in their understanding.   In some cases, assessments can serve as learning opportunities, where students learn for themselves about what skills they have mastered and what skills they still need to develop.

Once the purpose of the assessment is clear, the second criteria, “Arise from a clear vision of the learning targets to be assessed,” comes into play.  The learning targets should: be clearly and completely stated in language educators can agree carries the same meaning, measure mastery of what’s important, be organized into learning progressions, be realistic in number, be translated into student friendly terms, and consider student background, interests, and aspirations.

Only after the purpose and learning targets have been identified should an assessment be selected or created. The assessment must reflect the learning target(s), sample enough evidence to lead to dependable inference, rely on high-quality assessment methods to be sure information is valid and reliable, and minimize any sources of bias.

After the assessment is completed, the results must be communicated effectively to the intended user in a timely and understandable manner. The evidence shared must reflect student achievement accurately and support correct inferences on the part of the intended users, and the information shared must be precise and in terms the users (be they students, parents, teachers, or policymakers) understand.

Finally, a sound assessment must link the assessment process to student motivation in ways that keep all students striving for success! It should help students know where they are headed, where they are, and how to close the gap between the two.

Unfortunately, in Arkansas and throughout the country, these five criteria are rarely met by assessments.  Some states, however, have been working to ensure that they are.  The Michigan Assessment Consortium has created assessment literacy standards for students, teachers, administrators, and policymakers.  Oregon has a plan based on these principles that outlines a vision for assessment to empower meaningful student learning.  The plan includes the Student Assessment Bill of Rights, which communicates how students are entitled to the five sound assessment practices.

Although Arkansas doesn’t currently have assessment literacy standards, the success of the state’s new ESSA plan depends on stakeholders understanding, embracing, and taking action on the five assessment literacy principles. The Theory of Action states that “The ADE and (districts) will engage in continuous cycles of inquiry and improvement …. to identify and address the needs within their respective systems.” This theory assumes that the assessments provide meaningful information about clear learning targets and that the students, teachers, and administrators understand what the assessments are telling them and are communicating the results effectively.TOA

In all likelihood, however, the majority of students, parents, teachers, administrators, community members and policy makers do not really understand what assessments are telling them about student performance and learning in our state. Arkansas educators NEED to be assessment literate to complete the Theory of Action for Student Success. So, how do we get there?

Apart from going back to college to get a degrees in assessment, and since many of those teaching in higher ed. institutions may not have a deep level of understanding about assessment themselves,  it makes sense to make becoming assessment literate a school or district project.  We have some recommendations below:

  • Become more Assessment Literate!  A good first step would be to find out who in your district knows the most about assessment. There may be someone who knows about the ACT Aspire, and others who are strong in formative assessment.  Use them to learn more about the measures we use and what they mean.  In addition, there are microcredentials available through Bloomboard about data literacy and assessment, and you can also learn about these topics through resources available on assessmentliteracy.org!AL.org.png
  • Conduct a needs assessment: Determine the assessment literacy of your staff, and make a plan to get some high-quality professional development to improve their skills.
  • Use the Student Bill of Assessment Rights: This can create a baseline for discussion, and ensure that the conversations around assessment literacy include students. SBAR
  • Start talking: The new ESSA plan provides a great conversation starter for talking with parents and community members about assessments, what they mean, and how they are used to make decisions.
  • Conduct an assessment audit: Have your school/district examine what assessments are being used. As a staff identify what the purpose is of each assessment, what the assessments are measuring, the quality (reliable? valid?) of the assessment, how the results are communicated (and to whom?),  and if they are supporting student engagement.
  • Develop policies: Engage all affected stakeholders in the development of assessment policies and practices.
  • Use the data: Ensure that data from high-quality assessments to improve instruction and student achievement for all students.
  • Attach resources to priorities: Ensure efficient use of resources by checking to see if you are getting a return on investment from programs, and planing strategically for the future.

Until students, parents, teachers, administrators, policymakers, and community members understand how student assessment can enable them to better carry out their role in education, believe that assessment can improve teaching and learning, and put into place activities and behaviors to act on these beliefs, we will not effectively meet the learning needs of all Arkansas students.

Assessment literacy is critical to student success.

 

September Education Committee Meetings — Little Rock, AR

In AR Legislature, The View from the OEP on September 19, 2017 at 4:57 pm

capital-picTwo bi-partisan educational meetings held September 18th and 19th engaged lively discussions on a myriad of topics.  Monday afternoon was an educational caucus meeting that is a “deeper dive” by the educational committee before the larger committee meetings take place.  This week they honored the 2017 Arkansas Teacher of the Year, Courtney Cochran.  Ms. Cochran presented briefly on the importance of legislators getting into the schools to see what is happening on the ground.

Teacher Cadets

The remainder of the caucus meeting was a presentation to the committee from the Office of Educator Effectiveness on the Teacher Cadets program.  The Teacher Cadets program allows high school students to develop a better understanding of the field of education as well as promote growth in teacher preparation programs. Students participating in the program are paired with universities and earn concurrent credit towards education/teacher preparation hours. Teacher Cadets is a national program that has been in effect in Arkansas for 4 years, and is currently operating in 58 Arkansas schools. Some resistance seemed to stem from the lack of data to determine if the program was achieving its desired outcomes, as well as why the program is not being utilized in more (if not all schools) as it is relatively free to implement and open to all public schools. There seems to be a lack of Cadet programs in Southeast Arkansas, and the committee will continue to seek ways to engage districts to take part in this program.

Higher Education Funding

In large session, many of the states’ colleges and universities had representatives present to hear about the new funding calculations presented by the Arkansas Department of Higher Education (ADHE).  The meeting was “standing room only” as stakeholders awaited more information on how their future funding could be affected.  Act 148 of 2017 required the Arkansas Higher Education Coordinating Board to adopt policies developed by the ADHE necessary to implement a productivity based funding model for state-supported institutions of higher learning. The ADHE presented the proposed funding addendum that would incentivize institutional productivity over funding strictly based on needs.  Maria Markham, director of ADHE, began the presentation by highlighting the fact that Arkansas currently ranks 49th in the nation when considering the numbers of adults with a bachelor’s degree or higher and 47th when all degree types are considered. These low ranking have been consistent over time, indicating that current and past funding models have had no impact on increasing student degree attainment in Arkansas.

The new funding model would work with additional “new money” that would need legislator approval, but recent comment by the governor suggest that money is available.  The committee utilized the amount of $10 million for their discussion as that would cover the cost if all institutions were found to be “positive growth institutions” and were awarded the financial bonus.  The funding model would utilize criteria to rank a college or university’s performance against itself to ensure that the institution is improving student degree completion and affordability. Institutions could gain additional funds by operating efficiently or be penalized (capped at 2%) for not being efficient.

The program is designed to encompass missions of varying institutions, so they won’t be penalized for being “liberal arts focused” by comparing them to themselves and similar institutions nationally.  There are different models for 2- and 4-year institutions.  year colleges that send students to universities will receive bonuses as well as those universities for graduating those students with existing credits. Universities would be weighted in their score on their ability to serve underserved student populations and graduate students in STEM fields, as well as utilizing money on student services vs operations. All credited programs will be counted in the calculations, but there was some pushback from legislators for programs that are not credit-based, but provide technical skills and education.

If this funding protocol was utilized today, the models show that among 4-year institutions, the University of Arkansas at Fayetteville would receive the largest percentage of additional funding, and Henderson State would receive the lowest.  Among 2-year institutions, Arkansas State University-Newport would receive the largest percentage of additional funding with North Arkansas College receiving the lowest.

Representatives were apprehensive about several aspects of the new funding model.  They wanted to make sure there are ways for institutions to monitored, so they do not raise tuition or other expenses to cover cost of the “penalties.”  They also want to make sure that the state isn’t further creating an achievement gap by shifting money from low-performing schools to high-performing schools.  As stated by Senator Chesterfield when she said it reminds her of the old song that says, “Thems that get’s is thems that’s got”.

The chairman will allow the comments period to remain open on the topic until the 25th in order to allow all voices to be represented as many of the members present continued to have   questions, comments, and concerns.

School Nurses

The committee heard an update from the Public Health Services Advisory Committee about health issues of Arkansas Public School students and how school nursing services can be improved.  In the report it was noted that 38.8% of Arkansas children are obese or overweight and that for the first time obesity has surpassed attention disorders as the most frequent “chronic condition” for Arkansas school children.  The report also indicated that there are 950 school nurses reported in schools, and that based on acuity level 884 nurses are needed.

NSL Funding and Expenditures

The committee received an annual report on calculations for the free/reduced lunch program as well as the manner in which National School Lunch (NSL) funds are spent by districts.  NSL funds are provided to schools where more than 70% of the students are eligible for the federal free/reduced price lunch program. While state law lists a number of approved uses for the funding,  districts have some flexibility in the use of these funds, the majority of NSL funds support salaries and benefits for additional personnel such as curriculum specialists, instructional facilitators, counselors, social works and other personnel that support student learning. The committee sought to have measures to restrict the use of funds to areas that would be most effective in closing the achievement gap, but there was not clear direction regarding how to do so. In some cases, districts are not spending all of the funds provided, and four districts and one charter school were identified as having a portion of their NSL funding withheld by the ADE as a consequence for failing to spend at least 85% of funds previously distributed (per Act 1220 of 2011).

Issues of Equity

There was a presentation on the “issues of equity”.  This was a presentation that occurs annually that compares equity amongst youth across the state.  It measures opportunity, revenue, and expenditures as all elements of equity.

Interim Study Proposals

The committee also did hear from Rep. Meeks and approved the continuation of his sponsored ISP (interim study proposal).  The first proposal will allow studies to take place on the effects on technology being introduced to school children to find the optimal grade level to do so.  The second study proposes to study the effects of creating performance tasks for all grade levels for completion rather than simply attendance and hours.  Commissioner Key suggested that this is a process taking place of having personalized learning for individualized students, but that it is not currently occurring in every classroom statewide.

XQ: The Super School Project

In The View from the OEP on September 13, 2017 at 12:31 pm

XQ-Super-School-Live-Stream

Last Friday, NBC, ABC, CBS, and Fox simultaneously broadcast a live show about reinventing American high schools.  Although over 8 million people tuned in, you may have missed it, because you were watching Netflix or a cable channel. Ironically, that was what the show was about: disruptive innovation. To us, it felt more like an MTV awards show, with stars we only vaguely recognize, kids dancing, and hip sets, but there are real ideas and smart education minds behind the glitz.

When you strip away the Hollywood, the message was familiar…

  • American high schools are not meeting students’ needs. Our country has fallen behind in high school completion and performance on international exams. Students are not prepared for college, although two-thirds of jobs now require some college education.
  • American high schools are not increasing social equity. Longstanding inequities in the quality of education provided to different groups of young Americans continue to produce wide achievement gaps separating students of color from their white counterparts, and low-income students from their more affluent counterparts.
  • American high schools cannot change unless many more people participate. Reconnecting communities to their schools and students to their communities is key in improving education for our kids.

XQ: The Super School Project began as a challenge to reimagine and design the next American high school.  Teams from across the country submitted ideas and a year ago, 18 schools were selected to implement their plan. Each school received $10 million to help turn the ideas into reality, and new schools are up and running.

XQ_MAP_v7-2

That’s some serious cash, and it comes from the Emerson Collective, the group that Laurene Powell Jobs (wife of Steve Jobs) uses to finance philanthropic projects. The funded schools have wide-ranging visions, including: experiential learning, individualized academics, biliterate teaching, entrepreneurism, entwining the school with other community organizations, serving students facing challenges of homeless and foster-care placement, environmental and social justice, civic contributions, and making the school like a modern, creative workplace.

Although there are a lot of ‘buzz words’ in the school descriptions, it is important to recognize that the projects were essentially crowd-sourced, and reflect the needs of the local community and education team that submitted them.  None of us have all the answers, or the time to individually reinvent the wheel, so check out the schools that were funded (or these other examples of innovative schools) and consider if any of the ideas would benefit your community.

While no Arkansas school received the $10 million in support, there ARE schools in Arkansas implementing a variety of innovative ideas, and the Office of Innovation for Education is supporting the work in 52 schools throughout the state. We love to see the innovation in these schools, and how the ADE is supporting their efforts.

We also like a bunch of the resources available on the XQ site that are worth checking out (although, like the show, they were pretty heavily packaged). We think you could use these with your staff, school board, or community members to think about changes that you aall want to make.

Are these schools really going to be Super? Will the innovations make positive change for students?  How will impacts be measured?  Are the innovations sustainable without the millions?  These are all good questions that we don’t have the answers to right now. All we do know is that we can do better by our students.

One of our favorite parts of the XQSuperSchool Live broadcast was entertainers sharing what they wished they had learned in high school.

“I wish I had learned that it’s okay to make mistakes”

“I wish I had learned how important it would be to work with other people.”

“I wish I head learned about what it is like to actually live in the real world.”

“I wish I would have learned Mandarin.”

“I wish I had learned to be more tech savvy.”

These are things that students may be wishing right now. The XQ website has lots of student voices, and it is fascinating to listen to them. They want to learn, but see that the current system isn’t giving them what they need. students_voices

 

What do you wish you had learned in high school? Take a moment to think then comment below about what you wished you had learned. Then, take the next step and ask a student in your local high school what they really want to learn in high school, and ask how you can help them achieve that goal.

 

 

ACT scores are down, but it’s okay.

In The View from the OEP on September 7, 2017 at 6:24 am

This morning, ACT scores for the graduating class of 2017 were released.  Like we suggested in our blog yesterday, the scores were lower than the scores for every graduating class for over twenty years. While this may sound alarming, here at the OEP we are recommending that folks put the data in context and celebrate the good decisions Arkansas has made regarding ACT.

5yrs_17The statewide average composite score decreased from 20.2 for 2016 graduates to 19.4 for 2017 graduates.  As we mentioned yesterday, the decline is likely due to the fact that Arkansas tested 100% of graduates for the first time.  Declines were evident in every subject area:

  • English dropped from 19.8 to 18.9
  • Math dropped from 19.6 to 19.0
  • Reading dropped from 20.7 to 19.7 and
  • Science dropped from 20.2 to 19.5

5yrs_content_17.png

Although there was an increase in the percentage of graduates tested, at least 90% or more of the graduating class has taken the ACT since 2013. Like we discussed in yesterday’s blog, some of the students who completed the ACT may be different than those who participated in prior years. These students likely struggle academically and/or don’t consider themselves college-bound.  To make the scores more comparable across the years, we suggest looking at the scores by academic preparation. Students who have taken ‘Core or More’ are those enrolling in more rigorous core content in high school (4+ English courses, 3+ math courses, 3+ social studies courses, and 3+ natural science courses).

Core_More_17.png

Even when only comparing students who have taken similar academic preparation (Core or More), scores declined slightly across the state. The statewide average composite score for ‘Core or More’ graduates decreased from 20.9 for 2016 graduates to 20.4 for 2017 graduates.  Declines were again evident in every subject area:

  • English dropped from 20.7 to 20.2
  • Math dropped from 20.2 to 19.9
  • Reading dropped from 21.4 to 20.8 and
  • Science dropped from 20.8 to 20.4

 

One of the great things about ACT is that is taken by students throughout the US. In some states, however, only a small percentage of students take the ACT.  It isn’t appropriate to compare ACT scores for Arkansas with states that don’t test all their students. There are, however, 20 states that have tested 100% of students within the past two years, so we can compare the performance of Arkansas graduates to the graduates in those states. The states and the percentage of graduates tested since 2007 are presented in the table below. Years where 90% of graduates were tested are highlighted in green.

Note that Colorado, Illinois, and Mississippi have been testing all (or close to all) of their students for 10 years.  Michigan, who has also been testing 100% of graduates for nearly a decade switched to statewide SAT in the most recent year, which explains why only 29% of 2107 Michigan graduates completed the ACT.  We kept Michigan in the table, however, because they provide a great example of how changes in the percentage of students assessed can result in big changes to average ACT scores.

Tested_17

States Testing at Least 90% of High School Graduates, 2007-2017

Although the percentage of students tested impacts ACT scores, so does the demographics of the students.  State that serve a more economically disadvantaged population tend to have lower ACT scores than states that serve populations with less economic instability. We grabbed the percent of population that lives in poverty from census.gov, added it to the table of states that test all (or nearly all) of their graduates on ACT, and sorted it from least percentage of population in poverty to the greatest.  To this we added the average composite ACT score for the prior 10 years to illustrate the performance trends.

Scores_17.png

Poverty Rate (2014) and Average ACT Scores for States Testing at Least 90% of High School Graduates, 2007 to 2017

Of the states that tested at least 90% of high school graduates on the ACT, Wyoming had the smallest percentage of its population living below the poverty level at 11.2%.  Mississippi had the highest poverty rate at 21.5%.  In 2017, Wyoming had one of the highest ACT composite scores at 20.2, while Mississippi had one of the lowest at 18.6.  Arkansas has a greater poverty rate than most states at 18.9%, and an average composite score of 19.4.

Remember that Colorado, Illinois, and Mississippi have been testing all (or close to all) their students for 10 years. Although Colorado and Illinois have a smaller percentage of people living below the poverty line than Arkansas does, they can serve to illustrate what happens to ACT scores over time when 100% of students are tested each year. In Colorado, the scores have fluctuated slightly from 20.4 to 20.8, but have not consistently moved upward.  Illinois scores have also fluctuated slightly, from 20.5 to 20.9 until 2017, when scores rose but perhaps this was due to a reduction in the percentage of graduates completing the ACT (from 100% to  93%). Mississippi has a higher poverty rate than any other state that widely tests high school students on the ACT, and although lower than Colorado and Illinois, scores have also fluctuated slightly, from 18.4 to 19.0.

Kentucky and Alabama are the two states that are most similar to Arkansas in both ACT testing rates and poverty rates. Alabama tested 100% of graduates for the first time in 2015, and the statewide composite dropped from 20.6 to 19.1. The score stayed 19.1 for 2016, but crept up to 19.2 for the class of 2017.  Kentucky tested 100% of graduates for the first time in 2009, and the statewide composite dropped from 20.9 to 19.4. Over the eight years since the initial drop, however, Kentucky has made consistent gains in ACT scores, and has maintaining a high of 20.0 since 2015.  Although Kentucky has a greater percentage of people living below the poverty line than Arkansas, the average ACT score for the state is now higher than Arkansas’.

Remeber Michigan?  They had tested 100% of students from 2008 to 2016, and had shown consistent improvement in average score over time.  Like with other states, there was some variation from 19.6 to 20.3, but in 2017 the score jumped to 24.1!  The reason- only 29% of students completed the ACT this year.

So What:

What these statewide scores over time illustrate several key points for Arkansas education stakeholders to consider.

  1. States experience a dip in scores when they begin to test all students.
  2. States that test a high percentage of graduates on ACT do not demonstrate large changes in average scores.  Most states hold relatively steady over time, with fluctuations of less than half a point.
  3. Poverty does not limit performance: Kentucky students are outperforming similar states.

Now What?

Arkansas need to keep testing 100% of graduates for the long haul.  We can’t just stop because we don’t like the scores.  Arkansas’ ACT scores will not increase, however,  unless educators and students do something differently!  This is a classic example of weighing the pig not making it fatter.

pig

ACT is a meaningful test for students and parents, and student success on the ACT may soon become a part of how school quality is measured. While improvement may be difficult to see at the state level, changes can be implemented at the local level.  There are lots of resources available to support student success. Schools should be mindful to set realistic and meaningful goals for ACT improvement.

We LOVE that Arkansas is giving every student the opportunity to get a picture of his/her readiness for college and careers, and doing it early enough that students can use the information when making decisions. Although statewide scores decreased slightly, we think it is important to focus on the positive outcomes of the program rather than any decrease in scores. More students are getting more information about their achievement, and districts and schools have better information about the performance of their graduates.  This is a good thing, regardless of the statewide average scores.

What are your thoughts?

 

ACT scores may decline… But for good reasons!

In The View from the OEP on September 6, 2017 at 12:39 pm

johnny-carson-carnac

Tomorrow, the state will release the ACT scores for the graduating class of 2017.  Here at OEP, we predict that the scores will be slightly lower than for prior graduating classes, BUT we think there are good reasons for that and caution against over-reacting to any decline.

We will put out another blog tomorrow reviewing the results, but wanted to spend some time today setting the stage for the results.

This report will present results for the first graduating class that had 100% of kids take the ACT in grade 11, so it provides some new insight into our high school graduates.  In prior years, students self-selected to take the ACT, and students who felt they were not on ‘the college track’ may not have taken the test.

We love how Arkansas is providing the opportunity for all students to take the ACT for free during 11th grade, and if the statewide scores are lower than for previous graduating classes we recommend considering the causes and examining the data carefully.

The students represented in the upcoming report were in 11th grade in 2015-16, which was the first year that the state tested 100% of students on the ACT for free during the school day.  The number of Arkansas juniors taking the ACT jumped from 8,700 in the prior year to over 31,000, an increase of over 350%!  The numbers reported in the 11th grade report seemed positive, and everyone was pleased when the scores were essentially the same as in prior years when fewer students had been tested.

So, if these students were scoring similarly to prior classes when they were juniors, why do we think the graduating class report will show lower performance compared to previous graduating classes?

We have two reasons:

Different group of 11th graders.  In prior years, only 30% of the graduates who did take the ACT had taken the ACT in their junior year. For this graduating class, however, over 90% of graduates took the ACT in their junior year.

One-time score vs. most recent score.  The scores for the graduating class report represent the most recent ACT score for students who indicated that they were graduating. Like we said in our previous blogs, students usually do better the second (or third or fourth) time they take the ACT.  Since many students who are interested in going to college take the ACT multiple times in an attempt to increase their score, the higher performers will likely have taken the ACT at least once since their junior year and have gotten a better score. Students not planning on going to college, however, may not have taken the ACT in the past, but in this group everyone did. These students probably did not re-take, so the score they received in 11th grade would have been their only, and therefore, most recent score.

How can we tell if students are performing better or worse than before if different groups of kids were tested?

Due to the differences between the students tested, here at the OEP we recommend examining the performance over time of students who are similar to those who would have taken the ACT before the state provided testing for all.  These students are most likely represented by the “Core or More” level of preparation.  “Core or More” students report completing four or more years of English, and three or more years each of math, social studies, and natural science.  Since a broader population of graduates completed the ACT, the percentage of students who are “Core or More” will likely be smaller, but the students themselves will have taken a similar course load.

One great thing about the ACT is that it lets us compare the performance of Arkansas graduates to the performance of graduates in other states.  We have to be careful, however, to only compare Arkansas to the 18 other states that are testing all (or over 90%) of their students, and consider the demographic characteristics of those states as well.We will have this comparison for you in tomorrow’s blog.

School and District reports are not provided by ACT, but can be obtained from your district. In prior years, some schools tested all students on the ACT, while others allowed students to self-select if they wanted to take the exam. Now that all students are given the opportunity, it makes the comparison of scores between districts more equitable.

So What:

We LOVE that Arkansas is giving every student the opportunity to get a picture of his/her readiness for college and careers, and doing it early enough that students can use the information when making decisions. The statewide scores will likely go down due to the inclusion of students who may not have taken the ACT without the state providing it. We think it is important to focus on the positive outcomes of the program rather than any decrease in scores.  More kids are getting more information about their achievement, and districts and schools have better information about the performance of their graduates.

Now What?

  1. Check out tomorrow’s blog where we will review the scores!
  2. Get the ACT info for your school/district and check for trends over time – being careful to examine “Core or More” students separately.
  3. Make sure your school/district is sharing information with all students about practicing for the test and the benefits of taking the ACT more than once!
  4. Don’t freak out- Arkansas is headed in the right direction by increasing access to, and transparency with, the ACT.

 

 

Where The Metrics are Made Up and the Points Don’t Matter: The School Quality and Student Success Indicator in ESSA

In The View from the OEP on August 30, 2017 at 1:08 pm

Earlier this month, the third draft of the Arkansas’s ESSA plan was submitted to Gov. Asa Hutchinson for review. Given the focus on a successful beginning of school, many educators likely haven’t had time to examine this new draft, but feedback is due by August 31st, and we think you might want to give some on the plethora of measures now included in the School Quality and Student Success Indicator. Especially if you are a high school administrator (please share this blog).  We will submit our suggestions (at the end of this post) today to ade.essacomments@arkansas.gov and we suggest you send yours as well!

Reading through the updated info about the School Quality and Student Success Indicator (SQSS) we were reminded of one of our favorite shows: Whose Line Is It Anyway?

 

Points don't matter

 

Our last post about the ESSA plan was 3 months ago after the 2nd draft was released.  We had served on the Accountability Advisory Team and were pretty pleased with the resulting plan.  In that blog, we encouraged stakeholders to submit feedback about the plan.  Unfortunately, it seems like there were too many ‘suggestions’ about the School Quality and Student Success Indicators, because now instead of 2 measures of school quality and student success there are 11.   Keep in mind that this is in addition to the other indicators included in the plan: academic achievement, academic growth, graduation rate, and progress in achieving English Language Proficiency.

Here at the OEP, our view is that the inclusion of so many measures will make it more difficult for school leaders to determine if their school is meeting expectations for school quality and student success. We think ADE needs to reduce the number of measures being included in the School Quality and Student Success Indicator.  A few clear indicators will be better for our schools and students.

It’s kind of like if your doctor told you to eat healthier.  Eating healthier could mean different things to different people, depending on if you are needing to lower your cholesterol, manage your diabetes, lose weight, or even put on some pounds.  Tracking your calories, fats, sodium, potassium, protein, carbohydrates, sugar, iron, folic acid, and vitamins A and C, will give you a lot of information that may or may not make you healthier, and will likely make you give up altogether on your attempts to change your diet for the better. What might work, though, is if your doctor gave you a clear goal that is easy to keep track of, like “Eat more veggies” or “Stay under 40 grams of carbohydrates each day”.


What SQSS Was Before: As we discussed in our earlier blog about Arkansas’ ESSA plan, the first two versions of the ESSA plan included some of these clear goals in the form of measures for the School Quality and Student Success Indicator. As you can see in Table 16 from the 2nd draft, there were one or two indicators per grade span.

5th Indicators

For ‘elementary schools‘, those enrolling a majority of their students in Kindergarten through 5th grades, the two measures were the percent of students who attended school at least 90% of the time (NOT Chronically Absent), and the percent of students ‘Reading Ready’ in 3rd grade (measured until the new K-2 assessments get implemented as the percent of 3rd grade students meeting or exceeding expectations on ACT Aspire reading).

For ‘middle schools‘, those enrolling a majority of their students in 6th through 8th grades, the two measures were the percent of students who attended school at least 90% of the time (NOT Chronically Absent), and the percent of students meeting or exceeding expectations on ACT Aspire science.

For ‘high schools‘, there was only one measure: the percent of graduated with one or more AP/IB/or concurrent credit.

Pretty straightforward: get kids to come to school, be ready to read by grade 3, make sure they are learning science in the middle grades, and make an effort to have students take rigorous courses in high school. We appreciated how these measures were easily collected and easily interpreted.

We liked the third grade on-level reading because of the focus it will put on early reading progress in K-2, and we also liked the fact that science is being included somewhere in the metrics.

We had concerns with chronic absenteeism because the metric may reflect events at home rather than the quality of the school. If I leave for work before the bus comes so I can’t help my 1st grade student get to the bus on time, or if I need an older sibling to stay home with the younger siblings if they are sick, my student is absent due to issues OUTSIDE of school rather than issues of school quality. Could schools implement programs and supports to help my family get to school more consistently?  Sure, but it may be difficult to impact the wide variety of issues that some students and their families may face, and we aren’t convinced that absenteeism is a valid and reliable measure of school quality.

We also have concerns with only examining the percentage of graduates with ‘advanced’ coursework. First of all, just taking a course is not a complete indicator of success.  Arkansas invests to make AP tests FREE for all Arkansas students, so we suggest that the percentage of graduates that ‘pass’ the AP test may be a better indicator of school quality and student success.  In addition, we would have liked to see inclusion of other indicators of post-high school readiness: CTE program completers or the percentage of students receiving industry certifications.

 

Vision

 

Both chronic absenteeism and the ‘advanced coursework’ measures are particularly odd in the face of the Vision of the ADE and the tone of the ESSA plan.  Both the Vision and the plan talk about student-focused learning, but these measures are not student-focused at all. Although we didn’t love all of the metrics initially included int the School Quality and Student Success Indicator, they are not BAD measures, they are easy to collect and understand, there and only a max of 2 at each school which makes it manageable, and the points are 10% or less overall score (MOST OF WHICH IS BASED ON STUDENT GROWTH- HOORAY!).

 


What SQSS Is Now: In the the latest ESSA draft, however, the School Quality and Student Success Indicator,  is a mess.  The plan notes “The School Quality and Success Indicator was a focus of significant stakeholder feedback during the public comment period.”As you can see in Table 15 from the 3rd draft, there are now a lot more indicators of school quality and student success.

SSI indicators_version 3

 

Holy Moly!  That’s a lot of points and way more confusing than before! And now it is a greater percentage of the overall school score (15%).

Note that the indicators are now reported for grade levels instead of by the grade span if the school.  This is because all these indicators have to be combined at the student level.  You can read more about the super not-easy to calculate six-step process on page 43 of the ESSA plan.

For ‘elementary schools‘, the indicators still include chronic absenteeism, but the percent of students ‘Reading Ready’ in 3rd grade has been expanded to 3rd grade and up. We liked the 3rd grade indicator because we felt it would help focus on K-2 early literacy skills, but now we feel like this is just double-counting scores already used in the Achievement indicator.  In addition, science achievement and growth are included for elementary schools, which we don’t mind but it may adds more noise and district from the initial focus on early Literacy!

For ‘middle schools‘, the indicators still include chronic absenteeism, and the percent of students meeting expectations in science, but now also include science growth and the percentage of students meeting grade level expectations in reading.  We like including growth with science since achievement was initially included, but like with elementary- feels like double counting scores that are included in achievement.

High schools get hit hard with all these new metrics. Instead of just AP/ IB/ concurrent course taking there are now eleven measures

1) Chronic absenteeism is now a measure at the high school level, and while we still have concerns about the value reflecting things other than school quality, we do feel that in high school it may be more if a student’s decision to attend, and that most high schools could do more to get kids to come.

2 & 3) High schools are now, like elementary and middle schools, also examining science achievement and growth as a measure of school quality and student success for students through 10th grade.

4) High schools will also be measured on the percentage of students reading at grade level through 10th grade.

5 & 6) There are two measures for ACT for graduates: the high school receives a point for each student who scored at least 19 on the ACT (composite), and can get another half a point per student if they meet the college readiness subject benchmarks of 22 or over. We like including this measure because, as we have discussed, it has real meaning for students.  All 11th graders take ACT, and since students’ best score is used, maybe schools will work harder to promote all students taking it more than once. Criteria for scoring points for WorkKeys, the ACT career readiness assessment is being examined.

7) The GPA of graduates is another proposed measure of school quality and student success. This is the one indicator we really DO NOT LIKE. It seems to us that this may lead to pressure on teachers to give ‘better’ grades.  We know that grades are subjective and inconsistent from class to class and school to school. Grades are not a valid or reliable measure of school quality or student success.

8, 10 & 11) In addition to the AP/IB/ Concurrent indicator that was in the earlier drafts, now schools get points for students completing Service Learning courses and computer science courses.  Although we can see why these would be ‘on the list’ of measures for a quality school, we would still like to see a more career-focused indicator.

9) On-time credits is a new measure. We like the indicator that checks along the way for on-time credits, but aren’t sure how this looks in practice because it is new for Arkansas!

 

See what we mean about keeping track of too many indicators?

Herding-Cats

 


What SQSS Should Be:

We suggest ADE needs to reduce the number of measures being included in the School Quality and Student Success Indicator and the weight of the SQSS indicator in the Overall School Index should be pulled back to 10%.  A few clear, measurable indicators will be better for our schools and students than too many, but none of them are great measures of school quality or student success. Here’s what we would do:

“Reading Ready” just for 3rd grade.  We like the 3rd grade indicator because we feel it would help focus on K-2 early literacy skills, but when it is applied to all grades we feel like this is just double-counting scores already used in the Achievement indicator.

Science Achievement and Growth just for middle schools.  While we like the inclusion of science, we really need to have fewer, clearer measures. Elementary focus should be Literacy (reflected in part by “Reading Ready”) and Science is already included in the high school measures through ACT.

Combine all the course-taking measures into one, and give credit for any graduate that completed any of the listed classes. This is supposed to be about student-focused learning.  We want high schools to support students in pursuing their goals, not just enrolling in specific types of classes.

Leave out GPA. This is the one indicator we really do not like. It seems to us that this may lead to pressure on teachers to give ‘better’ grades.  We know that grades are subjective and inconsistent from teacher to teacher, class to class, and school to school. Grades are not a valid or reliable measure of school quality or student success and GPA should not be included in this indicator.

Start with on-time credits for just 9th grade, then add additional grades in subsequent years. Suddenly adding in this new measure at all grade levels (in addition to all the other measures) might be too much for the school to address at one time. A more measured approach would be less overwhelming for schools and could lead to better outcomes for students.

 

In summary- we think there should be fewer measures included in the School Quality and Student Success Indicator, and suggest that they should allocated as presented in the table below.

OEP’s proposed School Quality and Student Success Indicator Measures

OEP SQSS

We still want to see more career- ready indicators, and look forward to indicators that would be more representative of school quality, but this would be a WAY better start than what is in Draft 3. ADE makes it clear that these are not final measures: “This system will transition and improve over time as additional school quality and student success indicators are developed, validated, and used to replace or augment initially proposed indicators.”

Make your voice heard. Email your comments to ade.essacomments@arkansas.gov today or tomorrow (before August 31st)!

ADE will submit the final plan to the U.S. Department of Education in September 2017.

Federal feedback is expected by the end of the year, and ADE will implement aspects of the plan beginning in the 2018 − 19 school year.

ACT the sequel: some key points missing

In The View from the OEP on August 23, 2017 at 12:51 pm

ACT logo

How did Arkansas students score on the ACT last spring? Were a higher percentage of 11th graders found to be ‘college-ready’?

Nope.

Statewide scores didn’t change much, but some districts saw big gains in ACT scores this year. We posted the 2017 ACT results by school and district here, and calculated a change from prior year so you can see if your school/district has improved on this important outcome measure.

Last spring, over 31,000 high school juniors completed the ACT, a measure of college readiness used by colleges throughout the nation.  This is the second year that Arkansas students have been able to take the test for free (it is normally $42.50), which provides us the first opportunity to compare the scores to the prior year.  Like ACT, we recommend looking at a longer trend, but prior to 2015-16 only about a quarter of Arkansas juniors elected to take the ACT so it doesn’t make good sense to compare to earlier years.

ACT measures achievement in English, Reading, Math, and Science and is scored on a scale of 1 to 36. Students who score below 19 are ineligible for the Academic Challenge Scholarship and are typically placed in remedial courses in college. As can be seen in Table 1, the English score increased in 2016-17, but the average score for Arkansas’ juniors was below 19 in all four subject areas and for the composite.

Table 1: Average ACT Scores for Arkansas 11th Graders, by Subject and Year.

 

English

Mathematics

Reading

Science

Composite

2015-16

18.1

18.6

19.0

19.1

18.8

2016-17

18.4

18.5

18.9

18.9

18.8

 

Although Arkansas’ post-secondary institutions have generally used ’19’ as the indicator of readiness for college, ACT has developed a more nuanced approach by following actual students and their success in college. ACT College Readiness Benchmarks are the minimum ACT score indicating students have a 50% chance of earning a B or better in the college course. These Readiness Benchmarks vary by subject area as indicated in the table below.  The lowest Benchmark is in English (18) and 49% of Arkansas’ juniors met that benchmark. Thirty percent met the Benchmark of 22 for Reading, and fewer than 1 in 4 (24%) met the Benchmark of 22 in math.  At 23, the score needed to meet the Benchmark in Science was the highest, and the least likely for Arkansas students to meet; only 22% of students are college-ready in science.

Table 2: Percentage of Arkansas 11th Graders Meeting ACT College Readiness Benchmark Scores, by Subject and Year.

 

English    (18)

Mathematics (22)

Reading (22)

Science (23)

All 4

% met in 2015-16

49

25

31

24

14

% met in 2016-17

49

24

30

22

14

 

The average scores and the percentage of students meeting ACT College Readiness Benchmarks are essentially unchanged from last year. Like ACT, we recommend looking at a longer trend, but prior to 2015-16 only about a quarter of Arkansas juniors elected to take the ACT so we don’t think it makes sense to compare further back.

This year also marks the first time we have had consistent historical assessment data for our 11th graders, because they were the first group of students to complete the ACT Aspire in 10th grade. We wondered how the earlier scores stacked up against the actual ACT, so we compared the percentage of students meeting or exceeding benchmarks when they were in 10th grade (on the ACT Aspire) with the percentage meeting or exceeding in 11th grade (on the ACT).  Although these are not matched at the student level and so may include some different students each year, the results are fairly consistent. As you can see below, however, students were somewhat more likely to meet readiness expectations on the ACT Aspire than on the actual ACT.

Table 3: Percentage of Arkansas 11th Graders Meeting Readiness Benchmark Scores, as 10th and 11th graders, by Subject and Year.

 Assessment  Grade

English

Mathematics

Reading

Science

ACT Aspire (2015-16) Grade 10

56

29

34

29

ACT    (2016-17) Grade 11

49

24

30

22

 


So What?

Most 11th grade students indicated that they wanted to continue their education after high school.  Sixty (60) percent indicated they wanted to pursue at least a Bachelor’s degree, while 10% indicated they were going to attend a 2-year college or pursue a vocational/ technical degree.  While we don’t think any one assessment tells the whole picture of a kid, a student’s ACT score is currently tied to all sorts of significant outcomes including college acceptance, college scholarships, NCAA eligibility, and well as being an indicator of workplace readiness.  Most kids aspire to continuing their education after high school, but the majority are not prepared.

 

Now What?

Compare your scores to schools that serve similar populations- how do your students stack up?  Increasing student ACT scores is good for students, and some districts saw double-digit gains in the percentage of students who met readiness benchmarks.  Compare your scores to schools that serve similar populations- how do your students stack up?  What can you do to support your students in achieving this goal? We have some recommendations for educators, parents, and students to help.

Now that we have a consistent assessment system that is connected to the ACT, teachers and school leaders should carefully examine the ACT Aspire data for their students this year and determine what skills they are missing and how to give them the information they need to get on track for success. Students are central to this discussion!  Help them understand what the ACT Aspire score means and what they can do to help themselves! Parents also need to understand- in our blog last week we mentioned that the majority of parents hoped their students would og to college and they need good information to understand if their student is on the right path.

For high school students: We pointed out that the 10th grade ACT Aspire results are pretty indicative of student performance on the ACT and that students were least likely to meet Readiness Benchmarks in math, science, and reading.  There are lots of resources that schools should be letting students know about, and many are FREE for students who are eligible for Free/Reduced Lunch.

  1. Don’t take the ACT just one time!  Students who are eligible for Free/Reduced Lunch can get up to two fee waivers  from ACT students may use a maximum of two separate ACT Fee Waivers. If a students isn’t eligible- they should still sign up for the ACT at least once before the junior year test. Test dates are about once a month and are listed here.
  2. Do timed practice! Understanding the format of the test and being familiar with how the timing feels can help students use time available to demonstrate their knowledge.
  3. Use available resources! Check out ACT Online Prep and ACT Kaplan Online Prep Live.  Students who receive a fee waiver for the ACT can get either of these for free for a year. Students can access the program online, and progress can be tracked at an individual or aggregate level including time spent on the program, performance on the practice questions and tests, and areas in which an entire class may need help teachers can assign work in ACT Online Prep for students to complete as part of test prep within a classroom or as learning enhancement.  The “live” version includes access to live instructors who teach the material and are available for questions. Schools that are at least 50% FRL get 50% off the per-student price.

 

What’s missing?

Here at OEP we are a little concerned about the accuracy of the ACT data reports due to students not reporting information and to ACT reports not linking to all high schools.

Part of the information shared by ACT is student-reported information, but an increasing percentage of Arkansas students did not complete this information in 2016-17.  When asked about post-secondary educational aspirations, 30% did not respond to the question- which was double the rate of non-response from last year.  We also noticed a 10 percentage point decrease in the percentage of students identified as taking the “Core or More” curriculum. Students who complete the “Core or more” preparation perform better on the ACT and since it is essentially Arkansas’ Smart Core, (4+ English, 3+ Math, 3+ Social Studies, and 3+ natural science) we were surprised to see the percentage drop so significantly from 67 to 57 percent.  Turns out, this drop was due to 35% of students who did not respond about what courses they had taken. When students don’t provide complete information, the resulting reports can be misinterpreted and lead.  We hope that schools and the ADE will examine the pattern of non-responses and put a plan in place so that ACT can provide the most accurate information to inform Arkansas policy.

ACT uses a different process that the ADE to link student scores to their high school.  We noticed that Bentonville’s new high school was not included in the summary report for Bentonville School District, which could impact the district-level results.  While this is not a huge deal (except maybe for Bentonville!)  the completeness and accuracy of ACT data becomes more important for schools as several pieces will be used as a school success indicator in Arkansas ESSA plan- which we will delve into more next week.

We are interested in your thoughts about the latest ACT results- please leave a comment or a question!