University of Arkansas Office for Education Policy

Archive for the ‘The View from the OEP’ Category

Are Arkansas’ High School Graduates Prepared to Succeed in College?

In The View from the OEP on April 17, 2019 at 1:50 pm

Earlier this week, the Arkansas Department of Education released 2018 Report Cards for schools, districts, and the state (press release). Included in these data was information on the percentage of Arkansas students who graduate from high school, go to college, and earn college credits.

While college is not the right path for every Arkansas student, college degrees are increasingly valued by employers (Harvard Business School Report). Although there has been some recent pushback on employers’ degree requirements, to compete in tomorrow’s job market, Arkansas schools will need to prepare a large proportion of their students to succeed in college. Unfortunately, this is an area where Arkansas has some ground to make up. Only 22 percent of Arkansans age 25 and older have a bachelor’s degree, which is 9 percentage points below the national average (31%) with only Mississippi (21.3%) and West Virginia (19.9%) having  lower degree attainment (Census Bureau map).

Given our degree deficit and the priority the state has put on college readiness, we were eager to see if more of Arkansas’ students were hitting important milestones on the way to college graduation. We have written about this before (here and here), but at the time only three years of data were available. With yesterday’s release, we now have 5 years of data, but unfortunately, the story is not great.

There were, however, some encouraging signs in the data. For example, Arkansas students are graduating high school in greater numbers (see Figure 1). The state’s 4-year graduation rates inched up slightly, climbing to 89 percent in 2018. Minority students also saw increases over this period, with black students experiencing the biggest gains, increasing by 5 percentage points from 81 to 86 percent.

Figure 1: Arkansas 4-Year High School Graduation Rates 2014-2018

Screen Shot 2019-04-17 at 9.32.18 AM

Even though Arkansas’ students are graduating from high school at higher rates, they are not more likely to attend college (see Figure 2). The college going rate remained relatively flat over most of the 5-year period ending in 2018. However, if the recent data is to be trusted, college going rates declined by 8 percentage points in 2018. Such a large drop makes us question the validity of the 2018 data, but if accurate, this would be a huge deal that demands more discussion and monitoring.

***IMPORTANT NOTE:  ADHE confirmed to OEP that the college-going data are not correct and they are re-calculating the values. The initial re-calculation reflects a statewide college going rate consistent with prior years. *** We will update the analyses below with the new college-going data when released, but the general information still applies.  

Figure 2: Arkansas College Going Rate 2014-2018

Screen Shot 2019-04-17 at 9.35.43 AM

One important caveat about the state’s college going data is that the values only reflect students who attend a college in Arkansas. These rates miss Arkansas high school graduates who leave the state to pursue their post-secondary education, and so understate the state’s college going rate. The National Center for Educational Statistics produced a report in 2017 that can help us estimate by how much the rates are understated (see report here). In the fall of 2016, there were 3,318 Arkansas residents attending a college outside of the state. Assuming this number includes students who graduated at some point over the previous four years and dividing by the roughly 30,000 Arkansas high school graduates per year yields an estimate of approximately 3 percent for the percentage of Arkansas high school graduates in any given year attending college out of state. So our best guess as to how much Arkansas’ college going rate is underestimated is somewhere between 2.5 and 5 percentage points. The upshot is that even accounting for underestimation, Arkansas’ high school graduates enroll in college at far lower rates than the national average of 67 percent (https://nces.ed.gov/programs/coe/indicator_cpa.asp).

As we have done in previous blog posts, we can use the state’s data on graduation and college going rates along with the data on college credit completion to estimate how many of today’s 9th graders we can expect to get to college and earn at least one year of credits (figure 3). The picture is not encouraging. Of a group of 100 hypothetical 9th graders, we would expect 89 to graduate high school, 36 to enroll in college within the next year, and only 19 to complete one year’s worth of credits in the subsequent 2 years. So only 1 in 5 of today’s 9th graders would be expected to get to college and start down the path toward degree completion. And the story is worse for the state’s minority students.

Figure 3: Expected Education Attainment for Hypothetical 9th Grade Cohort

Screen Shot 2019-04-17 at 9.40.58 AM

Part of the challenge is that even once kids get to college, 63 percent of Arkansas’ high school graduates require remediation in one or more subjects. Remedial courses do not count toward degree completion, and can make persistence and degree completion seem out of reach.

College admission exams provide another measure of college readiness. All Arkansas high school students now take the ACT, and only 17 percent of 2018’s graduating class met the ACT’s college readiness benchmarks in all four tested subjects (i.e., English, math, reading, and science; Arkansas ACT report). Missouri, which also requires students to take the ACT and is similar to Arkansas in terms of its rural/urban mix, performed significantly better, with 22 percent of 2018 graduates meeting the college readiness benchmarks in all four subjects (Missouri ACT report). You can find a table summarizing states’ 2018 ACT performance here.

Although college readiness has been a priority for many years and the state’s institutions of higher education have been working to improve remediation and college persistence, it’s clear that we are not making fast enough progress. Arkansas is still far behind in college enrollment and degree completion, and we will need to redouble our efforts and try new strategies if we expect to compete on a global stage.

Note: If you want to dig into Arkansas’ graduation rate and college going rate data, the OEP website provides handy spreadsheets that stratify the data by district and school. You can find graduation rate here and college going rate here (the 2018 rate data table will be released after re-calculated data are released by ADE).

Are We Closing the Achievement Gap?

In The View from the OEP on April 10, 2019 at 1:23 pm

ednext_XIX_3_hanushek_img01Yesterday, EducationNext discussed new research demonstrating a persistent 50 year achievement gap between “Haves and Have Nots”, and we wanted to chime in about what the gap looks like for students in Arkansas.

While the national and Arkansas-specific research use different data sources and methodologies, the conclusions are the same: the gap in academic performance between students with fewer and greater economic resources is large and isn’t getting smaller.

The national research used an index of student socioeconomic status (SES) developed from student-reported information about the education level of their parents and possessions that they have in their home. The achievement data come from four testing programs: National Assessment of Educational Progress (NAEP) Long-Term Trend;  Main NAEP; the Trends in International Mathematics and Science Study (TIMSS); and the Program for International Student Assessment (PISA). The researchers found that the gap in achievement between the highest 10% SES and the lowest 10% SES students has remained consistent over the last 50 years.  The conclusion holds when the ranges are expanded to the highest and lowest SES quartiles as well.

What about Arkansas’ Gap?

Here at OEP, we looked into the achievement gap for Arkansas students and found similarly disheartening results.

For the Arkansas analysis, we use the student-level indicator of Free-Reduced Lunch (FRL) program participation as the indicator of student socioeconomic status because we don’t have information on parental education or home possessions. FRL eligibility is based on family income, and is often used as an imperfect proxy for a student’s socioeconomic status. We group students into two groups: Not participating in FRL, or participating in FRL. In 2008-09, the first year of our analysis, 56% of Arkansas students participated in FRL.  The percentage increased to 59% in 2009-10, and has remained fairly consistent since then, ranging as high as 61% before declining slightly to 60% in 2017-18 .

For the achievement data, we use ten years of the annual Arkansas state assessments, which have changed over time. For the first six years of our analysis we use data from the Benchmark and End-of-Course exams. In the 2014-15 school year, students in Arkansas completed the PARCC exam, before switching to the ACT Aspire exam for 2016-17 and beyond.

All of these test changes have made it essentially impossible for Arkansas schools to examine achievement gaps over time, because each test scores and reports performance in different ways. Although the assessments have varied over time, each provides a measure of how students statewide perform in literacy and mathematics. We think it is very important for education stakeholders to be able to examine performance over time, so we used a common standardizing procedure to track the relative performance of different groups.  We transform these scores into percentiles to aid in interpretation. The statewide average percentile in literacy or mathematics at each grade level is the 50th percentile each year. Note that the percentiles are standardized within year and state, meaning that they are not indicators of how ‘true’ achievement has changed over time or how performance compares to students in other states. Percentiles are used to compare the relative annual performance of Arkansas student groups over time and examine if the gap in achievement has changed (details about the methodology can be found at the end of this blog).

We found that in Arkansas, the difference in achievement between “Haves and Have Nots” has remained essentially unchanged over the past decade.

As seen in Figure 1, Arkansas’ literacy achievement gap closed slightly from 25 percentage points to 23 percentage points over the past 10 years. In 2008-09, students participating in the FRL program scored at the 39th percentile in literacy, while students who were not participating scored at the 65th percentile on average.  Over time, the performance on non-participating students remained consistent, while FRL-participating student performance increased slowly over time to the 42nd percentile by 2017-18. The literacy achievement gap closed slightly from 25 percentage points to 23 percentage points over the past 10 years.

FRL LIT

As seen in Figure 2, Arkansas’ mathematics achievement gap increased slightly from 20 percentage points to 23 percentage points over the past 10 years. In 2008-09, students participating in the FRL program scored at the 42nd percentile in mathematics, while students who were not participating scored, on average, at the 62nd percentile.  Student performance fluctuated over time for students in both groups, but the performance of non-FRL participating student increased slightly over time, while the performance of their FRL participating peers remained steady, leading to the slight widening of the achievement gap in mathematics.

FRL Math

It is interesting to note that the achievement gaps in both literacy and mathematics were smallest in the 2014-15 school year, the first (and only) year that Arkansas students completed the PARCC assessment. The gaps closed due to an increase in the average performance of FRL-participating students AND to a decline in the average performance of non-FRL participating students. It is tempting to hypothesize causes behind this decreased gap, but futile because we have only a single data point, and the gap then returns to established levels.

Well Shoot!

As with the national research, we feel these are important things to consider as education/ the level of education attained is key to upward mobility. Both the national and Arkansas results show that certain subgroups are limited in their ability to achieve success academically in comparison to their peers, and that the achievement gaps have persisted over time.

Therefore, it is seen that schools in Arkansas and across the country are not truly meeting the challenge of reducing those disparities among these subgroups. How can that be addressed?

We need to keep digging in order to get to the root of this achievement gap issue. OEP is digging into these gaps more in depth, and will share more information about trends in grade-level achievement gaps, as well as gaps between racial/ethnic and gender groups. We look forward to highlighting the Arkansas schools and districts that HAVE closed the achievement gaps for some student populations.

 


About the methodology:

Students in grades 3-8 were consistently assessed in both content areas, but there was variation in when high school students were assessed. For our analysis we used all grades assessed in literacy and/ or math in a given year, and limited our analyses to students completing the general assessment.

Scale scores were standardized for each year within grade level and content area, creating a Z-score (mean of 0, standard deviation of 1).  These z-scores were then averaged across student groups and transformed into percentiles to ease interpretation. The statewide average percentile in literacy or mathematics at each grade level is the 50th percentile each year. Percentiles are standardized within year and state, meaning that they are not indicators of how actual achievement has changed over time or how performance compares to students in other states. Percentiles are used to compare the relative annual performance of Arkansas student groups over time and examine if the gap in achievement has changed. 

What’s the deal with vouchers?

In The View from the OEP on April 3, 2019 at 2:06 pm

The Arkansas legislature is considering two bills, SB 539 and SB 620, that would make it easier for low-income students to attend private schools. These bills highlight two different approaches for expanding school choice by offering families money to cover a portion of their children’s private school tuition – often referred to as vouchers. Discussions about private school vouchers tend to elicit strong reactions from supporters and opponents alike, making it difficult to wade through the noise and understand what they are, why they exist, what we know about their impact. In this post we hope to help our readers understand the answer to these three questions.

What are vouchers?

 Vouchers are publicly funded grants or scholarships that are used use pay for private school tuition. Vouchers are not new in education policy. As the National Council of State Legislatures’ (NCSL) notes in their summary of vouchers’ history, “state support for private school education has existed in Maine and Vermont for nearly 140 years.” Vouchers have been used in higher education since 1965, offered by the federal government in the form of Pell Grants. However, there has been renewed interest in vouchers since the early 1990s when the first modern K-12 voucher program was established in Milwaukee.

Today twenty-three states offer some form of vouchers for private K-12 education, and they come in three different flavors: 1) tax-credit scholarships, 2) state-funded scholarships, and 3) education savings accounts. Eighteen states offer a tax-credit scholarship, 15 offer a state-funded scholarships, and 6 offer education savings accounts. While program design differs somewhat across states, most of these state-based voucher programs restrict eligibility to either low-income students or students with disabilities (i.e., who have an individual education plan (IEP)). Arkansas currently offers a voucher program called Succeed Scholarship Program to students with disabilities. For more information on state-based voucher programs check out the NCSL Interactive Guide to School Choice Laws and EdChoice’s summary of state choice programs.

Why do vouchers exist?

Vouchers are a public policy tool that can be used to increase families’ schooling options. This is especially true for low-income families who, in the absence of a school choice program, often lack the resources to choose a school different from the one to which they are currently assigned based on their address. Wealthier families, on the other hand, have many more schooling options because they have the resources not only to choose amongst a wider set of neighborhoods but also to send their children to private schools. School choice programs, like vouchers, can help ameliorate the inequity in schooling options, providing low-income parents with the ability to choose the school that best fits their children’ needs.

Some social scientists, most famously Milton Freedman, have also argued that providing more schooling choices in an environment where families have few if any options will increase overall school quality because schools will need to get better to attract and keep students.

What do we know about the impact of vouchers?

Over the years there have been many studies of K-12 education vouchers. Several of these studies are high-quality randomized controlled trials (RCTs) which provide researchers with the best chance of establishing cause and effect. Researchers have studied vouchers in many different locations/contexts and with varying designs (e.g., targeted to low-income families vs not). Despite the amount of high-quality research around vouchers, the story about their impact is equivocal. You can find good summaries of the research in Chalkbeat, Journal of Economic Literature (JEL), and School Choice at a Crossroads ch.4.

The research has studied vouchers’ impact on three main categories of outcomes: 1) voucher students’ achievement on standardized tests, 2) voucher students’ educational attainment (e.g., high school completion, college enrollment, etc.) and longer-term outcomes (e.g., criminal behavior), and 3) the effect on traditional public schools (e.g., achievement, finances, segregation, etc.).

Student Achievement

The results for the impact of vouchers on student achievement are mixed. While several older studies consistently found either no effect or a slight positive effect on achievement, four recent studies have found negative effects and one has found positive effects (see Figure 1 below, omits North Carolina results). Research design may have played a role in the studies’ findings – many of the older studies are RCTs, giving us significant confidence in their results; however, two of the recent studies are as well. The differences in the context and design of the voucher programs are likely more explanatory. The more recent studies evaluated programs that are less tightly targeted to low-income families and students with disabilities and, in some cases, included significant regulation that limited the supply of private options. Another explanation for the mixed results could be that the local public schools used for comparison in the recent research have gotten pretty good at delivering results on state tests. Regardless, while the overarching impact of vouchers on student achievement is not completely clear, it’s likely that, even when well designed, their impact is small to neutral and that the design and context of the voucher program matters a lot.

Figure 1: Math and Reading Test Score Impacts from 12 Voucher Studies

Reproduced from the Brookings Institution Evidence Speaks Reports, Vol 2, #18.

The x-axis is the estimated impact in standard deviation units. For each study, the dot represents the estimated mean impact and the bars represent the 95 percent confidence interval around the mean.

Screen Shot 2019-04-03 at 9.57.10 AM

Attainment and Longer-Term Outcomes

The research story for educational attainment and longer-term outcomes is more positive, although the evidence is not very strong. Several studies have found that vouchers increase high-school graduation and college enrollment. There is, however, mixed evidence on college completion with one study finding a positive impact and another finding none. A recent working paper produced by EDRE’s own Cory DeAngelis and Pat Wolf found that the Milwaukee voucher program reduced criminal activity and paternity suits among participants in adulthood. These results on longer-term outcomes are promising, but more evidence is needed to verify these initial findings.

Effect on Traditional Public Schools

Opponents of vouchers often claim that they harm traditional public schools by lowering the achievement of those left behind, decreasing resources, and increasing segregation. However, we are not aware of any rigorous evidence documenting these harms occurring in real-world programs. In fact, there is significant evidence that voucher programs improve traditional public school performance. As the JEL research summary linked above put it, “Evidence on both small-scale and large- scale programs suggests that competition induced by vouchers leads public schools to improve.” Likewise, there is significant evidence that voucher programs are either cost neutral or save the state and districts money. The evidence on vouchers segregation is not as robust, but does suggest that programs that target low-income families do not have a racially segregative effect (e.g., Louisiana, Milwaukee). However, just because these harms have not been documented in existing programs does not mean that they can be ignored. Policymakers must be aware of the potential harms and do their best to mitigate these concerns when they design voucher programs.

What does that mean for Arkansas?

As noted in the intro, the Arkansas legislature is considering two voucher bills.

SB 620 would create a pilot, state-funded scholarship program for low-income Pulaski County students. This bill is currently in the Senate education committee.

SB 539 would offer tax credits to individuals and corporations who make donations to a fund that will provide scholarships for low-income students to attend private schools. This bill passed out of the Senate a few days ago and is now set to be considered in the House. Seventeen other states already offer similar tax-credit scholarship programs (visit EdChoice for more info on these programs). Our colleague Julie Trivett has produced an analysis of the potential fiscal impact of SB 539, and found that under conservative assumptions the proposed tax-credit scholarship program will be effectively cost neutral.

The bills strictly limit eligibility to low-income families and cap the resources available so that the programs start small, which are both features of programs that have demonstrated positive impacts in past research. The bills also require participants to take a norm-referenced standardized test and schools to report results. Given the research evidence on private school vouchers, we see no compelling reason for policymakers not to consider piloting the proposed voucher programs. While the proposed programs are unlikely to yield large positive results or cost savings, they will increase low-income families’ schooling options and may lead to small improvements for participants and traditional public schools alike. If either of the bills are passed, the legislature should include a requirement that the programs be rigorously evaluated so that we learn from the experience.

No time for field trips?

In The View from the OEP on March 13, 2019 at 11:38 am
process_th

Photo source:  Crystal Bridges Museum of Art

As Arkansas schools enter the final month before state testing, teachers may be focusing instructional time on test prep, foregoing other ‘non-tested’ subjects and activities, but new research finds that students who attended art-related field trips demonstrated increased engagement in school, higher levels of social-emotional skills, and, unexpectedly, higher scores on standardized tests!

The study is a longitudinal, randomized controlled trial, the gold standard for research.    Conducted by Jay Greene, distinguished Professor and head of the Department of Education Reform in the College of Education and Health Professions,  and members of the University of Arkansas National Endowment for the Arts Research Lab that he directs, the study randomly assigned fourth and fifth grade public school students in Atlanta, Georgia to attend three field trips throughout a school year. Students went to an art museum, a live theater production, and a symphony performance at the Woodruff Arts Center. A control group of students within the same schools did not attend the field trips.

You might not think that attending three field trips would lead to measurable, positive outcomes for students, but it did! You can read the working paper on the social emotional effects by lead author Angela Watson here, and the paper by lead author Heidi Holmes Erickson that addresses students’ engagement and academic outcomes here,  but here’s the highlights:

  • Students who were randomly selected to attend the field trips showed significantly higher levels of social-perspective taking through survey items like, “How often do you attempt to understand your friends better by trying to figure out what they
    are thinking?” and “When you are angry at someone, how often do you try to ‘put yourself in his or her shoes?”.  (Effects reflect the limited sample of students with higher academic performance who were more likely to be able to read and interpret the questions)
  • Students who were randomly selected to attend the field trips showed significantly higher levels of tolerance through the survey item, “I think people can have different opinions about the same thing.”
  • Students who were randomly selected to attend field trips reported more positive school engagement. They were less likely to agree that ‘school is boring’, and they had fewer disciplinary infractions in middle school than their control group peers.
  • Female students who were randomly selected to attend field trips were less careless in their survey answering, a measure of conscientiousness.  Female students in the second year of field trips demonstrate even greater levels of conscientiousness, while female students who are not included in a second year of field trips exhibit the same level of conscientiousness as female students who never attended one of the field trips.

Researchers also examined academic outcomes, hypothesizing that there would be no differences in standardized test scores between the students who attended the field trips and those who did not, as three days away from traditional classroom instruction was unlikely to affect students’ academic performance on math or reading exams one way or the other.

  • BUT- students who were randomly selected to attend field trips performed significantly better on their end of year standardized tests in math and English Language Arts than students in the control group.

Greene and his research team are continuing the research with another group of students, and will learn more about the students’ long-term outcomes as they observe them through middle school, high school, and beyond.

In the meantime, here at OEP we think that schools should consider the importance of field trips and arts experiences in a well-rounded education.

Making college accessible, one field trip at a time

In The View from the OEP on March 6, 2019 at 11:55 am

CJA

In the upcoming months, many high school seniors around the country will commit to attending a college or university. According to data from the Arkansas Department of Education, about 40-50% of high school graduates in Arkansas enroll in an in-state 2- or 4- year college. As students look forward to this important milestone, we thought it was a good time to take a step back to think about all of the decisions students have to make to be in a place where they’re deciding which college to attend. In particular, we want to focus on when and how students first start to get a realistic picture of what it is like to be a college student, and how those early experiences relate to students’ preparation for college.

Over the past two years, we at the OEP have been working with junior high schools and middle schools in the area to give eighth grade students information about college and to provide opportunities to visit the University of Arkansas-Fayetteville campus three times to learn more about the college experience. On these visits, students toured campus, participated in a college-readiness workshop, worked with academic departments, toured a dorm, and participated in an athletic event. (Oh, and of course they were able to experience the joys of a campus dining hall). These visits complement what schools are already doing to prepare students for their futures, such as offering career readiness courses, encouraging students to job shadow a professional in an interesting field, or making connections between coursework and potential careers.

Why did we focus on a campus experience, over and above providing information about college? The college-going process is complex, opaque, and confusing. Colleges can be difficult places to navigate, from making new friends, to adjusting to new academic expectations, to finding your way around campus. By creating an opportunity for students to navigate this type of environment and to meet successful students with similar backgrounds, we hope to make college seem less intimidating and more achievable.

Why did we focus on eighth grade? We knew we wanted to focus on early exposure to college, because research suggests many students get off a “college-preparatory” track in middle school. At the same time, students in the eighth grade are about to enroll in their first high school courses and to start building a high school GPA—in other words, they’re close enough to college for the message to resonate.

Of course, since we’re always interested in measuring the different ways schools are helping students, we had to evaluate these visits. So, within each of our partner schools, we randomized participating students to one of two groups. One group received an informational packet detailing postsecondary options in the state, discussing specific actions to take throughout high school to prepare for college, and educational requirements for different types of careers. The other group received the same packet of information and participated in the three campus visits. Then, we compared students’ responses to a survey asking about their attitudes towards and knowledge about college, as well as their course-taking decisions in ninth grade. You can read the full working paper here, but here are our main takeaways:

  1. Students who participate in the visits know more about college than students who just read the information on their own—students know more about the cost of college, what characteristics colleges look for in applicants, and how to earn college credit in high school, among other topics, if they have an experience to go along with the information.
  2. Students who participate in the visits have more conversations with school personnel about college than students who just receive printed information—in these conversations, students are talking about their college readiness, ACT scores they’ll need for their dream school, and other college-related topics.
  3. Students who participate in the visits are more academically diligent than students who just receive information—students who get a taste of the demands of college are more likely to fully complete a survey task.
  4. Students who participate in the visits are more likely to enroll in advanced math, science, and social science courses in ninth grade—students who participate in the visits are more likely to take advanced Geometry, pre-AP Biology, and pre-AP Civics in ninth grade, for example, than students who just receive printed information about college.
  5. Students who participate in the visits are less likely to want to go to a technical school after high school—as students gain a more realistic picture of the demands and benefits of a postsecondary education, they may be less likely to want a technical certificate and instead be more interested in other paths post-high school.

We’ve still got a lot of questions about how we can encourage students to prepare for college, particularly as more and more jobs require some sort of postsecondary training and college graduation rates stay flat. But it seems like providing field trips to a college campus is one strategy schools can pursue to help students think about all of their options for the future. If your school is interested in organizing a campus field trip (for any grade!), please reach out to us at the OEP! We would love to help you organize a fun, informational visit for your students that affirms their potential to succeed in any path they choose.

UofA

Discussion of the Proposed Arkansas Teacher Retirement System COLA Changes and Stress Testing

In The View from the OEP on February 27, 2019 at 1:19 pm

The Arkansas legislature is considering several bills that would affect the Arkansas Teacher Retirement System (ATRS), but two in particular have drawn the attention of teachers’ groups.

Here at OEP, we think it is important to understand these bills and the implications for current and future educators. We address the bills below, starting with HB1206, which would affect COLAs, before moving to HB1173 which would require plans to perform and publicly report the results of stress testing. HB1206 was withdrawn by its author this morning, but since COLAs will continue to be part of the policy dialog, it is still worthwhile to review the proposed changes. HB1173 is currently with the Joint Committee on Public Retirement and Social Security Programs, and is expected to be considered soon.


COLA Changes

What are COLAs? Cost of living adjustments (COLAs) are annual increases to retirees’ benefit payments that are meant to keep retirees from losing purchasing power due to inflation. When a teacher retires, he/she begins receiving a monthly retirement check based on years of service and average salary over his/her last few years on the job. If this base benefit amount remained constant throughout retirement, the teacher would lose purchasing power over time due to inflation. Prices of everything from groceries to healthcare tend to increase over time, requiring more dollars to purchase the same amount of goods and services. The purpose of COLAs is to increase retirees’ benefit payments at roughly the same rate as inflation so that they get consistent value from their checks throughout their retirement.

Because COLAs keep retirees’ monthly checks from losing value over time, they are a common part of plans that provide lifetime payments (i.e., annuities), including Social Security. Many public retirement systems’ COLAs were designed for a world in which inflation is consistently around 3 percent annually. Since the mid-90s, however, inflation has been far below that level. For example, average annual price inflation (i.e., December to December change in CPI-U as measured by the Bureau of Labor Statistics) for the south region since 1995 has been 2.13 percent and over the last 10 years it has been just 1.75 percent.

When COLAs provided by public retirement systems outpace inflation, retirees’ purchasing power over time is increased, rather than just maintained. Committing resources to COLAs that exceed actual inflation drives up plan cost and leaves less money to pay down pension debt and/or maintain benefit levels for new workers.

As governments across the country have struggled to get a handle on growing pension debt and rising retirement costs, many have made changes to COLAs. Since 2009, 30 states have reduced COLAs, and increasingly, governments are taking the logical step of linking COLAs to actual inflation (see NASRA report on COLAs). Many jurisdictions have also linked the provision of COLAs to their plans’ fiscal condition (e.g., COLAs can only be provided if the plan is greater than 90 percent funded). These types of changes not only keep plan costs in check but also ensure that COLAs fulfill their intended purpose of offsetting the negative effects of inflation on retirees’ purchasing power.

What changes were being proposed?

HB1206 (withdrawn today) would have modified the annual cost of living adjustments (COLAs) that retirees receive.  Under current law, ATRS retirees receive an automatic 3 percent COLA each year that is calculated using their starting benefit amount. HB1206 would have altered this by allowing the ATRS board to choose to provide an annual COLA, but the amount would capped at “the lesser of three percent (3%) or the percentage change in the Consumer Price Index (CPI), South Region as determined by the United States Department of Labor over the one-year period ending in the December immediately preceding the date for which the redetermined amount is being calculated.” In other words, the proposed change would have made the COLA discretionary and would have reduced the potential COLA amount in years when actual price inflation is below 3 percent.

So, why make the proposed change? While we don’t have any particular insight into Representative House’s way of thinking, the change was likely proposed because over the last 10 years the COLA specified in current Arkansas law would have significantly outpaced inflation. Figure 1 shows the value of $1,000 in hypothetical benefit payments growing at actual price inflation over the 10 years between 2009 and 2019 (black line) compared to COLAs under current Arkansas law – a simple 3 percent annually (orange line). To keep retirees purchasing power constant, benefit payments would have needed to increase by about 19 percent over this period, or by $190 for each $1,000 in payments. However, if COLAs had been given according to current Arkansas law, retirees’ benefit payments would have increased by 30 percent or by $300 for each $1,000 in payments. If inflation persists at its currently low level, the state would be committing more resources than necessary to maintain retirees’ purchasing power – resources that could be used to pay down pension debt and prepare the plan for the next downturn.

Figure 1: 10-Year Comparison of Different COLA Structures.

Screen Shot 2019-02-26 at 2.49.14 PM

Figure 1 also includes a blue line showing how the proposed change would have performed over the past 10 years. The COLA structure proposed in HB1206 would have tracked actual price inflation much better over this period, undershooting it by just a bit. However, while the proposed COLA structure would have performed well in today’s low inflation environment, it would significantly undershoot inflation if it were to rise. Figure 2 below is analogous to Figure 1 except it uses the last 20 years of inflation data instead of 10 years. As you can see, over this longer 20-year period the proposed COLA structure would have fallen short of actual inflation by a little more than the current law would have overshot it. To keep retirees purchasing power constant, benefit payments would have needed to increase by about 52 percent over this period, or by $520 for each $1,000. Current law would have increased benefit payments by around 60 percent while the proposed change would have increased payments by around 40 percent.

Figure 2: 20-Year Comparison of Different COLA Structures.

Screen Shot 2019-02-26 at 2.50.30 PM

The overall takeaway from these exhibits is that neither current law nor the proposed change would track actual inflation particularly closely when modeled using recent data. The current law increasingly overshoots inflation as it falls below 3 percent, and while the proposed COLA change would have done better at low levels of inflation, it would significantly undershoot inflation if it were to rise toward 3 percent.

So what should we do?

Given the low inflation of the past 10 years, it is reasonable for the Arkansas legislature to want to adjust the structure of COLAs. Why commit more resources to COLAs than is necessary to maintain retirees’ purchasing power? It is also a positive step to make COLA’s more responsive to actual price fluctuations by linking them to CPI. Such a change would continue to protect retirees while also more effectively managing plan costs. However, the COLA structure that was proposed in HB1206 could potentially fall well short of inflation if it were to rise above recent levels, likely necessitating additional modifications down the road.

If the legislature’s goal is to provide retirees with reasonable inflation protection while also not over-committing resources in periods of low inflation, then they should consider COLA structures that better track CPI within some boundaries. For example, if the legislature kept the capped, CPI-linked structure of HB1206 but changed from a simple to a compound COLA (i.e., apply the percentage increase to last years’ benefit amount rather than to the initial benefit), then COLAs would track CPI much more closely (see green line in Figure 3 below).

Regardless of the specifics, we strongly encourage the legislature to maintain adequate inflation protection for retirees and consider linking COLAs to actual inflation so that they more flexibly adjust to changing economic circumstances. We also reiterate our recommendation that any future changes to benefits, like COLAs, or contribution rates should only be made in the context of having a clearly defined funding policy and cost-sharing plan. 

Figure 3: 20-Year Comparison of Different COLA Structures with Proposed New Design.

Screen Shot 2019-02-26 at 2.51.15 PM

 


Stress Testing

Why is Stress Testing Important?

As noted in an earlier post,  stress testing, as proposed in HB1173, is vital to the prudent and sustainable management of public pensions. We are glad that ATRS agrees that stress testing is important, and of course, the plan already performs some level of stress testing, as the executive director points out in his recent legislative summary. However, on an admittedly quick scan of ATRS’s publicly available documents, we were unable to find any stress testing results that provide projections of future cost under multiple scenarios. We could certainly be missing something, but if stress testing results are not readily available to all stakeholders, including plan members, legislators, and taxpayers, then the value of the exercise is significantly diminished. ATRS has a responsibility to be transparent with and accountable to a broad set of stakeholders.

Given the huge implications for the public school workforce and large potential call on taxpayer resources, it is perfectly reasonable for the Arkansas legislature to place more specific stress testing requirements on ATRS than are included in the general guidelines of the Government Accounting Standards Board (GASB) or the Actuarial Standards of Practice (ASOP). ATRS’s complaints about added cost, frankly, ring hollow given the stakes for teachers and the state and school budgets. Based on our experience, it’s very hard to see how the requirements proposed in HB1173 would meaningfully increase cost above what the plan is already paying their actuaries to do.

Sounds good!  So what’s the problem?

To us, ATRS’s opposition to HB1173 appears to be more about fighting to maintain as much independence as possible rather than any concern about cost, etc. However, in our view, this strategy is shortsighted. Ultimately, it is the legislature and the taxpayers they represent that backstop the plan and ensure that teachers get the benefits that they were promised. Given the challenging times retirement plans face, ATRS should seek out increased productive engagement with the legislature so that legislators have greater ownership and a stronger sense of responsibility to fully fund the plan when the inevitable next recession hits.

We have also been disappointed by the overheated rhetoric of some employee groups regarding stress testing. Arguably their members face the greatest potential impacts from unexpected cost increases, a significant risk given ATRS’s higher than average discount rate and many experts’ investment return expectations. It seems the substantive area of disagreement is with the specific scenarios required in HB1173, which the Arkansas Education Association (AEA) calls “worst case scenarios” in the post linked above. Rather than reject the useful exercise of stress testing out of hand, we encourage the AEA to propose improvements to HB1173 that would make the scenarios acceptable to provide their members and legislators with vital information that would help them plan for whatever the future may hold.

The Takeaway

Retirement policy can be very challenging. It is technically complex, politically charged, and has many legal uncertainties. We are thrilled that Arkansas has done better than most states managing its teachers’ retirement system, but costs have still risen and public retirement plans face some significant challenges going forward. Here at OEP, we think HB1206 (now withdrawn) and HB1173 propose positive improvements that would help ATRS meet the retirement needs of former, current, and future teachers, even if the details need a little work.

What’s Driving Teachers’ Strikes

In The View from the OEP on February 20, 2019 at 10:58 am

The op-ed re-posted below was written by new OEP faculty member Josh McGee (@jbmcgee on Twitter) and appeared in Monday’s USA Today. The piece argues that school budgets are being squeezed by large numbers of new staff and rising benefits costs. Paying for all those new people and their increasingly expensive benefits is leaving less money to effectively pay teachers. The re-posted version below includes Arkansas specific data in indented brackets.

Teachers strike for higher pay because administration and benefits take too much money

4fbe9c91-c050-4156-9489-f2d9b4eccb4a-AP_Denver_Teachers_Strike

In Denver on Feb. 11, 2019. (Photo11: David Zalubowski/AP)

U.S. public schools administrative staff and rising benefits costs are squeezing school budgets nationwide.

Not long after Los Angeles’ teachers have returned to work after a six-day strike last month, more than 5,000 teachers in Colorado’s largest school district went on strike demanding higher pay. The Denver strike was resolved after three days, but it’s likely that this is just the beginning of teacher activism in 2019. Teachers in California, West Virginia and Virginia are gearing up to fight. As the legislative season gets rolling, teacher pay and education funding are hot topics in statehouses across the country.

Given all this it would be easy to believe, as many do, that America’s schools are starved of funding. But that argument doesn’t fully match the data. While there is variation across states, school funding has increased dramatically over the past 40 years.

According to the National Center for Education Statistics, inflation-adjusted per-pupil spending on public education has more than doubled since the 1970s.

[Arkansas’ spending per pupil has grown faster than the national trend over the last 40 years. Inflation adjusted spending per pupil has more than tripled from $3,356 in 1969-70 to $10,310 in 2015-16. However, Arkansas’ per pupil spending is still well below the national average of $12,330]

So why all the unrest? To answer that, we need to take a look at how all that new money has been spent.

The first part of the answer is that U.S. public schools have added large numbers of instructional, administrative and support staff over the past four decades. Student-teacher ratios have decreased from 22 to 1 in 1970 to about 16 to 1 today. And since 2000, the number of public school administrators has increased more than five times faster than student enrollment, a fact that has not gone unnoticed by labor leaders.

[While the NCES doesn’t report data for Arkansas going back to 1970, the state’s student-teacher ratio has decreased slightly from 14.1 in 2000 to 13.7 in 2015. Arkansas’ student-teacher ratio is below the national average likely because of the state’s large number of rural schools.

NCES also does not report a state-level time series for staffing, but in 2015, district administrators and school principals made up 3.3 percent of all Arkansas public school staff, which is slightly below the national average of 3.9 percent.]

In his letter announcing the strike, Denver Classroom Teachers Association President, Henry Roman wrote, “DPS has made its choice to keep critical funding in central administration, and not to apply more of those funds to the classroom where they would provide the greatest benefit for student learning.”

As part of the deal to end the strike in Denver, the district agreed to cut 150 administrative positions and eliminate large administrator bonuses.

Teachers who made up about 60 percent of all public school staff in 1970 now make up less than half, despite there being more than a million more teachers in today’s classrooms. More employees means that the budget pie is divided more times, leaving fewer dollars for each individual teacher’s pay.

[As with other staffing data, the NCES doesn’t report state-level data going back to 1970. However, teachers as percentage of all public school staff has decreased in Arkansas from 50.6 percent in 2000 to 48.6 in 2015.]

Money is going toward paying down debt now

At the same time, rising benefits costs are squeezing school budgets nationwide. While average inflation-adjusted teacher salaries have been relatively stagnant since 1990, benefits costs have risen from 16.8 percent of expenditures in 1990 to 23 percent of today’s much larger expenditure base.

[National average inflation-adjusted teacher salaries decreased by around 2 percent between 1990 and 2017, but in Arkansas they increased by approximately 14 percent. However, Arkansas’ average teacher salary is still about $10,000 below the national average – $48,616 vs $58,950.

Arkansas’ schools spend a lower percentage of their budgets on employee benefits, 19 percent, than the national average.]

More recently, the growth of retirement costs — in particular, payments to cover unfunded benefits earned by teachers for past service — has placed pressure on school budgets. Almost every state increased teachers’ retirement benefits in the booming 1990s. But the additional promises were not accompanied by responsible funding plans. Over-funded at the turn of the millennium, by 2003, teacher pension plans were collectively short by $235 billion. By 2009, pension debt had more than doubled, to $584 billion.

The strong bull market since the Great Recession has not put a dent in the shortfall, which now totals well more than $600 billion. As a result of pension-funding shortfalls, retirement costs per pupil have more than doubled since 2004, from about $530 to more than $1,300 today.

[Arkansas has done a better job than most states managing its teachers’ retirement system (ATRS), and as a result ATRS is better funded and its costs have not risen as much as the average teachers’ plan. For a more thorough discussion of the teacher retirement system’s finances, see our previous blog post here.]

Retirement costs now exceed 10 percent of all education expenditures on average across the country. Unfortunately, the majority of these contributions do not benefit teachers in today’s classrooms because roughly 70 percent of retirement contributions are going to pay down debt rather than for new benefits.

[Retirement costs now make up around 8 percent of Arkansas education expenditures. ATRS is 79 percent funded, and approximately 57 percent of the state’s annual contribution goes to pay down pension debt.]

Growing retirement costs for these legacy-benefit promises pose a challenge for many school districts to maintain their current level of services, much less to hire new teachers and support staff or give high-quality teachers a pay raise.

What protected teachers once is a danger now

Rising benefits costs are a big part of the L.A. school district’s budget woes, limiting funds available to meet the demands of the teachers’ union for more pay and support staff. That helps explain why teachers there settled for little more than was on the bargaining table when they chose to strike.

While the L.A. district has a $1.8 billion budget surplus today, it’s burning through it at an alarming rate, and rising benefits costs are much to blame. By the 2031-32 school year, the district expects to spend more than 50 percent of its budget on health care and pensions.

Even in Texas, considered by many to be a bastion of fiscal conservatism, pension debt has ballooned. The state and school districts now owe the Teacher Retirement System $46.2 billion in benefits that teachers have already earned, a total that is roughly equivalent to all of the state’s other debt combined.

[At the end of the 2018 fiscal year, Arkansas’ total long-term debt payable for bonds, capital leases, and notes was $3.9 billion. The state and its school districts owe ATRS $4.2 billion for benefits that teachers have already earned, a total that is slightly higher than all of the state’s other debt combined.]

To be sure, there is no immediate national “crisis,” insofar as most teacher pension plans are not on the brink of failure. But it’s clear that the rising cost of benefits that were meant to protect teachers is now endangering teacher pay and larger school funding in a way that was never anticipated. Indeed, school districts will likely be seeing red for some time — both at rallies and in their budgets.

Josh B. McGee, a senior fellow at the Manhattan Institute, is a research assistant professor in the Department of Education Reform at the University of Arkansas. Follow him on Twitter: @jbmcgee

Who’s Using Act 173?

In The View from the OEP on February 13, 2019 at 11:30 am

Today we look into who is enrolling in public schools under Act 173, which allows home school and private school students to enroll in their local school district. School districts are reimbursed by the state for one-sixth of the foundation funding amount per course in which the student enrolls (about $1,100 in 2017-18).  The Act was passed two years ago, and permits, but does not require, school districts to participate in the program. So we got to wondering, who is using Act 173 to enroll in public schools? We looked into it, and share our findings below.  You can read more in the associated policy brief.

How many students are enrolling in districts under Act 173?

Using data from 2017-18, the first year in which home school and private school students were eligible to enroll under the Act, we found that only 95 students enrolled in at least one course in their local district under Act 173.  This amounts to 0.02% of the public school population.

Are students who enroll under the Act demographically different from the public school population as a whole?

Figure 1. Demographic Differences Between Act 173 and District Public School Students, 2017-18

Figure1_Act173

We found that a greater proportion of Act 173 students were White and a smaller proportion of them were Black or Hispanic compared to regularly-enrolled district
students. While 86.3% of all Act 173 district students were White, only 55.9% of all regularly-enrolled Arkansas public school students were White. In contrast, only 5.3%
and 3.2% of Act 173 students were Black or Hispanic respectively, while 25.5% and 13.1% of the overall Arkansas public school population reported those racial identities.

Interestingly, Act 173 students were significantly more likely to report having a disability than public school students as a whole. Overall, 35.8% of all Act 173 district students reported having a disability compared to just 13.7% of regularly-enrolled district students, a difference of 22 percentage points. Of the Act 173 students reporting a disability, the vast majority (30/34) reported having a speech or language impairment.

No students identified as Limited English Proficient (LEP) enrolled in district schools through Act 173, compared to 8.5% of regularly-enrolled district students that are
identified as LEP.

Are Act 173 students concentrated in particular grade levels?

Yes!  Act 173 students disproportionately enrolled in high schools. Forty-four students
used Act 173 to enroll in a total of 23 district high schools in the 2017-18 school year. Twenty-five Act 173 students enrolled in either middle school or junior high and 26 used the program to enroll in elementary schools. Middle/junior high students enrolled in 15 different schools, whereas elementary students enrolled in only five schools, with the vast majority (21) enrolling in Baseline Elementary in Little Rock.

Figure 2. Number of Act 173 Schools and Students, by Level, 2017-18

Figure2_Act173

Are certain geographic regions more likely to enroll students under Act 173?

Actually, as a share of the public school population, Act 173 students are fairly equally distributed around the regions of the state. Most Act 173 students enrolled
in schools in either Central Arkansas (34 students,or 35.8% of all Act 173 students) or Northwest Arkansas (31, 32.6%). By district, the greatest number of Act 173 students enrolled in a school in Little Rock (25, 26.3%).  Central and Northwest Arkansas are the
largest education regions by total number of public school students, with
142,932 and 172,634 students, respectively.

Figure 3. Number and Share of Act 173 Students, by Region, 2017-18

Figure3_Act173

What type of districts enroll Act 173 students?

Given that Act 173 benefits districts by allowing them to serve more families in the community while also increasing district resources,we found it interesting that only 35 of Arkansas’s 227 traditional school districts enrolled any students under Act 173 during the 2017-18 school year. According to the latest data available, there are over 24,000 students enrolled in private schools in Arkansas, and over 20,000 students being home schooled in the state. That is over 44,000 students that districts could be serving under Act 173!

District size was an important factor in predicting Act 173 participation. A district with enrollment one standard deviation above the mean (that is, roughly 4,700 students) was approximately three percentage points more likely to have students enrolled using Act 173 relative to a district at mean enrollment. Larger districts are generally in areas with a larger number of private and home schooled students who might benefit from Act 173.  These larger districts also offer more distinct courses that might attract such students.  However, only three of Arkansas’ ten largest school districts (Little Rock, Pulaski, and Fayetteville) reported enrolling any Act 173 students during the 2017-18 school year.

Table 1: Ten Largest School Districts with no Student Enrollment Under Act 173, 2017-18

Table1_Act173

Here at OEP, we think Act 173 is a great opportunity for private and home school students to gain value from attending a public school, and enrolling these students enhances the districts through additional revenue and broader community engagement.

Recommendations!

Although fewer than 100 students enrolled in public schools under Act 173 in the first year, we anticipate increased participation in the years to come, and have some suggestions for things that would increase participation.

1. Promote Act 173 Enrollment. The Act is designed to benefit students, by providing them access to more courses, and districts, by allowing them to serve additional students in their community and receive more resources. However, only a small number of students used the Act to enroll in courses in their zoned district in the first year. The modest initial enrollment in the program could be because families lack awareness of this opportunity, because demand for district courses by home and private schooled students is modest, or because districts have not elected to participate in Act 173.

Education officials in Arkansas should encourage districts to be more proactive in promoting these opportunities to private and home school students living in their   district. They also should encourage districts to announce on their websites if they are or are not seeking to serve more children in their community through Act 173.

2. Highlight Available Resources. Demand for Act 173 enrollment is particularly
strong among students with disabilities. Districts should highlight the resources they have available and the services they offer to support such students.

3. Provide Supplemental Funding. The students with disabilities making use of Act 173 tend to be more resource-intensive to educate than the average district student. As a result, Arkansas education officials should explore ways to provide supplemental funding to districts enrolling such students to offset potential challenges. Changing special education funds so that they are tied to specific students, instead of wrapping general funding into the matrix, is something that we have addressed previously, and we feel it is a more equitable way of providing resources to students who need them the most.

4. Expand Student Choices. Act 173 does not allow students to enroll in a school
outside of the district in which they live. Additional students could benefit from this Act if nearby districts offered more attractive courses and they were able to enroll in them through a combination of Act 173 and the Public School Choice program. Better alignment between those two “consumer choice” initiatives would expand opportunities for students and districts while also providing state officials with a demand-driven measure of district school quality.

 

We look forward to bringing you more information about Act 173 enrollment as the data become available. Stay tuned!

Thoughts on Arkansas’ Teacher Retirement System

In The View from the OEP on February 6, 2019 at 11:38 am

This week, we are pleased to announce that Dr. Josh McGee has joined the team at OEP! McGee’s policy and research expertise will enhance OEP’s capacity to help policy makers and education leaders make evidence-informed decisions to improve Arkansas’ public education system.  Today, Dr. McGee shares his thoughts on Arkansas’ Teacher Retirement System.


Over the past two decades teacher retirement benefits have been a significant topic of conversation in statehouses across the country. For a number of reasons, including longer lifespans and lower than expected investment returns, teachers’ retirement benefits in Arkansas and nationally are turning out to be more expensive than policy makers had expected. School districts and state governments have not been putting aside enough each year to fully cover the cost of the benefits their teachers have earned, and as a result, unfunded liabilities, or pension debt, has grown dramatically, as has the cost of paying down this debt.

EDRE’s own Robert Costrell has an excellent graph illustrating the rising cost of teachers’ pensions. On average in the U.S., the cost of retirement benefits per pupil has grown by nearly two and a half times since 2004 from $530 to $1,312 today. Teacher retirement costs now make up more than 10% of all education expenditures, and because retirement costs have increased faster than education budgets, in many places they are crowding out schools’ ability to increase pay, purchase supplies, adequately maintain buildings, etc. (see reports here and here). In response to rising retirement costs, nearly every state has reduced teachers’ benefits and/or increase their contributions. The majority of state’s and district’s annual contributions, around 70 cents out of every dollar contributed, now goes to pay down pension debt rather than to pay for new benefits earned by today’s teachers.

The good news is that while Arkansas’ teacher retirement system (ATRS) has faced similar challenges as other public pension plans, it is in better financial shape than the average public plan, and as a result, its costs have not grown nearly as steeply. Below are graphs depicting ATRS’s funding and cost per pupil. As presented in Figure 1, at the end of FY2017, the latest year for which data is available, ATRS was 79% funded with a $4.2 billion pension debt, which is better than the national average of 72% funded for public pension plans.  Although the annual employer cost of Arkansas’ teachers’ retirement benefits has risen by $242 per pupil since 2001, Figure 2 illustrates it is still below the national per pupil average in both dollar terms ($822 vs $1,312 per pupil) and as a percentage of education expenditures (7.9% vs 10.7%).

Figure 1: Arkansas Teacher Retirement System Liabilities, Assets, and Debt, 2001-2017.

Screen Shot 2019-02-06 at 1.37.40 PM

Figure 2: Employer Contributions per Pupil, US and Arkansas, 2001-2017 (Projected through 2023). Graph reposted from Robert Costrell’s testimony before the Arkansas Legislature’s Joint Committee on Retirement on September 11, 2018.

Pension 2 

The fact that ATRS has remained in relatively good shape over the past two decades is a testament to the proactive, responsible steps that policymakers working together with ATRS have taken to keep costs in check while also ensuring a meaningful and secure benefit for the state’s teachers. Having that said, there are still significant risks on the horizon which the state would do well to understand and work to mitigate. Below is a brief discussion of three of the biggest challenges facing ATRS.

First, despite a nearly decade-long bull market since the Great Recession, Arkansas has made limited progress in paying down its pension debt. This is at least partially due to the backloaded repayment schedule (a.k.a. amortization), which is based on the expectation that the payments into the plan will grow by 2.75% annually. Because of this backloading, current contributions are not large enough to cover the interest on the pension debt, so under current funding policy, the debt is expected to grow for the next 10 years before finally declining until it is fully paid off in 29 years. This is akin to paying the minimum on a credit card – yes, you will eventually pay it off, but you’ll end up paying a whole lot more than the original amount and will have less financial resilience over a longer period of time. ATRS has acknowledged the value of accelerating the pension debt repayment schedule to avoid negative amortization, and we strongly recommend that the state consider doing so.

Second, public workers are living longer than public pension plans currently expect, and this is especially true of teachers. That’s what the Society of Actuaries (SOA) found as it works to updated mortality tables for public employees (see news article here and SOA report here). While it’s really awesome that teachers are enjoying longer lives, the cost of retirement benefits is going to go up significantly if/when ATRS updates its mortality assumptions in the next few years. The state and ATRS should formally study changes in public employee mortality based on the SOA’s findings and plan experience, and they should aggressively update the plans mortality assumptions to ensure the state has the most accurate picture of future benefits costs.

Third, the state and ATRS are betting on a 7.5% investment return to finance a huge share of teachers’ retirement benefits. While ATRS recently lowered its return assumption, it is still higher than the national average of 7.4% and much higher than their most sophisticated peers like the teachers’ retirement systems in New York City and California both of which have lowered their assumed return to 7%. The assumed return is important because it is the key ingredient used to estimate how much money the state and districts need to set aside today to fully cover the cost of the benefits owed to teachers when they retire. Using a higher expected return means budgetary costs will be lower today; however, it also means making a bigger bet on the market to cover a larger share of benefits costs over time, and as a result, it significantly increases the risk that contributions will need to rise in the future to make up for investment returns that didn’t materialize. Investment returns falling short of expectations was the single largest factor that contributed to the current pension debt, and returns will continue to be a big driver of teachers’ benefits cost. To provide a sense of scale, ATRS estimates that if the assumed return was lowered by 1 percentage point to 6.5%, which is roughly in line with the recommendations of the Society of Actuaries Blue Ribbon Panel on Public Pension Funding (SOA BRP), then the pension debt would increase by more than $2.5 billion or 60 percent. Given we are likely headed into a period of lower investment returns and the next recession is lurking somewhere in the not too distant future, sticking with a high assumed return places future state and school budgets at significant risk, not to mention teachers’ retirements. The state and ATRS should work together to remove some funding risk by developing a plan to lower the assumed return and increase contributions over time, bringing the assumed return in line with the SOA BRP recomendations.

These three risks are not insurmountable, and Arkansas is certainly not anywhere close to a crisis that requires drastic action. It is very important, however, that the state be vigilant, and seek to address potential issues well before they become larger problems. Like any system that relies on the power of compounding (i.e., exponential growth), problems with ATRS’s funding can get out of hand quickly if allowed to fester. This is why stress testing, as proposed in HB1173, and having formal cost-sharing plan developed in advance are so important. Not performing routine stress testing is like driving without headlights – you may survive, but the potential for unexpected disaster is huge. We recommend that the state adopt stress testing requirements for all of its pension plans, including ATRS, so that policymakers better understand the risks they face down the road and can make plans to navigate effectively through them.

In addition, the importance of planning ahead cannot be overstated. Once a pension plan gets into funding trouble, without an established plan address the problem, de facto cost-sharing will ultimately occur through ad-hoc changes that are almost guaranteed to disproportionately affect certain groups of employees (i.e., new teachers or retirees) and/or taxpayers (i.e., future vs. current). In contrast, a formal cost-sharing plan can distribute unexpected cost increases between taxpayers and employees in a predetermined, fair, and transparent manner. We recommend the state work with its pension plans to more clearly define its funding goals (here is an example from Texas) and the steps that would be taken should the plan experience unexpected cost increases. Additionally, we recommend that any future changes to benefits, like COLAs, or contribution rates should only be made in the context of having a clearly defined funding policy and cost-sharing plan. 

About Josh :

Josh

McGee most recently served as the Executive Vice President of Results-Driven Government at the Laura and John Arnold Foundation where he worked on a diverse set of issues ranging from retirement policy to how we address the national opioid epidemic. McGee is a Senior Fellow at the Manhattan Institute and is Chairman of the Texas Pension Review Board. McGee also serves on the boards of several nonprofits including MDRC, EdBuild, and the Equable Institute.

Class Size and Student Academic Growth

In The View from the OEP on January 30, 2019 at 11:36 am

Over the past two weeks, we have been examining relationships between teacher salary and and student outcomes.  We first discussed the proposed increase to Arkansas’ minimum teacher salary, including identifying which districts currently pay the minimum salary scale.  We found little relationship between districts’ starting teacher salaries and either student academic achievement or academic  growth.  Last week, we explored the relationship between teacher salaries and average class size, and this week we wanted to close the loop by examining how class size is related to student academic growth here in Arkansas. You can play with these data on our interactive viz.

Small classes are popular with parents and teachers alike. In a smaller class, we imagine that each student would get more personalized attention from the teacher, leading to greater academic gains. In Arkansas (and elsewhere around the world), however, small classes don’t seem to lead to consistently positive outcomes for students.

Check out the figures below to see the relationship between school average class size and school average academic growth. Because we already know that average class size in Arkansas varies between school levels (elementary has the largest classes, while high schools have the smallest) we broke the visuals out by school level.


In Arkansas elementary schools, average class sizes ranged from 10 to 25 students, and there was a weak correlation between class size and academic growth (r=0.2).  Further analysis, however, revealed that students in schools with larger average class sizes actually demonstrated greater academic growth than their peers in smaller classes! Differences were statistically significant for each of the past two years (the only years for which ESSA growth data are available).

Figure 1. Average class size and average content growth score, by Elementary level schools, 2017-18

es cs growth

In Arkansas middle schools, average class sizes ranged from 5 to 25 students, and there was essentially no relationship between average class size and student growth. Average class size was not a statistically significant predictor of student growth, and results were consistent over the past two years.

Figure 2. Average class size and average content growth score, by Middle level schools, 2017-18

ms cs growth

In Arkansas high schools, average class sizes ranged from 5 to 20 students, and there was essentially no relationship between average class size and student growth. Average class size was not a statistically significant predictor of student growth, and results were consistent over the past two years.

Figure 3. Average class size and average content growth score, by High school level schools, 2017-18

hs cs growth

Then we got to wondering, what if a school had decreased (or increased) class size- how would that relate to changes in academic growth? Our theory would be that if a school reduced the average class size from one year to the next, the average growth of students in the school would increase.

So we calculated the change in school average class size from 2016-17 to 2017-18, and the change in school growth score in that same time, plotted the results, and ran some regressions!

Figure 4 shows the change in average class size and student academic growth score from 2017 to 2018 for elementary schools.  The green square indicates the quadrant where we would imagine schools show up- reduced class sizes and increased growth.  There are some schools there, but there are also some schools in the red square– indicating reduced class size and decreased growth.  You will notice that the majority of schools, however, show up on the left side of the chart, indicating that they increased average class size from one year to the next.

Figure 4. Change in average class size and average content growth score, by Elementary school level schools, 2016-17 to 2017-18

es change

Just eye-balling these elementary schools, you can see that some of these schools experienced an increase in student academic growth, while others demonstrated declines in academic growth from one year to the next.  Further analysis, however, revealed that increasing average class size by one student would result in a reduction of less than 1 point in the change in growth score (holding all other district characteristics constant)!

Figures 5 and 6 show the change in average class size and student academic growth score from 2017 to 2018 for middle and high schools, respectively.  Like the elementary schools, the majority of these schools increased average class sizes, and statistical analysis showed that increasing average class size by one student would result in a reduction of less than half a point in the change in growth score (holding all other district characteristics constant)!

Figure 5. Change in average class size and average content growth score, by Middle level schools, 2016-17 to 2017-18

ms change

Figure 6. Change in average class size and average content growth score, by High school level schools, 2016-17 to 2017-18

hs change

So, what we have learned about average class size is that in Arkansas schools there is not a direct relationship between smaller classes and increased academic achievement.  When class sizes are increased, the negative impact on year to year changes in growth is statistically significant but practically insignificant at all levels because of the extremely small size of the change.

Of course, this is not a causal analysis, and there are many variables that we are not controlling for. One of the major issues is that we are using school-level class sizes and growth scores. If the data were available, analyzing at a classroom level might be better- but would also lead to super small sample sizes which raises other concerns. On the plus side-  we do know that academic growth isn’t correlated with the typical confounding variables like %FRL and school size.

After the research done in our past three blogs, we now know that larger class sizes are associated with increased teacher salaries, and don’t seem to be meaningfully impacting student growth, so school districts should carefully consider staffing patterns.

So what IS associated with increased student achievement?  You know what we think- high quality instruction all day, every day.

 


Regression Details

Elementary:

2016-17:  School average class size significantly predicted school-level academic growth even after controlling for school % FRL, b = .25, t(518) = 4.75, p < .001.  Average class size explained a significant proportion of variance in depression scores, R2 = .12,  F(2, 518) = 63.24, ( p < .001).

2017-18:  School average class size significantly predicted school-level academic growth even after controlling for school % FRL, b = .22, t(518) = 3.77, p < .001.  Average class size explained a significant proportion of variance in depression scores, R2 = .09,  F(2, 518) = 27.83, ( p < .001).

Change: Change in School average class size from 2016-17 to 2017-18 significantly predicted the change in school-level academic growth even after controlling for school % FRL, b = -.30, t(518) = -4.15, p < .001.  Average class size explained a significant proportion of variance in depression scores, R2 = .03,  F(2, 518) = 8.63, ( p < .001).

Middle:

2016-17: School average class size did not significantly predict school-level academic growth.

2017-18:  School average class size did not significantly predict school-level academic growth.

Change: Change in School average class size from 2016-17 to 2017-18 significantly predicted the change in school-level academic growth even after controlling for school % FRL, b = -.18, t(196) = -2.40, p < .05.  Average class size explained a significant proportion of variance in depression scores, R2 = .03,  F(2, 196) = 3.52, ( p < .05).

High:

2016-17:  School average class size did not significantly predict school-level academic growth.

2017-18:  School average class size did not significantly predict school-level academic growth.

Change: Change in School average class size from 2016-17 to 2017-18 significantly predicted the change in school-level academic growth even after controlling for school % FRL, b = -.19, t(292) = -2.29, p < .05.  Average class size explained a significant proportion of variance in depression scores, R2 = .02,  F(2, 292) = 3.242, ( p < .05).