This post is part of our ongoing analysis of 2024 education legislation. See our bill tracker here for all analyses.

Mississippi First has spent the last year closely researching the impact of MAEP on districts throughout the state in order to determine how best to direct additional resources to students with the greatest needs. Having built up an intimate knowledge of the intricacies of MAEP—as well as alternative funding models—we were surprised to see the Parents’ Campaign share an analysis this past weekend (and then another update on Monday, April 1) purporting to demonstrate that “MAEP Outperforms INSPIRE in Every School District.” We were skeptical as we have analyzed both the House and Senate numbers extensively since their release by the Legislative Budget Office, but, as policy wonks, we approached the Parents’ Campaign’s numbers as we would any of the other research we evaluate on a daily basis—with critical thinking, fairness, and a very close eye for detail.

Having now reviewed the Parents’ Campaign’s work, we conclude that their analysis contains a number of flaws that refute its top-line findings and undermine the credibility of anyone willing to publish them. The most basic problem is that it is nearly impossible to project Mississippi school finance estimates beyond FY26 with any level of validity because of the number of assumptions one would have to make. Not only does the Parents’ Campaign not defend any of their assumptions, they also do not list or explain them. What, exactly, is the Parents’ Campaign trying to hide by not providing any explanation? 

Credible financial projections of any kind not only require a description of the sources used for baseline numbers but also a detailed methodology because they almost always rely on assumptions. The validity of these baseline numbers and assumptions are at the heart of the trustworthiness of any projections. Without a methodology provided by the Parents’ Campaign, we had to spend a lot of time figuring out what their assumptions were before we could evaluate them. That process revealed methodological and analytical issues that would not pass muster at any research institution or organization that adheres to even basic methodological standards. These issues include:

  • Presenting long-term estimates for school funding models which require future-year recalculation based on future-year expenditures subject both to future-year state and local increases;
  • Assuming that district allocations under MAEP will increase by nearly 25% by FY35 while simultaneously assuming only minimal increases for INSPIRE, with no given justification;
  • Initially using inaccurate FY25 projections for district allocations under the INSPIRE formula that undercount projected spending by tens of millions of dollars, although the correct numbers had been available for weeks;
  • Failing to properly reflect that INSPIRE includes a three-year “hold-harmless” provision for all districts, including those affected by the 2002 hold harmless guarantee;
  • Improperly adding to the Senate proposal a perpetual “hold-harmless” provision for the 2002 hold harmless districts;
  • Assuming that student enrollment, attendance, demographics, and course enrollment remain unchanged until 2035 in every district;
  • Assuming that assessed value (and therefore local contribution) neither grows nor shrinks at any point until 2035;
  • Assuming a constant inflation rate for each year; 
  • Using a flawed measure of poverty to claim equity under MAEP.

In this analysis, we will explain first why long-term projections for MAEP, the original Senate proposal, or INSPIRE are nearly impossible to make. We will then expand on why the assumptions that the Parents’ Campaign have used are faulty, including the most basic and obvious flaw of initially using incorrect numbers for INSPIRE in “Year 1.”

Inexplicable Long-Term Projections Under MAEP and the Senate MAEP Plan

On its face, the Parents’ Campaign analysis projects costs of MAEP, the original Senate proposal, and INSPIRE over an 11-year time period, with estimates in FY25, FY27, FY31, and FY35. Over that time period, the Parents’ Campaign claims that state MAEP spending under the Senate proposal will increase from $2.94 billion in FY25 to $3.66 billion in FY35, a 24.4% increase. They provide no basis for this projection.

Every four years, the MAEP “base student cost,” which determines the lion’s share of MAEP funding, is recalculated based on actual state and local expenditures at districts that are deemed by the State Board of Education to be “successful” and “efficient.” The next recalculation will occur in the 2026 legislative session for FY27, then in FY31 and FY35 thereafter. Under the Senate proposal to revise MAEP, in the intervening years–meaning FY25 and FY26 and then FY28, etc.–the base student cost would grow at a rate equal to 25% of the base cost multiplied by the average annual inflation rate over the past twenty years. Currently, the base student cost grows at a rate equal to 40% of the base cost multiplied by the current inflation rate. As a result, MAEP would increase at a slower rate in non-recalculation years under the Senate proposal. (The Senate proposal does not change the process of calculating the base cost or allocating dollars to districts, but it does increase the local share from 27% of the operational costs of the formula to 29.5%, or 28 mills, whichever is less.)

For FY25, we can make projections based on actual data for both MAEP and the Senate MAEP plan as all the required information is now available for the FY25 fiscal year, but beginning next year (FY26), we have to start making assumptions about such factors as student attendance; student poverty; district average salary levels for teachers of special education, CTE, and gifted; annual inflation or its twenty-year annual average; and assessed value (for local contribution) at not only the state level but at the district level for every district. The sheer number of assumptions alone present challenges to precision, and reasonable people could disagree on whether any one of those many assumptions are fair. However, by FY27, we would no longer be having a debate about the reasonableness of these many assumptions because we would have to first agree on a much bigger variable–how the base student cost of MAEP might behave in a recalculation.

The exactitude of the base student cost calculation in MAEP (the reason that some people call it “objective”) is precisely the reason that it is difficult to predict in the near term, let alone the long term. Although the base cost formula is known, the data to run it is not, and cannot be, known yet, either for FY27 or certainly for as many as eleven years away. By statute, since MDE must pick districts based on recent “success” and “efficiency,” we cannot predict what districts will be included for these calculations prior to the actual recalculation year. Changes in the districts selected could have large effects in what the base cost will be because both state and local money are included in the expenditure data–both of which could increase or decrease differentially between now and recalculation for the yet-to-be-chosen school districts. The recent trend of increased district grades may also create a much larger pool of “C” districts for MDE to be choosing among than in the past, increasing the number of possible combinations of districts. Even using trend data from previous recalculation years is not reliable, as the particular time frame at issue is much more important to a recalculation than any previous trend since the actual appropriation decisions of state and local policymakers vary so widely year to year and over time. The operational costs of MAEP (the part of the formula driven by the base student cost) are the bulk of the formula, so this generates a large uncertainty in estimates even before we begin picking assumptions for nearly every other factor as described in the previous paragraph. To then make projections twice more for recalculation years up to eleven years away knowing these problems only compounds the uncertainty of any estimates. In short, without a team of demographers, school finance experts, and time travelers, any MAEP estimates in FY27 and beyond are simply wild speculation. This says nothing about the fact that regardless of estimates, if the legislature does not fully fund either MAEP or the Senate MAEP plan for every year projected all such future estimates are completely meaningless.

Inexplicable Long-Term Projections Under Inspire

Long-term projections under INSPIRE are similarly difficult to estimate. Since INSPIRE restarts the clock on when recalculation happens, INSPIRE’s first recalculation year will be FY29, which will happen during the 2028 legislative session. Between now and then, to project numbers for INSPIRE, we have to make assumptions about all the elements that go into calculating it: student enrollment (and the resulting sparsity); student demographics (poverty, concentrated poverty, English learners, and disability status); CTE enrollment; and local assessed value. INSPIRE also includes both an inflation component beginning in FY28 and a three-year, phased-out hold harmless for all districts before then so we also have to make assumptions about these as well. It is easy to understand why any estimates beyond FY25 get more and more uncertain.

Regardless, we again run into the recalculation problem in FY29, as the “base student amount” under INSPIRE is also subject to two recalculations during the timeframe of the Parents’ Campaign analysis albeit by a working group of superintendents, rather than a statutory formula. Although there is no guarantee the legislature will appropriate according to the advised recalculation under INSPIRE, they also do not appropriate according to the recalculation under MAEP despite the fact that it is in statute that they do so. Fairness requires that if one is to assume that the legislature will appropriate according to a future, guesstimated MAEP base student cost calculation, then one must also assume that the legislature will behave similarly for a future INSPIRE base student cost calculation offered by the educator taskforce. Fairness also requires that one use the same future guesstimated base cost number (adjusted for attendance versus enrollment) for both MAEP and INSPIRE and then calculate full district allocations accordingly, as there is no reason to believe an educator taskforce composed of superintendents will purposefully lowball their own schools. The fact that the Parents’ Campaign does not do this shows that their intention is not to compare apples to apples, but to compare apples to orangutans and hope that the public doesn’t notice.

Other Serious Problems with the Parents’ Campaign Analysis

In addition to the basic flaw of purporting to know the future without providing any justification as to how, the Parents’ Campaign also makes a series of obvious errors in their methodology.

Inaccurate FY25 Projections for District Allocations Under INSPIRE
Funding projections for “Year 1” (i.e., FY25) are the easiest to model, and these projections would also be extremely impactful for any long-term projections (though, as we have explained, long-term projections are exceedingly difficult to model with any validity). However, the Parents’ Campaign analysis initially used inaccurate FY25 projections for district allocations under INSPIRE, underestimating statewide FY25 INSPIRE funding by roughly $35 million.

For example, according to the most recent projections from LBO, Jackson Public Schools would be projected to receive an allocation of $136,230,569 under INSPIRE in FY25. The Parents’ Campaign originally cited this figure as $129,195,947—undercounting the LBO projection by over $7 million (a difference of roughly 5.4%). Statewide, these inaccuracies added up: their analysis estimates the total cost of INSPIRE to be $2,966,438,668 in FY25—roughly $35 million less than the $3 billion included in the education appropriations bill (House Bill 1823) passed by the House. 

Although the Parents’ Campaign has since updated their analysis, they had the correct figures for weeks before they sent out their first email. Whether this oversight was sloppy or deliberate doesn’t matter–once misinformation starts churning, it takes the truth a long time to catch up.

Incorrect Hold-Harmless Assumptions
Both the INSPIRE Act and the Senate proposal to revise MAEP would repeal the current “hold-harmless” provision under MAEP that prevents any district from receiving less in state funding than they received in 2002. The existing hold-harmless provision still applies to 10 districts that have experienced significant drops in student enrollment since 2002. Both school funding proposals would replace this perpetual hold-harmless provision with a temporary one, ensuring that no district would initially receive less in FY25 than they received in FY24. After FY25, however, these districts would likely begin to receive less funding—though INSPIRE funding could only decrease by up to 3% annually until FY28, whereas, the hold-harmless provision under the Senate MAEP proposal would expire entirely beginning in FY26.

The Parents’ Campaign analysis appears to erroneously assume that INSPIRE would not include a phased-out hold-harmless provision, while a revised MAEP would include a perpetual hold-harmless provision. This has serious implications for projected funding in districts like West Tallahatchie, which benefit substantially under the existing hold-harmless provision in MAEP.

Table 1. Parents’ Campaign Estimates for West Tallahatchie versus Correct Numbers

PC Senate MAEP (ERROR)PC INSPIRE (ERROR)INSPIRE with HHDifference between Corrected HH estimates
FY24$4,403,534 (MAEP)$4,403,534.91N/A

According to the Parents’ Campaign analysis, West Tallahatchie would see an immediate decrease in funding under INSPIRE beginning in FY25. This is patently inaccurate, as the INSPIRE Act includes a temporary hold-harmless provision for FY25 that ensures that no district would receive lower funding. The Parents’ Campaign analysis also posits that West Tallahatchie would see an increase in funding under a revised MAEP in both FY25 and FY27. Under the Senate proposal, West Tallahatchie would be held harmless in FY25 (meaning they would likely receive the same amount of funding in FY25 as in FY24, not more), but they would not be protected at all in FY27—meaning we would expect a decrease in funding under MAEP in FY27. Instead, the analysis claims that West Tallahatchie would see an increase in funding. This is obviously inaccurate. It is hard to model what exactly West Tallahatchie might be eligible for under the Senate plan in FY27 because of all the problems that we list above. However, if we assume every factor stays exactly the same for two years (a very strong assumption based on our discussion above) and that the base cost of INSPIRE does not increase for three years (also likely a poor assumption), by applying the accurate hold harmless in INSPIRE, we can see that the Parents’ Campaign systematically underestimates the amount of money that West Tallahatchie will receive. The same would be true for other 2002 hold harmless districts.

Flawed Measure of Poverty
Because INSPIRE’s focus on championing equity has been well-touted in comparison to the inequitable MAEP, the Parents’ Campaign attempts to highlight that “high-poverty” districts would benefit more from a revised MAEP. However, the Parents’ Campaign measures poverty using the percentage of students qualifying for free and reduced lunch (FRPL) in each district. As we have written about extensively, FRPL has become a fundamentally flawed measure of poverty, given that every student in a district participating in the Community Eligibility Provision qualifies for free lunch once at least 40% of students1 have qualified for at least one federal poverty program or are designated as homeless, migrant, or in foster care (see this additional explanation of the “Community Eligibility Provision” and why FRPL is no longer an accurate proxy for student poverty). Thus, when the Parents’ Campaign analysis references districts with “the highest poverty rate possible in MDE’s reporting (>=95%),” they are simultaneously referring to districts with as few as 40% of students truly in poverty as well as districts with as many as 100% of students truly in poverty. 

Coincidentally, MAEP also utilizes this flawed and outdated measure of student poverty to determine which students qualify for the 5% “at-risk” weight on top of the base student cost. The result of this flaw within the MAEP formula is that a majority of districts in Mississippi receive the 5% “at-risk” weight for every student, even in districts where only a fraction of these students are actually in poverty.

A far more accurate measure of student poverty is “direct certification” or the “identified student percentage”—essentially the actual percentage of students who are participating (or whose families are participating) in a means-tested federal antipoverty program, such as Head Start or the Supplemental Nutrition Assistance Program (SNAP) or who are categorically eligible due to situations such as homelessness or migrant status. This measure of poverty allows for differentiation between districts with moderate and high levels of student poverty, which is both useful from a research standpoint and for the purpose of directing resources to students with the greatest need. This is the measure of poverty that the INSPIRE formula utilizes to determine the percentage of students who are eligible for the “low-income” weight (a weight that is six times as large as the MAEP “at-risk” weight), and this is also the measure of poverty that we have used to more accurately evaluate equity in the two school funding proposals.

Is a Fairer Future-Year Comparison Possible?

A better future-year comparison of INSPIRE and MAEP may be possible, but it would take someone with a lot more expertise and a lot more time than what Parents’ Campaign apparently has to do a much better and more credible job. Even so, no one can see the future with clarity. With so many variables, it is better for policymakers to make decisions based on what they can actually know to be true–the FY25 projections based on real data. These projections provide a clear understanding of which types of school districts each model benefits and allow policymakers to make the best decision in real time.

1 At the end of this school year, the threshold for CEP has moved to only 25% of students needing to participate in federal programs or be categorically identified.