By Charlotte Chinana
_____________________________________________________________
“I am concerned that the percentages of files reviewed from district to district was not proportionally equal.”
“Our District’s Special Education units for 2011-2012, based on 2010-2011 official counts, decreased by 1.8%. The audit was initially described as an investigation of districts where funding units [had] increased. We are still not sure why our district was selected when our units went down.”
“We have never had the opportunity to refute the findings of the audit.”
“It makes me wonder if it was already predetermined that we would get an in depth audit.”
_____________________________________________________________
These are just some of the comments provided to the state’s Legislative Education Study Committee (LESC), from various district superintendents, regarding the audits conducted by the Public Education Department (PED), under the guidance of Secretary-designate Hanna Skandera.
Back in April of this year, Skandera called for an audit of 34 school districts across the state after her office reported a potential discrepancy related to enrollment data. What you might not recollect (or possibly had no idea it was even happening), was last week’s interim LESC meeting (May 25-27), in which the PED audit results were amongst the highly anticipated topics slated for discussion.
During the interim committee meeting, an analyst for the LESC, Craig Johnson, presented a staff report which highlighted a couple of equally interesting findings. Although supportive of the effort to ensure an accurate distribution of funds, according to Mr. Johnson’s presentation, the LESC staff had a number of concerns regarding the PED’s audit, including questions about:
- the PED’s selection of districts to be audited, which was based (almost exclusively) on an ‘apples-to-oranges’ comparison of data;
- selecting only two years worth of data for the audit, vs. a broader analysis of several years of worth of information;
- why it was necessary to expedite the audit timeline, and whether or not data accuracy could be assured (given the short amount of time self-allocated for the project).
The LESC staff’s sentiment was also echoed by the Legislative Finance Committee (LFC):
“In an April 19 letter, LFC Deputy Director Charles Sallee wrote, ‘The aggressive schedule for the data validation audits, the narrow scope, and limited audit procedures are not designed to fully assess drivers in new reported units, nor identify fraud or gaming of the formula.”
A Tale of Dueling Data Sets
The LESC staff report noted that, while the PED cited a “nearly 116% increase” in the “funding unit” data reported, using a comparison of the 2010-2011 school year, “80th day” reporting data – to the 2011-2012 school year, “80th day” reporting data, in actuality, the PED had compiled its information based on the comparison of the 2011-2012 school year “80th day” reporting data – to the 2010-2011 school year, “80th/120th day” average (or, “final funded numbers”).
According to the PED analysis, the unit increase looks quite substantial:

In a memo dated April 12, the PED noted an “enormous increase” in the number of “funding units” being reported by the state’s school districts, suggested that there had been a questionable increase in “[funding] unit growth,” despite there being only “1% increase in student enrollment.” This purported increase gave the PED cause for suspicion, and subsequently became a primary reason for conducting the audit.
According to the LESC analysis, the unit increase appears to less drastic:

- 1The final funded run uses the average of 80th and 120th day data with adjustments for 40th day data on growth and new programs. The preliminary funded run uses the average o 80th and 120th day data with a projection for 40th day numbers.
The LESC staff report also noted that the audit methodology used by the PED, wasn’t a “sufficient way to clearly identify suspected instances of formula chasing” (or, “gaming” of the system). It was also observed that the audit procedures and tools used by the PED, focused on an assessment of special education compliance, as opposed to the intended objective of ensuring the accuracy of data reporting for funding purposes.
Reliable Results
In a letter to the PED, dated April 19, the LESC staff outlined several concerns about the focus and expected outcomes of the audit. This included some “reservations about the department’s identification of school districts to be audited,” based on only two years worth of data. According to the letter, LESC staff had suggested that using trend data would give the PED “more useful and telling information about district practices and priorities,” than a two-year comparison:

The letter also highlighted a major issue with the April 27 deadline that Skandera’s department set for initial audit findings:
“…LESC staff are concerned not only that this deadline is much sooner than necessary but also that it provides too short a timeframe to examine data sufficiently and to report findings, let alone sufficient time for school districts to respond to the findings.”
According to the LESC staff report, this timeline effectively gave the PED audit team nine working days to conduct the various steps of the audit, AND report its preliminary findings. From the PED’s stand point, it was necessary to conduct the audit quickly, because more than 1/3 of the state’s school districts had been flagged for the audit (as well as 1/3 of the state’s charter schools), and there’s a statutory requirement for budget approval by July 1.
LESC Survey of Audited School Districts
Prior to the May 9th interim committee meeting, the LESC sent out an online survey to the superintendents of the 34 school districts audited, asking for feedback about the process. Of the 34 superintendents that received the survey, 30 responded by May 20th.
From the district response report:
“While some districts expressed support for the audit, survey responses, in general, demonstrated uncertainty or apprehension about the audit. Concerns about the audit expressed by districts in the survey include:
- media coverage;
- the nature of the selection process;
- the short time frame involved; and
- the response from PED.
Some districts noted that the audit procedures appear disconnected from one of the stated purposes of the audit, which was to identify ‘gaming of the system to receive additional funds.’ “
Through the combination of the online survey results and public testimony provided by several of the district superintendents who came to Santa Fe for the meeting, the LESC, LFC and PED were provided with a plethora of anecdotal feedback, regarding the practicality of using such a truncated timeline. While there was general consensus that timely and accurate reporting of data is a reasonable enough request, many districts expressed concerns ranging from the lack of entrance and exit conferences, to problems encountered while trying to provide documents (i.e. busy fax lines), to the lack of PED asking for a district response to the audit findings (related to any compliance issues their district was flagged for).
“Several districts also noted that the budget review process was affected by the audit, most commonly due to a delay in receiving budget documents from PED.
- ‘Significantly slowed our ability to complete budget.’
- ‘Administration delayed work on budget recommendations pending outcome of the special audit based on the possibility that projected membership numbers could be affected.’
- ‘We did not get our 910B5 until later than usual, and that made it very difficult to get our budget done in a timely fashion.'”
Where Do We Go From Here?
According to the next steps outlined by the PED, an additional in-depth audit will be conducted for nine districts deemed to have a combination of “major compliance issues” along with “severe data quality issues.” Hopefully, the issues raised by the LESC and LFC will be addressed, so that this next round of audits really does ensure that “information is being reported accurately and taxpayer dollars are protected.”