St. Louis is Doing a “Poor” Job with Local Taxing Districts

“Poor” is never the rating you want. Unfortunately, that’s what St. Louis got in a recent audit of the city’s local taxing districts. The audit found that “in the areas audited, the overall performance of this entity was Poor” and the city needs to “significantly improve operations.”

Local taxing districts are political subdivisions that fund specific improvements and services through taxes and fees, so taxpayers should be concerned by this bad rating. The government is taking and distributing hard-earned taxpayer money through these districts, but is it doing so responsibly?

Here are the findings as summarized on page 2 of the audit report:

Findings

So, is St. Louis responsibly taking and distributing taxpayer money? The answer is no.

The state auditor found serious problems with the creation and implementation of local taxing districts. These districts take a lot of taxpayer money each year. According to the report, CIDs bring in approximately $10 million each year, and in the last fiscal year TDDs brought in $3.8 million. This is bad news for taxpayers; the last thing we want is the city playing fast and loose with our money.  

Show-Me analysts have long been critical of special taxing districts in Missouri. There have been calls for reform of these districts in the past. While it’s great that more attention is being given to this problem, the time for reform is long overdue. When it comes to handling taxpayer dollars, the city needs to be doing a lot better than “poor.”

 

Making Growth Understandable

School test scores are a snapshot. If the test is a good one, it tells us how much a student knows at any given time, but it doesn’t tell us how much he’s learned over the course of a school year. For that you need to know how well the student scored in the past and measure that against the present. That’s called “growth data.” A recent Data Quality Campaign (DQC) publication highlights how important growth data is for parents, and suggests ways to help parents find and interpret growth information.

The DQC publication is a resource for parents, explaining why growth data is important for understanding student progress and how it can provide insight into their child’s school. It even explains different types of growth measurements in non-academic terms and could help parents work through jargon that may be on a school report card.

Unfortunately, the Missouri school report cards produced by the Department of Elementary and Secondary Education (DESE) don’t clearly explain growth data like the DQC does. In fact, the report cards don’t effectively inform parents about student growth at all, let alone explain what growth means.

To be sure, Missouri school report cards currently have a “growth” column in a section labeled “Federal (ESSA) Data”. The screenshots below are from three different district report cards. The growth numbers are all indecipherable.

Federal ESSA data

The explanation on the report card doesn’t help much either. The report card states that numbers above 50% represent positive growth. ‘S’ and ‘N’ indicate whether the data was statistically significant or not. However, DESE doesn’t indicate how much growth the 50 percent benchmark represents, or even what defines growth. This “information” is not useful for parents who want to gauge how their child’s school is performing.

DQC’s approach actually informs parents. To explain how growth is calculated, DQC asks, “Did teachers help students in this school do better than we expected them to perform, even if they didn’t get to a grade-level target?” Framed in this way, there is context to the meaning of growth data and what it tells us about a school. DESE and the DQC are both talking about growth, but the different ways they present and communicate the information can make a major difference.

Parents should be able to easily find how much a school teaches students each year. But as long as student growth information is hidden behind statistical jargon and vague definitions, parents may never know how much students are learning at their child’s school.

Waddell & Reed and the Border War

Having been granted $62 million worth of Missouri state subsidies, Waddell & Reed is asking for an additional $40 million in local tax breaks. The surprise here is that the additional local incentives are supported by the mayor of Kansas City, who campaigned for office promising to reform economic development incentives.

What is less surprising is that the governor of Kansas, who embarked on a highly publicized economic development truce with the governor of Missouri, is critical of the additional Waddell and Reed incentives. The Kansas governor is quoted in The Kansas City Star as saying:

My executive order limiting the use of state incentives was premised in part on Missouri local units of government bringing their property tax incentives to a level playing field with Kansas. Without that action, a true ceasefire cannot occur.

There always was less to the border war truce than people wanted to admit. The agreement was vague and full of loopholes, and I wrote as much in the Kansas City Business Journal in August. No piece of legislation or executive order is as important as a leader’s resolve, and without the ability to say no to incentives, the truce is meaningless.

 

Why Can’t Missouri Be Like . . . Illinois?

Missouri’s Department of Elementary and Secondary Education (DESE) released its school report cards earlier this year in an attempt to fulfill the transparency requirements in the national Every Student Succeeds Act. DESE’s report card either  missed or barely met many of the requirements listed in the federal law. The deadline for one specific requirement—reporting on spending per student at the school level—has been pushed back to June 2020, allowing states more time to collect the data. Missouri has not yet published the school spending data; it will (hopefully) be on the 2018–19 Missouri school report cards,  

Of course, there’s no reason to wait for the final minute to report. Nineteen other states, including Illinois, have already released school-level spending ahead of the deadline. But Illinois takes it a step further and breaks out that spending by subcategory, including spending for instructional purposes, teacher salary and benefits, and classroom supplies. The state also has a high quality, organized school report card website that allows people to easily compare schools. Parents and school leaders can compare schools’ spending and academic performance at the same time.

The screenshot below shows a few randomly selected schools in Illinois and their spending comparisons, and also shows how much of school funding comes from different sources (local, state, federal or evidence-based funding). Further comparisons might reveal districts where one school spends more money per student and does poorly in academics, while another school that receives less money but does very well in academics.

Spending graph

Information about school-level funding will provide more detail and context for how schools are performing. Parents in Illinois and other states that have already published this information can use it to form a more complete picture of school performance. Why does it seem like DESE always waits until the last possible minute to comply?

 

DESE’s APR Summary Reports Hide Achievement Gaps

If you want to find out how Missouri students are performing, you might think you could go to the Department of Elementary and Secondary Education’s (DESE) website to find out. After all, it’s DESE’s job to house the state’s education data. But you’d be wrong, because DESE isn’t nearly as helpful or transparent as it should be.

The Annual Performance Report (APR) on the DESE website does contain some information. However, as Show-Me Institute writers have pointed out, this report doesn’t show how many students are performing at grade level. It lacks clear labels, leaving the reader confused about what the terms mean. It also doesn’t present the raw data, it only gives results, on an undefined 100–500 color-coded scale, after DESE has gone through the APR calculations. Those calculations are not explained.

Within the APR summary report is the subgroup achievement section. That’s a likely place to look for achievement gaps, but this section isn’t helpful either. It is hard to tell how students in different subgroups are doing because DESE uses a “super subgroup.” The super subgroup is a combination of scores from Black students, Hispanic students, students who quality for free and reduced-price lunch (indicating a low family income), students with disabilities, and English Language Learners. There is no information about performance from each group separately.  

In contrast, resources like the National Assessment of Educational Progress (NAEP), also known as the “Nation’s Report Card,” show student’s academic achievement and disparities between different groups of students.

Missouri’s NAEP result data can easily be broken out to provide a clear picture of achievement gaps. The starkest gaps are for students with disabilities. Only 6 percent of Missouri’s 8th grade students with disabilities were proficient in math, and only 8 percent were proficient in reading. That’s 26 and 25 percent lower than the state average, respectively. Other subgroups, including students who qualify for free and reduced price lunch programs and Black and Hispanic students, also have lower rates of proficiency than the state average, as shown in the graph at the top of this post.

The NAEP results provide valuable information for parents about how Missouri students are performing. DESE’s APR Report, with its “super subgroup” that hides more than it reveals, leaves parents in the dark.

Why should Missouri parents have to go searching beyond DESE’s website for information they need? They shouldn’t. DESE should be helping, not hindering parents (and taxpayers) who want to know how their schools are performing.

 

An Update on Economic Development Policy in Kansas City

No sooner had Show-Me Institute published “Some Positive Signs on Economic Development Incentives in Kansas City” than one of the points of optimism fell away. What does this mean for reform in Kansas City?

The Strata project—in which the taxpayers of Kansas City were asked to invest $63 million in public subsidies for a $132 million office tower with no known tenants—was flawed from the start. I previously  noted that some of the claims regarding the need for the project were false, and even the mayor said, “Strata should fail.”

The city council reworked the deal, reducing the public incentive to $36 million, indicating that the developers’ initial claims for the need for public investment were questionable all along. The mayor still opposed the new deal, but the subsidy was approved by the council on a 7–4 vote.

Activists opposing the deal demanded that the mayor veto the measure, which would have required 8 votes of the council to override. Despite voting against the measure, the mayor chose not to veto it. Why not? Why didn’t the mayor exercise his power to try to stop something he says he is against?

Research indicates that economic development incentives such as these do not change behavior in 75 percent of cases. Even those in charge of the city’s economic development policy concede the benefits are “extremely difficult to quantify.” If policymakers want to protect taxpayers from wasteful subsidies, they must start saying no.

 

Support Us

The work of the Show-Me Institute would not be possible without the generous support of people who are inspired by the vision of liberty and free enterprise. We hope you will join our efforts and become a Show-Me Institute sponsor.

Donate
Man on Horse Charging