Grading Missouri Schools with Susan Pendergrass and Avery Frank

Education |
By Susan Pendergrass , Avery Frank and Zach Lawhorn | Episode Length 25 min

Visit the site: moschoolrankings.org/

Susan Pendergrass and Avery Frank join Zach Lawhorn to discuss MOSchoolRankings.org, the Show-Me Institute’s website that assigns letter grades and GPAs to Missouri schools and districts using publicly available academic and spending data. They explore how the site works, why Missouri has lagged behind other states on accessible school report cards, and how the governor’s executive order requiring A through F grades may change that. They also discuss the most common objections to grading schools, how growth and proficiency data account for differences in student populations, the status of report card legislation in the 2026 session, and more.

Listen on Spotify

Listen on Apple Podcasts 

Listen on SoundCloud

 Episode Transcript

Zach Lawhorn (00:00): Welcome to the Show-Me Institute podcast. I’m Zach Lawhorn from Show-Me Opportunity, and today I’m joined by Susan Pendergrass and Avery Frank from the Show-Me Institute. Susan, welcome back to the podcast as a guest — we’re really making a habit of this. Today we’re going to talk about MOSchoolRankings.org, which is a website that was launched a few years ago now at this point. So we’re going to talk about some updates, some new data, some improvements that have been made to the site. But for the handful of people who haven’t yet visited MOSchoolRankings.org, Susan, just give us a primer. What is it? What’s the idea of the site, and then we’ll kind of talk about the upgrades.

Susan Pendergrass (00:35): Yeah, MOSchoolRankings has been the subject of a couple of ironic moments in history, one being that we decided to launch this in 2018-19. We decided that because we’ve complained a lot about how the state doesn’t do informative report cards that parents can understand — simple, ideally with a letter grade because everyone gets that. And we looked at this model that is used by the Fraser Institute in Canada, where they also rank order all the schools. So you can see this school compared to the rest of the schools in the state is number one and this one is number 2,500. So we decided we would rank order and assign letter grades to only academic measures, which is really pretty groundbreaking. In 2018-19 we picked the only academic measures really available, which is proficiency in reading, proficiency in math, proficiency in reading and math for only low-income students to get a measure of achievement gaps or how districts deal with low-income students, a measure of how a particular school or district would expect to do in reading and math based on the percentage of low-income students they serve, and the growth model that was developed and is used by the state. ACT scores and graduation rates. So a total of the most would be 10 measures for each school that we assign letter grades to using a very simple curve where we took the full range of scores. For example, graduation rates might go from 75% to 100%. We divided that into five equal sections and assigned letter grades. So an F would be 75% to 80% and an A would be 95% to 100%. Did the same thing for all 10 measures — took the range, divided by five, and assigned the letter grades that way, which is a curve, and you get most of the schools and districts in the middle: Cs, 2.0 grade point averages. And we decided that when we set those grade intervals, we wouldn’t change them so that we could see over time whether Missouri schools are doing better or worse than they did in 2018-19. Same for districts. And of course we had no idea there would be a global pandemic. The next year’s data in 2019-20 was not usable, and then we get into 2021, still difficult with schools reopening. There was some pressure at that time to recalibrate all the grades and make them more based on the COVID environment, but we didn’t. We stuck with our 2018-19 letter grades, and we currently have six years of data on there now. We kept 2018-19 so that we can see whether schools have caught up or not from what happened during COVID. And from the first year, we took 10 letter grades and combined them into a GPA, just like you would see on a college or high school report card. Very simple approach — an A is worth four points and an F is worth zero points. And we combine them into a GPA.

What we did this year is we took the GPA and just made that a letter grade. Same GPA, same rank order, but for folks who don’t readily get the GPA thing, we just made the GPA also a letter grade. It’s kind of helpful and a little weird because you don’t get an overall letter grade on your high school report card or your college report card, but we took your overall GPA and turned it into a letter grade. At the same time, the governor in January signed an executive order requiring the state to create report cards and have a single letter grade on them. So we were already in the process of doing this, and our newest data on the website also reflect the single letter grade for each school and each district. We just happened to do it at the same time as the governor’s executive order. So it’s going to be really interesting to be able to compare our site and our letter grades to what the Department of Elementary and Secondary Education comes up with. It shouldn’t be the case that ours are dramatically different than theirs — we use proficiency, growth, and graduation rates just like they do — but ours is equally weighted, and time will tell how theirs are weighted.

Zach Lawhorn (04:49): All right, before we move on, I want to make one thing perfectly clear, because you used “we” and “our” a lot, and then you said “they” — we use the same data they use. So when people hear that Show-Me Institute has this website that grades schools and assigns GPAs, talk to me about the methodology, the data — what data are you using and where specifically did you get it?

Susan Pendergrass (05:11): So what we do is, when DESE releases test score data, we go to the DESE website and we download it. DESE does not release score data — we request it through a data request to DESE, they give it to us. We download the graduation rate data from DESE. We download the ACT data from DESE. That’s all of the data behind the letter grades. We simply take it from DESE. It’s the same data in the APR scores, the same data used for MSIP 6. It’s all the same test, same test scores. We don’t make any of it up. The only thing we do is put it on a curve and assign it a letter grade.

I should have mentioned that four years ago we added finance data to the website, so it’s kind of a dual website — one side is academic, one side is finance. That’s because every school district in the state does a massive comprehensive financial report to DESE every year called the Annual Secretary to the Board report, and it has so much revenue and expenditure data — like hundreds of lines of it. We decided to download those from DESE and convert them into something that a reasonable person could understand. It’s like 14 pages and very complicated. We convert that into just revenues and expenditures and donut charts, and we tried to make that as accessible to folks as well. So if you look at the academic data for a district, you can go over and look at the finance data and see how much they’re spending, how they’re spending it — down to the most granular detail: how much did they spend on substitute teachers, how much did they spend on advertising, how much did they spend on gas for the buses. So all of that is in there too, and we think that gives a really good comprehensive look at every school district.

Zach Lawhorn (07:00): And Avery has been heavily involved in this process, including the data checking. Tell me a little bit about what that process has been like. And Susan described what she hopes the website has accomplished — when you work on MOSchoolRankings.org, what do you hope it accomplishes? What’s your goal?

Avery Frank (07:18): Well, I hope it really makes it accessible to average everyday folks — for teachers, for administrators, for parents — because this data is very hard to interpret. It’s very messy. The Annual Secretary to the Board report she’s talking about — those things are very hard to compile together into one central location. It’s very hard to understand, there’s a lot of jargon. One of our missions is to make our education system as transparent and as accessible to parents and average citizens as possible. So we put it all together in one place and they can look at it and hopefully do some investigating themselves. Maybe it’s hard to find some of the outliers in spending, but if a parent who knows their district pretty well looks and sees they’re spending a lot on electricity, or buildings, or textbooks, they might think, wait, this seems way out of normal — and then they can go investigate and be more informed to hold their school districts and schools accountable, both on the grade side and the finance side.

Zach Lawhorn (08:30): And Susan, we so often do here in Missouri — let’s talk about what other states are doing. Is this idea of easily accessible, easily understandable report cards for schools a novel idea, or have other states been doing this for a while?

Susan Pendergrass (08:47): Well, Florida was kind of the leader in letter grades for schools and districts. They started in the 90s, so maybe 35 years ago. They started putting letter grades on schools and districts, and they immediately coupled that with: if a child goes to a D school for two years or an F school for one year, they don’t have to go there — they can choose a different public school, which makes a lot of sense.

During the last Trump administration, there was a big push for understandable report cards. Every state is required to produce report cards by federal law — if you take federal money, you have to make a report card for every school in the district. What those look like is kind of up in the air, and they’re supposed to meaningfully differentiate between schools and districts. Missouri has gotten by with, like Avery said, initially a school report card written at the 16th grade level, which is like graduate school — very jargony, a lot of acronyms. Box checked, we’ve got report cards. No one could understand them, but that’s fine. There has been a push at the federal level, and hackathons and websites to show you how to make good ones. There’s a large foundation called ExcelinEd that has devoted multiple resources to what makes a good report card. So there’s a push for this, and Missouri has really resisted it until the executive order by the governor this year.

What Missouri does — and I think it’s the opposite of leading — is it puts the word “accredited,” “partially accredited,” or “unaccredited” on districts, and out of 520 districts, about six — I mean, 98% are fully accredited. So they use this system where everyone passes; maybe six out of 520 don’t. And it’s really misleading for parents. And worse, when St. Louis became fully accredited even though individual school buildings weren’t, they put “fully accredited” posters on the buildings. I think parents want this information. Parents talk at soccer fields or after-school programs — they kind of know if their school is doing okay or not. But no one is helping them get really easy-to-understand information. Lots of other states do letter grades. States that stopped doing letter grades, like Indiana, are going back to letter grades. It’s the one thing that everyone understands. So we are not in any way breaking new ground here.

Avery Frank (11:23): And again, with transparency and accessibility — I think Susan is definitely right about DESE just following the letter of the law, not the spirit of the law, because they really do report just the numbers. There’s not a lot of context for them. Like if you see a district that says 40% proficient in English, is that really good? Is that bad? How does that compare to everyone else? You can’t just report the data flat out for just one district because you don’t know the context. Maybe 40% for a 100% low-income district would be excellent. But 40% for a Clayton or a Ladue would be horrible. So you have to have context both for the types of students that are there and the growth of that district. Are they doing better than they have in the past? And are they doing better in comparison to everyone? Because if everyone is failing, the scale is going to adjust. If you have a lot of people failing and some really succeeding, that breaks the curve, and we have to start looking at what those other districts are doing because it shows that good performance is possible. That’s why I really think a report card with relative context, based on how their students are and how the rest of the state is doing, is really important.

Zach Lawhorn (12:45): All right, so Susan, my understanding when we started this project a few years ago was that our hope was that the state of Missouri would kind of take the baton — that we would start this, but it would be great if the state was able to produce an easily accessible, understandable school report card that Show-Me Institute and Show-Me Opportunity had nothing to do with. Am I correct?

Susan Pendergrass (13:12): That’s right. It was like six or seven years ago when we started with 2018-19 data, and we just posted 2024-25. I didn’t want to be in the school report card business — I don’t work for DESE — but there was a vacuum of information in the state that we decided to fill. And we have said that we’re committed to filling it until the state takes over. That could happen with the new report cards. They have access to all the data, better data than we have access to — student-level data. They can do much more in-depth analysis and I suspect they will. The governor’s executive order includes something called “growth to proficiency,” which is a new model that the state is going to have to create using experts in the field. Maybe they’ll be better. I suspect that when DESE puts out the report cards for the first time with letter grades, there’s going to be a lot of conversation. There’s going to be a lot of pushback. I don’t think many people whose kids are in F schools will be shocked, but I think some people whose kids are in maybe C schools will be shocked because they’re under the impression their kids are in A schools. It’s going to be interesting. Typically when you survey parents, they give their own kid’s school very high marks, so it’s going to be a dose of reality for a lot of folks. And I think that’s the conversation that we’ve been wanting to start for a long time, because if you just listened to what the state and legislators say, you would think that Missouri is doing just fine — and we’re not.

Zach Lawhorn (14:42): And Avery, Susan mentioned pushback. As you’ve been working on this project and following the governor’s executive order for the state to produce A through F report cards, what are some of the common objections to putting letter grades on schools, or really just making school performance and spending data more accessible?

Avery Frank (15:07): Honestly, the most pushback I hear is against the Missouri Assessment Program, or the MAP itself. A couple of senators said that it’s a “useless autopsy” and that we shouldn’t tie any incentives to a flawed test, because a lot of people want a test that tests throughout the year — more of a formative assessment rather than a summative assessment at the end of the year. But the MAP is a good test at the end of the year because we get to have everyone take the same test at the same time and then compare the results. That’s what it’s really for, and there’s a lot of pushback on that idea in general.

If we don’t have those kinds of tests, we can’t see how everyone’s doing relative to one another. There wouldn’t be any context if we’re not comparing to one another. If everyone’s doing their own test and their own grades, they can see how they’re performing relative to themselves, but they can’t see how they’re performing relative to one another. Of course there’s also some pushback about which type of grade should be weighted more — should we weight growth more, total proficiency more, expected proficiency versus actual proficiency more? There are going to be arguments for which rating scale should be used and what the weighting should be, because that will favor different districts.

Susan Pendergrass (16:47): Here’s the pushback we get: schools aren’t letter grades, schools aren’t test scores, teachers do so many things that have nothing to do with how kids do on a test, letter grades are racist and classist because it’s mostly low-income children of color who go to the D and F schools, and if we point that out then we are being racist towards them. We are not acknowledging the hard work of teachers. There’s already a video circulating against school report cards because this is not how schools should be measured — because they do so much more. I hear the same tropes over and over.

On the other hand, I think it was President George W. Bush who said, if you don’t measure it, you can’t fix it. The reality is we might not want to look at our bank balance or the scale, but if we just say no, I’m so much more than my credit score, then we’ll never fix it. And this is what Missouri’s been doing for a long time — let’s not make anyone feel bad. We don’t want the kids to feel bad, the parents to feel bad, the teachers to feel bad. And somebody even said in the discussion around report cards happening right now, because the legislature is considering legislation on report cards in addition to the executive order: why couldn’t every school be an A? They really want to believe that we can create this environment where everyone feels good about what’s happening.

But in the states that have been doing this for a long time, like Florida — not only has Florida had letter grades for 30 years, but as too many schools and districts get A’s and B’s, they raise the bar. They move the goalposts further to push schools and districts harder. As a result, Florida fourth graders are top 10 in the country on the national test, where we’re in the low 30s, more like 36 to 38 out of 50 states. Florida is top 10 because they keep pushing themselves, and this is how you push. The pushback on report cards is basically: it makes people feel bad, it’s racist, and it doesn’t acknowledge all the work that schools do.

Zach Lawhorn (18:54): Okay, so let’s engage with the context argument that a school is more than a letter grade. As the legislature moves through this process now, moving on from the governor’s EO to actually forming legislation, Susan, as they design the criteria, what are some of the things they should keep in mind that can hopefully account for some of that context?

Susan Pendergrass (19:24): DESE in doing the executive order report cards is looking at proficiency, growth, and growth to proficiency. But it’s going to be really interesting, especially in how they weight them. What we found with our letter grades is some districts do really well on proficiency and not so well on growth, because their kids come in better prepared. In some of the higher-income districts, kids aren’t getting a year’s worth of growth in a year, and I would argue that they should. And then you see some real standouts that serve more disadvantaged students — their proficiency numbers are pretty low, but their growth is more than expected. Their growth is higher than the statewide average. Basically, the state reports growth in terms of whether it’s higher or lower than the statewide growth, and some of them have higher-than-average growth. Those are schools and districts we should be looking at really closely to see what they’re doing and how they’re doing it.

How they weight the measures is going to make a big difference, because if they weight growth really high, then some of the districts you think are the highest performing in the state will be B’s and C’s. They’re looking for schools and districts that are getting kids the furthest down the road, not just the benchmark of proficiency.

Zach Lawhorn (20:49): Okay, so it’s correct to say that for people who are not familiar with growth and proficiency, if the claim is it’s unfair to grade schools because they serve different student populations, that is acknowledged and accounted for in these models.

Susan Pendergrass (21:08): Yeah, and people who just believe their kids go to a fantastic school are going to have to keep believing it regardless of what the letter grade is. But it is going to find those high-flying performers that are doing really well with growth and growth to proficiency, even if their test scores are low. And then you’re going to have some schools that just don’t have good proficiency and don’t have good growth, and a lot of their kids are below basic. So this growth-to-proficiency model is about how you get the lowest performers to move hopefully up toward grade level, and it’s going to point those out as well. I’m looking forward to seeing exactly what they come up with. The executive order has some flexibility in it so that the experts and statisticians putting it together can determine the best mix. It’s going to be really interesting to see how it turns out and to see that first set of grades in September.

Zach Lawhorn (22:05): Avery, we’ve got MOSchoolRankings.org, then we’ve got the governor’s EO, and currently the legislature is working on legislation. So as we sit here in the second half of the 2026 session, what’s the status of the legislation?

Avery Frank (22:22): The legislation passed out of the House already and they’re hearing it in the Senate now. It’s undergoing some changes. We’ll see how it turns out in the Senate. There was a school climate survey that was attached to it that’s up in the air as well. We will see what the final bill looks like. Hopefully the legislation sticks close to the governor’s EO, which was really good in my opinion. There are a lot of great aspects to it. There’s going to be a lot of senators trying to advocate for their district — some are going to want more weight towards proficiency, some are going to want more weight towards growth, some are going to want no ratings at all because their districts are doing badly and they want to cover it up. So there are going to be a lot of different political moves trying to mess with the grading scale, and I hope it sticks as close to the EO as possible because I really do think it was a well-written EO.

Susan Pendergrass (23:33): I agree. The legislature can do what they want — if they pass a really good school report card bill, that’d be great. But I wonder if it wouldn’t be smarter to let the executive order play out and get that first set of grades and see how they look. Then the legislature next January can start thinking about what would be a better way of doing it. They’re kind of jumping the gun by wanting to get it into legislation. And I suspect, like Avery said, it’s possible that some lawmakers are thinking they don’t like the EO and they can do something with the law to water it down. But I don’t think a watered-down version is going to end up getting to the governor’s desk. So I think the EO is probably the most watered-down version that would get to the governor’s desk, and what might make more sense is to reconsider it in the future when we know how it’s even going to work.

Zach Lawhorn (24:35): All right, well, it sounds like that as with all things, once the political process kicks in, there’s a lot to be considered and debated. For now, until the state of Missouri produces something great and Susan and Avery get to spend more of their time on other projects, you can go to MOSchoolRankings.org. You can find performance-level data, GPA, letter grades, and spending data. Susan and Avery, before we wrap up, is there anything we haven’t covered that you want to make sure we highlight?

Susan Pendergrass (25:17): Yeah, just one thing — when it first came out in 2020, it took folks a while to understand that when grades are curved, you get a lot of Cs. If the statewide average is a C, then a C means you’re at the statewide average. If you get a B, you’re better than the statewide average. If you get a D, you’re worse. I think people — maybe thinking of ourselves or our kids or our grandchildren — think the only good grade is an A and a B is okay. It’s really not that. A C is average. A C is a good grade. It means you’re at the statewide average. A B is better and an A is better than that. We didn’t use grade inflation where everyone gets an A.

Zach Lawhorn (26:07): And on the site you can find the full methodology — we post all that. There’s a glossary of terms. And you can download the full data set. So if you go to MOSchoolRankings.org and you say these people are full of it, you have access to the same data that Susan and Avery had.

Susan Pendergrass (26:29): Transparency was always our goal with this whole thing — it’s not my numbers. Our goal throughout has been just to make a transparent system. I’ve had members of the media writing stories who find it easier to just download our data set rather than go to 10 different DESE files. Our finance data set is like a lifesaver for folks because we took something very complex and made it accessible. I’ve had people use our data in lawsuits — people arguing about which school is better. I think a lot of folks have gotten comfortable with our method and now use our rankings when they come out. A lot of schools are doing better than they did before the pandemic — not every school is doing worse, so you can find those schools too. I’ve had school boards that want us to present on how it works, and I do think we’ve had a lot of buy-in on the method. And one thing I can say in our defense is we haven’t changed anything — everything is the same as it’s been for seven years. There was a time when DESE switched how they calculated the growth numbers from being centered on zero to centered on 50, or the reverse. So we have to make changes as DESE makes changes. But other than changes that DESE has made, we haven’t changed one thing. We now have line graphs so you can look at how your school was doing in 2018-19 and see how it’s doing six years later. That’s all really important.

Avery Frank (28:07): The website has a lot of cool features. It’s very interesting if you want to do some research on both the finance side and the academic side. There’s a misconception in education that more money equals better results. And this is just directly pulled from MOSchoolRankings — Valley Park has 34% free and reduced-price lunch students, they spend $36,000 per student, and they got a C. But then you look at Festus, which has 28% free and reduced-price lunch students, they spend $13,000 per student, and they received an A. There are a lot of districts like that. You can compare and ask: wait, these districts spend a lot more money, they have the same types of students, but they’re doing a lot worse. You can use that data to show that it’s not just about money. And the last thing I’d add is that we have both schools and school districts. So if you want to see how your district as a whole is doing, you can look at that. And if you want to look at your specific school within your district, you can compare schools within your district and across the state, which is also a very cool feature.

Zach Lawhorn (29:18): And you mentioned spending data — if you go to the home page of MOSchoolRankings.org, in the upper right-hand corner there’s a button that says “Rank by Spending,” and it’s a whole new world from the performance data to the spending data.

Susan Pendergrass (29:31): Any feedback is welcome, right?

Zach Lawhorn (29:34): Yeah, we take notes. We take comments. Okay, one more time: MOSchoolRankings.org. Go to the website, find your school. Susan, Avery, thank you very much.

Produced by Show-Me Opportunity

Susan Pendergrass

About the Author

Before joining the Show-Me Institute, Susan Pendergrass was Vice President of Research and Evaluation for the National Alliance for Public Charter Schools, where she oversaw data collection and analysis and carried out a rigorous research program. Susan earned a Bachelor of Science degree in...
Avery Frank

About the Author

Avery Frank earned a Bachelor of Arts degree in economics (with honors) and political science from Sewanee: University of the South in 2022. He also studied at the London School of Economics in 2021 and was inducted into the Phi Beta Kappa and Pi Sigma Alpha Honor Societies. His research interests...
Zach Lawhorn

About the Author

Zach Lawhorn joined Show-Me Opportunity in 2018. He earned his bachelor's degree at the University of Missouri. Prior to joining SMO he worked in marketing and strategic communications for the University of Missouri School of Health Professions and was a senior media producer at Mizzou Video...

Similar Stories

Support Us

The work of the Show-Me Institute would not be possible without the generous support of people who are inspired by the vision of liberty and free enterprise. We hope you will join our efforts and become a Show-Me Institute sponsor.

Donate
Man on Horse Charging