Academic Decline - Part 4
I just became aware this morning that the Brake, Brown and Sharp campaign posted another page addressing me directly. Apparently it went up at some point last week. I struggle with whether even to write this on the eve of the election, but I will post a reply since 1) I know there are people who do research the night before and day of voting and 2) the BBS campaign has again personally attacked me. For reference, I’ll include a link to download the page I’m responding to at the end.
Please note, I am one person with a full-time job and a family. I am not a campaign of three candidates with a campaign manager, volunteers and tens of thousands of dollars to spend in an attempt to buy control of the local school board.
As such, I’ve tried to focus this page on areas where I think the BBS campaign’s misinformation are most likely to confuse voters or ill-inform them about the reality of our schools. If you’re an undecided voter who legitimately thinks they make a good point that I’ve failed to address, please reach out via the contact page. I will be monitoring submissions through the close of the polls tomorrow and will do my best to respond to any questions.
That I cherry picked which data to respond to.
First, in their previous attack on me, the BBS campaign said that if they could disprove one aspect of my website, every aspect of my reporting on and analysis of publicly available data shouldn’t be believed. So their accusation is literally that I’m employing the standards they advocate.
Second, when an entire argument is built on top of an entirely incorrect starting point, it’s not cherry picking to focus just on the starting point. It’s efficiency.
That this website or any individual pages on it are mostly opinion.
No, this website is mostly summary and analysis. I have consistently provided far more links to data than anything the BBS campaign has provided to back any of their misinformation. And from the start I have welcomed people to form their own opinion and reach out to me to discuss anything they question.
That I ignored much of their COVID analysis.
I did. Because it was stupid. It was so openly stupid as to be disingenuous, and I think Carmel voters are smart enough to see that. I’ve tried to keep the tone of this site civil, but I don’t know a more polite word than stupid for their previous analysis of wholly irrelevant data.
Their new arguments are stupid too. For example, consider this italicized quote from where they’re discussing the amount of instruction between 2019 & 2021 iLEARN tests and 2021 & 2022 iLEARN tests:
2019 – 2021:
6 months in person, 2 months of remote, 7 months of hybrid, 15 total months of learning
13 of 14 schools had decreases in iLEARN scores
2021 – 2022:
7 months of in-person and 7 total months of learning
10 of 13 schools had increases in iLEARN scores
So, is Mr. May REALLY trying to say that 1 month less of in-person learning, 2 months MORE of remote learning, 7 months MORE of hybrid learning and 8 more months of total learning is what caused our scores to decline as versus grow? How would 8 additional months of learning lead to lower scores?
My god. These people might soon be in control of our school board.
Yes. I am saying that raising kids two grade levels over two years with only less than half of their instruction being in person is much more challenging than raising them one grade level over one year where the entirety of their instruction is in person.
Beyond this, their points generally revolve around ‘well if some scores go up and some go down, that means something!’ Yes and no.
Yes, in some cases it does. As previously explained, in the case of Carmel schools, it shows a huge impact from redistricting.
In other cases, yes, it may reveal that one school is doing something particularly well or poorly and should be compared to other schools. The BBS campaign acts as if they should be applauded for stating this when it’s wholly unrelated to the misinformation they were originally spreading. It’s something that everyone already knows and that the schools already consider. It also applies at all times and is an odd thing to concentrate on as an attempt to minimize the effects of the pandemic.
In yet other cases, no, it means nothing. There is random variance to life. If you tested the same kids at the same school every day for a week, you would not get five identical proficiency rates. Some would be a little higher and some would be a little lower. That’s how things work.
That the BBS’s previous attempt at regression analysis wasn’t stupid.
It was. The attempt to defend it is as well.
It’s clear that someone at the campaign googled some variation of ‘regression analysis percentage bad’, found a scholarly article saying it was inaccurate and copied a link to it in their article. It’s equally clear that they did not understand what it said.
There are two ways we can talk about including percentages in statistical analysis. Let's use data from Mohawk Trails to illustrate them. From 2021 to 2022, enrollment at Mohawk Trails increased from 297 to 337 and the number of English language learner students increased from 1 to 28. So, when it comes to percentages, we could say:
1) Mohawk Trails increased its number of students learning English by 2,700%.
2) Mohawk Trails saw its percentage of students learning English increase from 0.34% to 8.31%.
The scholarly article that the BBS campaign found and didn’t understand is about measuring percentage change from a baseline, which is #1 above. And yes, including a 2,700% increase would absurdly skew the numbers and make analysis impossible.
The article has nothing to do with #2, which is a perfectly valid and accepted way of normalizing data when comparing it for multiple populations of different sizes.
As for showing my work, I’m obviously fine with that. It’s an admittedly simple model as it was created quickly to demonstrate how obscenely off-base the BBS campaign’s analysis was.
I started by aggregating the net change to percentage of students in disadvantaged categories.
I then compared the aggregate change against the shift in iLEARN scores and ran a regression analysis.
Three numbers to note. I’ll start with the one that the BBS campaign’s resident data maimer would immediately latch onto: an R-square of 0.5298. That tells us that the model only accounts for 53% of the total change in test scores across all schools.
This is to be expected, given that several schools had almost no change and 9 out of 13 had a net change of less than 5%. But that low R-square, coupled with the other data, tells us that where there were significant changes to the percentages of disadvantaged students, they explain a lot more of the score variance.
First, we see that the X Variable, which is the net change in disadvantaged students, has a -0.79% coefficient. That means that for every 1% increase in the percentage of disadvantaged students, we can expect to see a .79% decrease in proficiency. Second, we see that the X variable is very statistically significant, as its p-value is 0.004, well below the threshold of 0.05 needed to establish a relationship between the values.
In other words, we can very confidently say that for schools that had big changes to the percentages of disadvantaged children being served, it had a big impact on their proficiency.
I’m guessing this will be the last page I add to the site, as the election is tomorrow. Please don’t vote for these people. They have demonstrated again and again and again that they are happy to misrepresent and lie about things they understand and make confidently wrong assertions about things they don’t understand. Our schools deserve so much better.
As promised, here is the link to the wreck of their second attack on me.