Thanks for writing this, very interesting! I am curious why Steubenville hasn't been studied more closely since 2017 (it appears).
Also, is there more granular testing data within San Francisco? To know whether there are other (positive) outliers between schools within the district that may yield evidence of other factors that can improve performance.
Finally, SFUSD started a new reading curriculum this year (and is piloting a new math curriculum at some schools) so there may messy results in the next few years. For example, teacher difficulties in implementing the new curriculum at first, students adjusting to the new curriculum, reading resource aides being added (or dropped) to schools to assist with the new curriculum, etc.
Great work as always. Someone should hire you to do this. It’s too bad you can’t back out the percentage of kids in private school for chart 3. What percentage of private school kids take the SBAC?
This one is easy. The SBAC is only taken by public school kids. Private schools can use whatever assessment they want or none. I believe MAP tests are common in SF.
When my first kid started kindergarten I discovered many parents had already taught their kids how to read! 🤯 I scrambled and used "Teach Your Child to Read in 100 Easy Lessons" at home.
Aaagh! You're right. I'd been spending so much time looking at 3rd grade reading data where SFUSD had 3,600 students that I didn't adjust my frame of reference. I'll make a correction.
Thank you for posting. What exactly is meant by, “doesn’t carry over for middle school and high school”. I’ve seen in multiple places that 3rd grade reading is predictive of a ton of outcomes… are you saying that doesn’t hold in that district in Ohio? Is the significance of that measurement misunderstood then?
Thanks for the great article. Interesting to see how predictive a district's rate of parents' college education is on 3rd grade reading achievement.
I downloaded your data / analysis from your last chart, and created another chart that you might find useful. In this case, I sorted by county and within each county by how much each district outperformed or underperformed expectations. I would have liked the counties to also be sorted by average outperformance, but I didn't see a way to switch that order from something besides alphabetical in the interface.
Here is also a version in which they're not grouped by county. Just sorted from most out-performing to most under-performing. That's here: https://www.datawrapper.de/_/YiGkH/
Thanks. I can't believe I didn't calculate relative performance by county myself. It's a trivial calculation and would have saved me a lot of time when I was created the dummy variable for whether a district is in LA or Orange.
Thank you for this analysis. 1. Wouldn't it be helpful if NAEP data was available for all districts, so we could compare across states at the district level. In California, district-level results are available only for Fresno Unified School District (USD) through 2019, Los Angeles USD, and San Diego USD. 2. Great podcast showcasing Steubenville, OH. There are lessons to be learned. https://edtrust.org/rti/extraordinary-districts-episode-3-steubenville-ohio/
As a general principle, I'd love to see more education data that is comparable across the country. The lack of comparability is why I hardly ever refer to the experience in other states. But California is a big diverse state and there is plenty of data already available on how schools and districts are doing if we're willing to learn from it. What I'd really like is not even more data but more context about the assessment data we already collect. For example, it would be very informative to be able to breakout SBAC scores by curriculum provider; or by whether a class is English-only, language immersion, or bilingual; or by average teacher experience; or by class size.
I agree that's a great podcast. It's the same one I listened to (well, skimmed if I'm honest) before writing the article.
Thanks for writing this, very interesting! I am curious why Steubenville hasn't been studied more closely since 2017 (it appears).
Also, is there more granular testing data within San Francisco? To know whether there are other (positive) outliers between schools within the district that may yield evidence of other factors that can improve performance.
Finally, SFUSD started a new reading curriculum this year (and is piloting a new math curriculum at some schools) so there may messy results in the next few years. For example, teacher difficulties in implementing the new curriculum at first, students adjusting to the new curriculum, reading resource aides being added (or dropped) to schools to assist with the new curriculum, etc.
Great work as always. Someone should hire you to do this. It’s too bad you can’t back out the percentage of kids in private school for chart 3. What percentage of private school kids take the SBAC?
This one is easy. The SBAC is only taken by public school kids. Private schools can use whatever assessment they want or none. I believe MAP tests are common in SF.
Maybe you could give us a score for S.F. only assuming that 100% of the private school parents have a college education.
When my first kid started kindergarten I discovered many parents had already taught their kids how to read! 🤯 I scrambled and used "Teach Your Child to Read in 100 Easy Lessons" at home.
"Steubenville is not small (it is 75% of San Francisco’s size)" Here it says Steubenville City District has 2769 students and 6 schools. That's not 75% of SFUSD's 49,000... Am I missing something? https://nces.ed.gov/ccd/districtsearch/district_detail.asp?Search=2&DistrictID=3904482&ID2=3904482&details=1
Aaagh! You're right. I'd been spending so much time looking at 3rd grade reading data where SFUSD had 3,600 students that I didn't adjust my frame of reference. I'll make a correction.
Thank you for posting. What exactly is meant by, “doesn’t carry over for middle school and high school”. I’ve seen in multiple places that 3rd grade reading is predictive of a ton of outcomes… are you saying that doesn’t hold in that district in Ohio? Is the significance of that measurement misunderstood then?
Most of what I know about this issue in Steubenville was gleaned from this podcast:
https://edtrust.org/rti/extraordinary-districts-episode-3-steubenville-ohio/
Thanks for the great article. Interesting to see how predictive a district's rate of parents' college education is on 3rd grade reading achievement.
I downloaded your data / analysis from your last chart, and created another chart that you might find useful. In this case, I sorted by county and within each county by how much each district outperformed or underperformed expectations. I would have liked the counties to also be sorted by average outperformance, but I didn't see a way to switch that order from something besides alphabetical in the interface.
https://www.datawrapper.de/_/fgkUn/
Here is also a version in which they're not grouped by county. Just sorted from most out-performing to most under-performing. That's here: https://www.datawrapper.de/_/YiGkH/
Thanks. I can't believe I didn't calculate relative performance by county myself. It's a trivial calculation and would have saved me a lot of time when I was created the dummy variable for whether a district is in LA or Orange.
Thank you for this analysis. 1. Wouldn't it be helpful if NAEP data was available for all districts, so we could compare across states at the district level. In California, district-level results are available only for Fresno Unified School District (USD) through 2019, Los Angeles USD, and San Diego USD. 2. Great podcast showcasing Steubenville, OH. There are lessons to be learned. https://edtrust.org/rti/extraordinary-districts-episode-3-steubenville-ohio/
As a general principle, I'd love to see more education data that is comparable across the country. The lack of comparability is why I hardly ever refer to the experience in other states. But California is a big diverse state and there is plenty of data already available on how schools and districts are doing if we're willing to learn from it. What I'd really like is not even more data but more context about the assessment data we already collect. For example, it would be very informative to be able to breakout SBAC scores by curriculum provider; or by whether a class is English-only, language immersion, or bilingual; or by average teacher experience; or by class size.
I agree that's a great podcast. It's the same one I listened to (well, skimmed if I'm honest) before writing the article.