For reasons we can’t explain, a June 29, 2010 study that showed “on average, charter schools had no significant impacts on student achievement in math and reading” drew zero attention from New York media — including education blogs.
The 15-state, 36-school study, funded by the U.S. Department of Education and conducted by the prestigious Princeton-based Mathematica Policy Research, Inc., was announced one day after the New York State Senate passed legislation raising the state’s charter school cap from 200 to 460 — a result of a months-long lobbying campaign by charter advocates.
Until the legislature’s approval, New Yorkers had been bombarded by TV commercials, print ads, and editorials and op-ed pieces in the Daily News, the Post, and the NY Times — all urging legislators to lift the charter cap to improve New York’s chance to win Federal funding in Round 2 of Race to the Top, the Obama administration’s school reform incentive program.
But critics suggested that the RttT eligibility goal was just cover for a well-funded opinion campaign aimed at pressuring legislators to increase the number of charter schools. Ultimately, the legislation incorporated several limitations, including a stipulation that new charters may not be operated by for-profit companies.
The Mathematica study is notable for meeting the “gold standard” for comparisons between charters and traditional public schools: It compares charter middle school students with matched peers from TPSs who had sought charter admission via lotteries but had not been selected.
The methodology avoids the taint that discredits studies that compare charter students with TPS students whose families may not have been motivated enough or able to seek charter admission for their child.
“This report helps us make sense of previous charter school studies that have generated a wide range of findings,” said Phil Gleason, senior fellow and lead author. “In this study—the most comprehensive and geographically diverse using charter school lotteries to date—our findings are consistent with prior evaluations that focused on a broad range of schools. We found that the average charter school in our sample did not have positive impacts on students’ math or reading achievement.”
In a Mathematica Q&A document that accompanied the announcement, other details emerge:
“On average, the study charter schools had a positive effect on lower-income and lower-achieving students, and a negative effect on higher-income and higher-achieving students;
“charter schools did not affect … attendance, homework completion, student behavior, and parents’ involvement in their child’s education;
“charter schools had a positive effect on students’ and parents’ satisfaction with their schools;
“there was weaker evidence that charter schools with longer school days and/or years as well as those with higher per student revenues from public and private sources had more positive (or less negative) effects. “
Remarkably, only three major newspapers in the United States acknowledged the Mathematica study in the days following its release: The Washington Times, The Christian Science Monitor, and the Dallas Morning News.
Given its provenance and its results, the report deserves wider attention. The full study can be found here.