What Went Wrong with Math Instruction in New York?
How the New York State Education Department’s “Math Briefs” run afoul of good education science — and how to fix them | Attacks on Excellence, Issue #4
Editor’s Note: This article discusses a New York petition asking Commissioner Rosa to retract the NY math briefs. You can find the petition and associated letter here.
The refrain has been broadcast widely: New York math achievement, as measured on a respected national test, is in the dumps. On the 2024 National Assessment of Educational Progress (NAEP), only 37% of New York 4th graders scored proficient relative to 39% overall for the nation.1 The situation was worse for 8th graders, with only 26% of students scoring at or above the proficiency cut-off, again below average for the nation.1 All other Northeastern states performed better. The highest performing state, Massachusetts, outperformed New York by 17 and 13 standard points, respectively, for the two grades sampled.1 Results were even more disappointing for students historically ranking below the 50th percentile for the test: those above recovered pandemic losses, while those below yielded worse scores than they did pre-pandemic. That is, disproportionality between the haves and have-nots continued to grow between 2022 and 2024. On the average, Louisiana, Mississippi, Alabama, and Kentucky outperformed New York in 4th grade math, even before adjustments based on differences in student’s socioeconomic status.
A frustrating backdrop for these findings is that New York outspends all other states on education, with per pupil expenditures exceeding $36k annually.2 Importantly, averages don’t tell the full story; some New York districts are financially stressed, as funding is uneven and comes from multiple sources.3 The cost of living in New York is high, and school funding supports more than math and literacy instruction, although these are obvious priorities. Nevertheless, the data suggests that funding isn’t the crux of the problem. An alternative explanation is that math instruction, on the average, is poor across the state, driven by New York State Department of Education (NYSED) state standards which do not align with empirical research.4 These invalid state standards drive the selection of poor curriculums and teaching methods, frustrating teachers and fueling a downward spiral.
This brings us to the New York Numeracy Initiative. New York Governor Hochul rightly made the decision to draw attention to unacceptable student math outcomes in the spring of 2025. As part of this math initiative, NYSED commissioned a set of eight numeracy briefs, authored by Dr. Deborah Ball and colleagues out of the University of Michigan.5 The briefs were described as summarizing the state of the science regarding how to deliver top quality “evidence-based” math instruction. After publication, the briefs were shared widely and were a focal point of statewide professional development sponsored by NYSED in May of 2025.
This sounds all well and good, except that the recommendations put forward in the briefs are entirely antithetical to the published science on math instruction. Both national and international scholars have raised concerns about the briefs, highlighting the authors’ disregard of high quality peer-reviewed evidence. Rather, the briefs double-down on long discredited theories of instruction. Several of the debunked theories endorsed in the briefs overlap with educational myths toppled by the Science of Reading movement, which is a grassroots initiative to compel curriculum writers, administrators, and state leaders to acknowledge the very large body of evidence supporting the use of comprehensive explicit instruction to teach literacy, including phonics-based instruction.6 The authors of the New York numeracy briefs ignored large bodies of empirical research in (a) cognitive science, (b) school and educational psychology, (c) special education, and (d) various other learning sciences which inform best practice in teaching mathematics to diverse audiences.
Recently, my colleagues and I authored a letter explicitly outlining a sampling of the mischaracterizations, factual inaccuracies, and omissions in those briefs. In a later podcast episode of Chalk and Talk with Dr. Anna Stokke, I stated that the degree of inaccuracies is such that the briefs can be best described as pseudoscientific. One of my favorite definitions of pseudoscience, from Lilienfeld and colleagues, describes it as a “false or sham science” that “possess[es] the superficial appearance of science but lack its substance.”7 The briefs fit these criteria because they advance discredited myths about learning while portraying themselves as evidence-based and scientific.
Here are some takeaways about the more egregious aspects of the briefs, from our letter:
The briefs disparage “explicit instruction,” a broad term for the family of instructional routines found to be highly effective in conferring knowledge to students through use of clear modeling, demonstration, and practice. In contrast to messaging in the briefs, predominant use of explicit instruction works well for both disabled and non-disabled students alike.8,9
Mischaracterizing the use of short, timed tests to assess student competence as unnecessary or harmful. Research shows such brief timed tests which emphasize fluent responding, such as math curriculum-based measurement, are critical to inform teaching and are far more accurate than untimed quizzes, surveys, or informal teacher appraisal.10 Further, such tests do not cause math anxiety.11
Claims that deliberate practice and recall activities centered around basic facts and procedural algorithms aren’t helpful. In contrast, research shows repeated practice of fundamental math procedures and facts is essential to complex problem solving, as it is to master any skill.
That discovery learning should be prioritized over explicit instruction and occur early in the learning sequence. Research states the exact opposite: teachers should engage in explicit instruction first, and then only use discovery learning principles after students have the tools to manage their own learning, thus minimizing frustration and supporting early successes.12
How could this be? How could university affiliated professional development providers double-down on practices shown decades ago to be ineffective, ignoring hundreds of published empirical studies and meta-analyses? Shouldn’t our universities be our most trusted sources of information? The Reading Wars provides a helpful context. As highlighted by Emily Hanford’s “Sold a Story,” vendors and professors pushing reading pseudoscience flourished for decades despite an enormous body of widely accessible research showing such practices were ineffective, seemingly operating in a self-constructed research vacuum. It took a grassroots movement and an unignorable reading crisis to finally bring the battle to a tipping point, spilling over to the public with the aid of journalists, community advocates, and politicians. This now needs to happen for math.
The central dichotomy of The Reading Wars was, on one side, carefully planned, scripted instruction focusing on the core elements of decoding and language, and on the other side, an emphasis on constructivist learning centered on authentic literacy experiences. The science of math shows that many of the lessons gleaned from the science of reading cross over, such as (a) explicitly teaching and modeling fundamental principles, (b) ensuring fluency in core skills, (c) measuring to inform instruction, and (d) saving exploratory and “hands on” learning for when students are ready for such. The skills and targets are different; the means to attain them the same.
In practice, this includes having students memorize their core math facts using brief speeded activities; providing quick timed tests of fluency; explicitly modeling standard algorithms up front; promoting engaging teacher-directed learning dense with student responses; and minimizing student confusion whenever possible.13 This isn’t to say these are the sole practices that should be emphasized across a given instructional arc. Rather, the science of math (and the science of learning more generally) dictates these practices must occur first, and “discovery” and “experiential” activities occurring after demonstrated mastery of fundamentals.14 We wouldn’t ask a baseball pitcher to experiment with novel throws prior to mastering a fastball, curveball, and changeup through extensive practice, nor a novice chess player to “discover” strategy prior to explicit instruction in piece movement through modeling and rehearsal. It should come as no surprise that math and literacy are the same. Just like The Reading Wars, one problem may be that many educators, including many academics, still see stiff and boring instruction as synonymous with explicit instruction, a belief perhaps rooted in some philosophies or personal opinions, but not science. As The Reading Wars tells us, doubling down on mistaken philosophical beliefs yields very real and harmful outcomes for students; in New York’s case, for millions of children, reverberating across generations in the form of lost social and financial opportunities.
The numeracy briefs are hardly the first instance of latching onto philosophical, as opposed to scientific, reasoning at the expense of New York students. Many curriculum authors have taken great license with their programs, often operating in a research vacuum. For example, the lead author of Illustrative Math — the program adopted citywide by New York City — recently stated that there are no “magic bullet” curriculums and that their program operates by situating teacher modeling and instruction last, not first, in the instructional routine, rather emphasizing multiple strategies and discovery initially.15 During the program’s five year rollout, New York City test scores have decreased. Meanwhile, programs have existed for decades that have driven robust math achievement, all of which share common features of teacher-directed explicit instruction, refined over iterations. For example, a 2018 meta-analysis conducted by Stockard and colleagues concluded Direct Instruction programs (a suite of programs founded upon explicit instruction learning principles) significantly improved math achievement with moderate to large effects.16 The authors of the New York math briefs do not acknowledge this research.
It doesn’t have to be this way. We can embrace the science of learning and math right now for the benefit of New York students. In doing so, we can save unnecessary frustration and nurture a generation of students who are ready to use math competitively in different ways across fields. While not everyone will aspire to be a mathematician, demonstrating strong vocational skills requires traditional math competency as well.17 We can reduce teacher frustration, burn-out and attrition, reestablish confidence in our public education system, and save the state millions in remediation costs, lost income, and endless policy and curriculum cycles. We can truly address educational inequity by shrinking worsening achievement gaps, rather than dressing poor instruction up in equity language.
New York is not the only state to forward questionable recommendations for math instructional practices and curricula. As an example, recent guidelines from the California Department of Education reflect very similar inaccuracies and mistaken beliefs,18 again camouflaged in the language of evidence-based practice and equity. 2024 NAEP scores revealed widespread concerns across states, and invalid standards, guidance, materials, and instruction are likely causes — and thankfully fixable ones, too. If New York can pivot toward evidence-based practice, as defined by empirical findings reported in rigorous experimental studies published in peer-reviewed journals and/or through publicly funded research, the state can serve as a model of adoption.
It will require more than just changing standards and recommendations, though. As demonstrated by Mississippi,19 change also requires well-organized hierarchical coaching structures and specific materials that offer examples and routines at each step. More recently, the work of Sir Nicolas Gibb, who is credited with turning around the United Kingdom’s math performance, has been well publicized. His approach included adopting many of the evidence-based recommendations I describe above.20
If you’d like to assist, please sign my petition. Ask your local state leadership to support the use of evidence-based practices to inform math instruction in schools. Bring attention to the inaccuracies in the briefs. Highlight the problem with your local school board. Resolving The Reading Wars took a grassroots movement; literacy researchers couldn’t do it alone. Help us do it for math.
National Center for Education Statistics. (2025, January). 2024 NAEP mathematics assessment: Results at grades 4 and 8 for the nation, states, and districts (NCES 2024-217). U.S. Department of Education. https://nces.ed.gov/use-work/resource-library/report/statistical-analysis report/2024-naep-mathematics-assessment-results-grades-4-and-8-for-the-nation-states-and districts/
Citizens Budget Commission. (2025, January 17). Highest costs, middling marks: New York school spending and results (Issue Brief). https://cbcny.org/research/highest-costs-middling marks
Office of the New York State Comptroller. (2025, January). Fiscal Stress Monitoring System — School districts: Fiscal year 2023-24 results. https://www.osc.ny.gov/files/local government/publications/pdf/2024-fsms-schools.pdf
New York State Education Department. (2017). New York State Next Generation Mathematics Learning Standards. Retrieved from https://www.nysed.gov/sites/default/files/programs/curriculum-instruction/nys-next generation-mathematics-p-12-standards.pdf
New York State Education Department. (2025). Numeracy Initiative [Web page]. https://www.nysed.gov/standards-instruction/numeracy-initiative
Petscher, Y., Cabell, S. Q., Catts, H. W., Compton, D. L., Foorman, B. R., Hart, S. A., Lonigan, C. J., Phillips, B. M., Schatschneider, C., Steacy, L. M., Terry, N. P., & Wagner, R. K. (2020). How the science of reading informs 21st-century education. Reading Research Quarterly, 55(1), S267- S282. https://doi.org/10.1002/rrq.352
Lilienfeld, S. O., & Landfield, K. (2008). Science and pseudoscience in law enforcement: A user friendly primer. Criminal Justice and Behavior, 35, 1215–1230.
Fuchs, L. S., Malone, A. S., Preacher, K. J., Cho, E., Fuchs, D., & Changas, P. (2023). Next generation fraction intervention and the long-term advantage of interleaved instruction. Journal of Educational Psychology, 115(2), 211–230.
VanDerHeyden, A. M., McLaughlin, T., Algina, J., & Snyder, P. (2012). Randomized evaluation of a supplemental grade-wide mathematics intervention. American Educational Research Journal, 49(6), 1251–1284. https://doi.org/10.3102/0002831212462736
McNeil, N. M., Jordan, N. C., Viegut, A. L., & Ansari, D. (2025). What the science of learning teaches us about arithmetic fluency. Psychological Science in the Public Interest, 26(1), 10–57. https://doi.org/10.1177/15291006241287726
Codding, R. S., Goodridge, A. E., Hill, E., Kromminga, K. R., Chehayeb, R., Volpe, R. J., & Scheman, N. (2023). Meta-analysis of skill-based and therapeutic interventions to address math anxiety. Journal of School Psychology, 100, https://doi.org/10.1016/j.jsp.2023.101229
Ashman, G., Kalyuga, S., & Sweller, J. (2020). Problem-solving or explicit instruction: Which should go first when element interactivity is high? Educational Psychology Review, 32(1), 229–247. https://doi.org/10.1007/s10648-019-09500-5
Binder, C. (1996). Behavioral fluency: Evolution of a new paradigm. The Behavior Analyst, 19(2), 163–197. https://doi.org/10.1007/BF03393163
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–86 https://doi.org/10.1207/s15326985ep4102_1
Napolitano, J. (2025, October 15). Illustrative Math’s CEO on what went wrong in NYC and why Pre-K math is up next. The 74. https://www.the74million.org/article/illustrative-maths-ceo-on what-went-wrong-in-nyc-and-why-pre-k-math-is-up-next/
Stockard, J., Wood, T. W., Coughlin, C., & Rasplica Khoury, C. (2018). The Effectiveness of Direct Instruction Curricula: A Meta-Analysis of a Half Century of Research. Review of Educational Research, 88(4), 479-507. https://doi.org/10.3102/0034654317751919
Goldstein, D. (2025, September 25). Why reading and math scores are plunging for U.S. students – and what it means. The New York Times. https://www.nytimes.com/2025/09/25/us/reading-math-scores-declines-impact.html
California Department of Education. (2024). Mathematics framework for California public schools, kindergarten through grade twelve: Chapter 2: Teaching for equity and engagement (Adopted November 2023). Sacramento, CA: Author. Retrieved from https://www.cde.ca.gov/ci/ma/cf/documents/mathfwchapter2.pdf
Spencer, N. (2024). Comprehensive early literacy policy and the “Mississippi Miracle”. Economics of Education Review, 103, Article 102598. https://doi.org/10.1016/j.econedurev.2024.102598
Gibb, N., & Peal, R. (2025). Reforming Lessons: Why English schools have improved since 2010 and how this was achieved. Routledge.