Tom Loveless
« Back to Blog

My letter to the Editor of Notices of the AMS

American Mathematical Society

 

4–5 minutes

I was both surprised and disappointed to see the following claim published in the January 2024 issue of Notices of the American Mathematical Society, “Metacognition in the Mathematics Classroom:”

When we taught 82 middle school students in a youcubed summer camp we taught the students these strategies as they worked on open tasks. At the end of the four-week camp the students had increased their achievement on standardized tests by the equivalent of 2.8 years (Boaler et al, 2021, see 3).

Editors should have immediately questioned the validity of such a statement. The magnitude of the gain, representing an effect size of 0.91 standard deviations, defies belief. An analysis by NWEA estimates that US seventh graders will need an additional 5.9 months of learning in math to recover from the pandemic. The claim here is that more than four times that amount can be produced by 18 days of instruction. Matthew Kraft analyzed 747 randomized controlled trials of education interventions, considered a strong design for estimating causal effects, and calculated a mean effect size of 0.16 for both reading and math and 0.11 for math alone. The alleged summer camp gain is several times larger.

The study of the youcubed summer camp did not have a strong design. It didn’t identify a control group for estimating learning gains. It featured non-random selection into the treatment, meaning that students were recruited for the summer camps. Those who showed up were in; those who didn’t were out. The same four “tasks,” created by the Mathematical Assessment Research Service, or MARS, were used for both pre- and post-test assessments. The pre-test was administered on the first day of summer camp and the post-test on the final day. Using the same four problems in such a short interval is a legitimate concern. Moreover, the publishers of MARS describe their assessments and tasks as “prototype materials” that “are still in draft and unpolished form,” needing “further trialing before inclusion in a high stakes test.”

I discuss additional weaknesses of the summer camp studies as part of a critique of the California Math Framework published in Education Next. Two developments subsequent to that publication are important for AMS readers to consider. First, the state board adopted an approved list of assessments that the state’s charter schools can use for documenting learning gains. MARS tasks were evaluated but not ap proved. Second, researchers working for the state of California removed the claim of 2.8 years of summer camp growth from the version of the California State Math Framework ultimately adopted by the state board of education.

It’s a pity that AMS Notices allowed this dubious claim to be repeated in its pages.

Tom Loveless