The Research Doesn’t Add Up

 This commentary was based on the 2021 draft version of the California Mathematics Framework. An updated version of this critique, based on the final draft (3/14/2022) of the framework - Second Field Review, is available here.

Item 6 of 6

Executive Summary

The authors of the draft framework make highly misleading claims that are not supported by research.


The Research Doesn’t Add Up

The most significant flaw of the draft 2022 Math Framework is that it misrepresents the very literature and data that it uses to justify itself.

“A review of much of the research cited, however, reveals that what the Framework describes as “clear” is often actually pretty murky, hotly disputed, or contradicted by other research, misleadingly stretched to cover situations for which it was not intended, or, in some instances, just plain wrong.”

Example 1: A prime example of this is the Burris, et al Rockville Centre study 1 (Ch. 1, Lines 456-463) that the draft framework cites as foundational to its proposals. However, the intervention studied at the Rockville Centre, NY middle school is pretty much opposite to what the draft framework proposes.

Rockville Centre NY implemented a universal (detracked) math acceleration intervention in grades 6-8, with all students completing Algebra 1 in grade 8, with the express goal and metric that students would complete certain advanced math courses (Algebra 2, Pre Calculus, and Calculus) in high school.

In contrast, the draft framework proposes a universal (detracked) math deceleration, delaying Algebra 1 for all students until grade 9. Not only that, the draft framework discourages calculus completion in high school, labeling the goal of completing calculus in high school a ‘rush to calculus’ (Ch. 1 Lines 430-433, 441-443) and ‘...​​an incorrect conclusion that Calculus is an important high-school goal’ (Ch. 8 Lines 121-123).’ It further claims that students need not take PreCalculus if students choose to attempt to complete Calculus in high school (Ch. 8 Lines 843-847).

This is just about opposite to the intervention that Burris, et al 2006 studied.

From Burris, et al:

Such findings regarding American mathematics curricula led SIMS scholar Edward Kifer (1993) and TIMSS scholar William Schmidt (2004) to propose that all students take an algebra-based course in the eighth grade (pg. 107).

Per the excerpt above, the Burris, et al findings supports all students taking Algebra 1 in grade 8. The draft framework, though, does not support any students taking Algebra 1 in grade 8.

The draft framework dubiously posits that its proposed universal (detracked) math deceleration intervention is supported by the results of the successful universal (detracked) math acceleration intervention that Rockville Centre NY implemented. The draft framework touts only the ‘universality,’ or ‘detracking,’ portion of the Rockville Centre intervention, while studiously ignoring the integral math acceleration intervention that took place in grades 6-8 at Rockville Centre, NY.

Burris, et al 2006’s findings do not support the draft framework’s recommendations; this is a misrepresentation.

Example 2: Misrepresented claims have also been made about the ‘results’ of the SFUSD math pilot program (Ch. 1 Line 471-476), which due to its intervention of detracking and delay of Algebra I to grade 9 is widely understood to be the model for the draft 2022 Math Framework.
Specifically, the draft framework makes claims about the benefits of detracking at SFUSD and also of delaying Algebra 1 to grade 9:

“Educators in the San Francisco Unified School District found similar benefits when they delayed any students taking advanced classes in mathematics until after tenth grade and moved the algebra course from eighth to ninth grade. After making this change the proportion of students failing algebra (the repeat rate) fell from 40 percent to eight percent, and the proportion of students taking advanced classes rose to a third of the students, more than any other number in the history of the district (Boaler et al, 2018). (Ch. 1 Line 471-476)

Families for San Francisco, in its ‘Inequity in Numbers’ report on the SFUSD math program, refutes this claim:

“The (Algebra) repeat rate (failure rate) did come down, but only because in 2015 SFUSD eliminated the requirement to pass the Algebra 1 California Standards Test (CST) exit exam as a condition of progressing (from Algebra 1 on to Geometry).

The effect of this change was later partially acknowledged by the (SFUSD) Math department in the speaker's notes in one of their (SFUSD Math Data) presentation slides in 2020: "The drop from 40% of students repeating Algebra 1 to 8% of students repeating Algebra 1, we saw as a one-time major drop due to both the change in course sequence and the change in (math) placement policy."(see Slide 3: ‘Algebra 1 Repeat Rate, Ethnicity’ see speaker notes at the bottom of the slide)

Again, according to Families for San Francisco, the change in SFUSD (math) placement policy was the main reason for the reduction in the Algebra 1 failure rate (repeat rate):

“Previously, all students in 8th grade were placed in Algebra 1 (prior to SFUSD’s math intervention). In SFUSD, continuing to 9th grade Geometry was a result of a student’s grade in Algebra 1 AND scoring proficient on the California state exam. If students did not meet this criteria, they had to repeat Algebra 1 in 9th grade. Often, students had to retake Algebra 1 multiple times.”

Also, according to SFUSD itself, in its SFUSD’s ‘Math Data’ presentation, the change in (math) placement policy was identified as a reason for the reduction in Algebra 1 repeat rates (failure rates), which was viewed as a one time drop:

The drop from 40% of students repeating Algebra 1 to 8% of students repeating Algebra 1, we saw as a one-time major drop due to both the change in course sequence and the change in (math) placement policy. (see Slide 3: ‘Algebra 1 Repeat Rate, Ethnicity’ see speaker notes at the bottom of the slide)

Similarly, Families for San Francisco refutes the second part of the above draft framework claim – that more SFUSD students took advanced math classes after the SFUSD math program intervention. The draft framework claims:

“Educators in the San Francisco Unified School District found similar benefits when they delayed any students taking advanced classes in mathematics until after tenth grade and moved the algebra course from eighth to ninth grade. After making this change the proportion of students failing algebra fell from 40 percent to eight percent, and the proportion of students taking advanced classes rose to a third of the students, more than any other number in the history of the district (Boaler et al, 2018). (Ch. 1 Line 471-476)

Families for San Francisco’s refutation of the claim:

“Enrollment in advanced math classes at SFUSD has gone down, not up, and SFUSD has produced no data about pass rates.” “Advanced math is commonly understood to mean courses beyond Algebra 2, including Precalculus, Statistics, and Calculus; however, SFUSD’s claim that its enrollment in “Advanced Math” enrollment has increased depends entirely on counting students enrolled in its “compression course” -- a third-year course combining Algebra 2 with Precalculus. The problem with this framing is that the University of California (UC) rejected SFUSD’s classification of its compression class as an advanced math course due to its failure to meet UC standards for Precalculus content. Once we exclude the enrollment data for the compression course, the enrollment number for advanced math shows a net decrease from 2017-2018 (the final cohort prior to the implementation of the new math course sequence).”

https://www.familiesforsanfrancisco.com/updates/inequity-in-numbers

Thus, the draft framework has made misleading and possibly false claims about the impact of detracking, and the impact of delaying Algebra 1 to grade 9 on SFUSD’s algebra repeat rate, on the change in SFUSD student enrollment in advanced math classes, and on the level of student enrollment in advanced math classes at SFUSD. No claim of student benefit can be supported.

Further, when the draft framework makes its claim about SFUSD’s results, it cites and refers to (Boaler, et al, 2018) as support for its claim:

“Educators in the San Francisco Unified School District found similar benefits when they delayed any students taking advanced classes in mathematics until after tenth grade and moved the algebra course from eighth to ninth grade. After making this change the proportion of students failing algebra fell from 40 percent to eight percent, and the proportion of students taking advanced classes rose to a third of the students, more than any other number in the history of the district (Boaler et al, 2018). (Ch. 1 Line 471-476)

The full citation for this cite, (Boaler et al, 2018), is at the end of Chapter 1 (Ch. 1 Line 987-989). It is:

Boaler, Jo.; Schoenfeld, A.; Daro, P.; Asturias, H.; Callhan, P. & Foster, D. (2018). How One City Got Math Right: Pathways that work. The Hechinger Report, October 9, 2018 (https://hechingerreport.org/opinion-how-one-city-got-math-right/)

This full citation (Ch. 1 Line 987-989) leads to an opinion piece penned in ‘The Hechinger Report,’ October 9, 2018 (https://hechingerreport.org/opinion-how-one-city-got-math-right/ ). In this opinion piece, the authors state the following, including a link to (SFUSD math program) ‘results’:

Every system wants to help students with a math interest learn the subject at high levels because students get a leg up when it comes to getting into college and moving into careers in the fields of science, technology, engineering and math (STEM). San Francisco Unified thought carefully about the best way to do this.

This article shares some of the remarkable results that this district achieved when it changed its approach to mathematics pathways. The changes were supported by researchers from Stanford, Berkeley and Silicon Valley.'

That ‘results’ link in this Hechinger Report opinion piece links to a Stanford Research Institute (SRI) Education report on the SFUSD math program:
(http://www.sfusdmath.org/uploads/2/4/0/9/24098802/sfusd_middle_school_sli-math-2016.pdf). SRI Education’s report, purportedly an evaluation of the SFUSD math program, rests on student answers to one MARS math task. Typically, claims about results in math learning rely upon exams and assessments made up of multiple items, all of which have been tested for validity and reliability. Yet we can find no information that MARS Math tasks have met these standards for assessments.

About MARS Math Tasks:

MARS (Mathematics Assessment Resource Service) math tasks are materials that were created by the The Mathematics Assessment Project (MAP), which is a part of the Math Design Collaborative initiated by the Bill & Melinda Gates Foundation. https://www.map.mathshell.org/background.php#whatismap

MARS Math tasks can be found under ‘The Materials’ here: https://www.map.mathshell.org/background.php#material

Summative (MARS) Assessment Tasks can be found publicly available here, for example: https://www.map.mathshell.org/tasks.php

Unsuitability of MARS math tasks as an Outcome Measure/Assessment:

The Stanford Research Institute’s (SRI) report on SFUSD’s program cites student answers to (just) one MARS math task as its outcome measure. However, MARS math tasks do not meet robust, independent standards as an outcome measure, unlike California’s CAASPP test, https://www.caaspp.org/, or the NWEA MAP test, https://www.nwea.org/, for example. MARS math tasks are subjectively scored, and lack evidence of reliability or predictability as an evaluation instrument. MARS math tasks are not a suitable outcome measure.

Most education studies that “meet standards” (see US Dept of Education Whatworks Clearinghouse, https://ies.ed.gov/ncee/wwc) use robust, independent measures. Examples include measures that have strong evidence of consistency (reliability), and have also been studied in thousands, if not millions of students each year, including those of different socioeconomic backgrounds. These measures are carefully and consistently administered and scored and the entire outcome measure process is independent of the original study. These measures are also studied as to whether they predict if students later succeed. None of this applies to MARS math tasks. We could not find a publisher for MARS math tasks tests other than these websites (https://www.map.mathshell.org/index.php, http://web.archive.org/web/20030303050154/http://www.nottingham.ac.uk/education/MARS/eval/own.htm) where there were several references to MARS tests being “drafts” and the stimulus tests and answers being publicly available (no test security). We could not find measures of reliability or predictive validity (whether they predict later academic success) for MARS tasks as an outcome tool. The MARS measure consists of several math problems that teachers hand score, thus leading to potential inconsistency and subjectivity. MAP admits that its MARS math tasks test materials are in draft form. From https://www.map.mathshell.org/tests.php:

The purpose of these (MARS math tasks tests) is to provide examples of the type of tests students should be able to tackle, if the aspirations of the Common Core State Standards are to be realized. The methodology of the tests and their balance of different task types is discussed in more detail here.

Note: please bear in mind that these materials are still in draft and unpolished form. https://www.map.mathshell.org/tests.php

The Summative Assessment Tasks may be distributed, unmodified, under the Creative Commons Attribution, Non-commercial, No Derivatives License 3.0. All other rights reserved. Please send any enquiries about commercial use or derived works to map.info@mathshell.org.

Note: please bear in mind that these prototype materials need some further trialing before inclusion in a high-stakes test.

https://www.map.mathshell.org/tests.php

The measure of whether students actually learned, measured via students' answers to just one MARS math task, casts grave doubt on the validity of SRI Education’s evaluation of SFUSD’s math program. As such, SRI’s evaluation of SFUSD’s program is not credible.
The draft framework cannot claim that SFUSD’s program shows student benefit.

The claims of beneficial SFUSD math program ‘results’ have been discredited as:

The results of the SFUSD math program do not support a conclusion of student benefit from either detracking or math deceleration (delaying Algebra 1 to grade 9), hardly the model on which to base a shift in math curriculum for 6 million California school children.

Example 3: The draft framework claims a study (Ch. 1 (Line 463-471) shows the success of 8 unnamed Bay Area (California) school districts that ‘detracked’ middle school mathematics, akin to what the draft 2022 framework proposes to do. Though the citation is missing from Chapter 1, Chapter 7 does contain a citation to this same paper, though the citation given at the end of Ch. 7 leads to a 2021 version of this paper, different to what is specified in the written citation in the draft framework (2018) (Ch. 1 Line 463-471). The Ch. 7 citation is here:

Boaler, J. & Foster, D. (2018). Raising Expectations and Achievement. The Impact of Wide Scale Mathematics Reform Giving All Students Access to High Quality Mathematics. .https://www.youcubed.org/resources/raising-expectations-achievement-mathematics/

(Ch 7, lines 428-429 and the citation references at the end of Ch. 7)

We could not locate a 2018 version of this paper. An earlier version though, believed to be a 2014 version of this paper, “Raising Expectations and Achievement, The Impact of Wide Scale Mathematics Reform Giving All Students Access to High Quality Mathematics,” Boaler, J. & Foster, D., can be found here: https://bhi61nm2cr3mkdgk1dtaov18-wpengine.netdna-ssl.com/wp-content/uploads/2017/03/Raising-Expectations.pdf

Results of a Google Scholar search (https://scholar.google.com/) show that ‘Raising Expectations…’ by Boaler, J. and Foster, D. is an unpublished paper, and has rarely been cited. There is no indication in Google Scholar that ‘Raising Expectations..’ was peer reviewed.

The paper ‘Raising Expectations…’, which discusses a detracking and teacher professional development intervention, does not identify any of the 8 (unnamed Bay Area) districts it studies, nor any of the 25 comparison districts. Nor does it provide any district level results, for either the California Standards Test (CST) or the MARS math tasks, the outcome metrics mentioned in the paper. As such it provides no specific, verifiable data on the results of its intervention. It provides sparse detail on its study design, and little detail on the criteria used in the recruitment of the districts eventually chosen to be studied. The use of the MARS Math tasks as an evaluation tool is problematic, and unsuitable, as explained above.

Evaluations predicated upon MARS math tasks lack credibility.

‘Raising Expectations..’ (8 unnamed Bay Area school districts unpublished paper) lacks detail on study design, does not provide verifiable results, offers unsupported conclusions, while using a non normed and non credible evaluation tool. As such, ‘Raising Expectations…’ does not provide credible evidence to support the draft framework’s proposals.

Example 4: Adjacent to and linked to the Ch. 1 mention of the 8 unnamed Bay Area school districts paper (‘Raising Expectations..’) (Ch. 1 Line 463-471), is an additional reference cited to further buttress the draft framework’s claim that detracking leads to increased student math achievement:

The overall achievement of the students after the de-tracking significantly increased. The cohort of students who were in eighth-grade mathematics in 2015 were 15 months ahead of the previous cohort of students who were mainly in advanced classes (MAC & CAASPP 2015). (Ch. 1 Line 467-471)

Yet the full citation for this reference citation (at the end of Ch. 1, Line 1062-1063), which is ‘MAC & CAASPP (2015) Technical Report, Years 2014 and 2015, Educational Data Systems, 15850 Concord Circle, Morgan Hill’, does not provide any evidence of student benefit due to detracking. Instead the citation leads to a technical report on the administration of CAASPP testing in 2014 and 2015 (https://www.cde.ca.gov/ta/tg/ca/documents/caaspp14techrpt.pdf).

Perhaps the draft framework meant to lead the reader to this Silicon Valley Mathematics Initiative (SVMI) report instead: “​​Student Achievement on MAC/MARS and California’s State Tests,” by David Foster. Even if so, this SVMI report does not present any specific, verifiable data or evidence for the claim that detracking alone or in combination with teacher professional development increases student achievement. And, once again, this paper relies upon the non credible and non normed assessment tool, MARS math tasks, contributing to a lack of credibility. As stated above, MARS math tasks are subject to rater bias and are psychometrically inadequate as outcome measures. Evaluations predicated upon MARS math tasks lack credibility.

A search on Google Scholar for the SVMI report produces no results.

Example 5: Regarding detracking, Science, Technology, Engineering and Mathematics (STEM) professionals critiqued the draft framework and how it cites research papers in a way that is inconsistent with the actual findings, including detracking. The draft framework claims “...there is research demonstrating negative effects of ability grouping and tracking on those in the “high” group or track as well...(Becker, Neumann, Tetzner, Böse, Knoppick, Maaz, Baumert, & Lehmann, 2014; Mulkey, Catsambis, Steelman, & Crain, 2005 (Ch. 9 Line 1039-1042). However, this is a misrepresentation. Mulkey and Catsambis et al 2005 found tracking had positive effects for lower achieving males and all females, with the only caveat that some higher achieving males may suffer ‘depressed self-concept’ in comparing themselves to other high achieving classmates. The draft framework omits mention of the main finding of the Mulkey, Catsambis research, which was to ‘reaffirm tracking has persistent instructional benefits for all students.’

The other study the draft framework mentioned in the same cite, (Ch. 9 Line 1039-1042), Becker, Neumann, et al 2014, studied an unrelated situation, the psychological effect on students being placed into a selective high school in Germany.

Neither study cited by the draft framework supports its claim about negative effects of ability grouping and tracking.

The STEM professionals’ critique about the draft framework’s detracking claims can be found here: (https://gdoc.pub/doc/e/2PACX-1vQvuzlJ8MWthsqOhRLxQc5akGS0JkgThz3umqO3K-WQiXFhWiq9qw-9iYdTyC652Ftjvv5nHvgGYTEv).

Section 4 of the STEM professionals critique:

4) RESEARCH EVIDENCE

The CMF repeatedly cites research papers in a way that is inconsistent with the actual findings: (This is not a judgment about the papers themselves, only about the way they are used to claim the recommendations are evidence based.)

  • The CMF says regarding acceleration or differentiation prior to high school that “there is research demonstrating negative effects of ability grouping and tracking on those in the ‘high’ group or track as well”, citing this and this. The first paper (Becker, Michael, Neumann, Marko, et al 2014 - (Ch. 9 1039-1042 and 1283-1286)) is about the negative psychological effect of being placed into a selective high school in Germany (an entirely unrelated situation) and the second (Mulkey, Catsambis 2005 - Ch. 9 Line 1039-1042 and 1382-1384) says its results “reaffirm that tracking has persistent instructional benefits for all students” (!) while ascribing the negative effects of tracking for those in the high group to having more high-ability classmates.
  • The CMF avoids citing other works such as Figlio and Page, that followed a random sample of several thousand students, and concluded that they found “no evidence that detracking America’s schools … will improve outcomes among disadvantaged students.” or Card and Giuliano that found that tracking has “significant positive effects on the reading and math scores of high achievers, with the gains concentrated among Black and Hispanic students.” https://gdoc.pub/doc/e/2PACX-1vQvuzlJ8MWthsqOhRLxQc5akGS0JkgThz3umqO3K-WQiXFhWiq9qw-9iYdTyC652Ftjvv5nHvgGYTEv

Mulkey, Catsambis 2005 ‘The long-term effects of ability grouping in mathematics: A national investigation’ can be found here: https://fr.art1lib.org/book/8150754/cf74a2

The Becker study can be found here: https://psycnet.apa.org/buy/2014-01451-001 Thus the draft framework misrepresents research about tracking; it portrays research as though it supports detracking, when it does not.

Example 6: With its proposal to universally delay Algebra 1/Integrated Math I until grade 9 (Ch. 8 Line 335-338), the draft framework impedes a student’s ability to complete calculus in high school, given there are only 4 years (starting in grade 9) to complete 5 years worth of content (Algebra 1, Geometry, Algebra 2, PreCalculus, Calculus). The draft framework tries to sidestep the impediment that it itself has caused — by claiming that students can skip Precalculus (Ch. 8 Lines 843-847) and still take Calculus in grade 12. Its claim:

“Research has shown that taking a precalculus class does not increase success in calculus (Sonnert & Sadler, 2014), and recent innovative approaches for students in California community colleges have shown that students who move from Algebra 2 to supported calculus classes are more successful than those who go through prerequisite courses (Mejia, Rodriguez, & Johnson, 2016). (Ch. 8 Lines 843-847)

In their critique of the draft framework, (https://gdoc.pub/doc/e/2PACX-1vQvuzlJ8MWthsqOhRLxQc5akGS0JkgThz3umqO3K-WQiXFhWiq9qw-9iYdTyC652Ftjvv5nHvgGYTEv), STEM professionals dispute this claim, saying ‘..skipping precalculus is only feasible for very few advanced students..,.’ They go on to document the draft framework’s misrepresentation of the research it cites for claiming students could skip PreCalculus and just take Calculus, showing how the research the draft framework cites does not support its claim at all:

Section 4 of the critique:

“In order to support the viability of taking calculus in 12th grade following the MIC (Mathematics: Investigating and Connecting) pathway or a traditional/integrated pathway with Algebra 1 taken in 9th grade, the CMF cites the following (Chapter 8, page 32):

  • The CMF suggests that this paper shows that it might be possible to skip precalculus altogether, but the paper is focused on pre-calculus in college, noting that this population often already took pre-calculus already in high school!
  • The CMF also claims that this paper shows that “students who move from Algebra 2 to supported calculus classes are more successful than those who go through prerequisite courses.” but the paper talks about moving from algebra to statistics, and doesn’t discuss calculus at all. (https://gdoc.pub/doc/e/2PACX-1vQvuzlJ8MWthsqOhRLxQc5akGS0JkgThz3umqO3K-WQiXFhWiq9qw-9iYdTyC652Ftjvv5nHvgGYTEv)

Summary: The above are just a few examples of the misuse, misstatement, and misrepresentation of research in the draft 2022 math framework. There are more. A number of articles and letters have been written about these shortcomings, as well as the white paper ‘Inequity in Numbers,’ written by Families for San Francisco, which disputes and refutes the purported ‘benefits’ of the SFUSD math program:

Conclusion: Of foundational studies that underpin the draft framework’s detracking and math deceleration proposal, the record is the following:

  1. (Burris, et al 2006) has been misrepresented as supportive of the draft framework’s proposals when it is not. Burris, et al 2006 supports benefit from universal math acceleration, not from universal math deceleration, which is the draft framework’s recommendation.
  2. (Boaler et al, 2018) is an opinion piece, not a study, and the underlying report about the SFUSD math program, (http://www.sfusdmath.org/uploads/2/4/0/9/24098802/sfusd_middle_school_sli-math-2016.pdf), lacks credibility. Its evaluation instrument is not psychometrically sound (MARS math tasks). As such it cannot provide support for the draft framework’s detracking and deceleration proposals.
  3. (“Raising Expectations…” - the 8 unnamed Bay Area School Districts unpublished paper), provides no verifiable evidence of benefit for a detracking and professional development intervention, offering unsupported conclusions, and again uses an evaluation instrument that is not psychometrically sound (MARS math tasks).
  4. (MAC & CAASPP 2015), does not show student benefit from detracking, but is instead simply a technical report on the administration of the CAASPP test in 2014-2015.
  5. (Mulkey, Catsambis, Steelman, & Crain, 2005) does not support the draft framework’s claim of negative effects of ability grouping and tracking, but instead supports the opposite, as it ‘reaffirm(s) tracking has persistent instructional benefits for all students,’ with little downside. Another study, (Becker, Neumann, Tetzner, Böse, Knoppick, Maaz, Baumert, & Lehmann, 2014), has no relevance to the draft framework claims about negative impacts of ability grouping and detracking.
  6. Two studies cited by the draft framework to justify its claim that students can skip Precalculus and just take Calculus, do not support its claim. (Sonnert & Sadler, 2014) discusses the poor math preparation of College Calculus students, in particular deficient Precalculus teaching in high school, but does not suggest students can skip Precalculus prior to taking Calculus in high school. (Mejia, Rodriguez, & Johnson, 2016) discusses how pre-statistics accelerated community college pathways has assisted students to feed into college-level statistics courses, in place of the student pursuing a community college’s traditional math pathway, but does not suggest students can skip PreCalculus prior to taking Calculus in high school. Neither study supports the draft framework’s claim.

To make such a big shift in guidance for California’s mathematical education, the research underlying the draft 2022 math framework’s proposals should be robust and rock solid rather than misrepresented.

Citations

Burris, Heubert, Levin 2006
(Accelerating Mathematics Achievement Using Heterogeneous Grouping Author(s): Carol Corbett Burris, Jay P. Heubert and Henry M. Levin Source: American Educational Research Journal, Vol. 43, No. 1 (Spring, 2006), pp. 105-136)
https://www.semanticscholar.org/paper/Accelerating-Mathematics-Achievement-Using-Grouping-Burris-Heubert/89bab06af3343fb9fdf24e74fab1cacda8b1b4ce
Boaler, Jo.; Schoenfeld, A.; Daro, P.; Asturias, H.; Callhan, P. & Foster, D. (2018). (Opinion)
(Opinion) How One City Got Math Right: Pathways that work. The Hechinger Report, October 9, 2018.
https://hechingerreport.org/opinion-how-one-city-got-math-right
Final Report: Year 3 SFUSD STEM Learning Initiative Evaluation
http://www.sfusdmath.org/uploads/2/4/0/9/24098802/sfusd_middle_school_sli-math-2016.pdf