Nctm.confex.com



An Analysis of Presses of Third-Grade Student Prompts for

the Amount and Characteristics of Writing

Tutita M. Casa

Juliana R. MacSwan

Kara E. LaMonica

Madelyn W. Colonnese

University of Connecticut

Janine M. Firmender

Saint Joseph’s University

Paper submitted to the NCTM Research Conference, April 2015.

Abstract

Curriculum guidelines and professional organizations’ recommendations lack details about how often and how much students should write in mathematics and what characteristics should define their writing. This study contributes an analytic framework that addresses presses for the amount, length, and characteristics of mathematical writing. Consequently, 1,949 writing prompts in student books across nine comprehensive Grade 3 resources were analyzed. Findings indicate a marked variation in how often and how much students are pressed to write. Most prompts press students to explain what they did to solve a problem and why about number concepts, with most pressing for procedures. The greatest percentage of prompts press students to write about their own solutions and do not press them for specific writing features.

Keywords: Curriculum, Writing, Elementary, Mathematics

Grade 3 Student Prompts Across Nine Comprehensive Curriculum Resources: An Analysis of Presses for the Amount and Characteristics of Writing

Even though teachers of mathematics inarguably have their students write more than just numbers, “because mathematics is so often conveyed in symbols, … written communication about mathematical ideas in not always recognized as an important part of mathematics education” (National Council of Teachers of Mathematics [NCTM], 2000, p. 60). Yet research indicates that writing can enhance student learning (Borasi & Rose, 1989; Rothstein, A., Rothstein, E., & Lauber, 2003) and, “at its best, writing is learning” (The National Commission on Writing, 2003, p. 51). When writing, students are active learners (Kasparek, 1996) and are encouraged to clarify, analyze, compare, reflect on their thinking, and synthesize information leading to a deeper conceptualization of mathematics (Emig, 1977, Imscher, 1979, Odell, 1980, and Vygotsky, 1962, as cited in Kasparek, 1996; Rothstein, A., Rothstein, E., & Lauber, 2003). Mathematical writing also can aid teachers in understanding their students’ reasoning processes and misconceptions (Yang, 2005). Undoubtedly then, the number of times and amount students are asked to write, together with the ways prompts press students to include particular characteristics in their writing will influence the extent of these benefits on students’ learning of mathematics.

It is therefore not surprising that NCTM (cf., 1989, 1991, 2000) alludes to the importance of engaging students in mathematical writing. However, NCTM and current national curriculum documents lack explicit guidance in reference to mathematical writing. For example, the 2000 Principles and Standards for School Mathematics noted that “communication is an essential part of mathematics and mathematics education” (NCTM, p. 60), but it does not provide a clear definition of what this writing should entail or how often and the extent to which students should write. It also is noted that “students’ descriptions of problem-solving strategies and reasoning should become more detailed and coherent” and that third, fourth, and fifth graders’ writing “should be more coherent than in earlier grades” (p. 194) without elaboration on what constitutes coherence. Additionally, the Common Core State Standards for Mathematics (CCSS-M; National Governors Association Center for Best Practices [NGA] & Council of Chief State School Officers [CCSSO], 2010a) only infers the inclusion of mathematical writing in the third mathematical practice that states that, “Students at all grades can … read the arguments of others” (p. 7, emphasis added).

Teachers, curriculum writers, and researchers may find some initial guidance for mathematical writing in the Common Core State Standards for English Language Arts (CCSS-ELA; NGA & CCSSO, 2010b). The Writing Standards propose for students to write regularly across disciplines for multiple reasons, a position also espoused by The National Commission on Writing (2003). While mathematical writing is not expressly described in the Writing Standards either, educators may rely on the general description of types of writing found in Appendix A (NGA & CCSSO, 2010c) as potential types of mathematical writing. These types of text include arguments that are “a reasoned, logical way of demonstrating that the writer’s position, belief, or conclusion is valid.” Also, “informational/explanatory writing conveys information accurately. … [to] help readers better understand a procedure or process, or to provide readers with an enhanced comprehension of a concept” (p. 23).

With cursory attention given to mathematical writing in curriculum guidelines, questions remain about how often and to what extent students should be writing in mathematics and what characteristics define such writing. The importance of addressing these questions have become more urgent given that students’ written mathematical reasoning will be evaluated on the Common Core State Standards (CCSS)-related assessments. In particular, the Partnership for Assessment of Readiness for College and Careers (PARCC, 2010) asserts that students will be assessed on their ability to communicate a “well organized and complete response” (p. 13). Similarly, the Smarter Balanced Assessment Consortium (SBAC, 2010) notes writing as an essential characteristic of their Performance Tasks.

Unequivocally, it is an opportune time for the mathematics and writing education communities to attend to mathematical writing and thereby offer guidance to teachers and curriculum writers. In turn, researchers’ investigations can be based on more precise guidelines, and assessment developers can utilize findings to refine test questions and rubrics. Irrespectively, it is immediately important that we understand the ways in which students currently are being pressed to write during math class. Ultimately, students can benefit from this collective work.

Because curriculum materials feature prominently in the teaching and learning of mathematics (Grouws, Smith, & Sztajn, 2004; Reys, B., Reys, R., & Chavez, 2004; Tyson-Bernstein & Woodward, 1991), they are resources that allow us to investigate this phenomenon. In fact, a national survey of over 7,700 teachers revealed that those teaching mathematics at the elementary level overwhelmingly reported relying on a commercially available text or program, (Banilower et al., 2013). Student textbooks in particular have been identified as central to the implementation of standards, such as the CCSS-M (Schmidt & Burroughs, 2013), and these curriculum resources provide students an introduction to the processes of the mathematics community (Herbel-Eisenmann, 2007) that includes mathematical writing. It is therefore important to better understand how often, to what length, and with what characteristics writing prompts press students to write within their student book.

Research Questions

Since “teachers decide what to teach, how to teach it and what sorts of exercises to assign their students” (Reys, B., Reys, R., & Chavez, 2004, p. 62) based largely on available curriculum resources, it is important to review them to understand how students are positioned to engage in the practice of mathematical writing. We conjectured that the writing prompts press students to write a given number of times, certain amounts, and in particular ways. This research is focused on curriculum resources for the third-grade level for two reasons: it is the first grade level that the CCSS-ELA (NGA & CCSSO, 2010b) calls for students to write across the curriculum and Grade 3 is the initial grade level evaluated summatively on state-level tests as well as on the CCSS-M assessments. To guide our investigation our research questions were:

1. How often are students asked to write in commonly used and commercially available comprehensive Grade 3 student books?

2. How do writing prompts in commonly used and commercially available comprehensive Grade 3 student books press students to produce writing with particular characteristics, and how frequently do they call for such characteristics?

Methods

We next describe the methods undertaken to answer these research questions. We begin with a description of the decisions made to identify our data sources. The analytic framework we developed for this study is then described. Lastly, we explain how the writing prompts identified for this investigation were analyzed.

Data Sources

Since “very little systematic information on which materials are being used in which schools” (Chingos & Whitehurst, 2012, p. 1) is available, we sought to narrow our focus of potential data sources to examine. We considered commercially available curriculum resources that would provide us with an idea of writing expectations offered to a vast number of students during an entire academic year. This led us to ponder which curriculum series and the resources offered in the package we would rely on to identify the writing prompts.

Identification of curriculum series. We contacted our State Department of Education mathematics consultant and three local mathematics specialists who recently looked into purchasing series for their own districts for their advice about potential series to investigate. We also referenced the Education Market Research’s 2011 National Survey of Mathematics to cross-reference and augment the initial list of recommendations. We made sure to include any series originally developed in the 1990s with funding by the National Science Foundation that aimed to bolster the quality of mathematics curriculum resources in response to guidelines set forth in NCTM’s Curriculum and Evaluation Standards for School Mathematics (1989; Tarr et al., 2008), including Everyday Mathematics, Investigations in Number, Data, and Space, and Math Trailblazers (Bargagliotti, 2012), with the premise that they may intuitively address the aforementioned calls for mathematical writing. Lastly, given that one of our primary rationales for this study was rooted in the assumption that teachers are being positioned to address calls set forth in the CCSS-M (NGA & CCSSO, 2010a) and CCSS-ELA (2010b), we only investigated materials published after the final documents were released in 2010[1] and that were available on the market; this precluded study of some other commonly used elementary series, such as Growing with Mathematics last published in 2004. Altogether, we identified nine series: Everyday Mathematics that according to Education Market Research (2011) was closely or selectively followed by almost 20% of their survey sample, enVisionMATH (13.5%)[2], Saxon Math (8.1%), and Investigations in Number, Data, and Space (5.4%), Math Expressions (1.8%), Math in Focus: Singapore Math (1.4%), and Math Trailblazers (1.4%) and Go Math! and My Math (no usage data available).

Identification of resources within the curriculum series. We next identified the resources within the Grade 3 materials for each series that students would most likely directly interact with during instruction[3]. Although this investigation was not focused on the enacted curriculum where we might have considered the teacher manual as a data source, it is reasonable to assume that the student book is the main resource that meets the aforementioned criteria. In fact, Choppin (2011) found that teachers did not omit any activities from student books, so investigating these sources was apropos. The following series included just one version of the student edition (albeit some with two volumes), either only a hardcover textbook and/or a consumable with identical pages[4]:

1. enVision Math Common Core (Charles et al., 2012), hereafter referred to as enVision;

2. Everyday Mathematics, Student Math Journal, Volumes 1 and 2 (Bell et al., 2012a; 2012b), or Everyday Math;

3. Go Math Common Core Edition, student workbook (Dixon, Larson, Leiva, & Adams, 2012), or Go Math;

4. Investigations in Number, Data, and Space, Student Activity Book, Common Core Edition (TERC, 2012), or Investigations;

5. Math Expressions Common Core, Volumes 1 and 2, student workbooks (Fuson, 2013a; 2013b), or Math Expressions;

6. My Math, Student Edition, Volumes 1 and 2 (Carter et al., 2013a; 2013b), or My Math; and

7. Saxon Math, Intermediate 3, Student Edition (Hake, 2012), or Saxon Math.

Of special note are two series that offered two different student editions to be used during instructional time. Math in Focus offered a hardcover textbook and workbooks that contained different problems. Since they made a note to students that, “You don’t write in the [image of green rectangular space] in this textbook. This book has a matching Workbook. When you see the pencil icon [“on your own” icon that has an image of a pencil], you will write in the Workbook” (Kheong, Ramakrishnan, & Choo, 2013a, p. xv). Following this directive, we identified writing prompts predominantly from the workbooks. The textbook, however, did include some “’Reading and Writing Math’ Math Journal” pages that explicitly noted the expectation to write, and, thus, we included those writing prompts as data. In addition, Math Trailblazers also indicated in the lessons contained in their Teacher Lesson Plans (i.e., teacher guidebook) that the Student Guide and the Student Activity Book both are considered “student books” (Beal et al., 2014a, p. 2). We thus drew data from both student resources. Specifically, we drew from on the following student resources:

8. Math in Focus: Singapore Math (hardcover versions 3A and 3B; Kheong, Ramakrishnan, & Choo, 2013a, 2013b) and Workbooks 3A and 3B (2013c; 2013d), or Math in Focus; and

9. Math Trailblazers: Common Core State Standards, Student Activity Book, Volumes 1 and 2, Fourth Edition (Beal et al., 2014a; 2014b) and Student Guide, Volumes 1 and 2, Fourth Edition (2014c; 2014d), or Trailblazers.

We relied on hardcopies rather than electronic versions for all resources. We also depended on consumables if they were available to ascertain the amount of space provided for students to write, which included all series except enVision and Saxon Math that offered only textbooks, the previously noted Math in Focus Math Journal pages from the textbook, and the Trailblazers Student Guide textbook. We reference the consumables and textbooks as “student books” henceforth.

Identification of writing prompts. Many researchers who previously have analyzed curriculum resources focused solely or in part on the mathematical content (e.g., elementary-level content by Baker et al., 2010; arithmetic average by Cai, Lo, & Watanabe, 2002). While mathematical content is central to any given mathematics curriculum resource, we found during our initial perusal of the student books that identifying actual prompts requiring students to write was not as straightforward of a process as we had originally assumed, such as when students are asked to “write” yet very limited space is provided for them to do so.

There were instances where there was an indication for students to write that made us consider the notion of “writing” in and of itself. Although students were positioned to put pencil to paper with many prompts—particularly with the consumables—we only included questions prompting students to inherently address the CCSS-ELA’s (NGA & CCSSO, 2010b) Grade 3 Writing Standards that in part have them “write informative/explanatory texts to examine a topic and convey ideas and information clearly” and arguments “supporting a point of view with reasons” (p. 20) as suitable data sources. Deciding to exclude particular prompts helped us further our working definition of “writing.” Specifically, we excluded prompts that pressed students to, for instance: label (e.g., graphs, place value, units); write a number sentence or equation; fill in blanks; exclusively show their steps to a solution (e.g., “Show your work,”); copy information; only draw; and solely require a numerical answer (e.g., “How many…?”) or word (e.g., yes/no, true/false).

Instead, we pursued prompts that obligated students to minimally write one complete sentence. We initially relied on several veiled indicators, such as those in the directions themselves (e.g., “write a…”), subheadings (e.g., “Writing to Explain” in enVisions), and icons (e.g., pencil with the word “writing” on it in Investigations and one with “Write Math” in Go Math) as initial guides. However, we further had to ascertain whether or not a given prompt in fact pressed students to write directly or, in the case of prompts more focused on procedures, presented the possibility to write because these indicators did not always seem to press students to at least write a sentence. One characteristic we relied on for the consumables was the presence of space for students to write, which sometimes included lines. We also considered the prompt stems and identified some that may call for students to write at least one sentence (e.g., “explain what,” “explain your reasoning,” “tell why,” “explain how you decided,” “what do you know about,” “describe,” “compare”). We were compelled in some instances to place ourselves as a student using the resource because the tasks surrounding the prompt might press them to write a particular amount.

At times, students also were given the option to write and/or include another representation. Noting that the CCSS-M (NGA & CCSSO, 2010a) calls for students to model with mathematics and the CCSS-ELA (NGA & CCSSO, 2010b) guides them to illustrate ideas, we decided to include such prompts (e.g., Trailblazers’ “Show or tell how you ___.”) because the opportunity to write presented itself; it also is reasonable to assume that third graders would write and not just draw. Additionally, at times students were asked to write only if they disagreed or agreed with a given answer (e.g., Math Expressions’ “Is my answer right? If not, please…”). Again, this instance presented the option to write, so we included such prompts in the data corpus.

Analytic Framework

We were obligated to develop an analytic framework to carry out our study because of the cursory description of writing in curriculum-related documents and sometimes ambiguous presentation of the prompts in the student books. Charalambous, Delaney, Hsu, and Mesa (2010) highlight the importance of a framework so not to approach a study with “disparate criteria with no underlying structure or organization” (p. 122). We found Usiskin’s (2010) roads analogy describing the development and implementation of curriculum resources informative in framing our endeavors. In particular, he envisions curriculum writers as the engineers building the roads. “Some of the roads [curriculum writers] take, however … are new roads. At times they are uneven because they are unpaved or pass through unchartered territory” (p. 26).

Our assumptions. Other researchers who have analyzed mathematical content presented in curriculum resources have relied on types of understanding of the given topic and an abundance of available literature (e.g., see Cai, Lo, and Watanabe’s 2002 investigation of arithmetic average; Haggarty and Pepin’s 2002 research focused on angles; Mesa’s 2004 investigation of the views of functions; Charalambous et al.’s 2010 study of the addition and subtraction of fractions; Mayer, Sims, and Tajika’s 1995 study of problem solving of addition and subtraction of signed whole numbers; Schmidt, Wang, and McKnight’s 2005 study and Schmidt and Houang’s 2012 research of general U.S. content standards in comparison with top-performing TIMSS countries). However, while mathematical content is ubiquitous in any given mathematics curriculum resource, as already discussed, we found during our initial review of potential data sources that identifying actual prompts in the student books that require them to write was not a clear-cut endeavor.

Given that writing is considered a process (see the communication standard in NCTM, 2000) or practice (see the third mathematical practice in NGA & CCSSO, 2010a) and arguably is less defined than mathematical content areas, we were compelled to draw upon multiple resources attempting to define such an effort. We also needed to ensure that our analytic framework applied to any and all forms of writing prompts. Like Thompson, Senk, and Johnson (2012), who analyzed a mathematical process—mainly, reasoning and proof—found in high school textbooks, we proceeded with fundamental assumptions while developing our analytic framework[5]. We adapted two of their related assumptions for our own purposes:

1. The framework should be applicable to various types of writing and across all content areas.

2. The framework should be applicable across all curriculum resources regardless of their characteristics (e.g., instructional goals, alignment with guidelines set forth by NCTM).

While a plethora of mathematics education literature is available in other process-oriented practices, like reasoning and proof, limited work has been done in the area of mathematical writing (Ryve, 2011). Cynthia Langrall’s (2014) recent editorial reminding mathematics education researchers of the continued importance of linking research to practice, together with Usiskin’s (2010) proposition that “it is natural to examine documents at the national level” (p. 26) had us focus on resources available to and guiding teachers nationwide. We thus modified Thompson, Senk, and Johnson’s (2012) third assumption and added a fourth specific to our purposes:

3. The framework should build upon national mathematics education recommendations.

4. The framework should identify ways in which students might be pressed by the curriculum resource to construct written responses with particular characteristics.

Initial resources guiding the development of our framework. We sought to identify explicit references to writing that would serve as guides for our analytic framework. NCTM’s 1991 Professional Standards for Teaching Mathematics seminally discussed the role of discourse by calling for worthwhile tasks that “promote communication about mathematics” (p. 25) and addressed the teacher’s and student’s roles and tools for enhancing it. In 2000, the Principles and Standards followed with the presentation of the Grades 3-5 communication standard that in part called for students to “communicate their mathematical thinking coherently and clearly” (p. 194). Unfortunately, writing was only cursorily mentioned in both steering documents and did not provide us with specific enough direction (e.g., “Understanding that writing about one’s ideas helps to clarify and develop one’s understandings would make a task that requires students to write explanations look attractive” [1991, p. 27] and, teachers “should expect students’ writing to be correct, complete, coherent, and clear. Especially in the beginning, teachers need to send writing back for revision” [2000, pp. 198-199]). We consequently turned to other offerings in both documents that plausibly pertains to writing albeit not explicitly. Guidance on worthwhile tasks (1991) had us consider the content, openness of problems, and facility with procedural skills and understanding of concepts. Further, the Principles and Standards’ (2000) communication, representation, and reasoning and proof standards respectively had us consider students attending to others’ arguments, using representations to help communicate their arguments, and evaluating and again considering others’ arguments.

We next reviewed the CCSS-M’s (NGA & CCSSO, 2010a) Standards for Mathematical Practices (MP) for further and more current guidance. The most obvious MP that might apply is the third one that calls for students to construct viable arguments and critique the reasoning of others. Essentially, this MP focuses more on the process of creating such arguments rather than expectations for writing. While writing is not explicitly mentioned, it can be assumed given students should be reading others’ arguments. We reviewed the other MP’s and none directly addressed writing. Nonetheless, we deduced that MP 1 (make sense of problems and persevere in solving them), MP 4 (model with mathematics), and MP 6 (attend to precision) inadvertently address writing by respectively calling for students to write substantially, include representations with their writing, and incorporate the use of academic language, symbols, and labels.

Given our focus on writing, we also referenced the CCSS-ELA (NGA & CCSSO, 2010b). In fact, the expectation for students to write for across disciplines begins in Grade 3. Three text types and purposes are mentioned, including a) opinion pieces, b) informative/explanatory texts, and c) narratives, with the first two influencing our analytic framework. While “opinions” do not apply to the mathematical domain, this term is used to conjure the development of argumentative writing in Grades k-5 (2010c). We ascertained that “provide reasons that support the opinion” (2010b, p. 20) alludes to the expectation that students be pressed to include mathematical reasoning in their writing. Constructing a written mathematical explanation also should include detail when writing informative/explanatory texts.

Since mathematics is not explicitly noted in the Grade 3 Writing Standards (NGA & CCSSO, 2010b), we also referenced information about the CCSS-related mathematics assessments available at the time of this study to further guide our analytical framework. However, many of the writing stems from the sample items from the Partnership for Assessment of Readiness for College and Careers (PARCC) (see ) and Smarter Balanced Assessment Consortium (SBAC) (see ) provided limited guidance for our analytic framework given that they only included the items themselves (e.g., with stems like “explain”) and no rubrics outlining expectations.

Overall, we experienced a similar phenomena as Thompson and colleagues (2012) whose two initial sources guiding their frameworks “all had some aspects that [they] found useful, [yet] each failed to capture some nuances…that [they] believed were important to capture” (p. 258). Simply put, none of the aforementioned documents explicitly addressed mathematical writing and provided just a broad framework that would have limited our ability to thoroughly address our research questions. As recommended by Charalambous et al. (2010), we consequently used a top down as well as a bottom up approach and inductively reviewed the writing prompts.

Our framework. In line with other researchers investigating characteristics of written curriculum materials, such as textbooks (e.g., Baker et al., 2010; Herbel-Eisenmann, 2007; Thompson et al., 2012), we too acknowledge that such investigations inherently cannot shed light on any enactments of these resources, such as guidance provided by teachers’ or students’ interpretations of the writing prompts. Thus, the framework that we present showcases “an optimistic or broad view of the potential opportunities that student have to engage with” (Thompson et al., 2012, p. 261) mathematical writing. Nonetheless, while the aforementioned documents (i.e., NGA & CCSSO, 2010ab; NCTM 1991, 2000) ambiguously addressed writing, our inductive review of the prompts and reliance on these resources resulted in a comprehensive set of categories that are reflected in our research questions, and which we next explain.

Considering students’ opportunities to write resulted in two categories. First, since students should regularly write for “shorter time frames (a single sitting or a day or two)” (NGA & CCSSO, 2010b, p. 21), it was natural for us to investigate how often students are pressed to write. Relatedly, features of the workspace also press for an amount of writing to produce since the amount of space provided may compel students to produce written responses of a given length that inherently may impact the depth of their ideas.

In addition, the writing prompts appeared to press students regarding contextual characteristics. Noticeably, “[t]he content is unquestionably a crucial consideration in appraising the value of a particular task” (NCTM, 1991, p. 27). A press for addressing particular mathematical content in the writing might call for certain features in written responses that are inherent to the respective topic (e.g., visual representations in prompts asking students to write about geometry). We relied on the actual CCSS-M (NGA & CCSSO, 2010a) document presenting the Grade 3 content standards on pages 23-26. Overall, the different levels of specificity (e.g., listing shapes that students must know along with others that are examples of ones they should know) together with some overlapping ideas (e.g., solving problems involving multiplication as well as understanding its properties) necessitated coding at various levels. We relied on the domain or “larger groups of related standards” (p. 5) if the prompt addressed at least one standard within it for prompts addressing Operations and Algebraic Thinking (OA), Number and Operations in Base Ten (NBT), Number and Operations—Fractions (NF), and Geometry. The Measurement and Data domain was too broad, thus we focused on clusters or “groups of related standards” (p. 5) to code for measurement that press students to, “Solve problems involving measurement and estimation of intervals of time, liquid volumes, and masses of objects” (p. 24); data that press students to, “Represent and interpret data” (p. 25); and Geometric Measurement: Area and Geometric Measurement: Perimeter that press students to attend to these respective measurements. In these cases too, the prompt needed to minimally address one standard within the cluster. We coded all other prompts as not addressing a third-grade content standard.

The CCSS-ELA (2010b) calls for third graders regularly to write for various purposes across disciplines. Hudson, Lahann, and Lee (2010) similarly recommend those looking to adopt new curriculum resources to ask themselves, “In what ways do the tasks in the curriculum require students to justify and communicate their reasoning in various ways” (emphasis added, p. 226)? Thus, the press for a type of writing is a noteworthy consideration. National documents call for students to write arguments and explanatory texts (NGA & CCSSO, 2010b) as well as communicate about tasks that have them reason and consider alternative solutions (NCTM, 1991), evaluate arguments (NCTM, 2000), and justify their reasoning (NGA & CCSSO, 2010a). Other researchers recognized the need to code for writing types (e.g., Charalambous et al.’s 2010 identification of prompts calling students to submit only an answer, an answer and mathematical sentence, an explanation, or a justification), yet our interest in sharing a more in-depth view into student writing prompts to better understand the current landscape of mathematical writing, compelled us to implement a more nuanced investigation as seen in Table 1.

|Table 1 |

|Framework for Coding the Press for a Type of Writing |

|Code |Definition |Examples |

|Compare and/or |The prompt presses students to write how something is similar and/or |How are ___ and ___ related? |

|contrast |different. |How are ___ and ___ different? |

|Compose a problem |The prompt presses students to write a problem that follows a |Write a story problem that ___. |

| |specified format. |Write a real-world problem using ___. |

|Define a term |The prompt presses students to define a term or use the definition of|What does it mean to ___? |

| |a term in their response. |Write the meaning of ___. |

|Describe observations |The prompt presses students to write about something that is |Describe any patterns you see. |

| |observable in the given problem, such as patterns and components of a|What do you notice about ___? |

| |graph. | |

|Explain what |The prompt presses students to write about “what” they did to solve a|Explain how to solve it. |

| |problem. Students are not pressed to write beyond their steps, |Do you agree with ____? If not, show or tell |

| |procedures, or solutions by expanding on their response with |how you would help ___. |

| |“because…” While students may have reasoned as they solved the |Explain how you can use ___ to ___. |

| |problem, the prompt does not call for them to write about the | |

| |reasoning they used for the steps in the problem. Rather, the prompt | |

| |calls for students to write about the outcome of any reasoning. | |

|Explain why |The prompt presses students to write about “why” they solved a |Explain your reasoning. |

| |problem in a given way. That is, students should explain “because…” |Explain why. |

| |The prompt calls for them to write about the reasoning process they |Explain why you agree or disagree. |

| |used to solve the problem. | |

|Rewrite a sentence |The prompt presses students to rewrite a sentence. |Rewrite the sentence to make it true. |

|Write a question |The prompt presses students to write a question about a given problem|Write a question that cannot be answered by |

| |or scenario. |the graph. |

|Other |The prompt does not have characteristics of any of the aforementioned|--- |

| |codes. | |

NCTM (1991) has teachers ask, “If there is an explanation of a procedure, such as calculating a mean, does that explanation focus on the underlying concepts or is it merely mechanical” (p. 26)? Thus, we also coded prompts that already had been identified as either “explain what” or “explain why” as well as those coded as number-related domains (i.e., OA, NBT, or NF) for a press to write about procedures or concepts[6] (see Table 2). We relied on Baroody, Feil, and Johnson’s (2007) definitions that they adopted from de Jong and Ferguson-Hessler (1996) to define our codes. Specifically, they define “…procedural knowledge as mental ‘actions or manipulations’ (p. 107), including rules, strategies, and algorithms, for completing a task” and “…conceptual knowledge as ‘knowledge about facts, [generalizations], and principles’ (p. 107)” (original emphasis, p. 123).

|Table 2 |

|Framework for Sub-coding the Press to Write About Procedures or Concepts for Prompts Already Coded as Explain What or Explain Why and OA, NBT,|

|or NF |

|Code |Definition |Example |

|Conceptual explanation |The prompt calls for an explanation that can be written by |Explain your thinking. |

| |proposing a generalization or principle. |Why is it important to ___? |

| | |Why do you need [this certain step]? |

|Procedural explanation |The prompt calls for an explanation that can be written in a |How did you find the answer? |

| |step-by-step manner to describe rules, strategies, and/or |What strategy would you use to find ___? Explain. |

| |algorithms. |Explain your work. |

|Note: OA = Operations and Algebraic Thinking; NBT = Number and Operations in Base Ten; NF = Number and Operations—Fractions |

Further, students should communicate using representations (NCTM, 2000) to “include illustrations when useful to aid comprehension” (NGA & CCSSO, 2010b, p. 20), incorporate models, and use precise mathematical language (NGA & CCSSO, 2010a). We consequently coded prompts for the presence or absence of a press for the inclusion of a specific writing feature. Such prompts require students to incorporate a particular component to structure the writing, such as giving an example, making a list, drawing a diagram, including the steps they used to solve the problem, or using certain words or symbols.

Lastly, MP 3 (NGA & CCSSO, 2010a) calls for students to consider their own and others’ arguments and reasoning. Prompts might therefore press for writing about their own or others’ solutions. As seen in Table 3, this can be done either implicitly or explicitly so students write about their own solutions, one that has been provided, or two or more presented possibilities. Inherently, this framing sets up students to attend to certain solutions in their writing. So, for instance, if students are prompted to explain why a given solution is incorrect, agreeing with that solution essentially is off the table.

|Table 3 |

|Framework for Coding the Press for the Number of Solutions to Consider in the Writing |

|Code |Definition |Examples |

|Consider only their own |Students are presses to write about their own |Explain your thinking. |

|solution |solution. |Explain how you found your answer. |

|Consider one solution |Students are pressed to consider one solution that has|Student A thinks this. Tell why he is wrong. |

| |been provided either explicitly or implicitly. |What is the mistake? |

|Consider two or more |Students are pressed to consider two or solutions that|Student A thinks this and Student B thinks that. Which one do|

|solutions |have been provided. This can be done explicitly or |you agree with, and why? |

| |implicitly. |Do you agree or disagree, and why? |

Data Coding Procedures

We had to determine a way to consistently measure for the press for an amount of writing to produce code because the student books varied in style. For those that included lines for students to write, including Go Math, Math Expressions, and My Math, as well as most writing prompts in Everyday Math, we measured the first line to the nearest quarter inch and multiplied this number by the total number of lines to determine the amount of space. Investigations and Trailblazers as well prompts in the Everyday Math “Math Boxes” offered space to write but no lines. To estimate the amount of writing space, we measured the amount of space between two lines on another page to determine the amount of space between each line (e.g., 0.375 inch). Next, we measured the amount of space between the prompt and the first line on another page and subtracted the amount of space between each line to find the space between the prompt and the writing (e.g., 0.750 inch - 0.375 inch = 0.375 inch). We relied on the standard length of a full line of text on another page to estimate the length of each line of writing. We also measured the amount of space between the last line and the bottom of a page or Math Box to determine the bottom margin. Ultimately, we multiplied the length of one line by the number of lines. Similar to the method employed by Mayer, Sims, and Tajika (1995) who measured the area in square inches of the space “occupied by exercises, irrelevant illustrations, relevant illustrations, and explanation” (p. 446), we did not have a second rater code this category due to its objectivity. (We obviously could not code the enVision and Saxon Math textbooks. We also did not code prompts that asked students to write on the back or separate page because of the great variance in the amount of space.)

We relied on Glaser and Strauss’ (1967) constant comparative method for all other codes. We continuously compared prompts to determine their similarities and differences to find patterns in the writing prompts. We maintained a codebook as a living document that detailed our ongoing adjustments in categories, their definitions, and examples. For instance, initially we determined that the writing prompts sought to have students provide their own solutions when asked if they agreed with a given solution. We subsequently realized that students were being positioned to consider two solutions that they could record in writing, including if they agreed or disagreed. Thus, we added the “consider two or more solutions” code to the press for the number of solutions to consider in the writing category. We coded each writing prompt using the descriptors noted in the analytic framework.

At least two researchers met and coded a sample of writing prompts for these categories to ascertain the utility of the codebook and made adjustments to it as needed. Then, two researchers independently coded each writing prompt, and a third scorer individually coded any discrepancies. We discussed any resulting disagreements to come to a consensus, and this included writing prompts we were unsure as to how students might answer (e.g., Go Math’s “How do you know your answer is reasonable?” and Everyday Math’s “How do you know your answer makes sense?”). As furthered by Thompson, Senk, and Johnson (2012), coming to a consensus “precludes any measure of inter-coder agreement” (p. 264). We regularly met throughout this process, and all researchers called a meeting any time we felt that the adjustment to the codebook was substantial enough to call into question our procedures.

Data Analysis

We entered codes for each of the writing prompts for the aforementioned categories into a spreadsheet using IBM SPSS Statistics 19. We entered the codes for each of the types of presses for writing as categorical variables. For example, we coded the press for the number of solutions to consider in the writing as “1, 2, or 3” if the students were pressed to write about only their own solution, one other person’s solution, or two or more other solutions, respectively. We coded other categories, such as press for a type of writing, for the presence (“1”) or absence (“0”) of a descriptor (e.g., explain what). Coding the data in this way allowed us to calculate frequency counts and percentages for each of the types of presses by each of the student books individually as well as across all of the curricula. We also calculated the mean and range of the amount of space students were provided for writing for the press for an amount of writing to produce category.

Given the significance of the press for a type of writing code, we also analyzed the relationship between it and the press to write about procedures or concepts in order to further explore the ways in which students are pressed to write mathematically. As previously explained, only explain what or explain why writing prompts were coded for the press to write about procedures or concepts. We also calculated the frequency counts and percentages for the number of writing prompts that were coded as explain what-conceptual, explain what-procedural, explain why-conceptual, and explain why-procedural.

Results

How Often Students are Pressed to Write

Altogether we analyzed 1,949 writing prompts from the student books across the nine commercially available comprehensive Grade 3 curriculum resources. Our first research question asked how often students were prompted to write. The results indicate that the amount students are pressed to write greatly varies, with a range of 44-486 writing prompts.

To understand how often students are pressed to write across a 180-day school year, a minimum for most states (Education Commission of the States, 2013), we converted the number of writing prompts in each curriculum to reflect the average number posed per week or day[7]. Three student books, including those from the Everyday Math, Math in Focus, and Saxon Math series, had between 50 and 100 writing prompts, translating to about one to two questions per week. Another three, including Investigations, Math Expressions, and Trailblazers, had between 150 and 200 writing prompts or about one per day. The other three resources, including enVision, Go Math, and My Math, had between 350 and 500 writing prompts suggesting a minimum of two written responses per day.

Press for an Amount of Writing to Produce

Due to the wide variation in how often students were pressed to write, we were interested in the amount of writing students were pressed to produce. Similar to the number of writing prompts, there was a wide range, from as little as 9.4 inches (a little more than 1 line) to as much as 22.5 inches (about 3 lines), in the mean amount of space provided for students to write (see Table 4). All student books analyzed had a minimum of less than one complete line. One student books, Investigations, included writing prompts with 1 inch or less of a line for students to write their responses. The difference between the minimum amount of inches provided and the maximum in each curriculum was at least 24.5 inches (about 4 lines).

|Table 4 |

| |

|Number of Writing Prompts and Amount of Space Provided for Students to Write, by Curriculum Resource |

|Curriculum Resource |Number of Prompts Pressing |Length of Space Provided for Students to Write (inches) |

| |Students to Write | |

| | |M |Range |

|enVision |345 |a |a |

|Everyday Math |80 |16.1 |4.0 - 49.0 |

|Go Math |405 |9.4 |2.0 - 27.0 |

|Investigations |181 |22.5 |0.0 b - 104.0 |

|Math Expressions |154 |11.0 |4.0 c - 32.5 |

|Math in Focus |70 |15.0 |4.3 - 39.3 |

|My Math |486 |9.7 |2.0 - 26.5 |

|Saxon Math |59 |a |a |

|Trailblazers |169 |14.0 |4.0d - 54.0 |

|a Note: The enVision and Saxon Math student resources only included textbooks requiring students to write their responses elsewhere. |

|Therefore, we were not able to determine the amount of space for writing. b Investigations contained one writing prompt with no additional |

|space for students to write; the least amount of space they provided students without that prompt was 1 inch. c Math Expressions includes |

|five writing prompts labeled “Math Journal.” Each is situated at the bottom of the page with no additional space for students to write. |

|These prompts were not analyzed for the space provided for students to write because we deemed it likely that students are expected to do |

|so on a separate page. d Trailblazers student resources included a consumable workbook and a textbook. The textbook required students to |

|write their responses elsewhere. Therefore, we were not able to determine the amount of space for writing for that subset of writing |

|prompts. |

Press for Addressing Particular Mathematical Content in the Writing

As seen in Table 5, the writing prompts across the student books addressed the Grade 3 CCSS-M (NGA & CCSSO, 20101a) content domains and clusters to various extents. The writing prompts were most frequently coded as OA (27.8%) or NBT (16.5%). In the case of enVision (n = 385) and Math Expressions (n = 154), over 50% of the writing prompts were coded for these two domains. There also was a high frequency of prompts coded as OA or NBT in Go Math, Math Expressions, Math in Focus, and My Math resources.

Investigations, Trailblazers, and Saxon Math were not consistent with this finding. Investigations had the highest number of writing prompts coded as MD-d (39.8% or 72 prompts) indicating data content. Trailblazers had about the same number of prompts across OA, NBT, and MD-d accounting for about 60% or 105 of the writing prompts in the curriculum. Spanning across the student books, less than 10% of the writing prompts pressed students to write about measurement, area, or perimeter. Lastly, all of the student books had some writing prompts pressing students to write about content not reflected in the Grade 3 content standards. Saxon Math had the highest number of writing prompts (25/59 prompts or 42.4%) not reflected in the Grade 3 content standards.

|Table 5 |

| |

|Number and Percentage of Writing Prompts in Curriculum Resources Coded by Grade 3 CCSS-M Content Standards |

|Curriculum Resource |

|Curriculum Resource |

| |Explain What |Explain Why |

|Curriculum |

|(total prompts) |

|Curriculum Resource | Press for a Writing Feature Not |Press for a Writing Feature |

| |Included |Included |

|enVision (n = 345) |330 (95.7%) |15 (4.3%) |

|Everyday Math (n = 80) |76 (95.0%) |4 (5.0%) |

|Go Math (n = 405) |389 (96.0%) |16 (4.0%) |

|Investigations (n = 181) |146 (80.7%) |35 (19.3%) |

|Math Expressions (n = 154) |120 (77.9%) |34 (22.1%) |

|Math in Focus (n = 70) |55 (78.6%) |15 (21.4%) |

|My Math (n = 486) |464 (95.5%) |22 (4.5%) |

|Saxon Math (n = 59) |55 (93.2%) |4 (6.8%) |

|Trailblazers (n = 169) |158 (93.5%) |11 (6.5%) |

|All Curricula (N = 1,949) |1,792 (91.9%) |156 (8.0%) |

Press for Writing About Their Own or Others’ Solutions

Table 9 reveals how students were most often pressed to write about their own solutions across all resources (63.2%), minimally 42.3% (enVision, n = 345) and maximally 93.8% (Everyday Math, n = 80) within a student book. Six of the nine resources prompted students to write about “one solution” or “two solutions” at about the same frequency. Math Expressions (23.4% out of 154 total) and Saxon Math (42.4% out of 59 total) prompted students to write more often about two solutions respectively compared to one solution 5.8% and 8.5% of the time. Conversely, Math in Focus (n = 70) prompted students to write about one solution (37.1%) more frequently than two solutions (2.9%).

|Table 9 |

| |

|Percentage of Writing Prompts Coded for the Press for Writing About Their Own or Others’ Solutions, by Curriculum |

|Resource |

|Curriculum Resource |Own Solution |One Solution |Two Solutions |

|enVision |146 |89 |110 |

|(n = 345) |(42.3%) |(25.8%) |(31.9%) |

|Everyday Math |75 |3 |2 |

|(n = 80) |(93.8%) |(3.8%) |(2.5%) |

|Go Math |278 |77 |50 |

|(n = 405) |(68.6%) |(19.0%) |(12.3%) |

|Investigations |143 |22 |16 |

|(n = 181) |(79.0%) |(12.2%) |(8.8%) |

|Math Expressions |109 |9 |36 |

|(n = 154) |(70.8%) |(5.8%) |(23.4%) |

|Math in Focus |42 |26 |2 |

|(n = 70) |(60.0%) |(37.1%) |(2.9%) |

|My Math |306 |91 |89 |

|(n = 486) |(63.0%) |(18.7%) |(18.3%) |

|Saxon Math |29 |5 |25 |

|(n = 59) |(49.2%) |(8.5%) |(42.4%) |

|Trailblazers |103 |34 |32 |

|(n = 169) |(60.9%) |(20.1%) |(18.9%) |

|Across All Curricula |1,231 |356 |362 |

|(n = 1,949) |(63.2%) |(18.3%) |(18.6%) |

Discussion

While having students write has been alluded to in mathematics education standards (cf. NGA & CCSSO, 2010a; NCTM, 1989, 1991, 2000), the adoption of the CCSS by the grand majority of states (“Standards in Your State,” n.d.) presents the current need for teachers to focus on this practice in their instruction. However, the standards do not specify “how to begin making essential changes to implement these standards” (NCTM, 2014, p. 4). Many (cf., Chingos & Whitehurst, 2012; Mesa, 2004) have recognized the central role of curriculum in instruction that may support mathematical writing. Recently, 85% of elementary teachers teaching math reported relying on commercially available programs, with 81% of them covering between 75-100% of the text (Banilower et al., 2013). Given that a significant number of teachers utilize curriculum resources to carry out standards, we are obligated to review the extent to which and how students’ books press them to write in mathematics.

The results of this study indicate differing levels of attention given to writing across the student books from curriculum programs. Like Herbel-Eisenmann (2007), our intent is not to undermine the work of curriculum developers. We recognize that full implementation and interpretation of the CCSS, particularly the intersection of the mathematics and English language arts standards, is ongoing, comprehensive, and costly. Nonetheless, we heed Mark, Spencer, Zeringue, and Schwinden’s (2010) recommendation that “curriculum leaders need improved mechanisms for obtaining crucial information about the quality of effectiveness of textbooks” (p. 209), so next we humbly include suggestions regarding the extent and ways in which third graders could be pressed to write. Given that Houghton Mifflin Harcourt, McGraw-Hill, and Pearson together account for 97% of the market share of comprehensive elementary-level mathematics curriculum resources (Banilower et al., 2013), we are hopeful that the results of this study can inform later work across multiple curriculum programs. Throughout this discussion, we also offer limitations and suggestions for future research.

Implications Stemming from the Results

Overall, the results of this research reflect Usiskin’s (2010) observation that, “at no time in history has a greater variety of curriculum materials been available” (p. 27). Seemingly, this assortment may have contributed to great variation across the student books in several coding categories focused on the extent and manners in which students are pressed to write mathematically. While teachers may have numerous resources from which to draw, the results further suggest a need for more explicit guidance on mathematical writing. Given the footwork developing our analytic framework, throughout we respectfully offer suggestions that might help guide future work delineating our calls for the mathematics education field to explicitly address mathematical writing.

We too remind readers that we are not suggesting that the results of this study indicate that these practices are taking place instructionally. Additional research would need to investigate actual practices. Instead, we have limited our focus to the opportunities presented via third-grade student books included in the previously mentioned curriculum resources.

How often students are pressed to write. The frequency that students write in math class may influence the extent to which they may deepen their learning due to their writing. All of the resources we investigated pressed students to record at least one sentence, which is significantly more than the prompts from the textbooks used in Cyprus, Ireland, and Taiwan analyzed in Charalambous et al.’s 2010 study of addition and subtraction of fractions that only “required single answers” (p. 139). However, the range of the number of writing prompts in the current study differed greatly, with Math in Focus pressing students to write the fewest number of times at about once every 4 days and My Math pressing them to do so the most about two-and-a-half times per day.

It is important to note that the prompts identified as data for this study were not the only work that students were to complete in the student books. In addition to instructional time, students also were pressed to answer questions requiring other types of responses, such as numerical answers. Given that the Grade 3 CCSS-ELA (NGA & CCSSO, 2010b) recommends students write “for shorter time frames (a single sitting or a day or two) for a range of discipline-specific tasks” (p. 21), questions remain as to how often and the amount of time students should be pressed to write in mathematics. Those contemplating this decision may consider prioritizing the kinds of prompts to address time limitations and identify those that are more worthwhile in developing students’ abilities to write argumentative and informative texts that are recommended in the CCSS-ELA (NGA & CCSSO, 2010b).

Press for an amount of writing to produce. Students may interpret the amount of space provided to write as the expected extent of their responses, such as a brief one conveyed in less than one line across the page. The student books across the curriculum resources greatly varied in the amount of space for third graders to write their responses, with roughly half a line to eight lines of text. We speculate that curriculum publishers may have included more or less “white space” in resources without lines for aesthetic reasons that regardless might press students to write particular amounts. We believe that more attention ought to be given to the utility and possible interpretation of white space by students. It would be worthwhile for the mathematics education community to identify appropriate expectations for the amount of writing across various writing prompts (e.g., which ones to have students respond to briefly and those that would require at least one paragraph) and for researchers to investigate developmental trajectories to provide teachers with benchmarks from which to base their instruction.

Press for addressing particular mathematical content in the writing. The mathematical content that is the focus of students’ writing may call for particular text characteristics, such as visual representations. The overall findings suggest that the writing prompts across all curricula reflected three of the four Grade 3 CCSS-M critical content areas (NGA & CCSSO, 2010a). More than half of the prompts focused on the number-related OA, NBT, and NF domains that address two critical areas in developing an understanding of multiplication and division strategies as well as unit fractions. Number concepts may press students to record numerical and symbolic representations, including equations. As such, if students focus their writing in these areas, it is plausible that students might not realize that mathematical writing, including proof, can also entail prose. This may be particularly problematic with problems involving procedures, which we discuss later.

While most prompts addressed number concepts, less than 10% of all prompts focused on fractions. The National Math Panel (2008), in addressing the necessary proficiencies for the learning of algebra, noted U.S. students’ underdeveloped understanding of fractions. This calls into question why third graders were pressed relatively fewer times in this critical area when the literature suggests that writing allows for more in-depth understanding of concepts (Borasi & Rose, 1989; Rothstein, A., Rothstein, E., & Lauber, 2003).

One can argue that geometry too may bolster Grade 3 students’ realization of the role of representations to communicate their thoughts (NCTM, 2000). However, although it is one of the four critical areas, it accounted for only 8.2% of writing prompts. Similarly, even though area and its relation to rectangular arrays disputably can be conveyed using representations and is the fourth critical area, only 3.1% of the all prompts addressed this topic. Taken together, students might not have as many opportunities to bolster their learning through writing of geometric measurement.

In addition to reflecting some of the content highlighted across the CCSS-M (2010a), curriculum authors from individual series perhaps felt that there are some topics that are more or less conducive to writing. This may help explain why the other MD clusters that are not considered to be critical areas, including measurement, data, and perimeter, together with prompts addressing content not in the Grade 3 content standards, accounted for 10.2% of all prompts. Nonetheless, since the results across the resources varied considerably, one must question if this is the case overall.

Given our anecdotal experience with teachers lamenting that there is limited time to write in their mathematics classes, we believe it is important for mathematics educators to identify the content areas that may be more worthwhile areas of focus. Considerations may be given to what content might encourage particular forms of reasoning and evidence (e.g., symbolic). Certain content may also benefit from writing-to-learn advantages, such as establishing a more solid foundation in the understanding of fractions. In the long-term, some content calling for particular features may help establish a stronger foundation than others for more formal written proof expected in the later years.

Press for a type of writing. Presses for particular types of writing are central to conversations needed to further define mathematical writing. While NCTM (1991) did describe worthwhile mathematical tasks as ones that “promote communication about mathematics” (p. 25), this and other national curriculum documents do not clearly explicate the types of writing teachers should pose to their students. The CCSS-ELA (2010b) Writing Standards do identify opinion/argumentative and informative/explanatory texts as two of the three predominant types of writing for third graders, yet they do not specifically address mathematics. Regardless, emphasis on these text types may help clarify why the results indicate that, on average, the student books pressed students to explain what or explain why more than half of the time. In addition, there were presses for other types of writing (e.g., describe an observation, compose a word problem) each accounting for less than 10% of all prompts yet indicating plausible additional purposes for writing.

While the mathematics education community is unclear about they types of writing that may be worthwhile endeavors, we have posited that reasoning-focused writing might be the most valuable (Cohen, Casa, Miller, & Firmender, in press). Reasoning is a cornerstone of mathematics as evidenced in several standards, including the 2000 NCTM Reasoning and Proof process standard and the titles of the 2010a CCSS-M’s MP 2, MP 3, and MP 8. In fact, the CCSS-ELA (2010c) also emphasizes that students should “provide reasons” (p. 20) to support their positions in argumentative writing. However, seven of the nine student books prompted students to explain what more frequently than explain why. We posit that a prompt pressing third graders to explain what they did to solve a problem does not compel them to share their reasoning.

In deliberating what types of prompts might be ones to emphasize, we further must consider how, through the CCSS, our nation is geared to prepare our youth for college and careers (NGA & CCSSO, 20101a, 2010b). Kevin P. Lee, a mathematician who wrote “A Guide to Writing Mathematics” (n.d.), explains: “In your mathematics writing, you will want to be communicating to the reader why and how you arrived at a solution. You will also want to convince your reader that your particular reasons and your particular means to the solution are correct”(p. 9; original emphasis). Thus we set forth that a greater proportion of writing prompts should press students to explain why.

Press to write about procedures or concepts. The press for a type of writing may further be impacted by the nature of the mathematics, mainly procedural and conceptual understanding. The number of explain what and explain why writing prompts coded as procedural reflects only those positioning students to write at least one complete sentence. There were many more prompts across the books that asked students to record their procedure numerically or symbolically and, consequently, were not analyzed in this study. Nonetheless, across student books, almost three-quarters of the prompts pressed students to write about procedures and almost half of them to write about concepts.

According to the 1995 NCTM Assessment Standards, procedural skills are “best assessed in the same way they are used, as tools for performing mathematically significant tasks” (p. 11). For instance, “Explain how to find the length of each side of a triangle with sides of equal length, and a perimeter of 27 inches” (Go Math, p. 435) presses students to write about their procedures, yet students arguably can more efficiently do so by recording their steps. Prompts that press students, for instance, to argue the merits of someone else’s reasoning, might be more valuable (NCTM, 1991).

Press for the inclusion of a specific writing feature. Mathematical writing of any type can encourage various writing features. For instance, the CCSS-M (NGA & CCSSO, 2010a) calls for students to “construct viable arguments” (p. 6). It is reasonable to suggest that students need to learn how to develop an argument that has specific characteristics, such as the inclusion of a claim, evidence, and warrants proposed by Toulmin (1958). They further would need to understand what “counts” as mathematical evidence and what can serve as warrants.

Students may perceive tasks that do not attend to particular writing features as ambiguous. To overcome this, students “often urge the teacher to make such tasks more explicit, thereby reducing or eliminating the difficult sense-making aspects of the task” (Stein, Grover & Henningsen, 1996, p. 462). Although another way to address this challenge is to include such guidance in the writing prompts themselves, the grand majority of the writing prompts analyzed did not include this support. Curriculum authors may have assumed that local norms might provide students with support. Nonetheless, the question of how much guidance should be present within a task before the prompt becomes too explicit remains (Stein et al., 1996). Relatedly, the mathematics education community may address some related questions addressing the amount of guidance (e.g., Should the support change throughout the school year, with more provided at the beginning of the year?) and for different purposes (e.g., Should the support differ between assessments and instructional tasks?).

Press for writing about their own or others’ solutions. Attention to the audience also may influence the ensuing writing. In most cases, students were pressed to write about their own solutions. While this can assist students in developing their own perspectives, it does not necessarily encourage them to “critique the reasoning of others” (NGA & CCSSO, 2010a, p. 6, emphasis added). One explanation that prompts asking students to write about their own reasoning was more prominent may be because curriculum authors did not have the opportunity to yet tease out the nuances of MP 3. Another explanation might be that they were more keenly focused on the alignment of the curriculum materials to the mathematical content standards. Notwithstanding, teachers, curriculum authors, and assessment developers would benefit from guidance from the mathematics education community regarding recommendations for the extent students should be positioned to write critiques about others’ reasoning.

Implications Stemming from the Methodology

We encountered tests similar to Charalambous and colleagues (2010) who noted “inherent challenges in textbook analysis” (p. 145). Throughout the coding process, we found the need to position ourselves as third graders in order to decipher the plausible intent of the writing prompt. Arguably our greatest challenge was to try to determine the ways in which students were being pressed as accurately as we could, but simply there were times we needed to infer them after conferencing. We were surprised at how often we were positioned to do so. While the lack of explicit mathematical writing guidelines may have contributed to this phenomenon, a focus on the writing prompts and attention to wording is advisable. For instance, we were unsure as to how students were being pressed when asked, “How do you know that your answer is reasonable?” that was frequently included in the Go Math resource. We anticipated that third graders might think—and write—that what they did to solve a given problem obviously should be “reasonable.” It is important to reemphasize that the student books are a curriculum resource, thus, teachers together with their students help establish local norms that would press them to write responses that are more mathematically acceptable.

Teachers and students would benefit from writing prompts that more clearly present the intent of the task. Like Schmidt and Burroughs (2013), we recognize that publishers have had a relatively limited amount of time to faithfully address all calls set forth in the CCSS-M (NGA & CCSSO, 2010a), and attending to guidelines that cut across the mathematics and English language arts CCSS (2010b), such as mathematical writing, adds yet another layer of involvedness. Nonetheless, we anticipate that categorically focusing on the writing prompts would help address this current drawback.

Implications Stemming from the Development of the Analytic Framework

One contribution of the study reported herein is the development of an analytic framework. The extent of this work was compounded by the fact that currently there are no national guidelines that explicitly address mathematical writing. While we systematically attended to this limitation of the field and developed an operational framework, the limitation we revealed has numerous implications that go beyond this investigation. Mathematics education researchers are compelled to further define “writing,” clarify the types of writing that are in line with the field’s vision, and provide recommendations about the amount of time and extent that students should write. In turn, they, along with teachers, curriculum authors, and assessment developers would benefit from a more cohesive and clear vision from which to expand this work. Ultimately, the outcomes of this effort would help guide research endeavors.

One such plausible and welcomed implication for subsequent curriculum resource design is for student books to be more explicit and thoughtful about which prompts are intended for students to respond to in writing. While we were challenged to deduce if a prompt, in fact, was meant to encourage students to write (e.g., using an icon, blank space) or not (e.g., very limited space to write), students and teachers alike would benefit from having clear direction as would researchers expanding on this work. We recognize that teachers utilize student books as resources for their instruction (Choppin, 2011), and we put forth that these resources should allow teachers to make informed decisions that, in part, includes writing prompts that have been carefully considered.

Conclusion

The results of this study provide a benchmark for several constituencies whose work is interwoven. Researchers may address the understudied area of mathematical writing and utilize or build upon the analytic framework developed for this study. Third-grade teachers can deduce the extent to which the student books in their curriculum resources bring the mathematical writing expectations as represented in the CCSS-M (NGA & CCSSO, 2010a), CCSS-ELA (2010b), and either PARCC or SBAC assessments. Curriculum authors, who are influenced by the standards (Reys, B., Reys, R., & Rubenstein, 2010), may most directly benefit since the results of this research can be directly implemented into future offerings. Assessment developers also may employ the analytic framework to further the development of test prompts and rubrics.

As Stein, Grover, and Hennigsen (1996) suggest, “a task can be viewed as passing through three phases: first, as curricular or instructional materials; second as set up by the teacher in the classroom; and third, as implemented by students during the lesson” (p. 460). The study presented here is in the first phase or, as Mesa (2004) termed, a “hypothetical enterprise.” Analyzing the implemented curriculum would be a necessary next step.

Lastly, we recognize that other curriculum resources are or will be available in the future. A replication of this study with these resources and across other grade levels is also necessary for the field to better understand possible and appropriate shifts called for in the CCSS-ELA (2010b) k-5 and 6-12 Writing Standards. Since there will be future revisions to curriculum resources as the intentions of the CCSS are further discussed, we realize the need for a follow-up study in the coming years to ascertain the extent to which students are pressed to write in their mathematics classes.

References

Baker, D., Knipe, H., Collins, J., Leon, J., Cummings, E., Blair, C., & Gamson, D. (2010). Journal for Research in Mathematics Education, 41(4), 383-423.

Banilower, E. R., Smith, P. S., Weiss, I. R., Malzahn, K. A., Campbell, K. M., & Weis, A. M. (2013). Report of the 2012 national survey of science and mathematics education. Chapel Hill, NC: Horizon Research.

Bargagliotti, A. E. (2012). How well do the NSF funded elementary mathematics curricula align with the GAISE report recommendations? Journal of Statistics Education, 20(3), 1-26.

Baroody, A. J., Feil, Y., & Johnson, A. R. (2007). An alternative reconceptualization of procedural and conceptual knowledge. Journal for Research in Mathematics Education, 38(2), 115-131.

Beal, S., Beissinger, J. S., Berndt, D., Cape, E., Ditto, C., Jones, L. … Stoelinga, T. (2014a). Math trailblazers: Common core state standards, student activity book, vol. 1, (4th ed.). Dubuque, IA: Kendall Hunt.

Beal, S., Beissinger, J. S., Berndt, D., Cape, E., Ditto, C., Jones, L. … Stoelinga, T. (2014b). Math trailblazers: Common core state standards, student activity book, vol. 2, (4th ed.). Dubuque, IA: Kendall Hunt.

Beal, S., Beissinger, J. S., Berndt, D., Cape, E., Ditto, C., Jones, L. … Stoelinga, T. (2014c). Math trailblazers: Common core state standards, student guide, vol. 1, (4th ed.). Dubuque, IA: Kendall Hunt.

Beal, S., Beissinger, J. S., Berndt, D., Cape, E., Ditto, C., Jones, L. … Stoelinga, T. (2014d). Math trailblazers: Common core state standards, student guide, vol. 2, (4th ed.). Dubuque, IA: Kendall Hunt.

Bell, M. et al. (2012a). Everyday mathematics: The University of Chicago school mathematics project, student math journal, grade 3, vol. 1. Chicago, IL: McGraw Hill Education.

Bell, M. et al. (2012b). Everyday mathematics: The University of Chicago school mathematics project, student math journal, grade 3, vol. 2. Chicago, IL: McGraw Hill Education.

Borasi, R., & Rose, B. J. (1989). Journal writing and mathematics instruction. Educational

Studies in Mathematics, 20(4), 347-365.

Cai, J., Lo, J. J., & Watanabe, T. (2002). Intended treatment of arithmetic average in U.S. and Asian school mathematics textbooks. School Science and Mathematics, 102(8), 391-404.

Carter, J. A. et al. (2013a). My math, student edition, grade 3, vol. 1. Chicago, IL: McGraw Hill Education.

Carter, J. A. et al. (2013b). My math, student edition, grade 3, vol. 2. Chicago, IL: McGraw Hill Education.

Charalambous, C. Y., Delaney, S., Hsu, H.-Y., & Mesa, V. (2010). A comparative analysis of the addition and subtraction of fractions in textbooks from three countries. Mathematical Thinking and Learning, 12(2), 117-151.

Charles, R. I. et al. (2012). enVision math common core, grade 3. Upper Saddle River, New Jersey: Pearson Education.

Chingos, M. M., & Whitehurst, G. J. (2012). Choosing blindly: Instructional materials, teacher effectiveness, and the Common Core. Washington, DC: Brown Center on Education Policy at Brookings: 1-27.

Choppin, J. (2011). Learned adaptations: Teachers’ understanding and use of curriculum resources. Journal of Mathematics Teacher Education, 14(5), 331-353.

Cohen, J., Casa, T. M., Miller, H. C., & Firmender, J. (in press). Characteristics of second graders’ mathematical writing. To be published in School Science and Mathematics.

de Jong, T., & Ferguson-Hessler, M. G. M. (1996). Types of qualities of knowledge. Educational Psychologist, 31, 105-113.

Dixon, J. K., Larson, M., Leiva, M. A., & Adams, T. L. (2012). Go math! Common core edition, grade 3. Orlando, FL: Houghton Mifflin Harcourt Publishing Company.

Emig, J. (1977). Writing as a mode of learning. College Composition and Communication, 28(2), 122-128.

Education Commission of the States. (2013, March 15). Number of instructional days/hours in the school year. Retrieved from

Education Market Research. (2011). National survey of mathematics. Retrieved from

Fuson, K. C. (2013a). Math expressions common core, volume 1. Orlando, FL: Houghton Mifflin Harcourt Publishing Company.

Fuson, K. C. (2013b). Math expressions common core, volume 2. Orlando, FL: Houghton Mifflin Harcourt Publishing Company.

Glaser, B. G., & Strauss, A. (1967). The discovery of grounded theory. Chicago: Aldine.

Grouws, D. A., Smith, M. S., & Sztajn, P. (2004). The preparation and teaching practices of U.S.

mathematics teachers: Grades 4 and 8. In P. Kloosterman & F. Lester (Eds.), The 1990 through 2000 mathematics assessments of the National Assessment of Educational Progress: Results and interpretations (pp. 221-269). Reston, VA: National Council of Teachers of Mathematics.

Haggarty, L. & Pepin, B. (2002). An investigation of mathematics textbooks and their use in English, French and German classrooms: Who gets an opportunity to learn what? British Educational Research Journal, 28(4), 567-590.

Hake, S. (2012). Saxon math, intermediate 3, student edition. Orlando, FL: Houghton Mifflin Harcourt Publishing Company.

Herbel-Eisenmann, B. A. (2007). From intended curriculum to written curriculum: Examining the “voice” of a mathematics textbook. Journal for Research in Mathematics Education, 38(4), 344-369.

Hudson, R. A., Lahann, E. P., and Lee, J. S. (2010). Considerations in the review and adoption of mathematics textbooks. In Reys, B. J., Reys, R. E., & Rubenstein, R. (Eds.), Mathematics curriculum: Issues, trends, and future directions (213-229). Reston, VA: National Council of Teachers of Mathematics.

Kasparek, R. F. (1996). Effects of using writing-to-learn mathematics. Paper presented at the Annual Meeting of the American Education Research Association: New York, NY.

Kheong, F. H., Ramakrishnan, C., & Choo, M. (2013a). Math in focus: Singapore math, hardcover student book 3A. Singapore: Houghton Mifflin Harcourt.

Kheong, F. H., Ramakrishnan, C., & Choo, M. (2013b). Math in focus: Singapore math, hardcover student book 3B. Singapore: Houghton Mifflin Harcourt.

Kheong, F. H., Ramakrishnan, C., & Choo, M. (2013c). Math in focus: Singapore math, workbook 3A. Singapore: Houghton Mifflin Harcourt.

Kheong, F. H., Ramakrishnan, C., & Choo, M. (2013d). Math in focus: Singapore math, workbook 3B. Singapore: Houghton Mifflin Harcourt.

Langrall, C. W. (2014). Linking research to practice: Another call to action? Journal for Research in Mathematics Education, 45(2), 154-156.

Lee, K. P. (n.d.). A guide to writing mathematics. Retrieved from

Mark, J., Spencer, D., Zeringue, J. K., & Schwinden, K. (2010). How do districts choose mathematics textbooks? In Reys, B. J., Reys, R. E. & Rubenstein, R. (Eds.), Mathematics Curriculum: Issues, trends, and future directions (199-212). Reston, VA: National Council of Teachers of Mathematics.

Mayer, R. E., Sims, V., & Tajika, H. (1995). A comparison of how textbooks teach mathematical problem solving in Japan and the United States. American Educational Research Journal, 32(2), 443-460.

Mesa, V. (2004). Characterizing practices associated with functions in middle school textbooks: An empirical approach. Educational Studies in Mathematics 56, 255-286.

National Council of Teachers of Mathematics. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: Author.

National Council of Teachers of Mathematics. (1991). Professional standards for teaching mathematics. Reston, VA: Author.

National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA: Author.

National Council of Teachers of Mathematics, (2014). Principles to actions: Ensuring mathematical success for all. Reston, VA: Author.

National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010a). Common core state standards for mathematics. Washington, DC: Authors. Retrieved from

National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010b). Common core state standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC: Authors. Retrieved from

National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010c). Common core state standards for English language arts & literacy in history/social studies, science, and technical subjects: Appendix a: Research supporting, key elements of the standards, glossary of key terms. Washington, DC: Authors. Retrieved from

The National Math Panel. (2008, March). Retrieved from

Odell, L. (1980). The process of writing and the process of learning. College Composition and Communication, 31(1), 42-51.

Partnership for Assessment of Readiness for College and Careers. (2013, July). Performance level descriptors – grade 3 mathematics. PARC, 1-18, Retrieved from

Smarter Balanced Assessment Consortium. (2012, April). Performance task specifications. SBAC, 1-8, Retrieved from

Reys, B. J., Reys, R. E., & Chavez, O. (2004). Why mathematics textbooks matter. Educational Leadership, 61(5), 61-66.

Reys, B. J., Reys, R. E., & Rubenstein, R. (Eds.). (2010). Mathematics curriculum: Issues, trends and future directions. Reston, VA: National Council of Teachers of Mathematics.

Rothstein, A., Rothstein, E., & Lauber, G. (2003). Write for mathematics. Thousand Oaks, CA: Corwin Press.

Ryve, A. (2011). Discourse research in mathematics education: A critical evaluation of 108 journal articles. Journal for Research in Mathematics Education, 42(2), 167-198.

Schmidt, W. H., Wang, H. C., & McKnight, C. C. (2005). Curriculum coherence: An examination of US mathematics and science content standards from an international perspective. Journal of Curriculum Studies, 37(5), 525-559.

Schmidt, W. H., & Houang, R. T. (2012). Curricular coherence and the common core state standards for mathematics. Educational Researcher, 41(8), 294-308.

Standards in Your State. (n.d.). In Common Core State Standards Initiative. Retrieved from

Stein, M. K., Grover, B. W., & Henningsen, M. (1996). Building student capacity for

mathematical thinking and reasoning: An analysis of mathematical tasks used in reform classrooms. American Educational Research Journal, 33(2), 455-488.

Tarr, J. E., Reys, R. E., Reys, B. J., Chávez, O., Shih J., & Osterlind, S. J. (2008). The impact of middle-grades mathematics curricula and the classroom learning environment on student achievement. Journal for Research in Mathematics Education, 39(3), 247-280.

TERC. (2012). Investigations in number, data, and space student activity book, Common core edition, grade 3. Upper Saddle River, NJ: Pearson Education.

The National Commission on Writing for America’s Families, Schools, and Colleges. (2003). The neglected “R”: The need for a writing revolution. Downloaded on 7/30/14 from

Thompson, D. R., Senk, S. L., & Johnson, G. J. (2012). Opportunities to learn reasoning and proof in high school mathematics textbooks. Journal for Research in Mathematics Education, 43(3), 253-295.

Toulmin, S. E. (1958). The uses of argument. London: Cambridge University Press.

Tyson-Bernstein, H., & Woodward, A. (1991). Nineteenth century policies for twenty-first century practice: The textbook reform dilemma. In P. Altback (Ed.), Textbooks in American society: Politics, policy, and pedagogy (pp. 91-104). Albany, NY: State University of New York Press.

Usiskin, Z. (2010). The current state of the school mathematics curriculum. In B. J. Reys, R. E. Reys, & R. Rhubenstein (Eds.), Mathematics curriculum issues, trends, and future directions: Seventy-second yearbook (pp. 25-39). Reston, VA: National Council of Teachers of Mathematics.

Vygotsky, L. S. (1962). Thought and language. Cambridge, MA: MIT Press.

Yang, D. C. (2005). Developing number sense through mathematical diary writing. Australian

Primary Mathematics Classroom, 10(4), 9-14.

-----------------------

[1] In fact, all nine series studied claim either in their titles and/or websites to be aligned with the CCSS.

[2] Banilower et al. (2013) also reported that the most common series used by elementary teachers in their national sample was enVision and Everyday Mathematics.

[3] Series varied on their inclusion of prompts for assessment and homework purposes, with some devoting resources separate from the student book and others incorporating these types of assignments. We hypothesized that it may be more common for curriculum resources to include writing prompts on assessments and homework sheets or the proportion may differ between such prompts compared to those designed for instruction. Since this potentially may have skewed the results and limited our implications, we decided to exclude all writing prompts included for assessment and homework purposes.

[4] We are embedded to the sales representatives for allowing us to access the curriculum resources investigated in this study.

[5] Thompson, Senk, and Johnson’s (2012) fundamental assumptions include: “1. The framework should be broad enough to capture various forms of both proof and disproof in different content areas; 2. The framework should be useable to analyze any high school mathematics textbook, whether aligned more with goals of reform or traditional approaches to teaching; 3. The framework should be useable to analyze fundamental characteristics of both the narrative and the exercise sets (e.g., homework tasks) of textbooks; [and] 4. The framework should build upon existing related research.” (p. 257).

[6] While students might reason to answer the question, we were interested in identifying what students might be pressed to record in writing. Also, although a procedure might be a component of the prompt, students might be asked to conceptually respond to the given procedure (e.g., why there is a “1” above the tens place in the sample addition problem that has been provided).

[7] Similar to other researchers, like Mesa (2004), we took the position that students would have the opportunity to answer each writing prompt.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download