Table 1: CFE Reports 2001-2003



Knowledge Accumulation:

Citation Analysis of Center for Education Reports, 2001-2003

Sarah Makela

University of Chicago and NORC

David Schalliol

University of Chicago and NORC

Barbara Schneider

University of Chicago and NORC

University of Chicago

August 2004

Table of Contents

Introduction 2

Citations Over Time 2

Influence of CFE Reports in Journals 6

Tracking the Influence of Scientific Research in Education through Citation Analysis 10

In-Depth Look at Shavelson & Towne (2002) Citations 16

Summary of In-Depth Look at Shavelson & Towne (2002) Citations 34

Conducting an ISI Web of Science Citation Analysis 35

Appendices

Appendix A: Sample CFE Report Citation Table 42

Appendix B: ISI Journal Coverage 44

Introduction

The purpose of this report is to document for the Center for Education (CFE) citation counts for 42 reports published from 2001 through 2003. The citation analysis of peer-refereed journals described in this report was compiled using ISI Web of Science (), Social Sciences and Arts & Humanities Citation Indexes.[1] This report details the number of citations over time by CFE publication and by citing journal, looks closely at the citations of the most influential CFE report, and concludes with instructions on how to conduct an ISI citation analysis.

Citations Over Time

Listed below in Table 1 are the CFE reports and the frequency of citations for years 2001-2004 as of July 2004. Of the 42[2] reports analyzed, just over 45 percent (19) were cited one or more times. Three of the reports, Adding It Up (2001), Knowing What Students Know (2001), and Scientific Research in Education (2002) account for more than 70 percent (80 out of 114) of the citations. Citations of Scientific Research in Education have increased two-fold each year since its 2002 publication and will presumably increase more by the time 2004 comes to a close. On the other hand, citations of Adding It Up and Knowing What Students Know slowed in 2003, and with more than half of 2004 over, there are not signs of a dramatic surge this year. The remaining 16 cited reports had a minimal impact in the peer-refereed journals between 2001 and July 2004: seven were cited just once, four were cited twice, two were cited three times, two were cited four times, and one was cited five times.

Table 1: CFE Reports Published 2001-2003, Citations by Year, January 2001-July 2004

| |Publication Name |Number of Citations by Year |

| | |2001 |2002 |2003 |2004 |Total |

|2001 | | | | | | |

| |Classroom Assessment and the National Science Education Standards: A Guide | |1 |2 | |3 |

| |for Teaching and Learning | | | | | |

| |Educating Teachers of Science, Mathematics, and Technology: New Practices | |2 |2 | |4 |

| |for a New Millennium | | | | | |

| |Improving Mathematics Education: Resources for Decision Making | | | | |0 |

| |Investigating the Influence of Standards: A Framework for Research in | | | | |0 |

| |Mathematics, Science, and Technology Education | | | | | |

| |Publication Name |Number of Citations by Year |

| | |2001 |2002 |2003 |2004 |Total |

| |Knowing What Students Know: The Science and Design of Educational Assessment|1 |7 |6 |2 |16 |

| |NAEP Reporting Practices: Investigating District-Level and Market-Basket | | | |1 |1 |

| |Reporting | | | | | |

| |The Power of Video Technology in International Comparative Research in | | | | |0 |

| |Education | | | | | |

| |Science, Evidence, and Inference in Education: Report of a Workshop |1 |1 | | |2 |

| |Testing Teacher Candidates: The Role of Licensure Tests in Improving Teacher| | |3 | |3 |

| |Quality | | | | | |

| |Understanding Dropouts: Statistics, Strategies, and High-Stakes Testing | |1 |1 | |2 |

|2002 | | | | | | |

| |Enhancing Undergraduate Learning with Information Technology | | |1 | |1 |

| |Evaluating and Improving Undergraduate Teaching in Science, Technology, | | |2 | |2 |

| |Engineering, and Mathematics | | | | | |

| |Helping Children Learn Mathematics | | | | |0 |

| |Improving Learning with Information Technology: A Report of a Workshop | | | | |0 |

| |Knowledge Economy and Postsecondary Education: Report of a Workshop | | |1 |1 |2 |

| |Learning and Understanding: Improving Advanced Study of Mathematics and | |1 |2 |1 |4 |

| |Science in U.S. High Schools: | | | | | |

| |Biology Panel Report (on-line only) | | | | |0 |

| |Chemistry Panel Report (on-line only) | | | | |0 |

| |Physics Panel Report (on-line only) | | | | |0 |

| |Mathematics Panel Report (on-line only) | | | | |0 |

| |Methodological Advances in Cross-National Surveys of Educational Achievement| | |1 | |1 |

| |Performance Assessments for Adult Education: Exploring the Measurement | | | | |0 |

| |Issues: Report of a Workshop | | | | | |

| |Reporting Test Results for Students with Disabilities and English-Language | | |1 | |1 |

| |Learners: Summary of a Workshop | | | | | |

| |Scientific Research in Education | |6 |13 |26 |45 |

| |Studying Classroom Teaching as a Medium for Professional Development: | | | |1 |1 |

| |Proceedings of a U.S.-Japan Workshop | | | | | |

| |Technically Speaking: Why All Americans Need to Know More About Technology | |4 | |1 |5 |

| |Technology and Assessment: Thinking Ahead—Proceedings from a Workshop | | | | |0 |

| |Publication Name |Number of Citations by Year |

| | |2001 |2002 |2003 |2004 |Total |

|2003 | | | | | | |

| |Committee on Performance Levels for Adult Literacy: Letter Report | | | | |0 |

| |Improving Undergraduate Instruction in Science, Technology, Engineering and | | | | |0 |

| |Mathematics: Report of a Workshop | | | | | |

| |Measuring Access to Learning Opportunities | | | | |0 |

| |Monitoring International Labor Standards: Human Capital Investment: Summary | | | | |0 |

| |of a Workshop | | | | | |

| |Monitoring International Labor Standards: National Legal Frameworks, Summary| | | | |0 |

| |of a Workshop | | | | | |

| |Monitoring International Labor Standards: Quality of Information, Summary of| | | | |0 |

| |a Workshop[3] | | | | | |

| |Monitoring International Labor Standards: Summary of Domestic Forums | | | | |0 |

| |Next Steps in Mathematics Teacher Development, Grades 9–12 | | | | |0 |

| |Planning for Two Transformations: in Education and Learning Technology: | | | | |0 |

| |Report of a Workshop (August) | | | | | |

| |Understanding Others, Educating Ourselves: Getting More from International | | | | |0 |

| |Comparative Studies in Education (February/April) | | | | | |

| |What is the Influence of the National Science Education Standards?: | | |1 | |1 |

| |Reviewing the Evidence, A Workshop Summary | | | | | |

| |TOTAL |7 |31 |39 |37 |114 |

• Source, CFE Reports List: Center for Education, National Research Council, National Academies, 2004

• Source, CFE Reports Citation Frequency: ISI Web of Science (), Social Sciences Citation Index, 1956-2004; Arts & Humanities Citation Index, 1975-2004

In Appendix A, we provide an example of the tables detailing each CFE report’s citation information, including Author, Corporate Author, Title, Journal, Type of Document, Abstract, Times Cited, Publication Date, Publication Year, Volume, and Issue. Uncited CFE reports do not have such tables. The full complement of CFE reports is available in a password-protected area of the Data Research and Development Center evaluation team website: .

Table 2 shows the number of citations each year by the year the reports were published. The top three cited reports were published in 2001 and 2002. For these publication years, a “lag effect” is seen wherein citations significantly increased in the year or two after publication. The 2003 reports had only one citation among them in 2003 and have received no citations thus far in 2004. It should be recognized that these tabulations represent only the first six months of 2004; however, it is worth noting that the 2002 reports already show a continued upward trend in citation counts, while it is too early to tell whether the 2001 reports will exhibit a similar pattern.

Table 2: CFE Reports Published 2001-2003, Citation Totals by Year, January 2001-July 2004

| |2001 |2002 |2003 |2004 |Total |

|2001 Reports |7 |20 |17 |7 |51 |

|2002 Reports |-- |11 |21 |30 |62 |

|2003 Reports |-- |-- |1 |0 |1 |

|TOTAL |7 |31 |39 |37 |114 |

Source: ISI Web of Science (), Social Sciences Citation Index, 1956-2004; Arts & Humanities Citation Index, 1975-2004

Influence of CFE Reports in Journals

Table 3 lists the 60 journals[4] that cited CFE reports and the number of citations by year, sorted from most to least citations. Journal for Research in Mathematics Education accounted for the most CFE citations (17), which is not surprising given CFE’s mathematics focus. Qualitative Inquiry had the second-most number of citations (12), followed by Educational Leadership and Educational Policy (5 each). Five journals included three citations, nine journals included two citations, and 42 included one citation.

For a deeper understanding of the context in which CFE reports are cited, we conducted a detailed analysis of the 45 Scientific Research in Education citations. See page 10.

Table 3: Number of Citations per Journal by Year, January 2001-July 2004

|Citing Journal |Number of Citations By Year |

| |2001 |2002 |2003 |2004 |Total |

|JOURNAL FOR RESEARCH IN MATHEMATICS EDUCATION |5 |5 |4 |3 |17 |

|QUALITATIVE INQUIRY |  |  |  |12 |12 |

|EDUCATIONAL LEADERSHIP |  |2 |1 |2 |5 |

|EDUCATIONAL POLICY |  |  |3 |2 |5 |

|AMERICAN EDUCATIONAL RESEARCH JOURNAL |  |  |3 |  |3 |

|JOURNAL OF THE LEARNING SCIENCES |  |1 |1 |1 |3 |

|PHI DELTA KAPPAN |1 |1 |1 |  |3 |

|SCHOOL PSYCHOLOGY QUARTERLY |  |3 |  |  |3 |

|TEACHERS COLLEGE RECORD |  |1 |  |2 |3 |

|APPLIED MEASUREMENT IN EDUCATION |  |1 |  |1 |2 |

|COMPUTERS IN HUMAN BEHAVIOR |  |2 |  |  |2 |

|EDUCATIONAL PSYCHOLOGY REVIEW |  |  |2 |  |2 |

|EXCEPTIONAL CHILDREN |  |1 |1 |  |2 |

|INTERNATIONAL JOURNAL OF TECHNOLOGY AND DESIGN EDUCATION |  |1 |  |1 |2 |

|JOURNAL OF EARLY INTERVENTION |  |2 |  |  |2 |

|JOURNAL OF RESEARCH IN SCIENCE TEACHING |  |1 |1 |  |2 |

|REVIEW OF EDUCATIONAL RESEARCH |  |1 |1 |  |2 |

|SCIENCE EDUCATION |  |  |2 |  |2 |

|ALTERNATIVE THERAPIES IN HEALTH AND MEDICINE |  |  |  |1 |1 |

|AMERICAN BEHAVIORAL SCIENTIST |  |  |  |1 |1 |

|AMERICAN PSYCHOLOGIST |  |  |  |1 |1 |

|ANNALS OF THE AMERICAN ACADEMY OF POLITICAL AND SOCIAL SCIENCE |  |  |1 |  |1 |

|ANNUAL REVIEW OF PSYCHOLOGY |  |  |  |1 |1 |

|ANTHROPOLOGY & EDUCATION QUARTERLY |  |  |1 |  |1 |

|APPLIED PSYCHOLOGICAL MEASUREMENT |1 |  |  |  |1 |

|BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION |  |  |1 |  |1 |

|BIOSCIENCE |  |  |1 |  |1 |

|CONSERVATION BIOLOGY |  |  |1 |  |1 |

|ECONOMICS OF EDUCATION REVIEW |  |  |  |1 |1 |

|EDUCATION AND URBAN SOCIETY |  |  |1 |  |1 |

|Citing Journal |Number of Citations By Year |

| |2001 |2002 |2003 |2004 |Total |

|EDUCATIONAL RESEARCH |  |  |  |1 |1 |

|EDUCATIONAL STUDIES |  |  |  |1 |1 |

|ETR&D-EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT |  |  |1 |  |1 |

|ELEMENTARY SCHOOL JOURNAL |  |1 |  |  |1 |

|EVALUATION AND PROGRAM PLANNING |  |  |1 |  |1 |

|FOCUS ON EXCEPTIONAL CHILDREN |  |  |1 |  |1 |

|FRONTIERS IN ECOLOGY AND THE ENVIRONMENT |  |  |1 |  |1 |

|HARVARD EDUCATIONAL REVIEW |  |  |  |1 |1 |

|IEEE TECHNOLOGY AND SOCIETY MAGAZINE |  |  |1 |  |1 |

|IEEE TRANSACTIONS ON EDUCATION |  |1 |  |  |1 |

|INSTRUCTIONAL SCIENCE |  |  |1 |  |1 |

|JOURNAL OF CHEMICAL EDUCATION |  |  |1 |  |1 |

|JOURNAL OF EXPERIMENTAL CHILD PSYCHOLOGY |  |  |1 |  |1 |

|JOURNAL OF HIGHER EDUCATION |  |  |  |1 |1 |

|JOURNAL OF LEARNING DISABILITIES |  |  |  |1 |1 |

|JOURNAL OF SPECIAL EDUCATION |  |1 |  |  |1 |

|JOURNAL OF TEACHER EDUCATION |  |  |1 |  |1 |

|MARINE TECHNOLOGY SOCIETY JOURNAL |  |1 |  |  |1 |

|NATURE NEUROSCIENCE |  |1 |  |  |1 |

|PERSPECTIVES IN EDUCATION |  |  |  |1 |1 |

|PHYSICS TODAY |  |1 |  |  |1 |

|PSYCHOLOGICAL SCIENCE |  |  |  |1 |1 |

|PSYCHOMETRIKA |  |1 |  |  |1 |

|READING RESEARCH QUARTERLY |  |  |  |1 |1 |

|RESEARCH-TECHNOLOGY MANAGEMENT |  |1 |  |  |1 |

|SCHOOL EFFECTIVENESS AND SCHOOL IMPROVEMENT |  |  |1 |  |1 |

|SCIENCE |  |  |1 |  |1 |

|SCIENTIST |  |1 |  |  |1 |

|TEACHING AND TEACHER EDUCATION |  |  |1 |  |1 |

|THEORY INTO PRACTICE |  |  |1 |  |1 |

|TOTAL |7 |31 |39 |37 |114 |

Source: ISI Web of Science (), Social Sciences Citation Index, 1956-2004; Arts & Humanities Citation Index, 1975-2004

With an understanding of the number of citations per CFE report and the journals in which they are found, Table 4 lists the journals citing the four reports that were cited five or more times. For reports appearing in more than one article of a journal, the frequency is in parenthesis.

Table 4: Reports Cited Five or More Times and the Journals in which They were Cited, January 2001-July 2004

|Publication Name |Total Citations|Citing Journal |

|Scientific |45 |ALTERNATIVE THERAPIES IN HEALTH AND MEDICINE |

|Research in | | |

|Education (2002) | |AMERICAN BEHAVIORAL SCIENTIST |

| | | |

| | |AMERICAN PSYCHOLOGIST |

| | | |

| | |ANNALS OF THE AMERICAN ACADEMY OF POLITICAL AND SOCIAL SCIENCE |

| | | |

| | |ANTHROPOLOGY & EDUCATION QUARTERLY |

| | | |

| | |EDUCATION AND URBAN SOCIETY |

| | | |

| | |EDUCATIONAL LEADERSHIP |

| | | |

| | |EDUCATIONAL POLICY |

| | | |

| | |EDUCATIONAL PSYCHOLOGY REVIEW (2) |

| | | |

| | |EDUCATIONAL RESEARCH |

| | | |

| | |EDUCATIONAL STUDIES |

| | | |

| | |ETR&D-EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT |

| | | |

| | |EXCEPTIONAL CHILDREN |

| | | |

| | |FOCUS ON EXCEPTIONAL CHILDREN |

| | | |

| | |HARVARD EDUCATIONAL REVIEW |

| | | |

| | |INSTRUCTIONAL SCIENCE |

| | | |

| | |JOURNAL FOR RESEARCH IN MATHEMATICS EDUCATION (7) |

| | | |

| | |JOURNAL OF EARLY INTERVENTION (2) |

| | | |

| | |JOURNAL OF HIGHER EDUCATION |

| | | |

| | |JOURNAL OF LEARNING DISABILITIES |

| | | |

| | |JOURNAL OF THE LEARNING SCIENCES |

| | | |

| | |QUALITATIVE INQUIRY (12) |

| | | |

| | |READING RESEARCH QUARTERLY |

| | | |

| | |SCHOOL PSYCHOLOGY QUARTERLY (2) |

| | | |

| | |TEACHERS COLLEGE RECORD |

| | | |

|Adding It Up: |19 |ANNUAL REVIEW OF PSYCHOLOGY |

|Helping Children | | |

|Learn Mathematics | |COMPUTERS IN HUMAN BEHAVIOR |

|(2001) | | |

| | |EDUCATIONAL POLICY |

| | | |

| | |ELEMENTARY SCHOOL JOURNAL |

| | | |

| | |EXCEPTIONAL CHILDREN |

| | | |

| | |JOURNAL FOR RESEARCH IN MATHEMATICS EDUCATION (7) |

| | | |

| | |JOURNAL OF EXPERIMENTAL CHILD PSYCHOLOGY |

| | | |

| | |JOURNAL OF SPECIAL EDUCATION |

| | | |

| | |PERSPECTIVES IN EDUCATION |

| | | |

| | |PHI DELTA KAPPAN (2) |

| | | |

| | |SCHOOL PSYCHOLOGY QUARTERLY |

| | | |

| | |TEACHERS COLLEGE RECORD |

| | | |

|Publication Name |Total Citations|Citing Journal |

|Knowing What |16 |AMERICAN EDUCATIONAL RESEARCH JOURNAL (2) |

|Students Know: The| | |

|Science and Design| |APPLIED MEASUREMENT IN EDUCATION |

|of Educational | | |

|Assessment (2001) | |APPLIED PSYCHOLOGICAL MEASUREMENT |

| | | |

| | |BIOSCIENCE |

| | | |

| | |COMPUTERS IN HUMAN BEHAVIOR |

| | | |

| | |EDUCATIONAL LEADERSHIP |

| | | |

| | |EVALUATION AND PROGRAM PLANNING |

| | | |

| | |JOURNAL FOR RESEARCH IN MATHEMATICS EDUCATION |

| | | |

| | |JOURNAL OF THE LEARNING SCIENCES |

| | | |

| | |NATURE NEUROSCIENCE |

| | | |

| | |PHI DELTA KAPPAN |

| | | |

| | |PSYCHOMETRIKA |

| | | |

| | |REVIEW OF EDUCATIONAL RESEARCH |

| | | |

| | |TEACHERS COLLEGE RECORD |

| | | |

| | |THEORY INTO PRACTICE |

| | | |

|Technically |5 |IEEE TRANSACTIONS ON EDUCATION |

|Speaking: Why All | | |

|Americans Need to | |INTERNATIONAL JOURNAL OF TECHNOLOGY AND DESIGN EDUCATION (2) |

|Know More About | | |

|Technology (2002) | |RESEARCH-TECHNOLOGY MANAGEMENT |

| | | |

| | |SCIENTIST |

| | | |

Source: ISI Web of Science (), Social Sciences Citation Index, 1956-2004; Arts & Humanities Citation Index, 1975-2004

These journals are largely education focused, aimed at a mixed practitioner and researcher audience, with the exception of discipline-based journals such as Bioscience, Applied Psychological Measurement, and Nature Neuroscience.

Tracking the Influence of Scientific Research in Education through Citation Analysis

As mentioned previously, the most cited CFE report published between 2001 and 2003 is Richard J. Shavelson and Lisa Towne’s Scientific Research in Education (2002). We thus looked at this report with close scrutiny, collecting the specific articles in which it was cited and compiling information that describes the context in which the CFE report was used.

Table 5 below provides for each citing article:

• The total number of citations

• Whether other National Academy reports are cited

• A brief description of the article

• Whether Shavelson & Towne (2002) was mentioned in the abstract

• The number of in-text citations of Shavelson & Towne (2002)

Table 5: Summary of Citing Article Attributes, January 2001-July 2004

|Citing Author(s) |Total Citations |Other National |Description of Citing Article |Mention in |# of In-Text |

| | |Academy Citations | |Abstract |Citations |

|Barron (2003) |90 |No |Mixed methods study of collaborative |No |1 |

| | | |interactions among sixth graders, concluding | | |

| | | |that quality of interaction influences | | |

| | | |problem-solving and learning. | | |

|Bloch (2004) |42 |No |Critical ontological and epistemological |Yes |Numerous |

| | | |analysis of the document, identifying its | | |

| | | |strengths and weaknesses from a post-modernist | | |

| | | |approach. | | |

|Boruch, et al. (2004) |87 |2 |Methodological piece on place-randomized trials|No |1 |

| | | |using examples of trials in diverse settings to| | |

| | | |illustrate the approaches, challenges, and | | |

| | | |scientific value added of the method. | | |

|Buysse, Sparkman, and |51 |1 |Evaluates and advocates communities of practice|No |Numerous |

|Wesley (2003) | | |to “scrutinize and improve” educational | | |

| | | |practices. | | |

|Cannella and Lincoln |8 |No |Continuation of Lincoln and Cannella (2004 |N/A |Numerous |

|(2004 10(2): 165-174)| | |10(1)). | | |

|Cannella and Lincoln |47 |No |Postmodernist editorial about the need for |N/A |1 |

|(2004 10(2): 298-309)| | |diverse research methods in education (and all)| | |

| | | |research as well as their acceptance by those | | |

| | | |with the authority to create standards of | | |

| | | |knowledge. Critique of NRC report as report | | |

| | | |that does the opposite of the above agenda. | | |

|Citing Author(s) |Total Citations |Other National |Description of Citing Article |Mention in |# of In-Text |

| | |Academy Citations | |Abstract |Citations |

|Ferrini-Mundy (NCTM |6 |1 |Description of NCTM’s Standards Impact Research|N/A |2 |

|Research Advisory | | |Group’s (SIRG) plans for research around | | |

|Committee) (2002) | | |Principles and Standards for School Mathematics| | |

| | | |(NCTM, 2000)—particularly regarding the | | |

| | | |documents’ impact and strengths and weaknesses,| | |

| | | |and the direction of future SIRG Standards | | |

| | | |work. | | |

|French (2003) |151 |No |Review of the literature on the paraeducator’s |N/A |1 |

| | | |role in special education programs. | | |

|Gorard (2004) |28 |No |Call for the British Educational Research |No |1 |

| | | |Association (BERA) to work to improve | | |

| | | |educational research in the UK rather than just| | |

| | | |defend its current state. | | |

|Hamann (2003) |32 |No |Recommendation to the field of anthropology to |No |1 |

| | | |use the work of Laura Nader as a guide for | | |

| | | |practice, notably her take on anthropology’s | | |

| | | |place in the context of “scientifically based” | | |

| | | |education research, e.g., she cautions that | | |

| | | |“‘Big Science’, or official science, is a | | |

| | | |social vehicle for silencing or subordinating | | |

| | | |other kinds of knowledge.” | | |

|Hammond (2004) |N/A |N/A |Book review |N/A |N/A |

|Howe (2004) |31 |No |Critique of “neoclassical” and “mixed methods” |No |Numerous |

| | | |experimentalism (mixed methods being | | |

| | | |represented by the document), while sketching | | |

| | | |an alternative “mixed-methods interpretivism” | | |

| | | |with a different epistemology. | | |

|Kamil (2004) |21 |No |A discussion of the current state of |N/A |1 |

| | | |quantitative reading research, with a bulk of | | |

| | | |the article describing areas of experimental | | |

| | | |quantitative reading research. | | |

|Kezar and Talburt |4 |No |Editorial introduction to the journal issue |N/A |2 |

|(2004) | | |explaining the necessity of pluralism in | | |

| | | |education research and summarizing how the | | |

| | | |issue supports that goal. | | |

|Citing Author(s) |Total Citations |Other National |Description of Citing Article |Mention in |# of In-Text |

| | |Academy Citations | |Abstract |Citations |

|Lather (2004) |62 |No |Postmodern, post-colonial, feminist |No |Numerous |

| | | |perspective talk given as Egon Guba Invited| | |

| | | |Lecture at the AERA annual meeting (2003) | | |

| | | |critiquing the NRC report, NCLB, and the | | |

| | | |narrowing of acceptable research in the | | |

| | | |continuation of the “Science Wars.” | | |

|Lewis-Snyder, et al. |37 |No |A study applying criteria from the Task |No |2 |

|(2002) | | |Force on Evidence-Based Interventions | | |

| | | |(EBIs) in school psychology to a | | |

| | | |group-based design intervention study, | | |

| | | |finding evidence supportive of the program.| | |

|Lincoln and Cannella |19 |No |Argues against the report’s |No |Numerous |

|(2004; 10(1)) | | |misunderstanding and rejection of | | |

| | | |postmodernism as well as the general | | |

| | | |movement towards “methodological | | |

| | | |fundamentalism,” as represented by the | | |

| | | |report. | | |

|Lincoln and Cannella |59 |No |Identifies a general movement and funding |No |2 |

|(2004; 10(2)) | | |increase in the social sciences towards | | |

| | | |“right wing” scholarship, particularly that| | |

| | | |which attacks research identifying and | | |

| | | |examining inequalities. | | |

|Lincoln and Tierney |17 |No |Critical evaluation of institutional review|No |1 |

|(2004) | | |boards and how they act as impediments to | | |

| | | |non-“conventional” research. | | |

|Lowther, et al. (2003)|46 |No |Study of the impact of providing students |No |1 |

| | | |with their own laptops using a matched | | |

| | | |treatment-control group design. Found | | |

| | | |greater computer usage and writing and | | |

| | | |problem-solving advantages among students | | |

| | | |with own laptops. | | |

|Lyon and Chhabra |27 |1 |Argues for scientific methodology in |N/A |2 |

|(2004) | | |education research and evaluation. | | |

|Maxwell (2004) |18 |No |Postmodern critique of a particular |No |1 |

| | | |conception of “science” as quantitative | | |

| | | |science while simultaneously emphasizing | | |

| | | |qualitative research as legitimately | | |

| | | |scientific. | | |

|Citing Author(s) |Total Citations |Other National |Description of Citing Article |Mention in |# of In-Text |

| | |Academy Citations | |Abstract |Citations |

|Mayer (2003) |16 |No |A call for evidence-based practice, rather |No |6 |

| | | |than doctrine-based, and issue-driven | | |

| | | |research, rather than method-driven. | | |

|Mayer (2004) |36 |1 |Review of research on discovery learning, |No |1 |

| | | |concluding that guided discovery is better | | |

| | | |than pure discovery. | | |

|Middleton, et al. |24 |1 |Call for Agenda for Research Action in |No |1 |

|(2004) | | |Mathematics Education. | | |

|National Council of |10 |1 |A discussion of the implications for |N/A |1 |

|Teachers of | | |mathematics education of No Child Left | | |

|Mathematics (NCTM) | | |Behind. Concludes by asking whether Dept. | | |

|(2003) | | |of Educ. will limit research funding to | | |

| | | |only randomized experiment and clinical | | |

| | | |trial studies. | | |

|Natriello (2004) |N/A |N/A |Book review |N/A |N/A |

|Ng (2003) |63 |No |Regarding teacher shortages in urban |No |1 |

| | | |schools, recommends looking beyond teacher | | |

| | | |certification efforts to instead view the | | |

| | | |problem from an organizational perspective,| | |

| | | |considering school systems’ “deeply rooted | | |

| | | |characteristics and inequalities.” | | |

|Odom and Strain (2002)|25 |No |An analysis of scientific evidence from |No |3 |

| | | |single-subject research on practices in | | |

| | | |early intervention/early childhood special | | |

| | | |education. | | |

|Pearson (2004) |97 |1 |Survey and argument about reading |No |1 |

| | | |instruction and research. Focuses on | | |

| | | |balance of research methods. | | |

|Popkewitz (2004) |32 |No |Critique of the “bedrock” of science, |Yes |Numerous |

| | | |arguing that the very ills the report seeks| | |

| | | |to eliminate may be reinstantiated by it. | | |

|Pressley, et al. |116 |No |Call for expanded scope of federal |No |1 |

|(2004) | | |educational research to include, for | | |

| | | |example, experimental research and ideally | | |

| | | |meta-analyses of these experiments. | | |

|Reigeluth (2003) |13 |No |Evaluative examination of the role of |No |1 |

| | | |knowledge, research, and technology in | | |

| | | |developing better ways of using the | | |

| | | |internet in education. | | |

|Citing Author(s) |Total Citations |Other National |Description of Citing Article |Mention in |# of In-Text |

| | |Academy Citations | |Abstract |Citations |

|Ryan and Hood (2004) |35 |No |Argues that projects like the NRC’s |No |Numerous |

| | | |threaten quality education evaluation and | | |

| | | |research and that education practitioners | | |

| | | |and the public should be included in a | | |

| | | |debate about improving education. | | |

|Schorr, et al. (2003) |51 |No |A study of the impact of state mathematics |No |1 |

| | | |testing on teaching practices, finding that| | |

| | | |without effective professional development,| | |

| | | |testing prompts little change. | | |

|Silver (2002) |6 |No |Editorial calling for and establishing the |N/A |1 |

| | | |role of trust and an absence of conflicts | | |

| | | |of interest in peer-reviewed journals | | |

| | | |(particularly JRME) as a way of | | |

| | | |establishing credible scholarship. | | |

|Silver (2003) |8 |No |Editorial calling for a widespread internal|N/A |2 |

| | | |evaluation of the current state of | | |

| | | |education research. | | |

|Simon (2004) |10 |No |Discussion of the quality of qualitative |N/A |1 |

| | | |mathematics education research with an | | |

| | | |emphasis on identifying the most pertinent | | |

| | | |issues, e.g., moving beyond descriptive | | |

| | | |studies and what empirical work merits | | |

| | | |publication. | | |

|Snyder, et al. (2002) |45 |No |Study of the methods used in 450 |No |3 |

| | | |quantitative studies that comprise the | | |

| | | |literature review of the Division for Early| | |

| | | |Childhood Recommended Practices project | | |

| | | |with the goal of evaluating scientific | | |

| | | |rigor. | | |

|St Pierre (2003) |19 |No |The article begins a discussion about power|No |Numerous |

| | | |relations and epistemological foundations | | |

| | | |of the present debate about educational | | |

| | | |research. | | |

|Stoiber (2002) |32 |No |A response to a series of articles on |No |5 |

| | | |evidence-based interventions (EBIs) in | | |

| | | |school psychology, including | | |

| | | |recommendations for improving the | | |

| | | |initiative’s viability. | | |

|Citing Author(s) |Total Citations |Other National |Description of Citing Article |Mention in |# of In-Text |

| | |Academy Citations | |Abstract |Citations |

|Turner, et al. (2003) |22 |No |A discussion of C2-SPECTR, a web-accessible|No |1 |

| | | |randomized controlled trials (RCT) citation| | |

| | | |register, and the increasing importance of | | |

| | | |RCTs. | | |

|Winn (2003) |17 |No |Starting as a response to Mayer (2003), a |No |1 |

| | | |call for education researchers to use all | | |

| | | |methods appropriate to their research | | |

| | | |question at their disposal, which includes | | |

| | | |both experimental and nonexperimental | | |

| | | |methods. | | |

|Woodward (2004) |153 |1 |A “historical review” of U.S. math |No |1 |

| | | |education from the 1950s, organized around | | |

| | | |three themes: 1) broad sociopolitical | | |

| | | |forces; 2) math research trends; 3) | | |

| | | |learning and instruction theories. | | |

|Zick and Benn (2004) |14 |No |A description and evaluation of a 7-week |No |1 |

| | | |course designed to teach complementary and | | |

| | | |alternative medicine (CAM) practitioners | | |

| | | |about research methodology. | | |

In-Depth Look at Shavelson & Towne (2002) Citations

For a more in-depth understanding of the ways Shavelson & Towne (2002) is cited, we provide for each citing article its full citation followed by a brief description of the way Shavelson & Towne (2002) was used in the article, the specific passage(s) in which the report was cited[5], and finally any other National Academy reports cited in the article. If the report is cited in the abstract, the abstract stands in for the brief description.

• Barron, Brigid. 2003. “When Smart Groups Fail.” The Journal of the Learning Sciences 12(3):307-359).

Cites Shavelson & Towne (2002) in terms of benefits of coding and quantification of group interactions.

“Quantification is useful as it can provide a way to (a) examine links between interactional processes and outcomes such as the quality of the group product and individual learning, (b) estimate the degree of between-group variability in a given population, and (c) advance our ability to specify important processes. Quantification of conversations also allows us to rule out simple explanations for variability in collaboration such as the possibility that no group member generated correct ideas. Coding with reliable schemes and subsequent quantification can also provide convincing evidence for researchers who are less confident in qualitative analytic work (National Research Council, 2002). For these reasons, measures of interaction were developed that would reflect how participants were responding to one another. Two main measures were the kind of responses that were given to proposals for solution, and the connectedness of a proposal to the content of the immediately preceding conversations. These are described in the Methods section” (p. 314).

• Bloch, Marianne. 2004. “A Discourse that Disciplines, Governs, and Regulates: The National Research Council's Report on Scientific Research in Education.” Qualitative Inquiry 10(1):96-110.

Abstract: “This article discussed the National Research Council (2002) report’s six guiding principles that define good scientific educational research. The principles and discussion focus on good educational research in terms of the “rigor” of the research as well as the “scientific” base and principles adhered to in doing the research. This article critically examines the normative base from which the consensual definitional of rigorous research and good science were drawn for the report and by its authors, and looks at the margins where other research falls, by definition. By using a variety of examples of research from around the world that highlights the need for complex historical, political, and cultural lens to examine policy, research, knowledge and power constructions, Bloch suggests that the report represents only one truth among many, limiting further inquiry. By examining the relation between knowledge and truth, the danger in a limited perspective is underlined” (p. 96).

“In the recent National Research Council (NRC) (2002) report titled Scientific Research in Education, edited by Richard Shavelson and Lisa Towne, new ways to imagine “quality” in scientific research are proposed. Shavelson, Towne, and a panel of experts in educational research in the United States were asked to respond to sharp critique about the “messiness” of educational research and the proposition that extreme views have “compounded the ‘awful reputation’ (Kaestle, 1993) of educational research and diminished its promise” (NRC, 2002, p. 20). In developing the positions presented by the NRC panel of 16 experts and 3 study coordinators, the report is very clear that both quantitative and qualitative research that follows six key scientific principles (see next section) and is “rigorous” can be considered good research in the field. In addition, the report suggests that the split between basic and applied educational research is a false divide, as is the divide between quantitative and qualitative (hard/soft) research. According to the report and its authors and experts, all rigorous research that follows the scientific guidelines and principles expressed in the report can be good or high quality research; by implication, other research is messy, poor, and contributes to the bad reputation educational research apparently has in the general scientific community, or at least within the federal development” (p. 96-97).

• Boruch, Robert, Henry May, Robert Turner, Julia Lavenberg, Anthony Petrosino, Dorothy De Moya, Jeremy Grimshaw, and Ellen Foley. 2004. “Estimating the Effects of Interventions That Are Deployed in Many Places: Place-Randomized Trials.” American Behavioral Scientist 47(5):608-633.

Cite Shavelson & Towne (2002) as an example of a group that recommends the use of randomized trials.

“Groups that advise government agencies have offered direct and indirect recommendations about randomized trials that use places as the units of allocation and analysis. In education, the National Academy of Sciences Committee on Scientific Principles in Education Research recognized as important the conduct of randomized trials in which schools, school districts, and classrooms were randomly assigned to different regimens (Shavelson & Towne, 2001)” (p. 614).

Other NAS citations:

Coyle, S. L., Boruch, R. F., and Turner, C. F. (Eds.). 1991. Evaluating AIDS prevention programs (Expanded ed.). Washington, DC: National Academy of Sciences.

Reiss, A. J. and Roth, J. A. (Eds.). 1993. Understanding and Preventing Violence. Washington, DC: National Academy of Sciences Press.

• Buysse, Virginia, Karen L. Sparkman, and Patricia W. Wesley. 2003. “Communities of Practice: Connecting What We Know with What We Do.” Exceptional Children 69(3):263-277.

Shavelson & Towne (2002) was used in a variety of contexts, from evidence to support a position as well as an example of a critique of an idea.

“… there is little consensus about what educational researchers need to know and be able to do (Shavelson & Towne, 2002)” (p. 264).

“A recent report published by the National Research Council concluded that ‘the sharp divide between education research and scholarship and the practice of education in schools and other settings’ is one of the fundamental reasons for the lack of public support for education (Shavelson & Towne, 2002, p. 14). According to this report, the disconnect that exists between educational research and practice stems, in part, from the fact that researchers and practitioners operate in vastly different worlds in which, historically, most researchers have been men, while most teachers have been women” (p. 264).

Other NAS citation:

National Research Council. 1999. Improving Student Learning: A Strategic Plan for Education Research and Its Utilization. Committee on a Feasibility Study for a Strategic Education Research Program. Commission on Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.

• Cannella, Gaile S. and Yvonna S. Lincoln. 2004. “Dangerous discourses II: Comprehending and Countering the Redeployment of Discourses (and Resources) in the Generation of Liberatory Inquiry.” Qualitative Inquiry 10(2):165-174.

Shavelson & Towne (2002) is again the focus of an investigation of “conservative discourses” and “methodological fundamentalism.”

“In the previous special issue of Qualitative Inquiry, titled Dangerous Discourses: Methodological Conservatism and Governmental Regimes of Truth (Lincoln & Cannella, 2004), the authors analyzed the National Research Council (NRC) report Scientific Research in Education to illustrate the dangers, inconceptualizations, forms of legitimation, and methodologies of present day reconstructions of the discourses of research. The NRC report is a U.S. government–requested project designed to clearly define the nature of research that is to be labeled as representing quality and therefore useful and fundable. Accurately referred to as methodological fundamentalism (House, in press), contemporary conservative research discourses as exhibited in the report have returned to a Western elitist high modernism that would reinscribe an arrow form of experimentalism as the “gold standard” for judging legitimacy and quality. Similar to the backlash against the civil rights and women’s movements during the past 15 to 20 years, diverse research philosophies and methodologies (e.g., critical theory, race/ethnic studies, feminist theories) that are conceptualized from within critical dispositions and have proposed to include the voices and life conditions of the traditionally marginalized are not “just” attacked but are directly rejected (NRC, 2002, “reject[s] the postmodernist school of thought,”p.25). These conservative discourses are being produced from within the United States and other government complexes of power (and reinforced from various societal institutional positions) in a manner that would inscribe and provide resources to a universalist regulatory science (p. 165).”

• Cannella, Gaile S. and Yvonna S. Lincoln. 2004. “Epilogue: Claiming a Critical Public Social Science - Reconceptualizing and Redeploying Research.” Qualitative Inquiry 10(2):298-309.

Use of Shavelson & Towne (2002) as example of narrowing definition of research acceptable for government funding and the construction of acceptable knowledge.

“We have previously stated that qualitative research is well established and will not disappear, a perspective that we continue to maintain … However, we must be aware that these gains [in acceptance of a diverse set of research methodologies, including critical ones] may just be illusions, visions that actually misrepresent the shifts that are occurring in conceptualizations of research purposes and methodologies. Those of us in education in the United States have been reminded of this possibility of the passage of the No Child Left Behind Act of 2001 and the National Research Council (2002) report, both government actions that narrowly define educational research that will receive funding and be considered of quality” (p. 299-300).

• Ferrini-Mundy, Joan (NCTM Research Advisory Committee, Standards Impact Research Group). 2002. Journal For Research in Mathematics Education 33(5):313-318.

Cites NRC as part of the shift in “the terrain for mathematics education research” that occurred after NCTM’s Research Advisory Committee reported on its activities in the Journal for Research in Mathematics Education in 2001.

“The National Research Council (NRC) has released a report, Scientific Research in Education (NRC, 2002b), which provides guiding principles for scientific inquiry and discusses design for conducting scientific research, in response to increasing controversy about what counts as “scientific” research in education” (p. 313).

Cites NRC in terms of the distinction between rigorous research and advocacy work.

“Rigorous research, by definition, cannot be conceptualized as advocacy work (Howe, 2002; NRC, 2002b). Thus, SIRG is not interested in research that argues blindly for the implementation of Principles and Standards. Rather, SIRG hopes to support research that conceptualizes the Standards as a good-faith effort to summarize current knowledge about high-quality mathematics teaching, learning, and curriculum” (p. 317).

Other NAS citation:

National Research Council (NRC). 2002. Investigating the Influence of Standards: A Framework for Research in Mathematics, Science, and Technology. Washington, DC: National Academy Press.

• French, Nancy K. 2003. “Paraeducators in Special Education Programs.” Focus on Exceptional Children 36(2):1-16.

Shavelson & Towne (2002) is used as evidence for the benefits of small class size.

“Although none of the research on the impact of paraeducators on attitudinal factors, or on social or academic engagement factors compared the data to student-achievement data, one might extrapolate that increased student engagement and independence, as well as a higher adult-to-student ratio, would be associated with increased academic achievement, given the well-established findings about the positive effects of student engagement (Marzano, 2003) and small class size (glass & Smith, 1979; Shavelson & Towne, 2002). Yet, such extrapolation remains unconfirmed by empirical information” (p. 6).

• Gorard, Stephen. 2004. “The British Educational Research Association and the Future of Educational Research.” Educational Studies 30(1):65-76.

Cites Shavelson & Towne (2002) in terms of U.S. situation of government-imposed guidelines on education research.

“One of the dangers for educational research, if we do not improve, is that funding will dry up (and the ESRC Teaching and Learning Research Programme, costing £30 million, has a big and frightening responsibility to the tax-payer in this respect). More likely, however, is that we will find ourselves in the same situation as our US colleagues (NSF, 2002). They now face a law that defines what government-funded educational research is and how they should do it. We need to improve in order not to have such legal criteria imposed on us, to retain the freedom to both support and challenge the use of particular methodological approaches independent of political fashion. Our research must relate to popular notions of what evidence is, but it can still be rigorous, apolitical, radical and independent” (p. 73).

• Hamann, Edmund T. 2003. “Imagining the Future of the Anthropology of Education if We Take Laura Nader Seriously.” Anthropology & Education Quarterly 34(4):438-449.

Cites Shavelson & Towne (2002) as including input from an educational anthropologist that led to the inclusion of ethnography in the report’s definition of science. Further mentions Scientific Research in Education in terms of a conflict between the report’s contents and its usage.

“Educational anthropologist Margaret Eisenhart helped assure that the National Research Council's (2003) recent report, Scientific Research in Education, did not exclude ethnography from its definition of science, but we should neither rest on our laurels nor assume that our frequent reliance on the natural laboratory of humans acting in the complex real world is not under threat. In precisely the ways of privileging "Big Science" that Nader identifies, No Child Left Behind formally embraces randomized trials as the gold standard of research, a posture that Grover J. (Russ) Whitehurst, director of the U.S. Department of Education's Institute of Education Sciences, emphasized in his remarks at the 2003 American Educational Research Association conference. Indeed, as Eisenhart (2002) points out, there is a discrepancy between what the NRC report says and how it is being used, particularly regarding the policies on scientifically based education research. Given the emphasis on science-based research, it behooves us to insistently contribute to defining what "scientifically based" means, ask how the current charge to conduct such research is being interpreted and enacted, and, when it excludes our repertoire of inquiry, to contest the policy's implementation” (p. 442).

• Hammond, Paula. 2004. “Book Review: Scientific Research in Education.” Educational Research 46(1):91-92.

More cursory summary than review. Author is concerned with non-U.S. applications of the book; broadly finds the book “interesting” and sees it as providing useful insight for readers looking to promote “a relatively strong scientific culture” within education.

• Howe, Kenneth R. 2004. “A Critique of Experimentalism.” Qualitative Inquiry 10(1):42-61.

Applauds Shavelson & Towne’s (2002) role for mixed methods research, but criticizes its close linkage to neoclassical experimentalism and privileging of “quantitative-experimental” methodologies over “qualitative-interpretive” ones to determine “what works.”

“Mixed-methods experimentalism is the view exemplified in the National Research Council (2002) report Scientific Research in Education (SRE)” (p. 49).

“SRE says things such as “At its core, scientific inquiry is the same in all fields” (National Research Council, 2002, p. 2) and the “accumulation of knowledge is the ultimate goal of [all] research (National Research Council, 2002, p. 24)” (p. 49).

• Kamil, Michael L. 2004. “The Current State of Quantitative Research.” Reading Research Quarterly 39(1):100-107. (Subarticle within set of articles entitled “New Directions in Research.” “The Current State of Quantitative Research” is not listed individually in the journal table of contents.)

Cites Shavelson & Towne (2002) in terms of a call for research questions to guide methodologies, as opposed to “searching for singular solutions to complicated problems” (p. 101).

“Recently there have been renewed calls for research methodologies to be guided by the questions being asked (Shavelson & Towne, 2002)” (p. 101).

• Kezar, Adrianna and Susan Talburt. 2004. “Introduction: Questions of Research and Methodology.” The Journal of Higher Education 75(1):1-6.

While research methods have become more varied in recent decades, Kezar and Talburt suggest that Shavelson & Towne (2002) could “narrow” research methods through “legislat[ing] valid research as ‘scientific.’”

“At the same time, we are aware that current political conditions may encourage a retraction rather than an expansion of research approaches. We refer specifically to the National Research Council’s (NRC) (2002) Committee on Scientific Principles for Education Research’s recent report, Scientific Research in Education. This report signals an epoch in which the federal government would legislate valid research as “scientific” in increasingly narrow and normative ways, as evidenced by the NRC’s emphasis on experimental research and its ambivalence about multiple approaches to research that do not come from or create a research community based in consensus (Feuer, Towne, & Shavelson, 2002). In such a moment, it is increasingly important that scholars of higher education learn to articulate the values of a plurality of research approaches and of specific types of inquiry framed as nonscientific” (p. 2).

“Some may see this embracing of complexity and expansion of research as dangerous in an environment in which the National Research Council is calling for narrower standards that do not reflect such complexity and openness. But we as a field should engage with these conversations, especially within this political and historical climate” (p. 5).

• Lather, Patti. 2004. “This IS Your Father's Paradigm: Government Intrusion and the Case of Qualitative Research in Education.” Qualitative Inquiry 10(1):15-34.

Shavelson & Towne (2002) is used as an example and cause of the narrowing of research acceptable as scientific by the federal government and other conservative interested parties.

“Although I will attend some to the consequent effort to address congressional disdain regarding educational research via the National Research Council’s (NRC) (2002) report Scientific Research in Education, my primary interest is in the structure of the situation” (p. 16).

“It turned to the attempts of the NRC (2002) report to negotiate between the federal government and the educational research community what it means to do scientific educational research. In spite of the efforts of the NRC report toward a “big tent” of legitimate methods in educational research (Feuer, Towne, & Shavelson, 2002), Lagemann seemed adrift in addressing how calls for generalizability, objectivity, replicability, and a unified theory of science reinscribe a science under duress for some 30 years” (p. 17).

• Lewis-Snyder, Gretchen, Karen Callan Stoiber, and Thomas R. Kratochwill. 2002. “Evidence-Based Interventions in School Psychology: An Illustration of Task Force Coding Criteria Using Group-Based Research Design.” School Psychology Quarterly 17(4):423-465.

Cite Shavelson & Towne (2002) regarding the influence of contextual factors on evidence-based interventions (EBIs) and the potential unfeasibility of random assignment in school situations.

“In addition, we have suggested that researchers need to recognize, consider, and examine the unique context of the school setting, in addition to the unique needs and constraints faced by school staff, such as staff time, school space limitations, and school climate. These contextual factors can have powerful and differential influences on how interventions are designed and implemented and on outcomes obtained (Kratochwill & Stoiber, this issue; Shavelson & Towne, 2002).”

“However, it is important to note that random assignment is often not feasible in school situations (Kratochwill & Stoiber, this issue; Shavelson & Towne, 2002; Stoiber & Kratochwill, 2000).”

• Lincoln, Yvonna and Gaile S. Cannella. 2004. “Dangerous Discourses: Methodological Conservatism and Governmental Regimes of Truth.” Qualitative Inquiry 10(1):5-14.

Shavelson & Towne (2002) is used as an example and analyzed as a governmental act as a “backlash” against “diverse forms of research.”

“In some respects, the current controversies surrounding the stringent language regarding what counts as research embedded in the No Child Left Behind Act of 2001 and the recent National Research Council (NRC) report (see Feuer, Towne, & Shavelson, 2002) resembles the reportedly true, but probably prophetic, story attributed to Bourdieu [De-centering the Western Canon if those originally in power could not control the center]” (p. 5).

“The return of high modernism, the backlash against diverse forms of research, and recent direct governmental actions (such as the NRC report) that would create a science for the “common good” (Foucault, 1991, pp. 94-95) are awakening (some scholars to the dangers in these present day reconstructions of the discourses of research” (p. 6).

• Lincoln, Yvonna S. and Gaile S. Cannella. 2004. “Qualitative Research, Power, and the Radical Right.” Qualitative Inquiry 10(2):175-201.

Shavelson & Towne (2002) is used as an example of antifeminism by advocating “old-style” “academic research standards and quality.”

“Antifeminism is expressed in at least four ways: (a) the use of sound bites such as “political correctness” to discredit diverse academic voices; (b) planned attacks on feminism by women co-opted by the conservative Right, co-opted for a variety of reasons and in a variety of ways; (c) old-style dismissal of women; and (d) old-style calls for academic research standards and quality such as those in the National Research Council (2002) report” (p. 185).

• Lincoln, Yvonna S. and William G. Tierney. 2004. “Qualitative research and institutional review boards.” Qualitative Inquiry 10(2):219-234.

Shavelson & Towne (2002) is used as an example of “backlash against qualitative research.”

“Or there is, perhaps, something far more ominous occurring: a backlash against qualitative research in all its nonrationalistic or postmodern forms (Lincoln & Cannella, 2002; National Research Council, 2002; No Child Left Behind Act of 2001)” (p. 223).

“Definition of research. Another problem that appears to bee merging is in the definition of research now widely adopted by many IRBs (and now codified in the National Research Council’s [2002] report, Scientific Research in Education)” (p. 230).

• Lowther, Deborah L., Steven M. Ross, and Gary M. Morrison. 2003. “When Each One Has One: The Influences on Teaching Strategies and Student Achievement of Using Laptops in the Classroom.” Educational Technology Research and Development 51(3):23-44.

Cites Shavelson & Towne (2002) in terms of need for further research using “experimental-type methodology” (p. 43).

“Given the ex post facto design employed in this study, the, present results can be considered only suggestive rather than conclusive about the benefits of the laptop program. Additional research, using experimental-type methodology (Shavelson & Towne, 2002) is needed to demonstrate similar findings, in diverse contexts and with other student groups (e.g., disadvantaged populations)” (p. 43).

• Lyon, G. Reid and Vinita Chhabra. 2004. “The Science of Reading Research.” Educational Leadership 61(6):12-17.

Cite Shavelson & Towne (2002) in “Qualitative and Quantitative Research” section in terms of the validity of and need for both qualitative and quantitative studies.

“Research methodologies fall into two major types—qualitative and quantitative. Both types of research can be valuable and relevant to improving education. Both are equally scientific—if we align the right method with the specific question or questions being addressed (Shavelson & Towne, 2002; Wiersma, 2000)” (p. 14).

“The type of research question asked tells us which research method (or combination of methods) to employ. Qualitative research has a role in helping to tease out the how or why in quantitative studies designed to find out what works. But qualitative research cannot produce generalizable results identifying instructional strategies that will improve reading achievement. Only quantitatively based studies—experimental or quasi-experimental—can do that. However, both quantitative and qualitative research methods are important and necessary if we are to develop the fullest understanding of which specific instructional approaches are most effective for which students and why (Shavelson & Towne, 2002)” (p. 14).

Other NAS citation:

Snow, C., Burns, S., and Griffin, P. (Eds.). 1998. Preventing reading difficulties in young children. Washington, DC: National Academy Press.

• Maxwell, Joseph A. 2004. “Reemergent Scientism, Postmodernism, and Dialogue Across Differences.” Qualitative Inquiry 10(1):35-41.

Cite Shavelson & Towne (2002) both regarding Lather’s (2004) critique of the report, which he largely agrees with, and in defense of other aspects of the report.

“I also agree with Lather’s (2004) critique of the National Research Council report Scientific Research in Education (Shavelson & Towne, 2002), which is that despite the committee’s attempt to promote an inclusive view of science, the report is grounded in narrow and outdated assumptions about what it means to be “scientific,” assumptions that do not even validly represent actual practice in the natural sciences, let alone apply to educational research. My grounds for this critique may be somewhat different from hers because I am sympathetic to many of the characteristics of science that the authors of this report advocate. I believe that we can retain the essential features of qualitative inquiry—a concern with meaning and interpretation, an emphasis on the importance of context and process, and an inductive and hermeneutic strategy—and still defend our work as scientific (Maxwell, 1996, 2002, in press-a, in press-b)” (p. 36).

• Mayer, Richard E. 2003. “Learning Environments: The Case for Evidence-Based Practice and Issue-Driven Research.” Educational Psychology Review 15(4):359-366.

Cites Shavelson & Towne (2002) as support for his call for evidence-based practice and issue-driven research.

“Winn’s essay, “Current Trends in Educational Technology Research: The Study of Learning Environments,” is useful because it raises the larger question of what constitutes “the study of learning environments.” The starting point for discussions about the nature of educational research is agreement that educational studies should be scientific. What does it mean to do scientific research in education? This is a question that has generated much discussion in our field (Lagemann, 2000; Levin and O’Donnell, 1999; Shavelson and Towne, 2002; Slavin, 2002)” (p. 360).

“Views on the nature of scientific studies of education can be extreme (Phillips and Burbules, 2000). On one extreme are the positivists who argue for quantitative experimental studies of behavior as the only route to progress. On another extreme are postmodernists who hold that all research is hopelessly biased by the researcher and all views can be equally valid. Like many educational thinkers (Phillips and Burbules, 2000), I reject both the narrow view of science and the relativistic view of science. Thus, I begin with the premise that science is a way of understanding fundamental educational issues, such as how people learn with technology. Two important aspects of a scientific approach to educational research concern the role of evidence and the role of methods for generating evidence (Shavelson and Towne, 2002)” (p. 360).

• Mayer, Richard E. 2004. “Should There Be a Three-Strikes Rule Against Pure Discovery Learning? The Case for Guided Methods of Instruction.” American Psychologist 59(1):14-19.

Cite Shavelson & Towne (2002) in terms of the importance of scientifically rigorous research.

“Second, psychology can offer a powerful methodology for testing theories of how to help people learn. To determine the value of any theory of learning, it is useful to derive clear predictions and to test them with valid evidence from scientifically rigorous research (Shavelson & Towne, 2002)” (p. 18).

Other NAS citation:

Bransford, J.D., A.L. Brown, and R.R. Cocking (Eds.). 1999. How People Learn. Washington, DC: National Academy Press.

• Middleton, James A., Barbara Dougherty, M. Kathleen Heid, and Beatriz D’Ambrosio. 2004. “An Agenda for Research Action in Mathematics Education: Beginning the Discussion.” Journal for Research in Mathematics Education 35(2):74-80.

Cite Shavelson & Towne (2002) as one of four projects that comprise “the base activity on which the Agenda we are proposing must be built” (p. 78). Although the NCTM envisions its agenda having a broader scope.

“Adding it Up (Kilpatrick, Swafford, & Findell, 2001) consolidates much of the evidence in learning and teaching number concepts and operations in a manner appropriate for practitioners, and the National Research Council (Shavelson & Towne, 2002) has given serious thought to the nature of scientific evidence in research on education” (p. 77-78).

Other NAS citation:

Kilpatrick, J., J. Swafford, and B. Findell (Eds.). 2002. Adding It Up: Helping Children Learn Mathematics. Washington, DC: National Academy Press.

• National Council of Teachers of Mathematics (NCTM) Research Advisory Committee. 2003. “Educational Research in the No Child Left Behind Environment.” Journal for Research in Mathematics Education 34(3):185-190.

Cites Shavelson & Towne as lead-in to concluding question: “Will the Department of Education, in actually deciding what to fund, take the broader view of scientifically based research? Or will only randomized experiments and clinical trials be funded?” (p. 189).

“What remains unclear is the extent to which nonexperimental research will be eligible for funding. As noted in the NRC (2002) report:

‘Differences in the phenomena typically under investigation do distinguish the research conducted by physical and social scientists. For example, the social and cultural work of sociologists and cultural anthropologists often do not lend themselves to the controlled conditions, randomized treatments, or repeated measures that typify investigations in physics or chemistry. Phenomena such as language socialization, deviancy, the development of an idea, or the interaction of cultural tradition with educational instruction are notoriously impervious to the controls used in the systematic investigations of atoms or molecules. Unlike atoms or molecules, people grow up and change over time. The social, cultural, and economic conditions they experience evolve with history. The abstract concepts and ideas that are meaningful to them vary across time, space, and cultural tradition. These circumstances have led some social science and education researchers to investigative approaches that look distinctly different from those of physical researchers, while still aligning with the guiding principles outlined in Chapter 3’ (p. 81)” (p. 189).

Other NAS citation:

Kilpatrick, J., J. Swafford, and B. Findell (Eds.). 2002. Adding It Up: Helping Children Learn Mathematics. Washington, DC: National Academy Press.

• Natriello, Gary. 2003. “Book Review: Scientific Research in Education.” Teachers College Record 106(2):329-337.

A thorough summary and critique of what the author deems to be an important and well-conceived statement on scientific education research.

• Ng, Jennifer C. 2003. “Teacher Shortages in Urban Schools: The Role of Traditional and Alternative Certification Routes in Filling the Voids.” Education and Urban Society 35(4):380-398.

Cites Shavelson & Towne (2002) in terms of statement on educational reform as a “contested field.”

“This explanation hides the more deeply rooted characteristics and inequalities of the school system such that policy makers cannot be held accountable to resolving them and the role of values as well as multiple—and sometimes unintended, uncontrollable—effects that make educational reform a contested field (Shavelson & Towne, 2002) become neutralized” (p. 395).

• Odom, Samuel L. and Phillip S. Strain. 2002. “Evidence-Based Practice in Early Intervention/Early Childhood Special Education: Single-Subject Design Research.” Journal of Early Intervention 25(2):151-160.

Cite Shavelson & Towne (2002) first more generally regarding evidence-based educational practice and then regarding single-subject methodology.

“In the fields of general education and special education, great emphasis has been placed on basing educational practice on scientific evidence (Cardine, 1995; Shavelson & Towne, 2002)” (p. 151).

“In fact, in a report of a recent committee convened by the National Academy of Sciences (NAS) to review the status of scientific research in education and make recommendations for future research, single-subject methodology was never mentioned (Shavelson & Towne, 2002)” (p. 151).

“Nevertheless, rigorous single-subject methodology (Kazdin, 1982; Tawney & Gast, 1984; Wolery & Dunlap, 2001) certainly meets the principles of scientific research the NAS committee established: (a) conducting an empirical investigation, (b) linking findings to a theory of practice, (c) using methods that permit direct investigation, (d) providing a coherent chain of reasoning, and (e) replicating and generalizing across studies (Shavelson & Towne, 2002)” (p. 151).

• Pearson, P. David. 2004. “The Reading Wars.” Educational Policy 18(1):216-252.

Cite Shavelson & Towne (2002) as encouraging “complementarities and convergence,” not competition among research methods.

“So let us talk about complementarities and convergence among methods rather than competition and displacement of one worldview with another. This is the message of the recent report on educational research by a committee empanelled by the National Academy of Science (Shavelson & Towne, 2002)” (p. 235).

Other NAS citation:

Snow, C., S. Burns, and P. Griffin (Eds.). 1998. Preventing Reading Difficulties in Young Children. Washington, DC: National Academy Press.

• Popkewitz, Thomas S. 2004. “Is the National Research Council Committee's Report on Scientific Research in Education Scientific? On Trusting the Manifesto.” Qualitative Inquiry 10(1):62-78.

Abstract: “The National Research Council Committee’s report, in part a response to congressional legislation, pursues an outline for the foundations of an education science necessary for policy making. This article focuses on these foundations and argues that they have little to do with science and its relation to policy may not be as fruitful as the committee believes. The so-called bedrocks of science are based on, at best, weak premises and an unrigorous understanding of the sociology, history, and philosophy of science. There is a nostalgia for a simple and ordered universe of science that never was. The resulting models of science embody a technological sublime that weaves together the procedures of administration and engineering with utopian visions and discourses of progress ordered by the expansion of its expertise. Finally, although the resulting framework was to relieve educational research of its “awful” reputation, the report may reinstitute and reinstantiate those very practices” (p. 62).

“The National Research Council Committee’s (2002) report titled Scientific Research in Education (hereafter the Report), in part a response to congressional legislation, pursues three tasks. It (a) outlines the foundations of all science, (b) uses these assumptions as the basis of an educational science, and (c) considers how a federal agency can be organized to fund and foster the growth of this science and provide the expertise for policy making” (p. 62).

• Pressley, Michael, Nell K. Duke, and Erica C. Boling. 2004. “The Educational Science and Scientifically Based Instruction We Need: Lessons from Reading Research and Policymaking.” Harvard Educational Review 74(1):30-61.

Cite Shavelson & Towne (2002) as another example of what they are promoting for educational research: diverse, creative instructional science.

“We hope that this article heightens sensitivity to an emerging position about the research methods that can inform instructional debates. We need many types of data and a variety of methods of synthesizing them. As we produce new analyses and syntheses, new and often better understandings of excellent instruction will emerge. Excellent instructional science will have to be as diverse and creative as the excellent instruction it is intended to inform (see also Shavelson & Towne, 2002). Although it is probably too much to expect that instructional science can ever be depoliticized entirely, we must move away from congressional mandates about forms of instruction that are based on a narrow band of research favored by the sitting federal administration. Unchecked, the White House and Congress will always make decisions about education politically, without regard for the broad range of educational science that must be understood to make intelligent decisions about curriculum and instruction based on evidence.”

• Reigeluth, Charles M. 2003. “Knowledge Building for Use of the Internet in Education.” Instructional Science 31:341–346.

Shavelson & Towne (2002) is used to support the point that education research will likely be undervalued by education practitioners if particular dialogue about design theory does not occur.

“Dialogue. Given the distinction between descriptive theory and design theory, it seems that design theory is more useful to help practitioners use the Internet in education. So why are we researchers not devoting more of our efforts to developing design theory? Is it because we don’t realize its greater value? Is it because we don’t know how to develop such theory? Is it because we suspect that journals won’t publish such scholarly work? Perhaps our field could benefit from more dialogue about these questions. Without such reflection, educational research may continue to be perceived as of little value by most practitioners (National Research Council, 2002)” (p. 343).

• Ryan, Katherine and Lisa K. Hood. 2004. “Guarding the Castle and Opening the Gates.” Qualitative Inquiry 10(1):79-95.

Abstract: “The passage of No Child Left Behind legislation and the publication of the National Research Council’s Scientific Research in Education have generated much discussion and criticism of the call for educational research and evaluation that is more scientific. Using examples from their own field research and evaluation experiences, the authors illustrate their concerns with the concepts and principles of scientific-based research. They propose both qualitative and quantitative methods are critical for studying the structural, political, systemic issues that surround complex educational issues and organizations. The authors call for the creation of a space for publication engagement to include the public’s collective wisdom and experiences to discuss and deliberate educational issues and generate possible solutions” (p. 79).

“Like many educational communities, the College of Education at the University of Illinois at Urbana held a seminar and panel devoted to “scientifically based research” (SBR) and the National Research Council (NRC) (2002) report” (p. 79).

• Schorr, Robert Y., William A. Firestone, and Lora Monfils. 2003. “State Testing and Mathematics Teaching in New Jersey: The Effects of a Test Without Other Supports.” Journal for Research in Mathematics Education 34(5):373-405.

Cites Shavelson & Towne (2002) as support of a statement on the status of education research.

“There is a small body of research that has looked at states with more performance-based assessments with and without stronger sanctions (Koretz, Mitchell, Barron, & Keith, 1996; Stecher & Barron, 1999) or with improved learning opportunities (Stecher & Mitchell, 1995), but in a field that lacks replication (Shavelson & Towne, 2002), the body of work from which to generalize is limited” (p. 399).

• Silver, Edward A. 2002. “Preserving Trust: Avoiding Conflicts of Interest in the Publication Process.” Journal for Research in Mathematics Education 33(4):235-238.

Shavelson & Towne (2002) is used as evidence – a credible source – to begin an argument about peer reviews and journal article publications as important for advancing education research and how to foster such an environment through trust and appropriately dealing with conflicts of interests.

“The Journal for Research in Mathematics Education is an outlet for the publication of high-quality scholarship on mathematics teaching and learning. In particular, it is a place for scholars to publish the findings of research studies that pertain to a wide variety of issues in this broad domain. As the National Research Council noted in its recent report on education al research (NRC, 2002) it is though the process of peer review and journal publication that research questions, methods, and findings are made available for professional public scrutiny. Peer review and publication are central to the development and dissemination of scientific knowledge” (p. 235).

• Silver, Edward A. 2003. “Improving Education Research: Ideology or Science?” Journal for Research in Mathematics Education 34(2):106-109.

Cites Shavelson & Towne (2002) as a “penetrating analysis of the complexities of education research” and uses controversial advocacy of NIH standards as a launching point for an argument for greater discussion and analysis of a variety of methodological and evidential issues within the education research community.

“A recent effort to examine the status and quality of education research was undertaken by a committee of the (U.S.) National Research Council [(NRC) 2002], and its report outlines some critical features of scientifically credible education research and several design principles to guide the creation of an improved education research enterprise. But efforts to improve education research from inside the field are running headlong into initiatives launched by politicians outside the field… some politicians in the United States have turned to medicine. Ignoring, or perhaps dismissing, the penetrating analysis of the complexities of education research offered by the NRC report, they have turned to the compelling model offered by the National Institutes of Health (NIH)” (p. 106).

• Simon, Martin A. 2004. “Raising Issues of Quality in Mathematics Education Research.” Journal for Research in Mathematics Education 35(3):157-163.

Cites Shavelson & Towne (2002) as part of the argument that just because a study is undertaken does not make it automatically worthy of publication; instead, 1) it needs to contribute to knowledge in the field, and 2) its claims need to be strongly warranted.

“In the natural sciences, many research studies do not yield publishable results. Research studies are of unpredictable length because the researcher cannot foresee what interesting results will be obtained. Is there a parallel in mathematics education? I believe so. ‘Many scientific studies in education and in other fields will not pan out’ (National Research Council, 2002, p. 25). Too often, mathematics education researchers gather data for a predetermined amount of time and then expect to publish the results of the study” (p. 160).

• Snyder, Patricia, Bruce Thompson, Mary E. McLean, and Barbara J. Smith. 2002. “Examination of Quantitative Methods Used in Early Intervention Research: Linkages With Recommended Practices.” Journal of Early Intervention 25(2):137-150.

Cite Shavelson & Towne (2002) first more generally regarding increasing emphasis on evidence-based practice and then more specifically regarding needing different research methodologies for different practices and what characterizes a study as scientific is more than just research design.

“The increased emphasis on evidence- or scientifically-based practice (e.g., Dunst, Trivette, & Cutspec, 2002; National Research Council, 2002) compels those in early intervention/early childhood special education (EI/ECSE) to evaluate the extent to which the quantitative research informing practices is convincing, compelling, and consistently produced” (p. 137).

“Several authors have suggested, however, that uniformly viewing randomized controlled clinical trials as the sole source of evidence for informing practice might be unwise because different types of recommended practices might require the use of different types of research methodologies (Dunst, et al., 2002; National Research Council, 2002)” (p. 145).

“As noted in the publication Scientific Research in Education (National Research Council, 2002), research design is one aspect of a larger process of rigorous inquiry and the design of a study does not itself make it scientific” (p. 145).

• St. Pierre, Elizabeth Adams. 2003. “Refusing Alternatives: A Science of Contestation.” Qualitative Inquiry 10(1):130-139.

Argues that Shavelson & Towne’s (2002) lack of epistemological awareness and rejection of “extreme” postmodern inquiry is confused and misinformed, and argues that a “voracious desire to center and control marks the end of the democratic politics that inspire our educational system.”

“The flurry of concern in 2002 following the publication of the National Research Council’s (NRC) (2002) report, Scientific Research in Education, followed by the November 2002 special issue of the American Educational Research Association’s Educational Researcher that focused on the NRC report and a multitude of formal and informal communications among educational researchers about the fate of science in the age of scientific-based research (SBR) and evidence-based research (EBR), has abated to some extent as the U.S. Department of Education puts into place the bureaucratic structure that will define, fund, and disseminate what counts as quality scientific educational research” (p. 130).

“In this request for proposals, a randomized experimental trial is clearly “superior,” and the validity of qualitative evidence which is not “hard evidence” (NRC, 2002, p. 130), is suspect” (p. 130).

• Stoiber, Karen Callan. 2002. “Revisiting Efforts on Constructing a Knowledge Base of Evidence-Based Intervention Within School Psychology.” School Psychology Quarterly 17(4):533-546.

Cites Shavelson & Towne (2002) as support for her perspective on the direction evidence-based interventions (EBIs) in school psychology should follow.

“In constructing a knowledge base of EBIs, our vision was to resist the "hierarchical" distinctions between "quantitative" and "qualitative" methodologies that historically produced tensions between researchers and practitioners. Consistent with the National Research Council (NRC) (see Shavelson & Towne, 2002), we viewed such distinctions as "outmoded" and counterproductive.”

“Furthermore, the context of schools does not permit a precise, concise, or unified view of intervention research within school psychology. In this regard, it was critical that the approach taken did not reflect the assumption that scientific rigor will ensure the use of our products in school psychology intervention practices. This perspective is consistent with the social context perspective emphasized by the NRC's (Shavelson & Towne, 2002) position that ‘scientific quality and rigor are necessary, but not sufficient, conditions for improving the overall value of education research’ (P. 25).”

• Turner, Herbert, Robert Boruch, Anthony Petrosino, Julia Lavenberg, Dorothy de Moya, and Hannah Rothstein. 2003. “Populating an International Web-Based Randomized Trials Register in The Social, Behavioral, Crime, and Education Sciences.” Annals of Political and Social Science 589:204-223.

Cite Shavelson & Towne (2002) regarding part of the definition of randomized controlled trials (RCTs).

“Randomized controlled trials (RCTs) are studies in which individuals or groups of individuals, including organizations, are randomly assigned to different interventions in order to evaluate the relative effectiveness of the interventions. When implemented well, RCTs produce unbiased estimates of these relative effects (Boruch, 1997; Rossi, Freeman & Lipsey, 1999). The freedom from bias is assured by the random assignment of individuals or groups of individuals to the different regimens. There is no systematic difference among the groups so composed (Shavelson & Towne, 2002; Mosteller & Boruch, 2002). This in turn produces a fair comparison. Alternatives to RCTs do not carry the same assurance because they are not randomized” (p. 3).

• Winn, William. 2003. “Research Methods and Types of Evidence for Research in Educational Technology.” Educational Psychology Review 15(4):367-373.

Cites Shavelson & Towne (2002) as part of this response to Mayer (2003) (discussed above). The citation concerns Winn’s work that is consistent with Mayer’s (2003) arguments regarding the need for scientifically-sound cognition studies.

“Mayer’s reading of my article was doubly puzzling in the light of three other manuscripts I was working on concurrently with the article in question. These are not principally concerned with the matter of this discussion. However, they express opinions consistent with Mayer’s. In the first, I advocate a return to a more rigorously scientific and more direct examination of learning, an activity that educational researchers seem to have drifted away from since the mid-eighties (Winn, 2003, pp. 88–89). In the second (Winn, in press), I similarly argue that researchers should conduct careful studies of cognition so that research can be put onto a sound scientific footing. I cite the same important report by Shavelson and Towne (2002) that Mayer cites in his critique. The third (Winn, et al., 2002) describes an experimental study of a learning environment that follows essentially the same methodology that Mayer holds up as a model for educational research (Mayer, 2003). Students were assigned randomly to treatments and some of the data underwent regression analysis and analysis of covariance. Conclusions were based on the tests of significance these statistics provided” (p. 368).

• Woodward, John. 2004. “Mathematics Education in the United States: Past to Present.” Journal of Learning Disabilities 37(1):16-31.

Cites Shavelson & Towne (2002) as supporting their assertion that “research designs alone will not yield a satisfactory answer to the question, “What works?”.

“To be sure, the tenor of these criticisms coincides with political views inside the second Bush administration. It is ironic that math educators agree that quantitative research methods could yield a more balanced and reliable foundation for continued reform efforts (Burrill, 2001; Cohen, Raudenbush, & Ball, 2002; Research Advisory Committee, 2002). However, there are a number of important technical and theoretical considerations that must be taken into account. First, research designs alone will not yield a satisfactory answer to the question, “What works?” Scientific Research in Education (National Research Council, 2002), which was commissioned by the National Research Council during the growing political debate over educational research and practice, made this clear. This report attempted to reconcile the tensions between different types of research methods by noting that different questions require different methodologies.” (p. 26).

Other NAS citation:

National Research Council. 1989. Everybody Counts: A Report to the Nation on the Future of Mathematics Education. Washington, DC: National Academy Press.

• Zich, Suzanna M. and Rita Benn. 2004. “Bridging CAM Practice and Research: Teaching CAM Practitioners About Research Methodology.” Alternative Therapies in Health and Medicine 10(3):50-56.

Evaluation of research methodology course for CAM practitioners followed evaluation model from Shavelson & Towne (2002).

“In assessing the impact of this course, we followed a model of evaluation outlined in a scientific report of educational research by the National Academy of Sciences. This report describes 3 questions that guide the evaluation process: (1) What occurred? (2) What were the outcomes? And (3) What explains the outcomes?” (p. 51).

Summary of In-Depth Look at Shavelson & Towne (2002) Citations

Shavelson & Towne (2002) was written as a consensus document, and while it may have the consensus of those who authored it, it does not have the consensus of the greater scholarly community. Within this community, there are those who see it quite positively and others who take issue with its message as well as its implications as a document tied to governmental authority. At least in relation to its citations, there are more articles in the journals surveyed that cite the document in a positive or neutral manner rather than a negative or neutral one. However, those who disagree or are concerned with the report’s position have largely been published in two issues of Qualitative Inquiry, responding to the report and other recent governmental action to influence the quality of scholarly research. Within some of these critical articles, there is nonetheless cautionary support for the report, suggesting that discussion in peer-refereed documents lends itself to more complex, nuanced analysis.

Conducting an ISI Web of Science Citation Analysis

The first step in searching for citations through the Web of Science is, of course, to retrieve the website using a web browser through the URL: . On the main page of the site, click on the link for “Web of Science.”

[pic]

Once on the main page of the Web of Science, select the databases and, perhaps, years appropriate for the search. In our case, we left the years as open to the entire database to catch potentially mislabeled citation years. We then selected the Social Sciences Citation Index – 1956-present and the Arts & Humanities Citation Index – 1975-present. Once the correct indexes have been selected, “CITED REFERENCE SEARCH” should be clicked.

[pic]

Searching for citing documents in the Web of Science is more complicated than might be expected. Problems with the database are twofold: factors external and internal to ISI. On one hand, ISI has to deal with irregular and incorrect citations in journal articles. As a result, ISI has often retained integrity of the journal article citations and subsequently irregularly entered citations into its database (it is assumed that some corrections are made). On the other hand, ISI has created a publication and author listing that is additionally irregular and sometimes even cryptic. For example, the document “Knowing What Students Know: The Science and Design of Educational Assessment” is listed in the database in at least the following titles, or “short names”:

KNOWING STUDENTS KNO

KNOWING WHAT STUDENT

KWOWING STUDENTS KNO

with the following authors:

PELLEGRINO J

PELLEGRINO P

PELLEGRION JW

PELLIGRINO JW

Further complicating the author search was the erratic pattern of citation of documents with “corporate authors,” documents authored by organizations or committees rather than individuals. In some cases, corporate authors were used, other times editors were used, both with the same “short name” problems experienced above.

As a result, a basic “Cited Reference Search” using the “CITED AUTHOR” or “CITED WORK” field for any full title or author’s name does not recall its intended target. In order to deal with these problems, we developed a group of strategies for locating documents in the database.

In the case of volumes with a clear editor or author, searches were performed for those authors, typically by simply searching by last name or by locating the listing of their name in the list of all publications cited in the database as being authored by the “NRC.” Because so few records are listed under their corporate authors, this strategy was not typically useful for such documents.

After completing the author search or when no clear author or editor was available, the first step was to use the “Cited Work Index” feature in the “Cited Reference Search” section of the Social Sciences Index to search for titles of the cited documents.

[pic]

The “Cited Work Index” is alphabetical and is searched by using the “MOVE TO” command, which rather than searching for all records that contain a given search string, jumps to the place in the alphabet where the search string belongs. Experience parsing words ultimately led to faster document retrieval, but the most effective search strategy was to start with the first syllable of the first word and build the string from there. For example, in the case of “Classroom Assessment and the National Science Education Standards: A Guide for Teaching and Learning,” the search strings “CLASS” and “CLASS AS” were first entered. Finding nothing relevant, “CLASSR” and “CLASSROOM” were alternately entered, bringing up a list of potentially relevant documents. The number indicated, 118, includes all documents listed under the title “CLASSROOM ASSESSMENT” in the database. It is from this list that we must look for legitimate citations.

[pic]

These documents were then “ADDed” into the Cited Reference search to determine if they were the appropriate documents. Finding that some versions of the documents corresponded with expected authors for the documents (e.g., “COMM-CLASSR ASS N,” the document’s abbreviated “corporate author” “Committee on Classroom Assessment and the National Science Education Standards”), the citations for the document were retrieved and added to the citation search by clicking on the “FINISH SEARCH” link.

[pic]

The documents matching the criteria selected above are then displayed. At this point, they should be “Submitted” to the “Marked Records” list using one of the three options from the menu on the right (“Selected records”, “All records on this page”, or “Records __ to __”).

[pic]

Once added to the “Marked Records” list, one has the ability to download the citations in a variety of formats with a variety of attributes. At the request of the CFE, we selected “Author”, “Title”, “Source”, “Abstract”, “Document Type”, and “Times Cited”, and then “Saved to file” the list as “Tab Delimited [(Mac) or (Windows)]” for inclusion in the spreadsheets corresponding to Appendix A that make up the electronic appendices of this report.

[pic]

Once we downloaded the files, Excel’s tabbed delimited import feature was used to read the data into each spreadsheet for portability and ease of display.

[pic]

Appendix A: Sample CFE Report Citation Table[6]

| |

|Adding It Up: Helping Children Learn Mathematics, 2001 |

|Title |

|ACADEME-BULLETIN OF THE AAUP |

|ACADEMIC PSYCHIATRY |

|ADULT EDUCATION QUARTERLY |

|ADVANCES IN HEALTH SCIENCES EDUCATION |

|AIDS EDUCATION AND PREVENTION |

|AMERICAN EDUCATIONAL RESEARCH JOURNAL* |

|AMERICAN JOURNAL OF EDUCATION |

|ANTHROPOLOGY & EDUCATION QUARTERLY* |

|APPLIED MEASUREMENT IN EDUCATION* |

|AUSTRALIAN EDUCATIONAL RESEARCHER |

|BRITISH EDUCATIONAL RESEARCH JOURNAL |

|BRITISH JOURNAL OF EDUCATIONAL STUDIES |

|BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY |

|BRITISH JOURNAL OF SOCIOLOGY OF EDUCATION |

|CHINESE EDUCATION AND SOCIETY |

|COMPARATIVE EDUCATION |

|COMPARATIVE EDUCATION REVIEW |

|COMPUTERS & EDUCATION |

|CURRICULUM INQUIRY |

|EARLY CHILDHOOD RESEARCH QUARTERLY |

|ECONOMICS OF EDUCATION REVIEW* |

|EDUCATION AND URBAN SOCIETY* |

|EDUCATIONAL ADMINISTRATION QUARTERLY |

|EDUCATIONAL EVALUATION AND POLICY ANALYSIS |

|EDUCATIONAL GERONTOLOGY |

|EDUCATIONAL LEADERSHIP* |

|EDUCATIONAL POLICY* |

|EDUCATIONAL RESEARCH* |

|EDUCATIONAL REVIEW |

|EDUCATIONAL STUDIES* |

|EDUCATIONAL TECHNOLOGY & SOCIETY |

|ELEMENTARY SCHOOL JOURNAL* |

|ETR&D-EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT* |

|FOREIGN LANGUAGE ANNALS |

|GENDER AND EDUCATION |

|HARVARD EDUCATIONAL REVIEW* |

|HEALTH EDUCATION RESEARCH |

|HIGHER EDUCATION |

|INNOVATIONS IN EDUCATION AND TEACHING INTERNATIONAL |

|INSTRUCTIONAL SCIENCE* |

|INTERACTIVE LEARNING ENVIRONMENTS |

|INTERNATIONAL JOURNAL OF ART & DESIGN EDUCATION |

|INTERNATIONAL JOURNAL OF EDUCATIONAL DEVELOPMENT |

|INTERNATIONAL JOURNAL OF SCIENCE EDUCATION |

|JOURNAL FOR RESEARCH IN MATHEMATICS EDUCATION* |

|JOURNAL OF ADOLESCENT & ADULT LITERACY |

|JOURNAL OF AMERICAN COLLEGE HEALTH |

|JOURNAL OF COLLEGE STUDENT DEVELOPMENT |

|JOURNAL OF COMPUTER ASSISTED LEARNING |

|JOURNAL OF CURRICULUM STUDIES |

|JOURNAL OF ECONOMIC EDUCATION |

|JOURNAL OF EDUCATION POLICY |

|JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS |

|JOURNAL OF EDUCATIONAL RESEARCH |

|JOURNAL OF EXPERIMENTAL EDUCATION |

|JOURNAL OF GEOGRAPHY IN HIGHER EDUCATION |

|JOURNAL OF HIGHER EDUCATION* |

|JOURNAL OF LEGAL EDUCATION |

|JOURNAL OF LITERACY RESEARCH |

|JOURNAL OF MORAL EDUCATION |

|JOURNAL OF PHILOSOPHY OF EDUCATION |

|JOURNAL OF RESEARCH IN READING |

|JOURNAL OF RESEARCH IN SCIENCE TEACHING* |

|JOURNAL OF SCHOOL HEALTH |

|JOURNAL OF SOCIAL WORK EDUCATION |

|JOURNAL OF TEACHER EDUCATION* |

|JOURNAL OF TEACHING IN PHYSICAL EDUCATION |

|JOURNAL OF THE LEARNING SCIENCES* |

|LANGUAGE LEARNING |

|LANGUAGE LEARNING & TECHNOLOGY |

|LEARNING AND INSTRUCTION |

|MINERVA |

|OXFORD REVIEW OF EDUCATION |

|PERSPECTIVES IN EDUCATION* |

|PHI DELTA KAPPAN* |

|QUEST |

|READING RESEARCH AND INSTRUCTION |

|READING RESEARCH QUARTERLY* |

|READING TEACHER |

|RESEARCH IN HIGHER EDUCATION |

|RESEARCH IN SCIENCE EDUCATION |

|RESEARCH IN THE TEACHING OF ENGLISH |

|REVIEW OF EDUCATIONAL RESEARCH* |

|REVIEW OF HIGHER EDUCATION |

|REVIEW OF RESEARCH IN EDUCATION |

|RUSSIAN EDUCATION AND SOCIETY |

|SCHOOL EFFECTIVENESS AND SCHOOL IMPROVEMENT* |

|SCIENCE EDUCATION* |

|SCIENTIFIC STUDIES OF READING |

|SECOND LANGUAGE RESEARCH |

|SOCIOLOGY OF EDUCATION |

|STUDIES IN HIGHER EDUCATION |

|TEACHERS COLLEGE RECORD* |

|TEACHING AND TEACHER EDUCATION* |

|TEACHING OF PSYCHOLOGY |

|TEACHING SOCIOLOGY |

|TESOL QUARTERLY |

|THEORY INTO PRACTICE* |

|URBAN EDUCATION |

|YOUNG CHILDREN |

|ZEITSCHRIFT FUR PADAGOGIK |

Source: ISI Web of Science (), Social Sciences Citation Index, 1956-2004

-----------------------

[1] A search of ProQuest Digital Dissertations’ () Dissertation Abstract database found no references to the 42 2001-2003 CFE reports analyzed in this report. The Educational Resources Information Center (ERIC) clearinghouse is currently under redevelopment, and as a result, searching for report titles is impossible. For example, a search for the string “Adding it up” returns all records containing the words “adding,” “it,” and “up,” a list exceeding the maximum number of records displayed by ERIC. On September 1, 2004, this problem should be fixed.

[2] We did not include “CD-ROM (Proceedings and unedited transcript) (October 1)” in our analysis.

[3] The list of reports provided by CFE included after this report title a second title, Information to Measure Compliance with International Labor Standards: Summary of a Workshop, and a note questioning whether this was the same report with a different title, released November 2002. We did not find this alternative title searching the National Academies of Science Press website ( ) and did not include it in our ISI search.

[4] 27 of these journals are included in the Education & Educational Research Subject Category. See Appendix B for more information on ISI’s journal coverage.

[5] For the articles that cited Shavelson & Towne “numerous” times, only the first few citation passages are provided.

[6] Abstracts were removed from this sample table for ease of viewing within standard document size.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download