Washington, D.C. Comprehensive Assessment System (DC CAS)

Technical Report

Spring 2010 Operational Test

Administration

Washington, D.C.

Comprehensive Assessment

System (DC CAS)

September 9, 2010

CTB/McGraw-Hill

Monterey, California 93940

Technical Report for Spring 2010 Operational Test Administration of DC CAS

Developed and published by CTB/McGraw-Hill LLC, a subsidiary of The McGraw-Hill

Companies, Inc., 20 Ryan Ranch Road, Monterey, California 93940-5703. Copyright ? 2010 by

CTB/McGraw-Hill LLC. All rights reserved. No part of this publication may be reproduced or

distributed in any form or by any means, or stored in a database or retrieval system, without the

prior written permission of the publisher.

Technical Report for Spring 2010 Operational Test Administration of DC CAS

Table of Contents

List of Tables ...................................................................................................... 5

Section 1. Overview ........................................................................................... 9

Purpose of the DC CAS Assessments in Reading, Mathematics, Science/Biology, and Composition

............................................................................................................................................................. 9

Highlights of This Technical Report................................................................................................. 9

Suggestions for How to Use This Technical Report ....................................................................10

Section 2. Test Content, Design, and Development for Reading, Mathematics,

Science/Biology, and Composition ................................................................ 11

2010 Test Design and Coverage of the Content Strands.............................................................14

Composition Test ..............................................................................................................................19

2010 Test Development Procedures...............................................................................................19

Organization of Test Booklets and Other Test Materials............................................................20

Section 3. Student Participation ..................................................................... 22

Tests Administered ............................................................................................................................22

Eligibility for Participation in DC CAS .......................................................................................... 22

Participation in the 2010 DC CAS Test Administrations and Use of Data for Analysis and Score

Reporting .........................................................................................................................................23

Section 4. Test Administration Guidelines and Requirements .................... 31

Overview .............................................................................................................................................31

Guidelines and Requirements for Administering DC CAS.........................................................32

Materials Orders, Delivery, and Retrieval ......................................................................................33

Secure Inventory ................................................................................................................................33

Section 5. Evidence for Reliability and Validity ............................................. 35

Construct, Purpose, and Interpretation of Scores ........................................................................35

Internal Consistency Reliability........................................................................................................36

Reliabilities of Content Strand Scores............................................................................................. 37

Conditional Standard Error of Measurement................................................................................40

Classification Consistency and Accuracy........................................................................................45

Classification Consistency.................................................................................................................45

Classification Accuracy .....................................................................................................................45

Differential Item Functioning..........................................................................................................49

Results of the Differential Item Functioning Analyses ................................................................50

Section 6. Reliability and Validity of Hand-Scoring....................................... 54

DC CAS Scoring Process .................................................................................................................54

Selection of Scoring Raters...............................................................................................................54

Recruitment ........................................................................................................................................54

The Interview Process.......................................................................................................................54

Training Material Development....................................................................................................... 55

Preparation and Meeting Logistics for Rangefinding Prior to 2010 Operational Scoring......55

Training and Qualifying Procedures in 2010 .................................................................................56

3

Technical Report for Spring 2010 Operational Test Administration of DC CAS

Breakdown of Scoring Teams.......................................................................................................... 56

Monitoring the Scoring Process ...................................................................................................... 57

Hand-Scoring Agreement .................................................................................................................58

Selection of the 2010 Writing Prompts ..........................................................................................59

Section 7. IRT Analyses................................................................................... 63

Calibration and Equating Models.................................................................................................... 63

Goodness of Fit to the IRT Models ............................................................................................... 64

Item Calibration .................................................................................................................................66

Establishing Upper and Lower Bounds for the Grade Level Scales for the Base Years: 2006 for

Reading and Mathematics, 2008 for Science/Biology ..............................................................66

Year to Year Equating Procedures.................................................................................................. 67

Anchor Set Review Process..............................................................................................................69

Anchor Item Parameter Comparisons............................................................................................69

Scaling Constants ...............................................................................................................................71

Section 8. Standard Setting............................................................................. 74

Section 9. Percent Indices for the State and for Content Areas and Content

Strands.............................................................................................................. 77

State Percent Index for Content Areas........................................................................................... 77

Percent Index Score for Content Strands ......................................................................................77

Calculating Proficient Percent Index using Expected Percent of Maximum Score.................77

Performance At or Above Proficient.............................................................................................. 78

Cut Scores for Performance At or Above Proficient for Percent Index Scores ......................78

Section 10. Results .......................................................................................... 80

Test and Item Characteristics...........................................................................................................80

DC CAS Performance Level Distributions....................................................................................82

Means and Standard Deviations by Race/Ethnicity and Gender...............................................82

Correlations.........................................................................................................................................86

Correlations of Strand Scores and Total Content Area Scores...................................................89

Section 11. DC CAS 2010 Field Test ............................................................... 95

References........................................................................................................ 96

Appendix A: Checklist for DC Educator Review of DC CAS Items .............. 98

Appendix B: DC CAS Composition Scoring Rubrics .................................. 100

Appendix C: Internal Consistency Reliability Coefficients for Examinee

Subgroups ...................................................................................................... 102

Appendix D: Classification Consistency and Accuracy Results for Each

Proficiency Level in Each Grade and Content Area Assessment .............. 105

Appendix E: Classification Consistency and Accuracy Estimates for All

Proficiency Levels for Examinee Subgroups .............................................. 108

Appendix F: Items Flagged for DIF Using Mantel-Haenszel Procedures .. 121

Appendix G: Operational Item Adjusted P Values....................................... 134

4

Technical Report for Spring 2010 Operational Test Administration of DC CAS

List of Tables

TABLE 1. DC CAS 2010 TEST STRAND DESCRIPTIONS: READING, MATHEMATICS,

SCIENCE/BIOLOGY, AND COMPOSITION................................................................................................11

TABLE 2. DC CAS 2010 TEST DESIGN: READING, MATHEMATICS, AND SCIENCE/BIOLOGY ........14

TABLE 3. DC CAS 2010 OPERATIONAL TEST FORM BLUEPRINTS: READING....................................15

TABLE 4. DC CAS 2010 OPERATIONAL TEST FORM BLUEPRINTS: MATHEMATICS.........................16

TABLE 5. DC CAS 2010 OPERATIONAL TEST FORM BLUEPRINTS: SCIENCE/BIOLOGY .................18

TABLE 6. DC CAS 2010 OPERATIONAL TEST FORM SCORING RUBRICS: COMPOSITION..............19

TABLE 7. NUMBERS OF EXAMINEES WITH VALID TEST ADMINISTRATIONS IN 2010: READING..23

TABLE 8. NUMBERS OF EXAMINEES WITH VALID TEST ADMINISTRATIONS IN 2010:

MATHEMATICS................................................................................................................................................24

TABLE 9. NUMBERS OF EXAMINEES WITH VALID TEST ADMINISTRATIONS IN 2010:

SCIENCE/BIOLOGY........................................................................................................................................24

TABLE 10. NUMBERS OF EXAMINEES WITH VALID TEST ADMINISTRATIONS IN 2010:

COMPOSITION.................................................................................................................................................24

TABLE 11. NUMBER (AND PERCENTAGE) OF STUDENTS IN SPECIAL PROGRAMS WITH TEST

SCORES ON THE 2010 DC CAS IN READING, MATHEMATICS, SCIENCE/BIOLOGY, OR

COMPOSITION.................................................................................................................................................25

TABLE 12. NUMBER (AND PERCENTAGE) OF STUDENTS RECEIVING ONE OR MORE SPECIAL

EDUCATION TEST ADMINISTRATION ACCOMMODATIONS IN READING, MATHEMATICS,

SCIENCE/BIOLOGY, OR COMPOSITION ..................................................................................................26

TABLE 13. NUMBER (AND PERCENTAGE) OF STUDENTS RECEIVING ONE OR MORE SELECTED

SPECIAL EDUCATION TEST ADMINISTRATION ACCOMMODATIONS IN READING,

MATHEMATICS, SCIENCE/BIOLOGY, OR COMPOSITION...................................................................27

TABLE 14. NUMBER (AND PERCENTAGE) OF STUDENTS RECEIVING ONE OR MORE ENGLISH

LANGUAGE LEARNER TEST ADMINISTRATION ACCOMMODATIONS IN READING,

MATHEMATICS, SCIENCE/BIOLOGY, OR COMPOSITION...................................................................28

TABLE 15. NUMBER (AND PERCENTAGE) OF STUDENTS CODED FOR ELL PROFICIENCY

LEVELS 1¨C4 IN READING, MATHEMATICS, SCIENCE/BIOLOGY, OR COMPOSITION .................29

TABLE 16. NUMBER (AND PERCENTAGE) OF STUDENTS RECEIVING ONE OR MORE SELECTED

ENGLISH LANGUAGE LEARNER TEST ADMINISTRATION ACCOMMODATIONS IN READING,

MATHEMATICS, SCIENCE/BIOLOGY, OR COMPOSITION...................................................................30

TABLE 17. INTERNAL CONSISTENCY RELIABILITY COEFFICIENTS FOR THE 2010 DC CAS

OPERATIONAL TESTS ..................................................................................................................................37

TABLE 18. COEFFICIENT ALPHA RELIABILITY FOR READING STRAND SCORES.............................38

TABLE 19. COEFFICIENT ALPHA RELIABILITY FOR MATHEMATICS STRAND SCORES..................39

TABLE 20. COEFFICIENT ALPHA RELIABILITY FOR SCIENCE/BIOLOGY STRAND SCORES ..........40

TABLE 21. DC CAS 2010 NUMBER CORRECT TO SCALE SCORE CONVERSIONS WITH

ASSOCIATED STANDARD ERRORS OF MEASUREMENT (SEM): READING..................................41

TABLE 22. DC CAS 2010 NUMBER CORRECT TO SCALE SCORE CONVERSIONS WITH

ASSOCIATED STANDARD ERRORS OF MEASUREMENT (SEM): MATHEMATICS.......................42

TABLE 23. DC CAS 2010 NUMBER CORRECT TO SCALE SCORE CONVERSIONS WITH

ASSOCIATED STANDARD ERRORS OF MEASUREMENT (SEM): SCIENCE/BIOLOGY...............44

TABLE 24. EXAMPLE OF CONTINGENCY TABLE WITH TWO CUT SCORES.........................................46

5

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download