Washington, D.C. Comprehensive Assessment System (DC CAS)

Technical Report

Spring 2012 Test Administration

Washington, D.C.

Comprehensive Assessment System

(DC CAS)

December 31, 2012

CTB/McGraw-Hill

Monterey, California 93940

Copyright ? 2012 by the District of Columbia Office of the State Superintendent of Education

Technical Report for Spring 2012 Test Administration of DC CAS

2

Developed and published by CTB/McGraw-Hill LLC, 20 Ryan Ranch Road, Monterey, California 939405703. Copyright ? 2012 by the District of Columbia Office of the State Superintendent of Education. All

rights reserved. Only authorized customers may copy, download and/or print the document, located online

at . Any other use or reproduction of this document, in whole or in part,

requires written permission of the District of Columbia Office of the State Superintendent of Education.

Technical Report for Spring 2012 Test Administration of DC CAS

3

Table of Contents

List of Tables ................................................................................................................................................................... 5

Section 1. Overview ........................................................................................................................................................ 7

Section 2. Item and Test Development .......................................................................................................................... 8

Overview ........................................................................................................................................................................ 8

Content Standards .......................................................................................................................................................... 8

Item Development .......................................................................................................................................................... 8

Test Development .......................................................................................................................................................... 9

Test Design..................................................................................................................................................................... 9

Section 3. Test Administration Guidelines and Requirements ................................................................................. 17

Overview ...................................................................................................................................................................... 17

Guidelines and Requirements for Administering DC CAS .......................................................................................... 18

Materials Orders, Delivery, and Retrieval .................................................................................................................... 19

Secure Inventory .......................................................................................................................................................... 19

Section 4. Student Participation .................................................................................................................................. 20

Tests Administered ....................................................................................................................................................... 20

Participation in DC CAS .............................................................................................................................................. 20

Definition of Valid Test Administration ...................................................................................................................... 21

Special Accommodation .............................................................................................................................................. 21

Section 5. Scoring .......................................................................................................................................................... 29

Selection of Scoring Raters .......................................................................................................................................... 29

Recruitment .................................................................................................................................................................. 29

The Interview Process .................................................................................................................................................. 29

Training Material Development ................................................................................................................................... 29

Preparation and Meeting Logistics for Rangefinding ................................................................................................... 30

Training and Qualifying Procedures ............................................................................................................................ 30

Breakdown of Scoring Teams ...................................................................................................................................... 31

Monitoring the Scoring Process ................................................................................................................................... 31

Section 6. Methods ........................................................................................................................................................ 33

Classical Item Level Analyses ..................................................................................................................................... 33

Item Bias Analyses ....................................................................................................................................................... 33

Calibration and Equating .............................................................................................................................................. 35

Goodness of Fit ............................................................................................................................................................ 35

Year-to-Year Equating Procedures .............................................................................................................................. 37

Establishing Upper and Lower Bounds for the Grade Level Scales............................................................................. 38

Reliability Coefficients ................................................................................................................................................ 39

Standard Errors of Measurement .................................................................................................................................. 40

Proficiency Level Analyses .......................................................................................................................................... 40

Classification Consistency ........................................................................................................................................... 40

Classification Accuracy................................................................................................................................................ 41

Section 7. Standard Setting .......................................................................................................................................... 47

Grades 3¨C10 Reading Cut Score Review ..................................................................................................................... 48

Grade 2 Reading and Mathematics Standard Setting ................................................................................................... 49

Grades 4, 7, and 10 Composition Standard Setting ...................................................................................................... 49

Final, Approved DC CAS Cut Scores .......................................................................................................................... 49

Section 8. Evidence for Reliability and Validity ........................................................................................................ 52

Reliability ..................................................................................................................................................................... 52

Validity......................................................................................................................................................................... 53

Item Level Evidence..................................................................................................................................................... 53

Classical Item Statistics ................................................................................................................................................ 53

Inter-Rater Reliability .................................................................................................................................................. 54

Differential Item Function ............................................................................................................................................ 55

Test and Strand Level Evidence ................................................................................................................................... 55

Copyright ? 2012 by the District of Columbia Office of the State Superintendent of Education

Technical Report for Spring 2012 Test Administration of DC CAS

4

Operational Test Scores ............................................................................................................................................... 55

Strand Level Scores ...................................................................................................................................................... 56

Standard Errors of Measurement .................................................................................................................................. 56

Proficiency Level Evidence ......................................................................................................................................... 57

Correlational Evidence across Content Areas .............................................................................................................. 58

References ..................................................................................................................................................................... 99

Appendix A: Checklist for DC Educator Review of DC CAS Items ...................................................................... 101

Appendix B: DC CAS Composition Scoring Rubrics .............................................................................................. 103

Appendix C: Operational and Field Test Item Adjusted P Values ........................................................................ 105

Appendix D: Internal Consistency Reliability Coefficients for Examinee Subgroups ......................................... 146

Appendix E: Classification Consistency and Accuracy Estimates for All Proficiency Levels for Examinee

Subgroups .................................................................................................................................................................... 152

Copyright ? 2012 by the District of Columbia Office of the State Superintendent of Education

Technical Report for Spring 2012 Test Administration of DC CAS

5

List of Tables

Table 1. DC CAS 2012 Operational Test Form Blueprints: Reading ............................................................................. 11

Table 4. DC CAS 2012 Operational Test Form Blueprints: Composition...................................................................... 16

Table 5. Number and Percent of Examinees with Valid Test Administrations on the 2012 DC CAS in Reading,

Mathematics, Science/Biology, or Composition ..................................................................................................... 23

Table 6. Number and Percent of Students in Special Programs with Test Scores on the 2012 DC CAS in Reading,

Mathematics, Science/Biology, or Composition ..................................................................................................... 24

Table 7. Number and Percent of Students Coded for ELL Access for Proficiency Levels 1¨C4 in Reading,

Mathematics, Science/Biology, or Composition ..................................................................................................... 25

Table 8. Number and Percent of Students Receiving One or More English Language Learner Test Administration

Accommodations in Reading, Mathematics, Science/Biology, or Composition ..................................................... 26

Table 9. Number and Percent of Students Receiving One or More Special Education Test Administration

Accommodations in Reading, Mathematics, Science/Biology, or Composition ..................................................... 27

Table 10. Number and Percent of Students Receiving One or More Selected Special Education Test Administration

Accommodations in Reading, Mathematics, Science/Biology, or Composition ..................................................... 28

Table 11. DC CAS 2012 Numbers of Operational Items Flagged for Poor Fit During Calibration ............................... 43

Table 12. Correlations Between the Item Parameters for the Reference Form and 2012 DC CAS Operational Test

Form ........................................................................................................................................................................ 44

Table 13. Scaling Constants Across Administrations, All Grades and Content Areas ................................................... 45

Table 14. LOSS and HOSS for Relevant Grades in Reading, Mathematics, Science/Biology and Composition .......... 46

Table 15. Final Cut Score Ranges .................................................................................................................................. 51

Table 16. DC CAS 2012 Classical Item Level Statistics ................................................................................................ 59

Table 17. DC CAS 2012 Operational Inter-Rater Agreement for Constructed Response Items: Reading ..................... 60

Table 18. DC CAS 2012 Operational Inter-Rater Agreement for Constructed Response Items: Mathematics .............. 61

Table 19. DC CAS 2012 Operational Inter-Rater Agreement for Constructed Response Items: Science/Biology ........ 62

Table 20. DC CAS 2012 Operational Inter-Rater Agreement for Constructed Response Items: Composition .............. 63

Table 21. DC CAS 2012 Field Test Inter-Rater Agreement for Constructed Response Items: Reading ........................ 64

Table 22. DC CAS 2012 Field Test Inter-Rater Agreement for Constructed Response Items: Mathematics ................ 65

Table 23. DC CAS 2012 Field Test Inter-Rater Agreement for Constructed Response Items: Science/Biology ........... 66

Table 24. Numbers of Operational Items Flagged for DIF Using the Mantel-Haenszel Procedure: Reading ................ 67

Table 25. Numbers of Operational Items Flagged for DIF Using the Mantel-Haenszel Procedure: Mathematics ......... 69

Table 26. Numbers of Operational Items Flagged for DIF Using the Mantel-Haenszel Procedure: Science/Biology ... 70

Table 27. Numbers of Operational/Field Test Items Flagged for DIF Using the Mantel-Haenszel Procedure:

Composition ............................................................................................................................................................ 71

Table 28. Numbers of Field Test Items Flagged for DIF Using the Mantel-Haenszel Procedure: Reading ................... 72

Table 29. Numbers of Field Test Items Flagged for DIF Using the Mantel-Haenszel Procedure: Mathematics ........... 74

Table 30. Numbers of Field Test Items Flagged for DIF Using the Mantel-Haenszel Procedure: Science/Biology ...... 75

Table 31. Total Test Scale and Raw Score Means and Reliability Statistics .................................................................. 76

Table 31. Coefficient Alpha Reliability for Reading Strand Scores ............................................................................... 77

Table 33. Coefficient Alpha Reliability for Mathematics Strand Scores ........................................................................ 78

Table 34. Coefficient Alpha Reliability for Science/Biology Strand Scores .................................................................. 79

Copyright ? 2012 by the District of Columbia Office of the State Superintendent of Education

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download