An Analysis of ITBS Results for Eighth Grade Students at ...

[Pages:1]

|team awesome |

|An Analysis of ITBS Results for Eighth Grade Students at St. Pancratius School |

|2011-12 School Year |

| |

|Megan Fissel, Jason Longoria, Beverly Saar, Jill Sweet |

|7/3/2012 |

|This paper describes the results of the Iowa Test of Basic Skills (ITBS) assessment, taken by the eighth grade students of a private school |

|within the Archdiocese of Los Angeles. The types of data and their reliability are discussed. Result validity and scores are analyzed, and |

|the application of the data is discussed. Questions about the test and the data it provides are also explored. |

Introduction

  Each fall, schools in the Archdiocese of Los Angeles participate in a diocesan-wide assessment of basic academic skills. The instrument is the Iowa Test of Basic Skills (ITBS). Developed at the University of Iowa over seventy years ago, and the subject of on-going research, the ITBS has been used successfully across the nation since 1935. Millions of students are tested each year, in both public and private schools.

This standardized, norm-referenced achievement test is designed to rank student achievement with other eighth grade test-takers across the country in the areas of reading, language, math, science, social studies and sources of information. The test is designed to measure how the students in the participating school perform in relation to other similarly-aged students on valued academic skills. Because of some variances in skills assessed by grade level, it is important to note that the data discussed here refers to assessment of eight grade students, using the Level 14 test version.

The ITBS Test

The Iowa Test of Basic Skills (ITBS), or Iowa Test, is used to provide an assessment of student performance in the major content areas: vocabulary, word analysis, listening, reading comprehension, language, math, social studies and science. Tests of varying difficulties, levels 5 – 14, are given to students from kindergarten through eighth grade to assess their performance in these areas. This information can then be used to inform teachers on the areas of study their area(s) of needed focus.

The test is structured to give the students every opportunity to succeed. The goal of the test is to assess performance and gather reliable data on how the students are advancing each year through curriculum levels. Depending on the test and subject matter, the amount of time given to complete a section varies. For example, Levels 5 and 6 of the ITBS are designed to be developmentally appropriate for young children. All tests are untimed and, except for the Reading test, are read aloud by the teacher (The Riverside Publishing Company, 2010). Accommodations for students with disabilities and English language learners are made by following the guidelines determined by each school district. Table 1 shows the number of items and time allotted for each section of the ITBS test. The data observed in this report was from 8th grade students taking a level 14 exam. (See Appendix I)

Students work through multiple choice questions and fill out a Scantron to record their answers. The Scantrons can be graded by hand, by using optical scanning equipment or can be sent away to a scoring center. The data is then reported in an Interactive Results Manager to allow the data results of each school to be assessed and compared quickly; which is the case with the data presented here.  Sample questions from a level 14 exam are shown below.

1. What would you find in an almanac?

A)   A description of how flowers pollinate

B)  The names of all the senators in this year’s congress in Washington D.C.

C)  The meaning of senator

D) A map of the continents

2. Which would tell you whether a parody is funny or serious?

F)   Dictionary

G)   Atlas

H)   Encyclopedia

J)   Almanac

3. Which would be the best source to find out how a suspension bridge is constructed?

A)   Dictionary

B)   Atlas

C)   Encyclopedia

D)   Almanac

Score Interpretation

Performance data is presented in a variety of ways, which, used collectively, effectively inform faculty of the strengths and weaknesses of the group. Score data used here include the group’s percentile rank (PR), standard score (SS) and grade equivalent (GE). Each of these scores assesses performance differently and will be discussed and analyzed separately. The reporting of these scores can be found in appendices II and III, respectively.

The ITBS is a nationally standardized test administered in the same way to students across a specified reference population across age and grade groups. It is a norm referenced test (NRT), as it compares students’ performance to other students, rather than to criteria. Score interpretations compare the test taker’s scores to the scores of other students in the nation, who take the test at, roughly, the same time of the school year. This allows educators to analyze the performance of their students in relation to others at the same grade level.

This assessment produces developmental standard scores and percentile ranks. For the data in this report, Average Standard Scores and computed Grade Equivalents of Average Standard Scores are presented. The average standard scores for this data are numbers that describe the average score for the 24 students in Mrs. Sweet’s 8th Grade class that took the ITBS exam in September, 2011. The results are posted for Reading, Language, Math, Core, Social Studies, Science, Sources of Info, and Composite subject areas of testing. To interpret the Standard Scores, the values associated with typical performance in each grade are used as reference points.

The Grade Equivalent Scores are numbers that describe a student's, or in this case the average of a class of student’s location on an achievement continuum. This continuum is a number line that describes the lowest level of knowledge or skill on one end and the highest level of development on the other end. For this data, the Grade Equivalent of Average Standard Scores is the grade equivalent scores of the average scores for this class, which took the ITBS exam in the fall of 2011.

The Grade Equivalent is a decimal number that describes performance in terms of grade level and months. The digits to the left of the decimal point represent the grade and those to the right represent the month within that grade. For example, the Grade Equivalent of Average Standard Scores of 9.8 on the Vocabulary and Reading comprehension tests mean that the average of scores for Mrs. Sweet’s 24-student class is most similar to the scores of a typical group of students in the eighth month of ninth grade using the same test. These scores show a GE that is higher than the student's grade level, indicating that the class has performed higher than their grade level on this assessment. This data is consistent across nearly all curriculum areas, with Social Studies as the only subject where the class average was approximately at their grade level.

For individual students, GE is useful and convenient for measuring growth from one year to the next, and for estimating a student's developmental status in terms of grade level. The GE estimates a student's developmental level in terms of performance, however, does not provide a recommendation for grade placement. A GE score that is much higher than the student's grade level is simply one indicator of exceptional performance.

The assumption of validity of the ITBS is based on several issues. First, the tests were developed at the University of Iowa in 1935, and are the subject of on-going research. Furthermore, the ITBS has been used successfully across the nation since that time. Finally, the tests are used to test millions of students each year in both public and private schools.

Use of Data

The data collected from this assessment is used to identify improvement strategies at the school, grade and individual student levels. First, the data informs faculty, administration and parents about the general success of the school by ranking students’ performance with students from other schools across the country on educational values by subject. This information also allows us to identify areas of strengths and weaknesses within the grade level groups to inform instruction and identify school improvement plans.

In addition to the group percentile results, grade level equivalency averages are also considered. The grade equivalency data informs us of how our students compare to other students without the age norm constraint. For parents, this data has more meaning, as most parents have a better understanding of grade level, than percentile. If they are used to seeing classroom test scores in excess of 90%, then receive standardized test scores that read 78th percentile, it is natural for them to be concerned that classroom tests are not adequately challenging. However, when you can show that the student performed about the same as a student the same age, or a year older would score, the percentile data is put into perspective.

At the individual student level, the data reports what percent of our students ranked high, medium or low against similar students. Alongside individual scores, this allows us to identify needs for intervention, as well as to monitor response to existing interventions. In some cases, the curriculum needs to be more challenging, and, for others, reasonable adjustments and/or additional differentiation may be indicated.

Tracking of scores over the students’ time at within the Archdiocese also informs us on the effectiveness of our instruction, both at the group and individual levels. For students targeted for intervention, this data is particularly useful for evaluating whether or not the interventions are having an effect on student performance. Furthermore, tracking provides the ability to identify unusual performance in a particular year, as seek an explanation for it. 

Significance

Testing early in the year informs teachers of the strengths and weaknesses of the current group so that curricular adjustments can be made in the current and preceding grades, if necessary. For example, scores in the subject of Social Studies were assessed as a weak area in grades 6-8. An examination of the teaching style and activities presented during instruction will be conducted and, the middle school faculty will work together to suggest differentiation strategies, and, perhaps, integration with language arts for greater instructional depth.

Reading scores and grade level equivalency data are used with students immediately following receipt of the ITBS scores. In grade 8, the preponderance of reading material is chosen by the students based on their interests. In order to ensure the students can both understand, and be somewhat challenged by their reading selections, Lexile levels are assigned to the individual students, based on their reading scores on this exam.

Every 5-6 years, our school participates in a re-accreditation process by the Western Association of Schools and Colleges. ITBS scores inform the narrative of our school’s competency and progress, as well as the creation of our action plan. While the school examined here is private, thereby unaffected by score requirements for federal funding, the school’s traditionally impressive scores are a marketing tool used for recruiting new families to the school.

Questions Regarding Quality of Data and Appropriateness of Use

As a private school, St. Pancratius, is not required to administer the California state exam (California Star Test, or CST) and has, instead, chosen to implement the Iowa Test of Basic Skills. One advantage is that this test may be given at any point in the year. By administering the test in September, scores are received quickly enough for teachers to plan instruction and adjust the curriculum based on how their current students are performing in the various content areas. This early testing also allows for interventions for students who are below grade level and allows teachers to examine specific areas of the curriculum on which students are performing poorly. When used in this manner, the use of such standardized test scores is appropriate, ethical, and helpful for the students, both individually and as a whole class. Further, the ITBS is relatively inexpensive to administer; at about $10-15 per student (Finne, 2012).

While the ITBS may have advantages and be considered a superior test to the CST, the use of the ITBS and the exclusion of the CST mean that it is nearly impossible to compare the students at this private school to their peers at public schools, in the same city or across the state. However, at a private institution, where students are likely to be high-achieving and come from families with high levels of education, using a measure that allows for nation-wide comparison is advantageous since California is consistently a low-performing state when it comes to public education. Further, is there any evidence that the ITBS is not biased against any significant groups of students (ethnically, socio-economically, or linguistically)? When the students at this school cannot be compared with students from other groups (i.e. public school students) on the same test, it is difficult to analyze the potential bias or achievement differences among certain groups.

The ITBS, as a national standardized test, is not only used to test achievement levels of schools across the country; scores may also be used for acceptance into Gifted and Talented programs as well as the National Honor Society (Test Prep Guide for the ITBS Assessment). These prestigious applications of high scores on the ITBS mean that there is also test prep material available for parents who wish to pay to help improve their child’s scores. This brings into question fairness issues when test results such as these are used outside of the classroom or school, and when some can afford to try to improve their child’s scores and others cannot.

On a similar note, to what extent is the ITBS used in a manner that may impact students’ futures? For example, are these scores used to determine placement into differentially leveled classrooms in later grades? If so, to what extent are test scores used and to what extent is academic achievement in the classroom (such as grades and teacher observations) an additional factor in placement decisions? Every major test making organization tells schools that tests should not be used as the basis for decisions about retention, graduation, or placement (Norm-referenced Achievement Tests, 2007). Further, the stated purpose of norm referenced tests is to “rank and sort students, not to determine whether students have learned the material they have been taught.” (Norm-referenced Achievement Tests, 2007).

Finally, to what degree does high achievement on the ITBS correlate with high achievement in elementary, middle, high school, and college?

Conclusion

The Iowa Test of Basic Skills is a nationally accepted norm-based test method of assessing how a student performs as compared to other students. The assessment and subsequent interpretation of the scores allows administrators, staff, faculty, and parents to understand the current level of understanding of the students, as well as their understanding of material over time and make any necessary changes to lead the students toward the greatest success. While useful to test individual groups or broader, national groups, this test does not allow for adequate comparison in states, like California, that do not require this test in all schools. However, when used to collect specific data, this test proves to be adequate, reliable, and interpretable by faculty and administrations.

References

Abbott, M. , & Joireman, J. (2001). The Relationships Among Achievement, Low Income, and Ethnicity Across Six Groups of Washington State Students. Technical Report #1. n.p.: Washington School Research Center. Retrieved June 25, 2012 from

Finne, Liv. Respected Iowa Test of Basic Skills is the Most Cost-effective Way to Meet National Testing Requirement. Retrieved June 25, 2012 from .

Norm-Referenced Achievement Tests. (August 17, 2007). The National Center for Fair and Open Testing. Retrieved June 25, 2012 from .

Riverside Publishing Company: Iowa Tests of Basic Skills. Retrieved June 18, 2012 from

Test Prep Guide for the Iowa Test of Basic Skills® (ITBS®) Assessment. (May 2010, revised April 2012). Retrieved June 25, 2012 from .

Appendix I: Number of Items, Types and Testing Time Allotments

|Grade: 8 - Test Level: 14 |Number of Items |Time |

| | |(min) |

|Practice Questions |1-3 each test |N/A |

|Vocabulary |42 |15 |

|Word Analysis |35(2) |20(2) |

|Listening |31(2) |25(2) |

|Reading/Reading Comprehension2 |52 | |

|    Session 1 |25 |25 |

|    Session 2 |27 |30 |

|Language | | |

|    Spelling |42 |12 |

|    Capitalization |34 |12 |

|    Punctuation |34 |12 |

|    Usage and Expression |43 |30 |

|Mathematics | | |

|    Math Concepts and Estimation |49 |25 (2) |

|    Math Problem Solving and Data Interpretation |32 |30 |

|    Math Computation |32 |15 |

|Total Core Battery |360 |211 |

|Social Studies |43 |30 |

|Science |43 |30 |

|Sources of Information | | |

| Maps and Diagrams |31 |30 |

|    Reference Materials |38 |25 |

|Total Complete Battery |515 |326 |

Appendix II: Group Display of National Percentile Rank

[pic]

Appendix III: Group Display of Standard Score and Grade Level Equivalents

[pic]

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download