Running Head: EXPLORING STUDENT RESPONSE …



Running Head: EXPLORING STUDENT RESPONSE TECHNOLOGY: iRESPOND

Evaluation Plan

Exploring iRespond Utilization at Hightower Trail Middle School

Carlene Bailey

MEDT 8480

University of West Georgia

Introduction

Clickers have become a fact of life in Cobb County Georgia classrooms. These tools offer students immediate feedback and can reduce the time teachers spend grading papers. In addition to these obvious benefits there are still questions as to whether incorporating this technology provides for enhanced student learning or is just another gadget to complicate an already cluttered learning environment.

The “clicker” currently being utilized is the iRespond-Lite. Information is available at . This tool has the capability of allowing students to respond to the following types of questions: multiple-choice, true/false, multiple-response, yes/no, content item, numeric fill-in-the-blank (by using the numeric keypad), survey, and up to five answer choice capability using multiple choice or multiple select type questions.

The benefits of incorporating this technology that proponents often mention are numerous and promising. After review of the available literature on this topic several conclusions have already been stated. Results in the literature were varied in that some referred to the enjoyment students and teachers get from use of personal response systems. Kenwright (2009) explained that students were more likely to respond and participate with the use of this personal response tool. Others focused on how they impacted attendance by making learning fun and engaging (Kolikant, Drane, & Calkins, 2010). One of the studies was different in that it suggested that uses of the system should not expect instantaneous results and that change only happens over a period of time (Kolikant et al., 2010).

This evaluation will explore the implementation and effectiveness of this technology in the middle school classroom, specifically at Hightower Trail Middle School in Cobb County, Georgia.

Hightower Trail Middle school is a large suburban middle school in Marietta, Georgia with a strong focus on excellence.

The evaluation client of this process will be Laura Montgomery. Laura is the assistant principal of Hightower Trail Middle School (HTMS). She has served in this role for five years and was an instructor at the same school for many years prior. She is a strong advocate for improving the learning environment through both student and teacher success.

The mission of this evaluation is to initially explore iRespond as a tool to enhance the learning experience, and hopefully as a result, better understand what it takes to effectively incorporate other exciting new technologies in the classroom.

Purpose

The purpose of this evaluation is to determine the benefits and effectiveness of the iRespond system and its implementation as it is being utilized at HTMS. The final conclusions of this research will provide insight and direction for future technology implementations and offer suggestions for teachers and administrators that do not fully understand the possible benefits or drawbacks of this student response technology.

Does the iRespond tool promise more than it delivers, or is the success of a student response tool, such as this, largely dependent on teacher acceptance and eager utilization? Students are often quoted as enjoying and looking forward to classes when the system is used.

Research is vital to explore the effective use of any technology. This evaluation project will be helpful in guiding and focusing the use of the iRespond tool as we explore the link between teacher training, technology, and ultimately improved student outcomes. As Levin & Hansen (2008) explained, “Some training may be necessary… Instructors should discuss how the course technology is relevant, or useful, to the students. The more likely students view the technology to be useful or relevant, the better the students will perform in the course” (p. 7).

Students and teachers who are thinking about using any personal response system can benefit from the information derived from this evaluation. School systems at large engaged in making decisions about whether or not to purchase systems such as this would also benefit from this information. Technology may seem appealing and effective at first glance, but without careful analysis and evaluation, schools could stand to sacrifice valuable resources on the latest fad.

Evaluation Questions

How effective was the iRespond system implementation at providing teachers with necessary preparation and instruction for a successful launch, and in what ways could the implementation been improved? In what ways does the iRespond system enhance or distract from student learning? What additional uses of the iRespond system have been discovered by the teachers beyond the initial required assessments? Are there differences in the effectiveness and utilization of the iRespond system based on grade level or subject content?

Another area that is of interest from this activity explores the barriers to technology implementation from the point of view of the instructor. Does the instructor have resistance to change? Are there fears and hesitancy towards the implementation of any new technology? The best technology in the world is worthless if the person charged with utilizing it believes it to not have value.

Methods

The participants in this research study will be the academic instructors at HTMS. This year, all academic instructors received a classroom set of iRespond remotes with necessary software and hardware to utilize it in the classroom. Some required assessments were expected of them and all the instructors were instructed in its setup and use. The instructors will be sent a letter (Appendix B) and encouraged to complete an online survey to provide answers to the evaluation questions.

Schedule of Tasks:

|Item to be Completed |Date |

|Evaluation Questions |February 23, 2011 |

|Evaluation Plan (includes the following) |March 7, 2011 |

|Background Information on Program to be Evaluated | |

|Evaluation Questions (previously agreed upon) | |

|Sampling Plan | |

|Evaluation Instruments | |

|Data Collection Plan | |

|Completion of Data Collection |March 18, 2011 |

|Draft Report for Evaluation Client to Review |March 30, 2011 |

|Final Evaluation Report |April 11, 2011 |

|Presentation to Evaluation Client and Organization (optional according to agreement |April 11, 2011 |

|with Evaluation Client) | |

Follow-up procedures to insure more complete data collection will involve a second sending of the email to non-participants as well a face-to-face interviews with select teachers should adequate representation not be achieved.

Upon final approval of the evaluation findings, the report will be posted on a Wiki page and links to this page will be sent to all participants that provide an email address on the survey instrument.

Evaluation Instruments

The evaluation instrument (Appendix A) can also be viewed online here. This instrument is a Google Docs survey form and all data is stored in an online spreadsheet. The online nature of the documents allows for easy distribution and assimilation of the data.

Questions provide for both quantitative and qualitative information and can be divided into four categories. Some questions set to establish the background information about the participant such as grade level taught, years of experience, and subject area. A second focus is on the effectiveness of the iRespond implementation, while a third topic is how effective the tool can be in a learning setting. Finally the fourth topic focuses on attitudes possessed by the participant in regard to technology implementations in general.

The structure and order of the questionnaire may seem to be disorganized at first but several questions with the same focus are asked in different question formats to provide for response validity. Questions requiring a more thoughtful response are scattered in the survey so as not to overwhelm the participant. Multiple-choice questions allowing only one response and rating questions should provide hard numeric data to help answer the evaluation questions. Open-ended responses will hopefully provide anecdotal evidence, while more difficult to analyze should be a source for helpful ideas.

Data Analysis

This evaluation project will use a mixed approach and collect both quantitative and qualitative data. Validity of the data collected will be achieved through the use of a large population based on the fairly limited scope of the project. All academic instructors at Hightower Trail Middle School received and have used the iRespond system. The survey data will come from this entire population. Additional validity will be accomplished through multiple data sources in the same instrument. Key questions are asked in different question formats providing for evidence that responses are valid and thoughtful.

Data will be collected in a spreadsheet and imported into a database format allowing for effective filtering and reporting. The overall population results of the key questions regarding effectiveness of iRespond, teacher preparation, and the teachers’ desire to utilize will be tabulated and presented in bar graph format.

Relationships between years of service, subject taught, and grade level will be analyzed in contingency tables versus tool utilization, technology effectiveness, and preparation adequacy.

Inductive categories may be used to classify anecdotal responses depending on the quantity of such data. For example students may respond with may varied comments, yet the ultimate opinion is that the experience was fun. Teacher suggestions for additional iRespond uses could be classified in a similar fashion.

Conclusion

The information from this evaluation will be presented directly to the evaluation client for approval. The approved final copy of the report will be posted on a wiki page with links made available to all participants for review and comment. The evaluation client may wish to share the findings with the technology department at the board office for use in future technology implementations.

Program evaluation is critical to help understand new technologies. This evaluation project will provide insights helpful in the continued use of the iRespond tool as well as in exploring the links between teacher training, technology, and ultimately, student success.

References

Berry, J. (2009). Technology Support in Nursing Education: Clickers in the Classroom. Nursing

Education Research, 30(5), 295-298.

Campbell, J., & Mayer, R. E. (2009). Questioning as an Instructional Method: Does it Affect

Learning from Lectures? Applied Cognitive Psychology, 23(1), 747-759.

Cole, S., & Kosc, G. (2010). Quit Surfing and Start “Clicking”: One Professor’s Effort to

Combat the Problems of Teaching the U.S. Survey in a Large Lecture Hall. The History

Teacher, 43(3), 397-410.

Kaufman, R., & Guerra, I., & Platt, W. A. (2006). Practical Evaluation for Educators. Thousand

Oaks, California: Corwin Press.

Kenwright, K. (2009). Clickers in the Classroom. TechTrends, 53(1), 74-77.

Kolikant, Y. B., & Drane, D., & Calkins, S. (2010). “Clickers” as Catalysts for Transformation

of Teachers. College Teaching, 58(1), 127-135.

Levin, M. A., & Hansen, J. M. (2008). Clicking to Learn or Learning to Click: A Theoretical and

Empirical Investigation. College Student Journal, 42(2), 1-11.

Nagy-Shadman, E., & Desrochers, C. (2008). Student Response Technology: Empirically

Grounded or Just a Gimmick? International Journal of Science Education, 30(15), 2023-

2066.

Appendix A

iRespond Implementation and Effectiveness Survey

Thank you for taking the time to fill out this brief survey. You may participate in this survey and remain anonymous if you like. I am conducting this research project as part of a course in program evaluation at the University of West Georgia. This survey will only be completed by teachers at Hightower Trail Middle School and will look at the effectiveness of the iRespond student response tool. Information collected will be available for review upon completion of this project. So please take a moment and share your opinions as your ideas may be helpful to the rest of us. Thanks again, Carlene Bailey 7th Grade, HTMS carlene.bailey@

[pic]

1. What grade level do you teach?

[pic]6th grade

[pic]7th grade

[pic]8th grade

[pic]Other

2. What subject area do you teach?

[pic]Language Arts

[pic]Mathematics

[pic]Social Studies

[pic]Science

[pic]Other

3. Chose the response below that best describes you and your relation to new technologies.

[pic]Technology and I don't get along very well. It takes too long to implement with limited benefits.

[pic]I have found some success with incorporating technology, but it is a slow and time consuming process.

[pic]I like to use technology and the latest tools in my classroom, but other factors prevent me from utilizing it more.

[pic]My use of technology in the classroom seems to be just right at this time.

[pic]Other

4. Have you personally set-up and used the new iRespond student response system?

[pic]Yes, and I have completed the required assessments.

[pic]Yes, and I have used the tool for additional learning activities beyond the required assessments.

[pic]No, but I was involved with another teacher that set-up and used the iRespond.

[pic]No, I haven't used the tool at all.

5. Was the iRespond system presented to you with adequate information and training to easily implement it in your classroom?

[pic]Yes, training on the iRespond was adequate and classroom set-up was easy.

[pic]Yes, the information seemed complete, but set-up and utilization was still challenging.

[pic]No, I needed more training and practice prior to using the iRespond.

[pic]The training was adequate but the written instructions were confusing.

[pic]Other

6. How many years have you been teaching?

[pic]1-3

[pic]4-10

[pic]11-20

[pic]21+

7. What could have been done differently to make the implementation of the iRespond system easier and more effective? This is not a required question. Only answer this if you feel the iRespond implementation could have been improved.

8. Did your students enjoy using the iRespond system?

[pic]Yes, they were delighted and can't wait to use it again.

[pic]Yes, they seemed intrigued but haven't been asking to use it again.

[pic]No, they were confused and frustrated with the tool.

[pic]No, they did not seem either negative or positive about using iRespond.

[pic]Other:

9. Check as many of the following statements that apply to you and your students. You can check as many or as few of these statements as you wish.

[pic]My students were engaged and excited to use the iRespond.

[pic]I feel that the iRespond can invigorate the learning process by providing immediate feedback to the students and the teacher.

[pic]The iRespond is just another technology gadget that will not replace more traditional teaching methods.

[pic]Other than taking a quiz, I see no benefit to using iRespond.

[pic]I think this tool could be used during traditional lecture classes to engage students' attention and retention of the material discussed.

[pic]Students are distracted from the lesson when they have a remote in their hands to play with.

[pic]I think we have just begun to scratch the surface of how this tool can enhance the learning process.

[pic]I look forward to learning about more ways to use the iRespond.

[pic]I will use it when I have to, but I will not yet change the way I teach.

10. Besides the required assessments, are there other times you have used the iRespond tool? If so, please briefly describe what you did to incorporate the iResond in your classroom. You may also explain how you plan on using the tool in the future. This is an optional question. Only answer if you have used the iRespond for additional instructional activities.

11. Rank the iRespond system on a scale of 1 - 10 based on its use in the classroom.

|1 |2 |3 |4 |5 |6 |7 |8 |9 |10 | | |Waste of money. |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |Excellent teaching tool. | |

12. Rank the iRespond implementation on a scale of 1 - 10 base on how well it prepared you to use the tool.

|1 |2 |3 |4 |5 |6 |7 |8 |9 |10 | | |Totally inadequate preparation. |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |I was perfectly prepared. | |

13. Share any comments you may have received from your students about the use of the iRespond system.

14. On a scale of 1 - 10 will you use this tool in the classroom for teaching other than what is required?

|1 |2 |3 |4 |5 |6 |7 |8 |9 |10 | | |Will only use it when required. |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |[pic] |Anxious to learn more and use it often. | |

15. If you would like to view the results of this survey, please provide your email address below. If you wish to remain anonymous, leave this box blank. This concludes the survey. Press the submit button when done.

[pic]

Powered by Google Docs

Bottom of Form

Appendix B

To: Academic Teachers at Hightower Trail Middle School

CC: Laura Montgomery (Evaluation Client)

From: Carlene Bailey (Evaluator)

Hello Fellow HTMS Teachers,

I am conducting a research project as part of a course in program evaluation at the University of West Georgia. This is a short fifteen question survey and should only take a few minutes of your time to complete. This activity will hopefully provide helpful information to all of us about better ways to utilize the new iRespond system in our classrooms.

Thank you in advance for taking the time to fill out this brief survey. You may participate in this survey and remain anonymous if you like. This survey will only be completed by the teachers at Hightower Trail Middle School, and will look at the effectiveness of the iRespond student response tool. Information collected will be available for review upon completion of this project. So please take a moment and share your opinions as your ideas may be helpful to all of us.

iRespond Implementation and Effectiveness Survey Link

Thanks again,

Carlene Bailey 7th Grade, HTMS

carlene.bailey@

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download