October 31, 2012 - Modeling Instruction Program



October 31, 2012

Grant # 12-4057, Carver Charitable Trust, awarded January 20, 2012

Iowa Physics Modeling Professional Development Program

Grant amount: $87,225

FINAL REPORT to the Carver Charitable Trust Board of Trustees

Overview

This report spans two years of the Iowa Physics Modeling Workshops which received funding through the Carver Charitable Trust in January of 2011 at Iowa State University and 2012 at the University of Iowa. This report will communicate the final evaluation of the Modeling workshop held at ISU in the summer of 2011. Additionally, a partial report (lacking student impact information which takes the school year to gather) of participant data and reflections from the summer 2012 at the University of Iowa will be included. A final report on that component will be sent next fall. Identical text from last year’s report has been lightened to help reduce redundancy and clarify additions to previously submitted reports.

2011 Profile of ISU Participants

The 2011 Modeling Workshop at ISU hosted 20 funded educators and two self-funded educators. A month prior to the workshop there were three participants on a waiting list and several inquiries well after participants had been finalized. The participants covered a wide range of experience and physics content knowledge. In relation to experience, participants included one practicing educator with experience at the research level in physics, physics majors, out of field science majors pursuing endorsement in physics or with endorsements in physics, several math educators as the only qualified staff members to instruct physics and pursuing endorsement, and two recent undergraduates seeking education employment. The range of experience was just as diverse, with an estimated average teaching experience around 10 years. The content knowledge of participants was similarly diverse with a number of participants achieving maximum pre-test scores while several scored well below the identified threshold for conceptual understanding with the majority in between.

2012 Profile of U of I Participants

The 2012 Modeling Workshop at the University of Iowa hosted 24 funded participants with one withdrawing after the first week due to a training conflict with training required for a position accepted after committing to the U of I workshop. The workshop also included two self-funded participants from a school district in Omaha, NE. The participant number was allowed to increase due to the majority of participants choosing to commute to the workshop, making housing funds available for additional participants. As with the previous year, there were several candidates on a waiting list after the participant list was finalized. The make-up of participants was diverse; although, similar in nature to the 2011 workshop. Participants were evenly distributed in experience. About a fourth of participants were entering their first year of teaching or in their first 5 years of experience. Half were distributed between their fifth and 15th year of experience and the remaining fourth had 15 years of experience or more. Academic backgrounds were just as diverse as the ISU participants as well. Applicants ranged from those pursuing endorsement, out of field (non-physics) majors with endorsements, previously practicing engineers who had entered the teaching profession, and two middle school teachers wishing to deepen their content knowledge and prepare for future physics openings in their districts. Similar to the ISU workshop participant FCI scores ranged from below the threshold for conceptual understanding to Mastery level of understanding Newtonian physics.

Impact Assessment:

Participant FCI Data

The 2012 participants produced a similar set of data as the 2011 group. Demonstrating conceptual understanding on the FCI is a goal for participants and there is a correlation between instructor performance and their students’ performance on the FCI. Effectiveness of Modeling workshops are typically determined by Normalized gain on the FCI. Normalized gains are a measure of how much participants improve in relation to how much they could possibly improve. The standard set by the ASU Modeling workshops have established normalized gains of .5 over the duration of NSF funding and through ASU’s MSN program. FCI scores are interpreted with a score of 18 as the established threshold for conceptual understanding of Newtonian concepts and scores of 24 or greater established as mastery of Newtonian concepts. Of the 25 participants completing the 2012 workshop, 11 of the 25 demonstrated mastery entering the workshop with scores greater than 26 out of 30. Similar to the ISU workshop, this group was populated by participants with degrees in engineering, physics, chemistry, or had been teaching physics for a significant period of time. The remainder of participants were distributed similarly to the ISU group above and below the Newtonian threshold. Both the 2011 ISU and 2012 U of I participant data are presented in appendix 1 and reference to the table might be helpful for the evaluation commentary that follows.

Several methods were conducted when determining normalized gains for both groups. Initially, an average of participants normalized gains was determined and appears in the same row with average pre and post FCI scores for the groups. An average normalized gain of .48 for the 2011 workshop and .40 for the 2012 workshop were determined. This method was determined to be problematic as it includes the twenty percent of participants with perfect pre-test scores. These participants could only produce a negative normalized gain, none of the participants in either group did, and it was determined that averaging in these zeros is the primary reason for normalized gains less than expected when comparing to the ASU standards.

Normalized gains were also computed from the participant average pre-test and post-test scores. This method produces a normalized gain respective of all participants yet avoids the problematic nature of averaging in zeros for participants with perfect pre-test scores. With this method, the 2011 workshop produced a normalized gain of .51 while the 2012 participants produced a normalized gain of .58. The 2011 workshop is comparable to what would be expected and the 2012 workshop exceeds the expectation, when comparing to the ASU standard for Modeling workshops

Averaging of the participants’ normalized gain was also conducted with participants but excluded perfect pre-test scores. Averaging all participants who could produce a net positive gain provided mixed results for the two groups. Averaging participants normalized gain in this manner produced a gain of .58 compared to the normalized gain of .51 with pre and post-test averages for the 2011 group. The 2012 group produced a gain of .50 compared to .58 with the pre and post-test averages. However, the 2012 group did have one high scoring individual produce a 28 pre-test and a 27 post-test, likely statistically insignificant, yet leading to a -.44 gain. Factoring this individual out yielded a .55 gain. There was an additional participant in the 2012 group clearly a statistical anomaly producing a 12 pre-test and a 4 post-test, leading to a -.5 normalized gain. Factoring out this anomaly in conjunction with the other led to a .6 normalized gain which is comparable to the normalized gain figured from the average pre and post-test of .58.

As expected, the workshops seemed to be generating aggregate data for participants that are consistent with published results from the ASU NSF and MSN Modeling workshops. To further evaluate participant data in relation to the grant goal of producing well qualified physics instructors, an evaluation of participants below mastery entering the workshop was conducted. The 2011 group consisted of 10 participants below mastery on the pre-test and finished with six of those ten at or above mastery on the post-test. The average normalized gain from the 2011 sub group produced a normalized gain of .53 and .49 when figured with the methods described above. The 2012 group consisted of 13 participants below mastery and finished with twelve of the thirteen at or above mastery on the post-test. This sub group produced an average normalized gain of .62 when including the statistical anomaly described previously and gains of .71 and .75 when excluding the anomaly and using the methods described previously.

Impact Assessment:

2011 ISU Participant workshop reflections

In addition to the preliminary participant evaluation for conceptual understanding, this section consists of testimonial evidence provided through participant reflections. Selected segments from participant weekly reflections will be provided and entire communications can be made available upon request. Selections of participant reflections will attempt to highlight the workshops effectiveness while weaknesses identified in reflections will be discussed in relation to making adjustments for future workshops. Three excerpts taken from weekly reflections as well as two participants’ overall workshop reflections are provided.

“After five days at the workshop I have learned an uncountable number of things. At least once per day we have covered a specific idea that I have struggled with teaching every semester. Specific examples would include the derivation of the v2 equation, the weight mass relationship for a static body, impetus misconception. I cannot wait to get into the classroom next school year.

I also love how the modeling curriculum tends to lend itself toward a more meaningful system of assessment. This is important when you are working towards a deep meaningful understanding of nature from your students.” –Eric Grebe, Newton

“When I stop and think about the past week it is difficult to not first talk about where I was as a teacher before beginning this course. I knew before coming to this course that I had serious deficiencies in teaching physics. It was not until Tuesday of this week that I realize just how serious those deficiencies are. The reasons are not important, but my personal responsibility to address them is mandatory as an educator.

The first week of this course has been a week of exponential growth for me in content and pedagogy alike. Knowing the math behind physics is all I thought I needed to teach it. My approach to understanding physics concepts was to go to the back of the chapter, learn how to do the math in the problems, and teach that to my students. This week I had many aha moments concerning the science the behind the math. I had never taken the time to do qualitative problems before and had many of the misconceptions that others talked about “students” having.” – Jamey Smith, Central Lee

“My predominant feeling from week 2 was, man, my head hurts. That might have just been due to the humidity in the room, but I felt like my brow was furrowed a good bit of the time on Monday and Tuesday. That’s good though, in my book, frustration and confusion are very important parts of the learning process. Ultimately, I came around because we put a lot of effort in the workshop into exposing more of the things that we probably worried about when we first start teaching, but over time we let slide: Circular definition of work, obvious misuses of a consistent energy concept by students, obvious failures on our part to convey the utility of energy as an explanatory theme in physics. I feel like I had good activities for kinematics that exposed the ideas organically, but my energy stuff always came off feeling really contrived. The articles relating to the issues in Energy were very helpful. I can tell that I really need to put some thought into how I’m going to approach energy and being sure I’m thinking a couple of moves ahead in the energy unit.”

– Matt Harding, Iowa City West

Modeling Workshop Reflection:  Overall

I can say without reservation that this workshop was the greatest professional development experience that I have participated in.

Among the things that made this workshop worthwhile:

-Exposure to and practice with advanced physics teaching techniques

-The ability to interact and collaborate with like-minded individuals

-A deeper appreciation for the methods of physics

-A deeper understanding of fundamental physics concepts

-Exposure to resources for further professional development (i.e. Physics Education Research)

-A broader appreciation for what physics can offer the general population of students

-Exposure to a more accurate and descriptive method of grading (i.e. Standards-Based Grading)

-Learning how to apply many of the ideas that are taught in teacher preparation courses

Ryan Johnson, Ottumwa

From the instructors’ perspectives, the reflection selections highlight what some of the perceived successes of the ISU modeling workshop. The reflections were purposefully selected from a diverse set of participants in relation to content knowledge and pedagogical knowledge. Despite the workshops diversity in content and pedagogical background and experience, these reflections indicate modeling workshops are conducted in a manner that allows all participants to engage in high quality professional development that promotes deep reflection and professional growth.

2012 Participant Reflection

To contrast the instructor selected reflections from a variety of 2011 participants, a single 2012 participant reflection is being pasted below. The instructors felt this reflection articulated ideas many participants chose to reflect on, yet the reflection provided was exceptionally clear and thorough as to how a modeling workshop could resonate with a single individual. We thought it might be more effective than providing similar selections from several individuals. With the exception of bolding the weekly headings the reflection is as the participant sent it at the end of the workshop.

Participant Will Swain’s Reflections

Week one reflection:

One week ago (before the course began) I thought my physics teaching was OK with room for improvement. After listening to the modelers I see my 1st year teaching physics was a year of wasted opportunities to teach a group of kids what they really needed to know. My physics course will completely change next year.

The ground work for this transformation had been laid for years. I knew my students (in all subjects) were not thinking as much as I wanted. I knew from test data (both standardized and my own) that my students were weak in interpreting data of any type, including graphs. I had already incorporated a fragmented version of a modeling approach for much of my astronomy course and I had already incorporated much discussion and writing about the Nature of Science in 10th grade biology. I was ready to put it all together for physics… but I wasn’t sure how…

What I needed from this workshop were specific ideas as to how to incorporate the modeling approach into my class room and that is exactly what I am getting. The systems that Shannon, Brad, and many participants use are simple and intuitive and I feel like an idiot for not thinking of at least some of this stuff on my own. White boards, genius. What a simple perfect system.

Speaking of feeling like an idiot, in addition to methodology I also needed to enhance my own content knowledge. I thought I had excellent graphing skills, I didn’t, I thought I had mastery of basic Newtonian concepts; it became apparent that some of my knowledge was fragmented. I was teaching and emphasizing the things I thought I understood well and glossing over things I felt less comfortable with. I rationalized that the things I didn’t teach were not that important. This workshop helped me see that these lazy rationalizations were negatively impacting my teaching and more importantly student learning. Whereas graphing seemed like silly busy work that I never bothered with before, I now see it as a valuable tool in the construction of the models of motion and as valuable problem solving tool. After seeing how the modelers use it, it is now difficult to imagine not using it.

Week two Reflection

I wrote in my week one reflection about the many teaching epiphanies that were the logical fallout from a non-algebraic approach to teaching physics. Week two had fewer ah-ha moments as the approach (which was no longer new to me) did not change from week one, we were now just applying this approach to other Physics models.

The assessment talk was the highlight. Two years ago I heard Sean speak …one day after I went to a 21st century PD course, (I needed the credit and it fit my schedule). In the 21st century skill course they talked about “soft skills” and behaviors that students would need in the workforce; the importance of teaching those behaviors, responsibility, getting assignments in on time, etc... Then I went to a formative assessment PD where Sean was a guest speaker. The juxtaposition of these two PD courses with back to back with widely differing views as to what to assess was quite interesting. I went back to school thinking about changing my assessment practices (leaning towards SBG) but I had a few doubts and some specific concerns as to how to pull this off. In biology (I wasn’t teaching physics at that time) it was difficult to even come up with the Standards. In the end dogmatic Inertia won the day and I changed little. We know from learning theory the importance of revisiting ideas… Well here we are again two years later. And this time there is no conflict in my mind regarding SBG. I will do standards based assessment for my physics class next year.

Sean’s ideas about assessment will work in almost any course, but they really complement the modeling approach to Physics perfectly. I believe the reason this course is such a logical fit is that in a modeling approach it is clear to you and to your students what you are trying to accomplish. Because the goals/ standards are so clear, standard based assessment (SBA) becomes a powerful approach to guide the learning of the students.

If I do the thought experiment: “WRIAEB how well a course is taught and the appropriateness of SBA for that course,” I would predicted positive relationship. If in a modeling approach to Physics the transition to SBA seems simple and in my other courses it seems difficult, that makes me wonder about the quality of my other courses. In the end, the modeling approach to physics will likely cause me to approach the teaching of my other courses differently. My Biology, Human Biology, and Astronomy courses will likely improve as a result of me attending this workshop. The Carver folks are getting more out of a modeling approach to Physics than they realize.

Week 3 reflection

The workshop has value because we get an opportunity to not only learn, but to pay attention to how we feel as we learn.

At the beginning of Tuesday morning’s challenge lab neither my partner nor I had a clear idea of how to approach the problem. I felt frustrated. I didn’t want to be rude but I felt his idea was needlessly complex. It took some convincing but we went my route. That was about the last good decision we made as a team. After that he lost me with his approach I lost him with my approach. We worked in isolation. When we compared numbers we were not even close. Turned out my number was close while his was way off. After the test my partner, who is not one to let things drop, kept at his calculations. He used a graphic approach just for fun and showed it to me. It was solid, there were no flaws that I could see and yet the answer was slightly different from mine. I revisited my calculations and I found that I had two offsetting errors. After the activity was over, we worked together much better than we did while we were under a time constraint, which brings me to my take home points from analyzing how I felt and how I learned.

#1 When people are under a tight time constraint they behave/learn differently. Not everyone processes information or solves problems at the same speed. Slow processing has little to do with intelligence but we assume they are synonymous so we set up tasks to reward the fast thinkers. These hurry up activities, which are common (and fun!), might do a disservice to the slow processing kids. Poor learning behaviors like shutting down, low confidence, not asking questions, are potentially reinforced with these timed activities for a child (or adult) that processes a concept slowly.

#2 My partner and I would have worked better together if we felt there was time to LISTEN. Usually communication problems are a result of listening issues not speaking issues. We had difficulty articulating our thoughts because we were in the process of forming them and trying to communicate them while someone else was talking to us. This was a set up for communication failure. (Part of the fun as a teacher is to watch the Mayhem, but it is likely not the best way to foster good listening skills) A better approach that is more conducive to teaching communication skills might be to give everyone the problem to solve, have them work on it silently to collect their thoughts, and then group them.

#3 Having students explain how they got their wrong answer and teasing out where in the process they went wrong, is more illuminating than the correct solution. It is better to find your own error than to have others point it out, so this should be the first approach. Conversations about; multiple approaches, standard error ideas, worst possible angles to choose are all important things to discuss after the lab. Taking time to do this is important.

#4 In my Physics course last year I taught (covered) more than just mechanics (optics, Electromagnetic energy, nuclear energy, thermodynamics, etc…) In hindsight there is no way that I gave students enough time with the activities and concepts to learn mechanics and all of these topics. Hammering out fragmented understanding of a concept takes time and not giving enough time leads to frustration.

Impact Assessment:

2011 Participant mid-year updates

Of the 22 workshop participants, Nine have indicated fully implementing modeling curricular resources with fidelity. One other participant has implemented the majority of materials with room for open ended explorations. Of the remaining 12, one completed Modeling activities but supplemented with other research based activities rather than modeling curricular materials, two implemented to some extent but had to compromise fidelity for curricular constraints, one abandoned implementation as a result of mandated curricular assessments, two implemented parts with populations with specific needs, one could not implement due to course schedule, three pursued educational advancement and are not currently teaching, and there were two who did not respond with a mid-year update but have indicated they are implementing in previous communication.

In addition to addressing level of implementation, participants were also required to address several other questions. Most importantly, participants commented on how their participation has helped them make progress toward the goals they have for their students. In general most participants’ perceptions indicated significant increases in understanding physics concepts, improved collaboration and team work, a far greater degree of self-efficacy and self-regulation, improved student data analysis and interpretation skills, and improvements in effective communication and representation of concepts. Comments related to problem solving ability, ability to pose effective questions, a deeper understanding of the NOS, and greater understanding of how to learn were also indicated in several respondents communication. In relation to how attending the workshop has improved participants practice, overwhelmingly participants indicated an increased ability to diagnose their students understanding and appropriately address student misunderstandings. Other insights related to practice include a renewed enthusiasm for teaching, improved understanding of how students learn, greater insight on how to appropriately scaffold concepts, improved ability to question students and guide them to greater understanding with effective questioning, improved ability to explicitly teach the nature of science, and improved ability to hold students accountable. These testimonials provide insight on the effectiveness of the ISU Modeling workshop and to a great extent are directly related to broader educational reform goals.

Impact Assessment:

2011 Participant TPI Results

The Teaching Perspectives Inventory (TPI) was taken prior to the workshop and at the conclusion of the first year of implementation. The delay was intentional due to the short duration of the workshop. While all participants completed an initial TPI, with the exception of one, there were several individuals who never returned a final TPI at the conclusion of the school year. Additionally, some participants only forwarded a partial TPI summary addressing the perspectives only. Fortunately, the sample includes teachers that were distributed reasonably well with the FCI performance. The TPI identifies 5 perspectives in three areas, beliefs, actions, and intentions. For any given perspective the highest score possible is a 45 and the lowest possible is a 9. In general averages for any category are in the mid-thirties and dominant perspectives are those with higher scores. A general descriptor of each perspective follows:

Transmission- Effective teaching requires a substantial commitment to subject matter or content.

Apprenticeship- Effective teaching enculturates students into a set of social norms and habits of mind.

Developmental- Effective teaching is planned and conducted from a learner’s point of view.

Nurturing- Effective teaching assumes long term persistent effort to achieve comes from the heart as well as the mind.

Social Reform- Effective teaching seeks to change society in substantial ways.

As we might expect, for participants providing a pre and post TPI with beliefs, actions, and intentions, the intention category experienced the highest frequency for a significant positive change. We can speculate this is the result of participants reflecting on their first year of implementation and recognizing there is much work to be done to achieve competency with Modeling methods and curricular materials. There were positive changes in the beliefs of participants but they were not as great as the intentions although they had a similar frequency. Perhaps the smaller change is due to participants holding similar beliefs prior to the workshop yet becoming more convinced as a result of implementation. Finally, we saw less change in the action category with less frequency for positive change. This might be expected given participants intentions experienced the greatest changes and action would follow any intentions for change. This leads us to believe that the belief frequency could be largely attributed to the workshops effectiveness on the participant since one would naturally think belief changes would occur after action has taken place.

In relation to the teaching perspectives, most participants were fairly well balanced between the perspectives with many having dominant developmental and/or apprenticeship perspectives entering the workshop and most having social reform as the lowest scoring perspective. We would likely expect to see changes in the developmental and apprenticeship categories given the nature of Modeling instructional practices and theoretical underpinnings. Most candidates experienced a positive change in either category, with many having a positive change in both. Developmental gains can be largely attributed to an instructional design that provides concrete experiences first and heavy reliance on representational strategies that make abstract concepts more concrete. Modeling instructional design can also help explain positive changes in apprenticeship perspectives. The majority of Modeling instruction emphasizes the collection and validation of empirical data, argumentative discourse, evaluating claims, evidence based reasoning, and the testing and validation of basic scientific models. This activity closely models authentic scientific practice making an apprenticeship perspective practical. Negative changes were most frequent in the nurturing category. Decreases in nurturance are likely attributed to the use of strategies that place heavy demands on the learner and clear standards for proficiency helping teachers separate personal and emotional relationships from the goals of the course. Interestingly, positive changes in transmission were not expected and there were several individuals who had significant positive changes in social reform. The positive changes in transmission seem to contradict changes in development, yet modeling instructional strategies have a tendency to make explicit the tacit decisions a person with expertise might not make explicit. Perhaps these positive gains are attributed to transmitting science as a way of knowing rather than from a delivery of content point of view. The large positive changes in social reform are likely attributed to participants that have the ability to promote modeling based practices as broader more generalizable habits of mind. These habits are likely viewed as transferrable across many domains, lending themselves to informed social decision making. While no clear correlation between TPI scores and FCI scores can be identified, it is worth recognizing that participants with developmental scores closer to 40 and transmission scores in the lower thirties all had reasonably high FCI gains. The one participant with a transmission score at 40 and a developmental score in the mid-thirties had the lowest FCI gains.

Impact assessment:

Student FCI Data and VASS Profiling

To be consistent with first year modelers completing a modeling workshop at ASU, average student post-test percentage of 52% or a normalized gain of 0.35. This is a correction from a previous report indicating that first year modelers typically produced an average normalized gain of .42. The graphic below is a summary of ASU’s findings for over 7,500 students.

A table of ISU participant student data is provided in appendix 2 to supplement the commentary that follows. Sixteen participants sent FCI results for students with only five returning VASS survey data. The low response for VASS data was likely due the request for VASS data occurring at the end of the school year and the labor intensive nature of compiling the results. 2011 participant’s student data yielded expected results for a 3 week modeling workshop. Average student normalized gain figured with average teacher gain and average pre and post data yielded normalized gains of .40 and .39 respectively. These gains are slightly better than what is expected from novice modelers after attending a three week workshop. More importantly, two of the 16 participants reporting data either could not fully implement due to curricular constraints or district mandated tests. Three other participants fully implemented six or seven of the 9 units in the mechanics curriculum, leaving out an important unit for discriminating addressed on at least 5 items on the FCI. Determining an average normalized gain for participants who indicated full implementation or a majority implementation yet completing the mechanics units yields a normalized gain of .46 and .45 respective to the gains reported for all participants. For comparative purposes, the gains have been superimposed over a graphic from Hake’s extensive study comparing interactive engagement methods (like modeling) to traditional instruction for high schools, colleges, and universities. The overall participant student average is plotted as an orange star and the blue star represents only students whose participants fully implemented.

Demographic information is provided with the FCI results with only one clear trend. There seems to be no relation between rural, mid-sized, and urban population settings in relation to the student data. Trends in percent free and reduced lunch and minority could not be clearly established. If all participants could have fully implemented it might have been possible to infer some relations for these demographics. ASU findings indicate Modeling’s effectiveness regardless of SES or minority demographic. Perhaps a similar trend will surface after 2012 U of I data are compiled with 2011 ISU student FCI results.

Two other findings can supplement the average student performance for the 2011 ISU participants. First, of the 487 students taking the FCI, 73 students or 15% scored at the mastery level or above on the FCI. Secondly, one participant who was familiar with interactive engagement methods, including modeling, and regularly administered the FCI provided longitudinal data for his AP and regular physics courses. The longitudinal data for this teacher clearly indicate an effect on this teacher’s student performance in both courses.

[pic]

[pic]

The VASS profiles obtained are consistent with what would be expected. The VASS survey probes students in relation to scientific and cognitive dimensions. The survey profiles students into the following views: folk, low transitional, high transitional, and expert. Folk views are associated with naïve realists and are considered to be passive learners. Expert views are associated with scientific realism and hold active and critical learning views. No students were classified with an expert profile and individual VASS scores reported for students correlated well with performance on the FCI. Students performing well on the FCI post-test typically exhibited a high transitional profile. Average VASS scores provide the same generalization, the higher students’ average VASS profile, the greater the normalized gain for the small sample available.

Summary of Findings

The findings presented provide some solid evidence that the 2011 ISU Modeling workshop was a success. Both participant and student data are consistent with published findings from the highly recognized ASU Modeling Instruction Program. Participant FCI data demonstrates the majority of candidates performing below mastery levels on the FCI can demonstrate mastery after attending a 3 week workshop. TPI data indicate positive changes in the expected developmental and apprenticeship categories, with most participants establishing positive changes in their intentions after the first year of implementation. Finally, student data indicate performance on the FCI that is clearly better than traditional instruction and slightly above what is expected from interactive engagement methods. Second year participant FCI data demonstrates improvement when compared to the 2011 ISU data and we are hopeful this trend will continue through second year student data.

Regards,

Iowa Modeling Workshop Consortium

Craig Ogilvie, Ph.D., Professor of Physics, Iowa State University

Mary Hall Reno, Ph.D., Professor and Department Chair, Physics, University of Iowa

Shannon McLaughlin, Physics Instructor, Norwalk High School

Jeff Weld, Ph.D., Director, Iowa Math & Science Education Partnership

Appendix 1

|2011 ISU Modeling Workshop Participant FCI Scores | |2012 U of I Modeling Workshop Participant FCI Scores |

|  |  |  |  | |  |  |  |  |

|Number |FCI Pre-Test |FCI Post-Test |Normalized Gain | |Number |FCI Pre-Test |FCI Post-Test |Normalized Gain |

|1 |30 |30 |0 | |1 |22.00 |NA |0.00 |

|2 |30 |30 |0 | |2 |30.00 |30.00 |0.00 |

|3 |30 |30 |0 | |3 |30.00 |30.00 |0.00 |

|4 |30 |30 |0 | |4 |30.00 |30.00 |0.00 |

|5 |29 |29 |0 | |5 |30.00 |30.00 |0.00 |

|6 |28 |28 |0 | |6 |12.00 |4.00 |-0.44 |

|7 |18 |18 |0 | |7 |28.00 |27.00 |-0.50 |

|8 |11 |14 |0.16 | |8 |29.00 |29.00 |0.00 |

|9 |15 |19 |0.27 | |9 |29.00 |29.00 |0.00 |

|10 |20 |23 |0.3 | |10 |23.00 |24.00 |0.14 |

|11 |26 |28 |0.5 | |11 |25.00 |26.00 |0.20 |

|12 |20 |26 |0.6 | |12 |23.00 |25.00 |0.29 |

|13 |14 |24 |0.63 | |13 |28.00 |29.00 |0.50 |

|14 |27 |29 |0.67 | |14 |26.00 |28.00 |0.50 |

|15 |23 |28 |0.71 | |15 |22.00 |27.00 |0.63 |

|16 |23 |28 |0.71 | |16 |20.00 |27.00 |0.70 |

|17 |18 |29 |0.92 | |17 |26.00 |29.00 |0.75 |

|18 |19 |30 |1 | |18 |18.00 |27.00 |0.75 |

|19 |27 |30 |1 | |19 |21.00 |28.00 |0.78 |

|20 |28 |30 |1 | |20 |19.00 |28.00 |0.82 |

|21 |29 |30 |1 | |21 |12.00 |27.00 |0.83 |

|22 |29 |30 |1 | |22 |23.00 |29.00 |0.86 |

|AVERAGES |22.44 |26.28 |0.48 | |23 |16.00 |28.00 |0.86 |

|Normalized gain with Average Pre and Post |0.51 | |24 |18.00 |29.00 |0.92 |

|  |  | |25 |16.00 |29.00 |0.93 |

|Average Normalized Gain N=5 through N=22 |0.58 | |26 |29.00 |30.00 |1.00 |

| | | | | |Averages |23.27 |27.16 |0.40 |

| | | | | |Normalized gain with Average Pre and Post |0.58 |

| | | | | |  |  |

| | | | | |Average gain N=6 through N=26 |0.50 |

| | | | | |Average gain N=7 through N=26 |0.55 |

| | | | | |Average gain N=8 through N=26 |0.60 |

|ISU Participants Starting Below Mastery on FCI | |U of I Participants Starting Below Mastery on FCI |

|FCI Pre-Test |FCI Post-Test |Normalized Gain | |  |FCI Pre-Test |FCI Post-Test |Normalized Gain |

|18 |18 |0 | |  |23.00 |24.00 |0.14 |

|11 |14 |0.16 | |  |23.00 |25.00 |0.29 |

|15 |19 |0.27 | |  |22.00 |27.00 |0.63 |

|20 |23 |0.3 | |  |20.00 |27.00 |0.70 |

|20 |26 |0.6 | |  |18.00 |27.00 |0.75 |

|14 |24 |0.63 | |  |21.00 |28.00 |0.78 |

|23 |28 |0.71 | |  |19.00 |28.00 |0.82 |

|23 |28 |0.71 | |  |12.00 |27.00 |0.83 |

|18 |29 |0.92 | |  |23.00 |29.00 |0.86 |

|19 |30 |1 | |  |16.00 |28.00 |0.86 |

|18.1 |23.9 |0.53 | |  |18.00 |29.00 |0.92 |

|Normalized Gain from averages |0.49 | |  |16.00 |29.00 |0.93 |

| | | | |  |12.00 |4.00 |-0.44 |

| | | | |  |without the highlighted anomolie below |

| | | | |  |19.25 |27.33 |0.71 |

| | | | |Normalized Gain from averages |0.75 |

Appendix 2

|2011 ISU Student FCI, Vass, and Demographic |  |  |  |  |  |  |  |

|Teacher Number |N |FCI Post |FCI Pre |% Gain |N Gain |VASS |Implementation|% minority|% F & R |

|Average of N gain |  |  |  |0.40 |  |  |  |  |  |

|Cohort Averages of full Implement |18.07 |8.46 |32.04 |0.45 |  |  |  |  |  |

|Average of N gain |  |  |  |0.46 |  |  |  |  |  |

|  |  |  |  |  |  |  |  |

Appendix 3

|number |unit cost |total | | | | |

|Iowa Regents Modeling Workshop Budget Proposal |

| | | | | | | | |

|Participant Independent Costs | | | | |

| | | | | | | | |

|1 |IMSEP Adminstration Costs |$4,125 |$4,125 | | | | |

|0 |UoI admin support, registration, email, logistics, & benefits |$2,500 |$0 | | | | |

|1 |Evaluation costs |$2,500 |$2,500 | | | | |

| |Other | |$0 | | | | |

| |Other | |$0 | | | | |

| |Management Costs | |$6,625 | | | | |

| | | | | | | | |

|4 |Regents Professor Visits |$150 |$600 | | | | |

| |University Faculty | |0 | | | | |

|2 |Facilitator Stipend |$4,500 |$9,000.00 | | | | |

|2 |Facilitator Expenses (Housing, Travel, Meals) |$1,000 |$2,000.00 | | | | |

| |Facilitator Costs | |$11,000.00 | | | | |

| | | | | | | | |

|Participant Dependent Costs | | | |

|20 | | | | | | | |

|20 |Stipend @ $150/day |$2,250.00 |$45,000.00 | | | | |

|20 |housing $50/day for 20 days |$1,000.00 |$20,000.00 | | | | |

|20 |Meal Stipend |$150.00 |$3,000.00 | | | | |

|20 |Teacher Supplies (Binder and Printed Materials) |$50.00 |$1,000 | | | | |

| | | | | | | | |

| | | | | | | | |

|20 |Participant total cost |$3,450.00 |$69,000.00 | | | | |

| | | | | | | | |

| |total expenditure | |$87,225.00 | | | | |

| | | | | | | | |

-----------------------

[pic]

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download