California State University, Northridge



2018-2019 Annual Program Assessment Report GuidePlease submit report to your department chair or program coordinator, the Associate Dean of your College, and to james.solomon@csun.edu, Director of the Office of Academic Assessment and Program Review, by September 30, 2019. You may, but are not required to, submit a separate report for each program, including graduate degree programs, which conducted assessment activities, or you may combine programs in a single report. Please include this form with your report in the same file and identify your department/program in the file name.College: Science and MathematicsDepartment: Chemistry and BiochemistryProgram: BA/BS/MS Assessment liaison: Thomas MinehanPlease check off whichever is applicable:A. ____√____ Measured student work within program major/options.B. ____√____ Analyzed results of measurement within program major/options.C. ________ Applied results of analysis to program review/curriculum/review/revision major/options.D. _________ Focused exclusively on the direct assessment measurement of General Education Arts and Humanities student learning outcomes Overview of Annual Assessment Project(s). On a separate sheet, provide a brief overview of this year’s assessment activities, including:an explanation for why your department chose the assessment activities (measurement, analysis, application, or GE assessment) that it enactedif your department implemented assessment option A, identify which program SLOs were assessed (please identify the SLOs in full), in which classes and/or contexts, what assessment instruments were used and the methodology employed, the resulting scores, and the relation between this year’s measure of student work and that of past years: (include as an appendix any and all relevant materials that you wish to include)if your department implemented assessment option B, identify what conclusions were drawn from the analysis of measured results, what changes to the program were planned in response, and the relation between this year’s analyses and past and future assessment activitiesif your department implemented option C, identify the program modifications that were adopted, and the relation between program modifications and past and future assessment activitiesif your program implemented option D, exclusively or simultaneously with options A, B, and/or C, identify the basic skill(s) assessed and the precise learning outcomes assessed, the assessment instruments and methodology employed, and the resulting scoresin what way(s) your assessment activities may reflect the university’s commitment to diversity in all its dimensions but especially with respect to underrepresented groupsany other assessment-related information you wish to include, including SLO revision (especially to ensure continuing alignment between program course offerings and both program and university student learning outcomes), and/or the creation and modification of new assessment instrumentsPreview of planned assessment activities for 2019-20. Include a brief description as reflective of a continuous program of ongoing assessment.2. Overview of Annual Assessment Project(s). Provide a brief overview of this year’s assessment activities.The following assessment activities took place this year:?Measure Student Work (Option A)Assess basic knowledge in general chemistry, organic chemistry, and biochemistry (SLO1) using standardized exam questions in course finals.Assess students’ ability to keep a laboratory notebook in chem 334L using our departmental notebook rubricAdministered a signature assignment for longitudinal assessment of knowledge (SLO1) in gateway (Chem 321) and capstone (Chem 401) courses.Assess graduate students’ scientific oral communication abilities in literature and thesis seminars, relevant to SLO2m: Organize and communicate scientific information clearly and concisely, both verbally and in writing?Analyze Results of Measurement (Option B)An analysis of student performance trends in general chemistry and organic chemistry was undertaken.Current student lab notebook performance was compared to previous assessments from several years agoThe results of the signature assignment were reviewed and suggestions for an alternative exam source were made.Review evidence pertaining to SLO2m: Organize and communicate scientific information clearly and concisely, both verbally and in writing.The department chose these activities so as to 1.) encourage faculty to continue doing assessment in their courses each year so as to identify trends over multiple semesters and weaknesses in student comprehension that need to be addressed at the individual course level and in the program as a whole, and 2.) move forward with our longitudinal assessment program, identifying possible sources of the lack of significant change in student performance at the gateway and capstone courses that were identified in previous assessments.A: Measure Student WorkSLO’s addressed: SLO 1: Assess basic knowledge in the following areas of chemistry: general chemistry, organic chemistry, and biochemistry, both at the individual course level and for the program as a whole.1. Assess basic knowledge in general chemistry (SLO1) using standardized exam questions. Alignment with core competencies: critical thinking, quantitative literacy.General Chemistry Assessment: Our General Chemistry courses Chem 101 and Chem 102 service majors, minors and a large number of non-majors. Students taking Chem 101 in fall 2018 and spring 2019 were assessed using 14 questions from the ACS standardized exam in General Chemistry. A previously established benchmark for success on these assessments is 8 or more of the questions (out of 14) answered correctly. For one section of 60 students in fall 2018, 19 students (32% of the class) performed at or above the benchmark score. For another section of 64 students in spring 2019, 15 students (23% of the class) performed at or above the benchmark score. Using the same set of questions, another instructor found that 64%, 38%, and 45% of the Spring 2017, Fall 2017, and Spring 2018 Chem 101 classes, respectively, achieved the benchmark score or higher. Finally, one instructor gave 12 questions from the ACS standardized exam in General Chemistry as both a pre- and post-test (first week of class and final week of class) in the Spring 2019 Chem 101 course (77 students). For the pre-test, the average score was 4.4 out of 12 (37%) and for the post-test the average score was 6.2/12 (52%). Gratifyingly, 76% of the students received a score in the final week that was higher than the first week, indicating value-added learning throughout the course.e. Analysis of the Results of Measurement The assessment results in Chem 101 appear to vary from semester to semester, although it is apparent from the data presented above that the recent trend in student performance is downward. Despite clear improvements in student learning/knowledge throughout the Chem 101 course, the number of students achieving the benchmark level of performance in most courses is slipping well below 50% of a given class. Although it is not desirable to “teach to the exam”, it may be helpful for general chemistry instructors to review the ACS exam questions and identify general topics with which the students seem to have more difficulty. This activity may result in changing the emphasis on some topics covered in the corresponding lecture course.Recently, one Chem 101 instructor administered a survey at the end of the course asking students to reflect on what was the most valuable resource for their learning provided in the course: homework, lectures, exams, quizzes, textbook or discussion activities. There was an almost equal distribution of responses indicating homework, lectures, textbook and exams. However, only 5% of the class found discussion activities to be a valuable learning resource! Perhaps this indicates a prevailing student attitude toward discussion sections as “less valuable” than the lecture portion of the course. This attitude may provide a stumbling block for some students to take full advantage of group problem solving activities to improve their performance in the course, especially in more advanced chemistry courses. 2. Assess student engagement in Chemistry 100, Principles of Chemistry.As is typical of GE courses in the college of science and mathematics, Chem 100, (as well as General Chemistry I and II, Chem 101 and Chem 102) are populated by both science (primarily chemistry/biochemistry and biology students) and non-science majors. Measuring student engagement is important for faculty and staff in the university to understand the time and effort students put into their studies and what areas need to be improved from students’ perspectives. With the goals of identifying significant predictors for students’ academic achievement and providing more learning opportunities for students in gateway chemistry courses, one instructor who taught CHEM 100 in Spring 2019 examined data collected from CHEM 100 students through the National Survey of Student Engagement (NSSE).The data was obtained from Institutional Research at CSUN. There were 1093 CHEM 100 students who took the NSSE survey before the year of 2019. The instructor analyzed student responses to the 10 items that are designated as Student Engagement Indicators (SEIs) in the survey and compared the results to the national data on the NSSE website. The SEIs contain four categories: Academic Challenge, Learning with Peers, Experience with Faculty, and Campus Environment. The data was coded and combined using the same method as the national NSSE survey on their website, and each item ranges from 0 to 60. Higher scores indicate higher student engagement in activities.e. Analysis of the Results of Measurement Comparing to national data of institutions with similar size, CHEM 100 students at CSUN reported higher scores on most of the survey items except for student-faculty interaction and quality of interactions. The scores were compared by academic level, first year students and seniors, same trends were observed. For both first year CSUN students and seniors, in the category of Academic challenge, students ranked higher on every item than students with the same academic level in comparable institutions. The average scores of about 600 CSUN first year students were 37.4, 42.3, 28.1, and 41.3 for reflective and integrative learning, higher-order learning, quantitative reasoning, and learning strategies, comparing to the average scores of 3800 students of 34.9, 37.3, 27.0, and 37.6, respectively. CSUN seniors (N=400) scored 38.8, 42.6, 33.3, and 41.9 for the above items, comparing to 37.8, 39.5, 28.1, and 38.3 of seniors (N=4800) from similar institutions. In the category of Learning with peers, first year CSUN students scored collaborative learning and discussion with diverse others with 38.9 and 41.3, comparing to 31.8 and 37.8 of national data; CSUN seniors scored 38.3 and 44.9 for the above two items, comparing to 32.5 and 38.7 of national data. For Experiences with faculty category, both CSUN first year students and seniors ranked effective teaching practices higher than national data, 42.5 and 41.7 as compared to 39.1 and 39.7. However, student ranked lower for student-faculty interaction, 19.4 and 24.4 as compared to 22.1 and 24.9. Similarly, for the last category of Campus environment, both CSUN first year students and seniors ranked one of the items supportive environment higher than national data, 38.9 and 38.3 as compared to 35.9 and 32.0. However, student ranked lower for the other item quality of interactions, 38.4 and 40.5 as compared to 42.6 and 42.3. Among all the items, the differences between CSUN students and nation data were in the range of 0.5 to 7.1 with 60 as the maximum score, that is, between about 1% to 12%. The biggest differences between CSUN students and national data was collaborative learning, CSUN first year students were satisfied and reported learning collaborative with others 12% more than national data, and CSUN seniors reported they discuss with diverse backgrounds 10% more than national data. The areas need to be improved for CHEM 100 students in the future including student-faculty interaction (5% less for first-year student and 1% less for seniors) and quality of interactions (7% less for first-year student and 3% less for seniors).A multiple regression analysis was carried out to identify which category of the SEIs was statistically significant to predict CHEM 100 students’ academic achievement. The predictors include gender, unrepresentative minority, academic level, Pell grant eligibility, parent education, high school GPA, SAT math scores, SAT verb scores, and the four categories of SEIs. The model showed five significant predictors for predicting CHEM 100 students’ campus GPAs. They were unrepresentative minority, academic level, high school GPA, SAT verb scores, and Experiences with Faculty. Among the four SEIs categories, “Experiences with Faculty” was the only significant predictor of CHEM 100 students’ campus GPAs after controlling for all the student background variables. According to the NSSE survey, “Experiences with Faculty” contains two major items: Student-Faculty Interaction and Effective Teaching Practices. Student-Faculty Interaction involves the frequency of students talking about course topics or ideas outside of class, discussing their academic performance and career plans with a faculty member, and working with a faculty member on activities other than coursework, such as committees and student groups. Effective Teaching Practices include instructors clearly explaining course goals and requirements, teaching course sessions in an organized way, using examples or illustrations to explain difficult points, providing feedback on a draft or work in progress, and providing prompt and detailed feedback on tests or completed assignments.In summary, CHEM 100 students reported higher engagement in the majority of learning activities at CSUN as compared to national data, especially collaborative learning and discussion with diverse others. Student experiences with faculty were identified as the most impactful factor for influencing CHEM 100 students’ academic performance. In order to improve CHEM 100 students’ academic performance, future efforts through multiple paths such as the Department of Chemistry and Biochemistry, Student Academic Advisement, and Faculty Development should consider collaborating on projects to improve CHEM 100 students’ experiences with faculty.3. Assess basic knowledge in organic chemistry (SLO1) using standardized exam questions. Alignment with core competencies: critical thinking, quantitative literacy.The organic chemistry courses Chem 333 and Chem 334 are taken in sequence by all Chemistry BS, Chemistry BA and Biochemistry BS majors. In addition, a large number of non-majors (especially from Biology) take this course as required in their program. Due to the emergency situation during the last two weeks of the fall semester of 2018, we have been able to collect final assessment data for Organic Chemistry from only for Spring 2019. An initial assessment (pre-test) was given by an instructor in Chem 333 in Spring 2019 consisting of 4 questions covering fundamental topics in general chemistry. Of the 16 students who took the assessment, 69% (11/16) got 3 or more correct out of 4, and the average score was 75% (3/4) correct. For another instructor, six multiple-choice questions from an ACS standardized exam in organic chemistry were incorporated in the course final exam for Chem 333 (Spring 2019). Of the 57 students who took the final exam, 18 students (32%) achieved the benchmark score of 4 or more questions out of 6 (66%) answered correctly; 32 students (56%) achieved a score of 3 or more correct out of 6. For comparison, in Fall 2016, ten questions from an ACS Standardized exam in organic chemistry were incorporated in the course final exam for Chem 333 (Organic Chemistry I, Fall 2016). In Chem 333, of the 66 students who took the final exam, 36 students (54%) achieved the benchmark score of 6 or more questions out of 10 answered correctly.e. Analysis of the Results of MeasurementThe recent data presented above indicate that even with sufficient general chemistry background, less than 50% of Chem 333 students achieve the benchmark score on the course final assessment questions. Furthermore, in comparison to three years ago, it appears that a greater percentage of students in a given class are now falling below the benchmark level of performance. This is alarming, especially since this course has a mandatory discussion/recitation section, which has been shown in previous assessments to increase student learning and problem-solving skills. The qualitative data available from Chem 101 on student attitudes toward the value of discussion sessions is especially informative in this case. Perhaps instructors need to emphasize the value/importance of discussion activities (for getting better grades!) on the first day of class. It would also be helpful if organic instructors focused a portion of their lecture time on problem-solving and student-engaging activities. Student performance in this traditionally high D/U/F rate course needs to be continuously monitored and appropriate adjustments to pedagogy made to ensure organic chemistry does not become a permanent stumbling block in student progress across multiple majors.3. Assess basic knowledge in biochemistry (SLO1) using standardized exam questions. Alignment with core competencies: critical thinking, quantitative literacy.Chem 464 is a one-semester course in biochemistry taken primarily by non-majors. In Spring 2019, 39 students in Chem 464 were administered 8 questions taken from the ACS standardized exam in biochemistry on their course final exam. The benchmark for success in this assessment was 5 or more questions correct out of 8, and the average score in Spring 2019 was 3.9/8 (49%). Of the 39 students who took the assessment, 16 (41%) got 5 or more questions correct out of 8. For comparison, in Spring 2017, 42 students in Chem 464 were administered 10 questions taken from the ACS standardized exam in biochemistry on their course final exam. The benchmark for success in this assessment was 6 or more questions correct out of 10, and 24 students (57% of the class) in Spring 2017 achieved the benchmark level or higher. e. Analysis of the Results of MeasurementThe recent data presented above indicate that there has been a decrease in the percentage of students achieving the benchmark score for the biochemistry assessment in Chem 464 in comparison to data obtained a few years ago. This result may be due to differing question sets selected for the assessment. One suggestion to allow comparison of our students’ performance with national averages is to administer the entire ACS standardized exam in biochemistry on the last day of class, time permitting. This has been a useful comparison for general chemistry instructors and organic chemistry instructors, allowing them to gauge where our students stand with respect to others assessed across the nation in the same subject area.A. Measure Student WorkSLO’s assessed: SLO 4: Work effectively and safely in a laboratory environment, including the ability to follow experimental chemical procedures and maintain a proper lab notebook. Assess students’ ability to keep a laboratory notebook in Chem 334L and Chem 462L. Alignment with core competencies: written communication.Chem 334L, Organic Chemistry II laboratory, is taken roughly midway through the BS/BA Chemistry and Biochemistry majors, and students in this lab have already taken three required laboratory courses as prerequisites: Chem 101L, Chem 102L, and Chem 333L. With this preparation, it is expected that students in this course should be able to properly maintain their laboratory notebook, an important skill for all practicing scientists. 11 Chem 334L lab notebooks from Chemistry and Biochemistry majors were assessed using the departmental lab notebook rubric by our lab TA’s in the fall semester 2018. Out of a possible score of 20 points, the average score was 16/20 (80%). Seven students (64%) achieved the benchmark score of 15/20 or higher. For Spring 2019, 5 lab notebooks from Chemistry and Biochemistry majors were assessed using the departmental lab notebook rubric. Out of a possible score of 20 points, the average score was 15/20 (75%). Four students (80%) achieved the benchmark score of 15/20 or higher. Although this appears to be a good indication of our students ability to properly keep their lab notebooks, the highest scores were obtained for categories in which students prepared their notebook outside of the lab: abstract, TOC entry, page #s, completed table of amounts and physical properties of all reactants and solvents, procedure flowchart, etc. Poorer performance is always observed in the categories “record of in-lab observations”, “notation of changes to experimental procedures”, and “conclusions and comparison of results to literature values”. The laboratory courses CHEM 462L (Biochemistry II Laboratory) and CHEM 464L (Principles of Biochemistry Laboratory) have a Lab Notebook component that is part of the grade for those lab courses. CHEM 462L is the second semester lab in a sequence of two-semester Biochemistry lab courses for our B.S. Biochemistry majors. CHEM 464L is a single semester Biochemistry Lab course for the B.S. Chemistry and B.S. Biology majors. The students are expected to maintain a proper Lab Notebook for recording pre-lab notes, experimental data and in-lab observations as well as notes. As part of the pre-lab preparation, students are expected to write down the purpose of the lab, sketch a flowchart of the experimental protocol, and prepare table(s) to record their experimental data based on instructions in the lab manual. The students must record all data collected during the labs in their own lab notebook, even if they may work in groups. Notebooks are randomly but regularly checked and graded throughout the course of the semester. CHEM 462L in Spring 2019 had 8 students. The students received an average grade of 2.5 ± 0.7 out of 4.0 during the first notebook check and an average grade of 2.6 ± 0.7 out of 4.0 for the final notebook grade.CHEM 464L in Spring 2018 had 18 students. The students received an average grade of 2.8 ± 1.2 out of 4.0 during the first notebook check and an average grade of 2.8 ± 0.7 out of 4.0 for the final notebook grade.f. Comparison to data from previous yearsIn the 2016-2017 academic year, 29 Chem 334L lab notebooks from Chemistry and Biochemistry majors were assessed using the departmental lab notebook rubric by our lab TA’s. Out of a possible score of 20 points, the average score was 16.3/20 (81.5%). Thus, it appears that lab notebook performance in chem 334L is consistent over the years. In the 2013-2014 academic year laboratory notebooks were assessed in Chem 411, Chem 422, and Chem 433. Although these courses are taken after Chem 334L by our majors, the average scores for the notebook assessment was in the range of 15.1-18.1/20, which tells us little about whether our students are improving their notebook-keeping skills. Indeed, instructors also noted poorer performance in the same categories mentioned above (“record of in-lab observations”, “notation of changes to experimental procedures”, and “conclusions and comparison of results to literature values”). Since the importance of in-lab notebook record-keeping for practicing scientists cannot be overstated, we are currently in the process of amending our notebook rubric to place a greater emphasis (65% of the total points) on in-lab observations, deviations from protocol, conclusions, and post lab-reflection of the results obtained.In Chem 462L and Chem 464L there is no comparison yet available for student performance from previous years for this instructor. However, it appears that the student notebook scores improved slightly during the semester in Spring 2019. In contrast, for Chem 464L, the student’s notebook scores remained the same throughout the semester. In the future it is hoped that instructor feedback on the first notebook check would be taken to heart by the students so that a more significant improvement could be observed by the final notebook check.A. Measure Student Workc. Administered a signature assignment for longitudinal assessment of knowledge (SLO1) in gateway (Chem 321) and capstone (Chem 401) courses.Signature Assignment Administration, year 4: previously, twenty multiple-choice questions from all subdisciplines of chemistry (general, organic, inorganic, analytical, physical, and biochemistry) were assembled to create an assignment with the input of the department faculty (Appendix B). For the past three years this assessment has been implemented into Canvas, and given as an assignment to students in Chem 321 (Analytical Chemistry I, the gateway course for majors and minors) and Chem 401 (Inorganic Chemistry, the capstone course for majors and “unclassified” graduate students who would like to demonstrate proficiency in inorganic chemistry). Since the previous three assessments showed no significant difference in performance on the assignment in the gateway and capstone courses, we are currently considering a move to using the ACS DUCK (Diagnostic of Undergraduate Chemistry Knowledge) exam for our longitudinal assessment program (see below).?Results of assessment: In Fall 2018 Chem 321 (44 students), the average score was 10.6 correct out of 20 (53%) and 17 students (39% of the class) achieved the benchmark score of 12 correct out of 20. In spring 2019 Chem 321 (50 students), the average score was 11.74 correct out of 20 (59%) and 28 students (56% of the class) achieved the benchmark score of 12 correct out of 20. In spring 2019 Chem 401 (69 students), the average score for those who completed the assignment was 10.4 correct out of 20 (52%), and 21 students (30% of the class) achieved the benchmark score of 12 questions correct out of 20. While there is negligible difference in performance for students at the gateway and capstone courses for the fall 2018 Chem 321 class and the spring 2019 Chem 401 class, it appears that the spring 2019 Chem 321 students are doing significantly better on the assessment (both in terms of the average score and in the number of students achieving the benchmark) than the spring 2019 Chem 401 students. This is an unusual result, especially because between Chem 321 and Chem 401 students take upper division courses in analytical chemistry (Chem 422), physical chemistry (Chem 351, Chem 352), and biochemistry (Chem 464, Chem 461/462), as well as corresponding lab courses and upper division experimental courses (Chem 411, Chem 433). Hence the trend in student performance observed on this assessment is quite surprising. A possible reason for this is that the chem 321 instructor now gives extra points for students if they get at least 20% correct on the assessment, so that they may take the assessment more seriously.g. Analysis of the results and suggestions for improvementThis is the fourth year in which this assignment has been given in gateway and capstone courses, and the fourth year in which the trends in student performance between gateway and capstone have been not been what is expected. In fact, the current year’s data indicates a decrease in the percentage of students achieving the benchmark score. While this may in part be attributed to performance-driven extra-credit points in Chem 321, in the past where no such incentive was given, student performance in the gateway and capstone courses on this assessment was essentially the same. Three years ago several of the assignment questions were modified to clarify what is being asked or to further differentiate the response options. Even with that change, there is still an alarmingly small difference in class average between Chem 321 and Chem 401. The results indicate that our students are struggling with concepts not only encountered in upper division courses (questions 17-19), but also with foundational knowledge related to molecular structure, acid-base chemistry, and orbital theory (questions 4, 11, and 20). Several department faculty have suggested changing the assignment to the ACS DUCK (Diagnostic Test of Undergraduate Chemistry Knowledge) exam, and this option is being seriously considered for implementation this year. An advantage of using this exam is that the ACS provides data on student performance on the exam (and on individual questions) from educational institutions across the U.S., a very useful comparison for our faculty in assessing where our students stand at the national level. A. Measure Student WorkAssess graduate students’ scientific oral communication abilities in literature and thesis seminars, relevant to SLO2m: Organize and communicate scientific information clearly and concisely, both verbally and in writing. Alignment with core competencies: oral communication, information literacy.In Spring 2018 the oral presentation rubric (developed in the Department of Chemistry and Biochemistry and used to assess the literature and thesis seminars) was modified slightly to include one different category. The categories scored prior to Spring 2018 were organization, understanding of scientific content, style/delivery, use of visual aids, and ability to answer questions. The new categories, starting in Spring 2018, are now organization, quality of chemical / biochemical content, understanding of scientific material, delivery & use of visual aids, and ability to answer questions (Appendix D). The department wanted a larger portion of the score to be dedicated to content, and a smaller portion to style of presentation. As before, performance in each category could be rated with a score of 0-20. The rubric provided descriptions for “A” range (17-20 points), “B” range (14-16 points), “C” range (12-13 points), and “D” range (10-11 points) performance. Each semester, faculty attending the seminar filled out the rubrics and forwarded them to the seminar coordinator. The seminar coordinator then tabulated the results for each category and an average score for literature and thesis seminars was obtained.Results for 2018-2019: Over the academic year 14 MS Chemistry or Biochemistry students presented literature seminars, with average scores 17.1/20, 16.3/20, 16.1/20, 16.1/20 and 15.5/20 for the categories of organization, quality of chemical / biochemical content, understanding of scientific material, delivery & use of visual aids, and ability to answer questions. The average total score for all 14 literature seminars was 81.2/100 (B+). In Fall 2018 and Spring 2019 12 MS Chemistry or Biochemistry students presented thesis seminars, with average scores 17.9/20, 17.2/20, 17.5/20, 16.9/20 and 16.8/20 for the categories of organization, understanding of scientific content, style and delivery, use of visual aids and ability to answer questions, respectively. It should be noted that one of the 12 seminars is not included in the averaging due to anomalous activities that occurred during and after the seminar that affected the grade. The average total score for the 11 other thesis seminars was 86.3/100 (A-).h. Analysis of the Results of MeasurementThe results indicate that, on the whole, graduate students are doing well in their oral seminars, since the average scores in most categories are in the 16-17 range. The trend has continued from previous years that the thesis seminar grades are slightly higher than the literature seminar grades. The weakest category tends to be the ability to answer questions. For many years the scores in both seminars were increasing. The likely reason for the increase was thought to be more rigorous pre-talk preparation, particularly in the area of practice talks. It is now common to see students giving 3-4 practice talks in advance of their seminar date, and they invite both faculty and students from a variety of subdisciplines within the department. Up to a few years ago, students who attended the practice talks would not provide a lot of constructive feedback to their peers. However, in recent years, as they have become more familiar with critically evaluating presentations, they have participated more and the overall quality of the seminars has increased as a result. The students who continue to have low scores are those that tend not to involve their research mentors and tend to have significantly fewer practice runs in advance of their seminars. These practice sessions have clearly been extremely valuable to both the audience and the presenting student. Since the scoring categories were changed in Spring 2018 the average scores for both literature and thesis seminars have decreased. This is not a reflection on the quality of our students – in fact, it can be argued that the students now joining the program are frequently better qualified than ever before. In addition, the rigorous pre-talk practice sessions have continued. There are two likely reasons for the decrease in average seminar scores since spring 2018: the first is that there is an observable change in scores due to an increase in categories that score content and a decrease in categories that score presentation style. This, however, doesn’t explain the across-the-board decrease in score in every category (including categories that did not change). The more likely reason is that the faculty expect more from the students now and are grading them accordingly. Despite the rubric, it is still very possible to obtain this variation, especially since the newer faculty (hired in the past ~12 years), which constitute an increasing portion of the evaluators, tend to score more strictly than many of the older faculty.3. Preview of planned assessment activities for next year. Include a brief description and explanation of how next year’s assessment will contribute to a continuous program of ongoing assessment.In the next year, in addition to our normal department-wide assessment activities, we plan to collect data on our program SLO’s #5 (Effectively utilize modern chemical instrumentation to obtain data and perform research) and #6 (Perform qualitative and quantitative chemical analyses, including the application of computer technology for such analyses) and compare the results from our previous assessment of these SLO’s 5 years ago. We will also continue with longitudinal program assessment in gateway and capstone courses using a revised instrument: the ACS DUCK exam. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download