Wolverine Access: Faculty Center

[Pages:45]Edited by Foxit Reader Copyright(C) by Foxit Software Company,2005-2007 For Evaluation Only.

Wolverine Access: Faculty Center Heuristic Evaluation SI 622

Tiffany Chao Saurabh Koparkar

Jane Loegel Megan Morrissey

February 15, 2008

Wolverine Access: Faculty Center ________________________________________________________________________

Table of Contents: Introduction to Wolverine Access: Faculty Center ............................ 3

Target Population Introduction to Survey .............................................................. 4

Survey Goals Methodology ......................................................................... 4 Questionnaire Design ............................................................... 6

Question Selection Question Design Question Length Results & Analysis ................................................................. 9 Demographics Key Findings Improvements to Teaching Support Improvements to Administered Survey Conclusion ............................................................................ 13 References ............................................................................ 14 Appendices ........................................................................... 15

_____________________________________________________________________ 2

Wolverine Access: Faculty Center ________________________________________________________________________

Introduction to Wolverine Access: Faculty Center

The Wolverine Access: Faculty Center site functions as an online resource tool to aid faculty, Graduate Student Instructors and other teaching staff at the University of Michigan in coordinating and managing their class sections. This system is currently known as "Teaching Support" and is the primary resource for faculty and instructors to perform administrative functions for their courses such as:

? view class rosters with student photographs ? manage class waitlists ? obtain advisee information ? manage and view class schedule ? submit grades

New features proposed for Faculty Center are class and faculty searches and additional external links to connect with outside resources within the University, such as the Students with Disabilities Office and Counseling and Psychological Service. The class and faculty search would allow faculty and instructors to coordinate meetings with one another. The site is primarily used at the beginning and the end of the semester and with the new improvements in Faculty Center, site use may increase. Overall, this system is a one-stop shop for the faculty and instructors to manage their courses and perform the related administrative functions.

Target Population

The users of the Wolverine Access: Faculty Center are current faculty, graduate student instructors and instructional staff who are teaching courses in a given semester at the University of Michigan- Ann Arbor. There is almost twice as many faculty than graduate student instructors, however, it is unclear how many faculty actually teach each semester.

Target Demographics

GSIs 29%

faculty 71%

Faculty (full-time/parttime in 2006): 5,542

Graduate Student Instructors for Fall and Winter Terms 20072008: 2,234

faculty

GSIs N= 7,776

_____________________________________________________________________ 3

Wolverine Access: Faculty Center ________________________________________________________________________

Introduction to Survey

Survey Goals

The primary goal of this survey was to determine how satisfied the user population is with the current Teaching Support site and what changes should be implemented in the its successor, Faculty Center.

We explored the following areas:

- User attitudes about the useability and usefulness of the product. - Understand what the users feel is important to them with regards to the

useability of the product. - Find our which user interface features of the product are preferred by the

users.

We also feel that learning about the demographics of our user population may give us a sense of how satisfied users are with Teaching Support. Because there are sometimes gaps in comfort with technology between older and younger users as well as gaps between users based on experience and level of exposure, demographics may contribute to the user satisfaction with Teaching Support.

Methodology

Team 4-Square is evaluating the Wolverine Access: Teaching Support site in order to provide recommendations for usability improvement of `Faculty Center'. In the formative stages of our project, we identified our user population as the faculty, staff, and Graduate Student Instructors (GSIs) at the University of Michigan. Teaching support is accessible only to instructors, who must use a password to access their records. With this in mind, our sample survey population consisted of these University individuals. We launched our survey using the website , because of its user-friendly interface and the option to export data for interpretation.

Through a pilot survey, our team would be able to determine if our product was well-developed, user-friendly and provide us with the appropriate data. A sample of seven faculty members and GSIs and asked them to participate and were sent an email with the survey link. Four of the seven people participated, providing us with enough data to launch our final survey. The answers provided by our pilot users proved to be very consistent: all four of them used the photo roster, grade roster, and student photo roster features the most often, citing the class and photo rosters as the most useful features.

Another purpose of the pilot survey was information gathering: in order to ascertain the data we needed in our final survey, the members of 4-Square sought to gain a preliminary understanding of our target audience's firsthand use of the site. The users

_____________________________________________________________________ 4

Wolverine Access: Faculty Center ________________________________________________________________________ all agreed that they wanted to see the new Teaching Support site contain features that allowed them to view their exam schedule, and perform faculty searches, allowing them to find a colleague's name and view their class schedule. One of our respondents stated that they wanted to see a Teaching Support website that was easier to navigate and was better integrated between departments.

In preparation to launch the final survey, we drafted an email message with the survey link but decided to contact the `department coordinators' within the colleges of the University, along with each department of the Literature, Science and Arts (LSA) college. Instead of contacting faculty members and GSIs individually, we felt that the department coordinators would have access to an internal email list that would be a great asset in distributing our message and reaching our target population. These email addresses were collected through department websites. Our correspondence was written in vibrant language to entice potential subjects to take our survey and contribute to the improvement of Teaching Support.

The survey was live for six days. A reminder email to our target population was sent on the fourth day to ensure that we had a viable sample and we were able to obtain 50 respondents. Through , our data responses were downloaded into a Microsoft Excel spreadsheet and analyzed.

To analyze our data, we ran a series of Cross tab tests in the Statistical Package for the Social Sciences (SPSS Inc.) software. After entering the raw data from our survey results into the SPSS program, we ran a series of correlations between variables, thus yielding clear user patterns of the Teaching Support website.

_____________________________________________________________________ 5

Edited by Foxit Reader Copyright(C) by Foxit Software Company,2005-2007 Wolverine AccesFso: rFEavcaullutyatCioennOtenrly. ________________________________________________________________________

Questionnaire Design

Question Selection

The group brainstormed questions for the survey using data from various sources such as user interviews conducted earlier in the semester as well as Teaching Support interface features. Twenty questions were developed for the pilot survey. After the pilot survey results were obtained, it was found that the people thought the questions were broad and lacked depth. The questions were subsequently modified by changing the wording and sentence construction. More Likert scale questions were added since the pilot survey had only two questions of this type. To give the users a chance to express and elaborate their thoughts about the current system, free-response questions were added. These questions were not stacked at the end of the survey which would have been dull and monotonous but instead, these questions were appropriately staggered in between other types of questions. In the pilot survey, there was a general Likert scale question about the `Help' feature of Teaching Support which was changed and made more close-ended to find out the `how' and `why' answers about the way `Help' feature was used in Teaching Support. The survey recorded user demographics such as gender and age. Computer and internet literacy was examined in terms of proficiency or expertise by a Likert scale question. Additional demographic data, such as the user profile (professors or GSI), the amount of time they have been teaching at the university as well as using Teaching Support use per semester were recorded. These questions were chosen because the surveys are excellent tools to find answers to user population characteristics. Questions which evaluated the attitudes of the users towards Teaching Support were selected such as listing the most usable and least usable feature(s) of Teaching Support.

Question design

The questions collecting demographic data, such as age and gender, were simple multiple choice, single answer type questions. The questions gathered data about:

? the users' affiliation to the university's schools and departments as teachers or as students (in case of GSI),

? their teaching profile such as professor or GSI ? the program level they teach ? undergraduate/graduate/doctoral ? the number of semesters they have taught at the university ? their computer and internet literacy skills

These were all close-ended, multiple choice, single/multiple answer type questions. Some questions used the Likert scale to elicit information from the user.

_____________________________________________________________________ 6

Wolverine Access: Faculty Center ________________________________________________________________________

Questions 16 to 20 and 23 were close-ended Likert scale questions which helped us to know the user attitudes towards various user interface features. These questions were designed using the data we obtained from the user interviews. For example:

Some questions have been assigned textboxes to account for user's open ended responses, though such questions are very few so that the users are not overwhelmed by the response expected out of them. Some open-ended textbox-answer type questions are a follow-up to a previous close-ended Likert scale question. For example:

_____________________________________________________________________ 7

Wolverine Access: Faculty Center ________________________________________________________________________

A few questions have multiple checkbox answers in order to elicit all possible responses from the participant for a given feature of Teaching Support. For example:

The questions were formulated to be easily understood and as lucid as possible so that the users would be less likely to abandon the survey due to confusion or frustration.

Question length

There were a total of 26 questions. The survey took between 15-20 minutes to complete. Covering the goals of the survey was an important issue and keeping the survey length optimum was also necessary; we felt that the questions we created struck a good balance between these two areas. The few open-response questions allowed the user analyze and think more about Teaching Support and the answers they had previously selected and it was a necessary gage to collect the users' attitudes and opinions about the system in question. Another reason for including free response answers was the fact that our pilot survey was considered broad and lacking depth.

_____________________________________________________________________ 8

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download