Round II Data Collection and Analysis: Remote Testing



-4134686016625EDIT 752: Spring 2016Kimberly SmithCara CrawfordDustin NorwoodAhmad AlmufarrehJoshua Flores00EDIT 752: Spring 2016Kimberly SmithCara CrawfordDustin NorwoodAhmad AlmufarrehJoshua Florescenter4500452120Round I Data Collection and AnalysisRound II Data Collection and Analysis: Remote Testing11540067000Round I Data Collection and AnalysisRound II Data Collection and Analysis: Remote Testingright230023114076009800Contents1. TOC \o "1-3" \h \z \u Background PAGEREF _Toc449390346 \h 22. Research Goals PAGEREF _Toc449390347 \h 22.1 Overview PAGEREF _Toc449390348 \h 22.2 Study Goals PAGEREF _Toc449390349 \h 23. Research Methodology PAGEREF _Toc449390350 \h 24. Participants PAGEREF _Toc449390351 \h 34.1 Recruitment PAGEREF _Toc449390352 \h 34.2 Summary of Participants PAGEREF _Toc449390353 \h 35. Research Questions PAGEREF _Toc449390354 \h 36. Protocol and Data Collection PAGEREF _Toc449390355 \h 46.1 Data Collection Protocol – Remote Testing PAGEREF _Toc449390356 \h 46.3 Data Analysis PAGEREF _Toc449390357 \h 87. Summary of Results PAGEREF _Toc449390358 \h 87.1 Calendar PAGEREF _Toc449390359 \h 87.2 Professor Ratings PAGEREF _Toc449390360 \h 87.3 Grade Point Average PAGEREF _Toc449390361 \h 97.4 Contacting OIPS PAGEREF _Toc449390362 \h 97.5 General UX Questions PAGEREF _Toc449390363 \h 98. Recommendations PAGEREF _Toc449390364 \h 118.1 Calendar PAGEREF _Toc449390365 \h 118.2 Professor Ratings PAGEREF _Toc449390366 \h 118.3 Contacting OIPS PAGEREF _Toc449390367 \h 118.4 General UX Recommendations PAGEREF _Toc449390368 \h 119. Future Research PAGEREF _Toc449390373 \h 1210. Artifacts PAGEREF _Toc449390374 \h 1310.1 userzoom full interface PAGEREF _Toc449390375 \h 131. Group EDeliverable IIEDIT 732 – Fall 2015BackgroundFor new international students, the prospect of moving and adjusting to a new country for their studies can be daunting. Because they come from other countries, these students deal with stressors that domestic students do not face. To assist international students attending George Mason University with immigration compliance, academics, and cultural adjustment we have teamed up with the Office of International Programs and Services (OIPS) and created an app named “Arrivata” which will serve as a virtual concierge to guide them through their studies.Initial feedback on our prototype from prospective users was positive overall, with many students expressing interest in its further development. Much of this initial feedback has been used as a springboard for further research and development.Since the initial testing, we have updated methods for accessing OIPS materials, improved the tracking system for important deadlines, and made checklists within the app actionable to further improve the user experience. This prepared us for the first round of data collection summarized below. Our official first round of user testing was conducted using focus groups and yielded valuable insights about the direction for this project. Most students indicated interest in ways to connect with others to as a means of improving their academic experience. This initial feedback has been used as a springboard for this second round of research and development.2. Research Goals2.1 OverviewThe purpose of this round of testing was to test the overall user experience when interacting with the app interface. collect ideas and opinions about our team’s preliminary design, and then use that feedback to shape the development of our prototype. This round of testing utilized focus groupsthe online user experience testing engine userzoom with target users to collect opinions about our current direction with the app’s featuresfunctionality and usefulness of the app. 2.2 Study GoalsThis round of testing centered around the following study goals which were used as a guide for our research questions and methods:Determine if and how students would use an app such as ArrivataIdentify strengths and weaknesses of the app if the app is on track to meet the needs of potential usersInvolve end users in the iterative design processIdentify areas where the user experience could be improved 3. Research MethodologyOur first round of testing utilized focus groups to gather broad information about potential user desires for an app such as Arrivata. For this round testing we wanted to gather more information about the actual execution of our app thus far so we utilized focus groupsthe userzoom remote survey tool and a small amount of hands-on testing to gather data used for prototype improvement.these methods because we wanted to provide students with an open forum to express their ideas about the app’s functionality. The primary benefit of using remote user experience testing was that it allowed us to take a granular look at how potential users perceived our app features thus far. All previous rounds of testing, including initial prototype testing, focused on the question of “What do you want to see?” whereas with individual user testing we were able to ask “What works and what doesn’t?”The secondary benefit of using this method of research is that we were able to dramatically increase the number of test participants because the testing was performed asynchronously. Given the nature of our participants, it was very difficult to recruit and schedule time for testing during the first round of research. This lead to a relatively small number of participants. With this round however we were able to triple our number of responses which provided us with a wealth of information about the current state of the project. 4. Participants4.1 RecruitmentParticipants in this study were all international students gathered from the international student body at George Mason University. The majority of these participants were recruited with the help of our contacts at OIPS through a pre-prepared announcement sent out to students. We also directly recruited international students sitting at the study tables on the second floor of the Johnson center to increase the number of responses. In some cases the users completed the testing right there and in others the students completed the testing on their own time. 4.2 Summary of ParticipantsOverall 21 participants were recruited for the study and of those 19 completed it from start to finish. Testing took place from 4/21/2016 – 4/24/2016. During this round we did not focus on demographic questions prior to testing as they did not have any discernable effect on results as shown in the previous round of testing. We also wanted users to focus their attention primarily on the app itself. For these reasons they were left off. Userzoom did however capture technical data such as operating system, browser information, screen resolution, etc. but since testing was performed on computers and not mobile devices this data was irrelevant for our purposes. 5. Research QuestionsOur data collection for this round used both quantitative and qualitative data to answer the following research questions about the execution of our prototype at this stage of development:CalendarWas the user able to effectively navigate this portion?Was it easy to navigate through calendar data?What issues did you have with navigating through the calendar?Professor RatingsWas the user able to effectively navigate this portion?Did the content meet their needs and expectations?Was it easy to navigate the ratings section?Was the information accessible in a timely manner?Did the language and visuals aid in finding teacher ratings?Visually was anything out of place or unnecessary?Was the execution of this section preferable?Grade Point AveragesWas the user able to effectively navigate this portion?Contacting OIPSWas the user able to effectively navigate this portion?General user interface questionsDid the language and visuals make sense?How likely is it that the user would recommend the app to others?Rate importance of the following features from least to most useful:Registering for classes and tracking degree completion statusGetting help with immigration paperworkViewing class descriptions/teacher ratings from other studentsViewing a campus map with all buildings markedAccess to all grades and GPAFinding a part time jobFinding other students with similar backgroundsFindings information on clubs and organizationsRate the frequency of the following activities from once per semester/never to daily:Talk with academic advisorContact OIPS with questionsCheck the campus mapCheck grades in BlackboardAttend official student events on campusWhich methods (in person, via computer, via mobile device) are used most for completing the following tasks:Contact advisor or OIPSComplete classworkCommunicate with classmates outside of classCheck grades, assignments, etc. on blackboardLook for jobsResearch housingWhat pieces information did the user wish they knew before coming to campus for the first time?Which of the following are reasons for visiting the OIPS website?Documents and formsVisa and Immigration informationEvent Schedules and informationStaff and Office informationTax informationVolunteer or job informationOther (blank)User has never visited the OIPS websiteWhat suggestions could be offered about the app’s functionality?6. Protocol and Data Collection6.1 Data Collection Protocol – Remote Testing Once participants were recruited they were sent the following link to the remote testing site: that completed the testing in person with the aid of a researcher were automatically taken to the testing portal. This link brought the participant to the Arrivata Research Study Consent Form which was an abridged version of the consent form used in the first round of testing.Upon agreement, users were taken to the Welcome screen which explained the purpose of the study and usage scenarios to follow.The first scenario was to use the calendar to find the last day to drop classes. Upon clicking Start task button, a second window with the prototype was launched with a copy of the task below and two buttons: Success and Abandon. Users were to complete the task then select Success or Abandon based on their ability to complete it. Users were then asked a series of follow-up questions about their experience completing the usage scenario. The process continued for the remaining usage scenarios followed by a series of general questions about the interface and content. Following completion of the survey the system automatically recorded responses, indicated that the participants had completed the study, and thanked the users for their participation. 6.3 Data AnalysisData analysis was heavily simplified through the use of userzoom as all data was instantly tabulated by the system. From the userzoom reporting dashboard all results were exported to Excel for further analysis. Since all user responses for each question were well-organized by the system, we had the ability to pay careful attention to all responses and take note of special insights. 7. Summary of Results7.1 Calendar95% of students were able to complete the calendar task which asked them to find the last day to drop classes. 83% of those users said that it was easy to navigate through the calendar data.7.2 Professor Ratings95% of students were able to complete the task of finding professor ratings. The majority of students generally found it easy to navigate this scenario. Many students did however indicate that the visual design of this section conflicted with existing mental models of how a ratings page should appear. Only 63% of students gave a favorable view of the organization of this section. “I don’t know if I missed it, but I only saw the reviews students wrote but not the actual rating.”“I thought the menu option was too long. I had to scroll down to see all of them. Perhaps have some sort of "pop-up" menu on top where I can click to see all options at once!”“ADD GRAPHS AND REVIEW STARS TO VISUALIZE PROFESSOR RATING”“I wish that the review would rate the amount of effort in the class”“Too much words”7.3 Grade Point AverageAll users were able to complete this task without issue. 7.4 Contacting OIPSOnly 74% of students had success contacting OIPS. 7.5 General UX QuestionsUsers generally understood the visuals and language used in the app. However, 25% were neutral in their opinion of the UX language. Users would generally recommend this app to friends.Order of feature importance:Registering for classes and tracking degree completion statusGetting help with immigration paperworkViewing class descriptions/teacher ratings from other studentsViewing a campus map with all buildings markedAccess to all grades and GPAFinding a part time jobFinding other students with similar backgroundsFindings information on clubs and organizationsFrequency of activities (0 = never, 7 = daily): Methods used to complete tasks: Students wished they knew more about the following categories before arriving:JobsHousingTransportation and parkingFoodShoppingHealthcare issuesAcademic advisingTechnology (email)ImmigrationStudent organizationsWhy do students visit the OIPS website?Final Suggestions for app functionality:“Just make sure you update the app periodically based on the students usage, make as user friendly as possible. Good design works best, make it look visually beautiful.”“Haven’t noticed the "back" button, if you click on one of the options and want to go back”“My stats and Academic are overlapping”“Live chat with OIPS representative - Change the GPA scale icon. Not very clear. - I am not sure if it shows or links to OIPS events schedule.”8. Recommendations 8.1 Calendar83% of users said it was easy to navigate the calendar data. Calendars are one of the most commonly used apps on mobile devices so this section should have received higher marks.Recommendation: Search for ways to make calendar more intuitive, possibly incorporate high-level rather than monthly view. 8.2 Professor RatingsMost students did not understand the text-based visual design of the professor ratings section. Recommendation: Remove extraneous text and convert section to star format next to teacher names indicating quality of teaching and class difficulty. When clicked students should then see individual ratings. 8.3 Contacting OIPSOnly 74% of students had success contacting OIPS. Students preferred contacting advisor in person 63% of the time. Recommendation: Create multiple outlets for contacting OIPS such as setting up in person meeting, direct phone call, or message through app. 8.4 General UX RecommendationsStudents want to know more about registration, immigration paperwork, and viewing teacher ratingsPrior to arriving, students need to know more about life on campus and how to get started. Consider adding section based entirely on pre-arrival contentInclude a back button so users can return to essential parts of the app.Students primarily use the OIPS website for documents, immigration, and job information. This means that they essentially do not use it for social features and a gap exists which needs to be filled. Continue to build out social features within app to help bring students together and share practical information with one-another. 9.0 Future ResearchBased on results from this round of research it is clear that there is ample room for improvement to the existing user interface. Once recommendations listed in section 8 have been implemented it would be best to perform a second round of remote user testing using userzoom. Preferably the same main areas would be tested again with the addition of more in-depth social features. Should that round of testing yield “green light” results for existing features it would then be prudent to move on to other areas students expressed interest in such as navigating life in the Northern Virginia area. 9. Artifacts9.1 userzoom full interface ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download