HES3360 – Human Factor



Swinburne University of TechnologyHES3360 – Human FactorCase Study – Augmented Reality Glasses – the iSunny Ha Le - 62358915/10/2011Case Study: The iSunny – Augmented Reality GlassesiSunny is an augmented reality pair of sunglasses that provides a platform for applications; which provide users with functionalities and rich information related to their current place, context, and tasks. Some of the major technologies being used in the sunglasses are: eye-tracking, gesture recognition, voice recognition and synthesis, 3D stereoscopic image generator, 3D sound mapping and generator, and mobile internet. In this paper, some basic functionalities and applications are discussed: GPS navigation; supplying context aware information like local services or product reviews and rating; real time traveller information and/or translation; and support for elderly and disable people in daily tasks through assistive vision and audio functionality. Using eye-tracking technology and other natural interface technologies like speech and gesture, iSunny has the ability to quietly immerse into the user’s life and provide relevant information in a natural and non-disturbing way.Contents TOC \o "1-3" \h \z \u Front-end Analysis PAGEREF _Toc292829074 \h 4User analysis PAGEREF _Toc292829075 \h 4Methods of user analysis PAGEREF _Toc292829076 \h 4Environment Analysis and resulting Personas PAGEREF _Toc292829077 \h 4Data Collection Method PAGEREF _Toc292829078 \h 5Scenario Creation PAGEREF _Toc292829079 \h 5Task Analysis PAGEREF _Toc292829080 \h 5Visual Sensory Analysis PAGEREF _Toc292829081 \h 6Personas’ visual ability and limitations analysis PAGEREF _Toc292829082 \h 6System and environment visual analysis PAGEREF _Toc292829083 \h 6The Lens PAGEREF _Toc292829084 \h 6Visual Receptor system analysis PAGEREF _Toc292829085 \h 6Sensory processing limitation PAGEREF _Toc292829086 \h 6Bottom-up versus Top-down processing PAGEREF _Toc292829087 \h 7Auditory System Analysis PAGEREF _Toc292829088 \h 8Personas’ auditory ability and limitations analysis PAGEREF _Toc292829089 \h 8Noise problem PAGEREF _Toc292829090 \h 8Speech Communication problems PAGEREF _Toc292829091 \h 8Hearing loss PAGEREF _Toc292829092 \h 9Alarms and warning PAGEREF _Toc292829093 \h 9System’s alarm criteria PAGEREF _Toc292829094 \h 9Environmental alarms PAGEREF _Toc292829095 \h 9Sound localization PAGEREF _Toc292829096 \h 10Tactile and Haptic senses PAGEREF _Toc292829097 \h 10Cognition analysis PAGEREF _Toc292829098 \h 11Selective Attention PAGEREF _Toc292829099 \h 11Perception PAGEREF _Toc292829100 \h 12Working Memory PAGEREF _Toc292829101 \h 12Long-term Memory PAGEREF _Toc292829102 \h 12Decision making process analysis PAGEREF _Toc292829103 \h 13Heuristics and Biases PAGEREF _Toc292829104 \h 13Dependency of Decision Making on the Decision Context PAGEREF _Toc292829105 \h 13Improving Human Decision Making PAGEREF _Toc292829106 \h 13Display PAGEREF _Toc292829107 \h 15Personas’ Scenarios PAGEREF _Toc292829108 \h 15Alerting Displays PAGEREF _Toc292829109 \h 15Labels PAGEREF _Toc292829110 \h 16Monitoring PAGEREF _Toc292829111 \h 16Multiple Displays PAGEREF _Toc292829112 \h 16Display layout PAGEREF _Toc292829113 \h 17Navigation Displays and Maps PAGEREF _Toc292829114 \h 18Commands Displays PAGEREF _Toc292829115 \h 18Maps PAGEREF _Toc292829116 \h 18Controls PAGEREF _Toc292829117 \h 19System Control Devices PAGEREF _Toc292829118 \h 19Physical buttons PAGEREF _Toc292829119 \h 19Eye-tracking control PAGEREF _Toc292829120 \h 20Voice Control Analysis PAGEREF _Toc292829121 \h 21Anthropometry and Workspace Design PAGEREF _Toc292829122 \h 22Anthropometric factors PAGEREF _Toc292829123 \h 22Component Layout PAGEREF _Toc292829124 \h 23Frequency of use PAGEREF _Toc292829125 \h 24Importance of use PAGEREF _Toc292829126 \h 24Sequence of use PAGEREF _Toc292829127 \h 24Consistency PAGEREF _Toc292829128 \h 25Control-display compatibility PAGEREF _Toc292829129 \h 25Clutter avoidance PAGEREF _Toc292829130 \h 26Functional grouping PAGEREF _Toc292829131 \h 26References PAGEREF _Toc292829132 \h 26Front-end Analysis User analysisMethods of user analysisIn order to perform Front-end analysis on the system, the set of users or operators of the system has to be identified with details about their characteristics like age, gender, education levels etc. There are currently quite a few approaches to identifying potential users for a particular system. Firstly, popular characteristics can be derived from sampling the existing population of users of a current solution, this approach has the advantage of being based on real and solid data, through a systematic and scientific approach. However it usually fails to support the design of a system that aim to target a new set of users, or extending a wider range of users. Another approach is creating concrete and understandable “personas” that are representation of the major characteristics of the population. This approach allows the design to be users-oriented and resulting in products that are usable for users that fit into the described “personas”, which normally represent a large portion of potential users. This approach will be used in the following analysis, where three personas have been developed in this case study to examine the basic functional requirements of the system.Environment Analysis and resulting Personas Johnny is a 22 years old backpacker from Melbourne. He has been travelling around the world for about a year, to places where there’s very little English in use like West China or remote areas in South America. He had a lot of difficulties not just in navigating around, but also in understanding the culture, enjoying the arts because he doesn’t speak or read the language. Johnny feels like he’s missing out a lot spending way too much time getting to somewhere, just to realize that he is only able to scratch the surface of the local culture. Being a Gen Y, Johnny is very competent with technology.Mary is a 62 years old retiree; she lives with her husband in a quiet suburb called Lonsdale. The couple live with a fairly tight budget as they still have to pay off their mortgage while pension is the main income for both of them. Mary has to very carefully plan the expenses, but with so many things to worry about, it just become such a headache. Mary has to go through all the local advertisements to find the best deals, regularly check up on local garage sales for possible bargains; she even has to take note of where to buy groceries with the best price and plan her way to save every dollar. Barry is a 40 years old man; he was born blind yet has learned to live very happily. However his blindness clearly makes him much more vulnerable than normal person, like hitting objects left scattered around the house, or when crossing the road or even hitting a power pole on the curb. It would be great for Barry to have a device to warn him off potential danger in range. He has little experience with complex computer or smart phone interface.Data Collection MethodIn order to effectively identifying the required tasks, different methods need to be used in different stages of the design. Initially Observations should be used to identify key general tasks that users desperately need to perform, Observations is cheap (next to nothing) to carry out, and reveals the most obvious needs. From these initial basic tasks, a carefully designed Questionnaire survey then can be used to dig down and further reveal of the details of the task, how the user like it to be done, and how critical is the task to them. Questionnaires are relatively more expensive, but able to target a large number of potential users. However its major disadvantage is its limited nature in response, meaning we only reconfirming or rejecting our own view about the tasks, no information can be provided about what we possibly have missed. That’s the reason why individual or group interview needs to be carried out, right before design the product. These interviews reveal real and personal view of potential users into the tasks and prospective product, and might provide unexpected information.Scenario CreationApplying the aforementioned method of task identification, the following persona scenario was designed to demonstrate a potential major task needs to be performed:Johnny left his backpacker hotel in Shanghai for Tong Li, a famous historical village where people still live in the old style. Johnny has been looking forward to this trip for a long time. When he got there, he was amazed by the whole ancient culture feel of the village. He saw so many things that he never seen before, but he quickly realized that there wasn’t anyone there to tell him what did these distinctive carvings on the walls mean, or what those unfamiliar objects were there for. There were hundreds of questions keep running through Johnny’s head, but there wasn’t any description on anything in the village except for some poorly translated English names. Johnny felt like he has missed out a lot by only seeing the surface of a rich and ancient culture.Task AnalysisThe goal identified in the above scenario is able to easily and intuitively get rich information about objects, activities, virtually anything the user is interested in. Where the main function is finding information about interested items/activitiesTask: Identify interested item.Sub task: user focuses on item by eyesSub task: system display confirming messageSub task: user confirms.Task: Search for related informationSub task: System finds all possible related informationSub task: User indicates relevant itemsTask: Display informationSub task: System displays selected itemsSub task: User further narrow down relevant information.Visual Sensory AnalysisPersonas’ visual ability and limitations analysisAs iSunny interacts the most with the user’s visual sensory system (for most users at least), a research in to its personas’ visual sensory ability and limitations was carried out as followed:Being a Computer Science student, Johnny is severely short-sighted; he has to wear prescription glasses almost all the time. However as he’s in his 20s, his eyes are very sensitive to colour, as well as in low light conditions.On the other hand, Mary is a lot older than Johnny, due to age, her eyes are far-sighted. Furthermore, she has very low night vision, she can’t drive at night and sometimes don’t even want to walk down the street in the evening. The last persona, Barry, our vision-impaired persona, unfortunately was born without vision; hence this part of the system won’t apply to him. However later discussions will provide Barry with alternative interface.Persona ScenarioJohnny has just arrived in Guangzhou, and rented a car at the airport. Putting on his new iSunny, he cruised down the highway to get to the city; the navigation system seemed to work just fine showing him clear instruction guiding him to his hotel. However, as soon as he got to the freeway he realized that his short-sightedness is not rectified by the glasses at all, he couldn’t read the highway signs very clearly. A bit nervous driving in that condition, he focused on the turn-by-turn navigation. Then he went into a tunnel, iSunny brighten the display slowly. However when he got out, the sun was too bright, and iSunny couldn’t adjust its shading quick enough, Johnny became blind for almost a few seconds. He almost hit the car on the next lane and also missed the exit from the highway. System and environment visual analysisThe LensDue to its wide range of users and their own visual limitations, e.g. Mary is farsighted, and Johnny is short-sighted, iSunny needs to be prescription-lenses-compatible. It also could be better if iSunny could be multifocal, minimizing users’ workload and tiresome of having to adjust their focal length (or Accommodation property).Visual Receptor system analysisFirstly, system supporting information should be displayed in the peripheral region, so it doesn’t block away the fovea region (Location property), which users need to focus in to complete their tasks. However detailed task relevant information like review and rating about a particular product, or a snippet of Wikipedia particles about an artefact, should be displayed in the fovea region to improve Acuity.Furthermore, due to its wide range of applications and users, iSunny needs to work in different lighting environments, and should be able to adapt quickly to changes in lighting condition, to assist user’s visual adaptation. For Mary, it could be out bright daylight when she walks down to the shopping strip, then much darker when she enters a small gallery. For Johnny, it could be a cloudy day or in a poorly lit room (possibly for preserving the artefacts). One possible way to achieve this is iSunny should be able to quickly and automatically adjust its shade, similar to transition lenses. Sensory processing limitationThe automated adaptation in shading should be in sync with adjusting the display contrast of information on iSunny to ensure optimum visibility, e.g. darker text with light grey bounding in daylight, and white text with dark bounding in lower light places. In addition, the system could also make use of image processing techniques to improve contrast of viewing scene/objects to help improve visibility, especially for Mary, a user with low night vision. In addition to improving contrast, the system will be designed to avoid colour blind problems by avoiding relying on colour, all information displayed on iSunny will have distinctive shapes, representative of their functionality (i.e. a Triangle with exclamation mark (!) in the middle means warning), and are placed at consistent locations, with colour only a redundant backup. Bottom-up versus Top-down processingDepth perceptionIn regarding to information perceiving and processing, firstly in order to assist user’s bottom-up processing, information displayed by iSunny should be in 3D form, using stereoscopic display, where snippets of information will be displayed virtually directly nearby to their relevant object/activities, e.g. for Johnny it could be a overlayed 3D map of the city with symbols and icons placed directly above their actual position as Johnny sees them. Another important aspect is as more and more information are displayed, they should be arranged in short forms, clear and familiar fonts, simple symbols to improve legibility. Visual search and detectionOn the other hand, Top-down processing can be improved by using consistent design and colour in symbols, for example black-yellow for warning, white-blue for public transport etc. This will help speed up visual search process for users (through Expectancy). Also danger warning symbols, like potentially troublesome area in an unfamiliar city for Johnny, should be in distinctive colour (bright red-white) and in slightly larger size to capture user’s attention (concept of Conspicuity). Auditory System AnalysisPersonas’ auditory ability and limitations analysisAs in previous Visual System Analysis section, the study on Auditory System is started off by researching the personas’ auditory ability and limitations.Our vision-impaired persona, Barry, apparently has very sensitive ears to compensate for his vision; he’s able to pick up very subtle sound like people stepping on leaves from far away, and able to distinguish very well between similar sounds. On the other hand, Mary hearing ability is rather limited due to her age; she has problem concentrating when exposed to street noise for a rather long period, and she only can hear rather loud sound or speech. Being the youngest persona, Johnny’s hearing is good and typical for his age group.Persona ScenarioBarry was wearing iSunny to get down the shops for some batteries to replace the dead ones in his radio. It was a Saturday morning and the street was packed with people, there was so much noise for Barry’s sensitive ears so he put on his noise cancelling headphone and paired it with iSunny. The headphone seemed to work fine and he found it much more comfortable. However after a while the headphone made his ears tired and a bit hurt, but he tried to cope with it as he just couldn’t stand the noise. Suddenly he heard vaguely somebody was yelling out something, he took his headphone off and realized that he was about to enter a road work site. And the site manager was calling for him to stay out but he couldn’t hear. Barry was scared when he got told that he almost fell into a hole if he entered the site.Noise problemAs a major portion of the system’s potential usage is in an outdoor environment (e.g. exploring a city, going shopping or finding a café in a crowed shopping strip), iSunny is strongly subjective to noise at quite high intensity and frequency like traffic (70-100 dB), people chatting (60 dB), possible mechanical noise like cars (70 dB), tram or train (100 dB), and other noise (Wickens et al., 1998) .Assuming the average frequency of these noise of 1000Hz, the equivalent loudness of these noise are from 60 to 100 phons, which is close to the threshold of feeling of 120 phons. According to ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998), when the Time Weighted Average (TWA) of noise is above 85, ear protection is required, however since louder noise (truck/bus/subway) occurs much less often, ear protection is not critically required in iSunny application. However it might be desirable to reduce stress and improve comfort for users, as more than 20% of users find they are “highly annoyed” by noise above 70dB, which can be considered as the environmental average for the system’s applications ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998).In regarding to Temporary Threshold Shift (TTS), as mostly users will be exposed to not as loud noise (60-70 dB), TTS will be minimal.A straight-forward approach to the noise problem is noise cancelling headphones, which should be a useful feature; especially it will ensure audio information will be perceived properly. However noise cancelling can be dangerous as it may prevent users from hearing alarm or warning. This can be rectified partly by only cancelling out low frequency noise, let through high frequency alarm, and give users the option of setting the frequency threshold. Another problem needs to be rectified is the comfort and style of wearing headphones/earplugs, apparently earplugs is a better choice in regarding to style, but it will need to be designed to fit comfortably.Speech Communication problemsNoise-cancelling headphones, however, signify another major problem with noise, which is speech communication in a noisy environment. For example, Mary would like to chat with her husband or friend while shopping, so does Johnny and Barry with his mates. Despite very hard to eliminate completely the problem, advance noise-cancelling can be used to maximize the ratio of Speech to Noise Ratio, by setting a band-pass filter where the Articulation Index is from 4/1 to 5/1 ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998) ,leaving only speech signal through. In case other speech could be noise as well, the system will have to rely on human’s natural ability to focus on relevant sound sources, as human ears are highly effective in this task and there's no current method that performs better.Hearing lossHearing loss is a major problem with elderly like Mary, where certain sounds especially at high frequency only can be heard if they’re amplified (an increase of 10-20dB). To rectify this problem iSunny will need to have sound boosting mode to adaptively amplifying high frequency sound (possible alarms).Alarms and warningAs iSunny is used mainly as a personal assistant, alarm is an important aspect to be taken into consideration during the design phase. A research into potential alarms to be designed in the system and other peripheral alarms is followed:System’s alarm criteriaFor the design of the system, some of the typical and important alarms are: low battery alarm, potential troublesome area/hazardous roads alarm. For our blind persona, Barry, a lot more alarms need to be presented: dangerous traffic alarm, hazardous walkway alarm, potentially dangerous weather condition alarm (thunder, heavy rain etc.). The characteristics of these alarms depend on their level of criticality, less critical alarms like low battery should be designed so that they’re instantly informative, but not startling or disrupting the understanding of more important alarms like traffic warning ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998). More significant ones need to be above the background ambient level, in our system case these alarm should be around 90dB (30dB above 60dB average noise), but not exceeding the dangerous level of hearing (which is about 90dB) ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998). They also should be accompanied with voice and/or visual instructions on how to deal with the situation (i.e. direction to get out of troublesome area, slowing down instruction for hazardous roads, stop directives when dangerous traffic is bounding) ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998). Also, the rate of false alarms needs to be kept sufficiently low without missing out, to ensure the users won’t mistrust the system ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998); furthermore, an alarms testing functionality would ensure their proper operation and improve user's trust. Environmental alarmsApart from system alarms, other alarms from the environment are also analysed to ensure there is no confusion or overlapping. Most popular alarms in users’ environment are: car horns, emergency vehicles, walk-light, fire alarms etc. They are generally already well-designed, with distinctive and informative sound; our system must not mask, overlap or produce similar sounds to these alarms, especially applicable to the noise cancelling functionality.Another aspect of the consideration into environmental alarm is assisting top-down processing of alarms when bottom-up is not sufficient in unfamiliar environment. For international travellers like Johnny, understanding unfamiliar alarms and warnings in foreign language is likely to be vital. Top down processing may consider these as not important, since it doesn’t learn to expect them, especially when the user is focusing on a particular task or object. This should be addressed by add-ons, which are particularly relevant to a country or region that detect popular alarms or warnings and notify users via both visual and audio channels (Redundancy), in a more familiar manner.Sound localizationSound localization can be utilized to provide a useful functionality, “Seeing with Sounds”, for our vision-impaired persona Barry. This allows Barry to get a sense of his surrounding environment, which is mapped to a 3D sound space highlighting potentially dangerous point, like steps, holes on the walk path, or cars within a defined range or moving faster than a speed threshold. All these features are associated with a distinctive sound, and by locating the sound sources in 3D, Barry can get a sense of where these objects are. With this functionality, Barry then can make use of his hypersensitive hearing to make up more efficiently to his loss of sight. However, a major drawback of this approach is the large number of signals/alarms produced by Seeing-with-sound makes the discrimination between them much harder, this can only be minimized by appropriate design using the various alarm design dimensions: pitch, envelope, rhythm and timbre. Tactile and Haptic sensesThe use of tactile feedback in our system is minimal, due to the nature of design of relying on indirect command and control. However haptic feedback is utilized to improving ease of use of physical buttons, e.g. On/Off button, Auto/Manual switch. These buttons are designed with distinctive and intuitive shapes, like the On/Off buttons is a extruded round button, which 2 states of Pressed (On) and Released (Off), which can be felt and locate easily on the right hand frame. And the Auto/Manual is designed as a sliding button, also extruded but on the left hand frame, this design allows user to feel the position of the button and control it easily. The different forms (push button vs. slider) of the buttons ensure users won’t get mixed up between them.Cognition analysisTo examine the concepts of cognition in our case study, the following scenario was developed Barry was coming back home from the city after having coffee with his old friend Bruce. While he was on the way to the tram stop, he got a call from his insurance company reminding him that he was 2 weeks late for premium payment, and he needed to pay on that day to avoid a significant fine, and it was 430pm already then. The lady on the phone told him he could pay by credit card right now over the phone, making him felt a bit relieved. But then he realized he need to hang up the call to be able to find his wife’s, Alex, credit card details which are stored in a memo in his phone. So he asked for the phone number to dial back, it was 9351 4464, he repeated twice to make sure he would remember it. However it was a lot harder that he thought to find out where the memo was stored, as the phone’s voice navigation was buried mostly in the city’s rush hour noise. Finally he found the memo, as the numbers was read out loud, Barry tried to memorize them, and it took him more than 10 times to make sure. But when he tried to call the lady back, he already forgotten the phone number he needed to dial back. He was frustrated and tried to focus on remembering the numbers, while trying to retrieve it from call history, suddenly the horn of a car made him realized that he was starting to cross the wrong road, as the walk sound generators of the 2 crossing were placed too close to each other. Startled, he stepped back and gave up on trying by himself, and called his wife to settle the matter.Another scenario was created to examine Mary cognition workload while using our system.Mary was driving down the Glenferrie shopping strip for groceries during a Friday afternoon with heavy traffic. When she was waiting at the corner with Burwood road to make a right turn, a piece of advertisement about a newly-opened local store with very good introductory price grasped her attention. She was reading it enthusiastically until the sound of the horn from the car behind her alerted her to turn. A bit startled, she stepped on gas pedal to make a turn, but she failed to yield to a pedestrian who was about to cross, as the ad was still displayed in her right side peripheral vision. Realizing the dangerous situation, Mary took off iSunny.Selective AttentionIn the above scenario, Barry focused on the phone’s history and incorrectly perceived the walk signal, however the car horn was immediately noticed due to it salience, similar situation with Mary when she was focusing in the ad, instead of the traffic. Applying to our system, apparently the alarm system needs to be able to capture Barry’s complete attention, so he can react to the warning in time to avoid accident. Therefore it should be designed to be distinctive enough to get his attention immediately, despite all other possible distractions or noise. Barry crossed the road as he expected the walk sound represented that it was safe to do so (Top-down factors). Also it’s the knowledge-driven factors that make Barry paying most of his attention into performing the task (of paying for the insurance), as he knew how valuable it was to attend to the task (i.e. Being fined for late payment).One possible solution to help Barry in situations similar to above is the “seeing with sound” functionality, as discussed in Auditory Analysis. On the other hand, as Barry needs to constantly perceive the surrounding environment through audio feedback, minimum effort should be required. Therefore feedback sound should be ear-friendly (around 50-60 dB ) to minimize fatigue.PerceptionMaximizing perception in “Seeing with Sound” (see Auditory Analysis – Sound Localisation) system can be achieved through using distinctive but consistent forms of audio feedback. Familiar sounds/alarms/signals will reduce Barry’s cognitive workload by maximizing unitilisation between top down and bottom up analysis. While distinctive sound will ensure important information will not be missed out even when Barry not paying full attention to the system. Also it’s very important to assist bottom-up analysing of system’s signals by selectively cancelling the environment noise, avoid degraded-form of signals, like when Barry could not hear the phone’s voice navigation due to the street noise.On the other hand, the system’s multiple signals have to be designed so that they are highly distinguishable (avoid confusion), to reduce the cognitive workload Barry has to spend on understanding similar messages. This can be done by categorizing audio feedback into different groups (small vocabulary) depending on their respective level of importance, vital alarm like incoming car or hole on footpath each should have one single distinctive alarm. Other information like path guiding and obstacle warnings, which are also important but less vital, could be conveyed through a three dimensional map of gradually varying sound, allowing simple extraction of information with minimal workload.Working MemoryIn above scenario, Barry’s working memory capacity was not sufficient to store 2 long strings of un-related number. And even though he has tried to memorize the first phone number, as all the digits are in separate chunks, Barry wasn’t able to keep them in his working memory for a sufficient amount of time. Also the confusability of the series of number 4464 (might be called as 4644) made it even harder to make the right choice. Another factor was Barry’s attention resources had to be allocated to many other tasks of interpreting the phone’s voice navigation, waiting for the walk signal etc., this reduced significantly the ability to store and retrieve information from working memory.In order to assist users’ working memory, our system should be able to store vocal or visual information by recording them and replay on request (e.g. voice memo), which reduce the need for memory load. Also visual echo can be used like flashing the turning arrow when approaching an intersection, in combination with voice navigation. Also confusability needs to be kept to minimal by making sure the systems’ symbols, signals and instructions are distinctive.Long-term MemoryLong-term memory about how to use the system is hard to develop in users, so the system should be designed so that symbols and figures are standardized, e.g. play/pause symbols, icon based interface (as used in many mobile interfaces like iOS), tab-based task switching method (used in most popular browsers), Decision making process analysisAs iSunny is a device designed to assist users in normal daily tasks, most of user’s decisions are based on the Descriptive Decision Model. The following scenarios will be used to analyse users’ behaviours when they make decisions, the heuristics and biases occurring during the process are also discussedHeuristics and BiasesOur traveller persona, Johnny, has to make decision about accommodation almost every week, sometime few times a week. As he’s unfamiliar with the city, he totally rely on internet sources like , or other traveller site for the best and cheapest place to spend the night (Knowledge based behaviour). However, as information from these sites generally are scattered, incomplete and sometime contradicted, Johnny is subjective to many errors in acquiring and perceiving information, he might choose the first reasonably priced place that comes up from search (cue primacy), without considering whether it’s up to date (unreliable cues), and ignoring other important facts like whether the place is close to any public transport, or whether the neighbourhood is safe (limited number of cues). Based on the biased cues, Johnny picks out a few potential accommodation candidates (limited number of hypotheses), he might start off by choosing a place a bit further from the inner city, as he did in many other places to save money (availability heuristic). Then from his experience, he filters out places that are close to shopping strips or entertainment facilities, because these places are generally more expensive (Representativeness Heuristic). Even though he might find a place fairly nearby the city and other facility with similar price, Johnny still insist on his initial decision, thinking the better ones are fake deals (Overconfidence and Cognitive Tunnelling). Our other persona, Mary, has to make more instantaneous decisions more often. For instance, one day Mary’s fridge broke down just after she has filled it up with groceries for the whole week, can’t find any cheap option available on (Rule-based) (a local classifies site) (Limited information), she has to make the decision to whether go off her budget to buy a new fridge, or see her groceries go bad (limited options), obviously the later option’s outcome is not acceptable. Mary seems to have no choice but using her credit card. Dependency of Decision Making on the Decision ContextMost of system’s persona decisions fall into either Rule-based or Knowledge-based categories, as Skill-based decisions won’t need any additional information or recommendation from the system. Therefore the primary aim of the system is to bring Knowledge-based decision as close to the level of Rule-based as possible, by providing sufficient and relevant information to user. For example finding and selecting accommodation is a complicated Knowledge-based task. However if provided with clear spread sheets showing sufficient information about local hotels/hostels, Johnny will be able to easily pick out the most appropriate one by using common sense rules.Improving Human Decision MakingIn order to assist Johnny with his decision making process, iSunny is designed to avoid biases by providing comparable, sufficiently complete information about available accommodation, all together on the city map (Display), or better even on a short spread-sheet with critical criteria. The system will also able to process the information and make recommendation based on a large number of factors that is too complicated for human to thoroughly consider (Expert system). In these rather unexpected situations, iSunny can help by firstly presenting some of the most frequent and easy to fix problem with fridges, like a blown fuse for example, and show Mary step-by-step instruction to fix it (Decision making trees). After that iSunny can load more complete and up-to-date information from multiple sources to help Mary choose the best option of buying a fridge, it may combine and compare eBay, , gumtree, Facebook market place, local classifies to pick out the best deal (Expert System).DisplayPersonas’ Scenarios In order to effectively analyse the display components of the system, the following scenarios examine possible troubles users might come across while performing their tasks. The first scenario is about Johnny, our traveller persona.Johnny was scrolling down the roads of Shanghai; he just arrived yesterday and was exploring the city. His friend, Sheng Li, told him about a small street where the local artists sell their hand-made treasure and Johnny was very eager to get there. He has looked up the map online the night before and printed a copy out. Unfortunately when he took the map out of his pocket, he realized the ink quality was horrible, all the text was blurred together and Johnny couldn't figure out any of the already complicated Chinese words anymore. Disappointed, he decided just to walk around, just to try his luck on what the city has to offer. After a while he realized that everyone around him was looking at him, not very pleasantly, he felt a bit weird and tried to figure out where he was. Opened the blurred map before, he managed to figured out where he was thanks to the bigger roads names still readable. Then he realized that his area was marked with a Chinese symbol, which he didn't understand. Spent few minutes searching his brain, Johnny remembered there was something in the Shanghai version of Lonely Planet about these symbols. He opened the book but it wasn't indexed at the end, another 5 minutes searching through the book and he found out in a footnote that the symbol meant "Troubled area", where all the drug dealers and prostitutes work.The following scenario describes the auditory display that Barry, our blind persona, uses.Barry was walking down street by himself, trying to catch the train to the city. His handheld GPS was doing an alright job showing him where to turn, but it didn't help him avoid the branches and small ponds of water the rain last night left behind. When he reached Flinders St Station, he tried to find Haddon's, the cafe where he was supposed to meet his friend, Fred, there for brunch. Barry has never been there, so he activated Point of Interest mode on his handheld GPS, and then it started talking about every single thing around Barry. There were so many shops, stores and cafes there as it was right in the middle of the city, so the GPS just kept going on and on and Barry got tired of listening. Then finally he heard the name "Haddon's" come up, but unfortunately the address and distance came were spoken before the name, so Barry didn't catch it. He had to restart POI and got it the second time.Applying the thirteen principles of display design into our system, the following part discuss how we would improve user’s experience through the display components: navigation, social and communication display, location and context aware information display, information mining display.Alerting DisplaysIn order to assist Barry, iSunny shall support obstacles and danger detection and warning, in combination with basic navigation functionality. It maintains environment monitoring by labelling potential hazards (advisory level) by their distinctive sound (Discriminability P5) depends on their respective nature and level of danger (Pictorial realism P6). When Barry's direction indicates that he might get himself into danger or hazard (warnings level), the system will provide additional vibration warning (Multiple resources –P4).For our traveller, Johnny, social and communication info is less frequently used (only when new update available), so should be displayed under notification form only. Warning displays (troublesome area) should be armed with auditory and flashing signal to attract attention (Multiple resources –P4). LabelsTo assist Johnny in making the most out of his travel in a safe manner, iSunny will improve legibility of labels (street names, symbols and signs) by the use of overlayed digital maps on the head mounted display with clear text in proper contrast, allowing zooming and search by voice (Information Access Cost- P8), with meaningful signals/symbols and conventional colour code (I.e. Black/yellow for warning signs) (Consistency - P13). The labels are placed in 3D, using stereoscopic display, right next to the entity (motels, hospitals etc.) that they represent (Location proximity – P9) MonitoringAs Barry needs to use his ears and Seeing with Sound functionality to monitor his surrounding environment almost all the time, to compensate for his loss of vision. The functionality needs to be designed firstly for best legibility, where audible and distinguishable sounds are allocated to object that naturally related to them (i.e. a door would have a light “knocking” sound, a tree would have a sound similar to when there is wind blowing through it). Secondly, as the environment needs to be monitored change continuously, all signals in Seeing With Sounds should be analogue, like object-related sound would get louder when closer (Moving part – P7). Also by interpreting Barry’s current steps and direction, warning can be produced in advance to prevent Barry from tripping over an object e.g. (Prediction and Sluggishness). Multiple DisplaysAs there’re many different functionalities and modes in iSunny, they’re presented in different displays. Each one is relevant to a particular task: navigating, finding places, shopping assistant, disability assistant mode etc. They should be displayed separately to avoid confusion. The system might recognize user’s intention and switch automatically (like switching to shopping mode when Mary walks into the store), but need to be able to manually switch to a particular display on user’s voice command. Display layoutTo avoid cluttered display, only task-relevant information should be displayed, i.e. turning instruction while navigating, pricing and rating when shopping, hotels when searching for accommodation etc. However it should not interfere with current task, i.e. not covering interested object/scene which normally located in the fovea region (Primary visual area).The displays should be designed following Organizational Grouping principles, where related functionality like Navigation, and Point-of-Interest should be together (Display Relatedness).Frequency of use needs to be taken into account as well, like Social and Communication mode is less frequently used (only when new update available), so should be displayed under notification form only. Similar to warnings but due to their importance of used, they should be armed with flashing signal and audible sound to attract attention. Figure SEQ Figure \* ARABIC 1 - System Display and Control (Virtual)(Courtesy of Google Maps – Street View)Navigation Displays and MapsCommands DisplaysAs our vision-impaired persona Barry cannot read a map, Commands Display is his only option for navigation. In addition to normal turn-by-turn instructions, iSunny shall provide real-time and context-relevant guides, like warnings about hazards on the walk-path, walk-light reminder or caution warning etc. MapsTo ensure ease of use, the maps functionality should follow traditional electronics map format: top-down, satellite image options, current location, and current orientation. However for safety purposes, a full map should only be (seemingly) projected on a flat surface (e.g. a table) to avoid occupying user’s vision. Navigation function should follow standard turn-by-turn GPS design, with overlayed arrows pointing turning direction and voice guided turns.Figure SEQ Figure \* ARABIC 2 Map shown on table-like surfaces(Courtesy of Google Maps and )ControlsSystem Control DevicesPhysical buttonsiSunny has the following physical buttons: on/off, manual/automatic switch, whose characteristics are described in the following table:ButtonPhysical feelShapeLabelling locationSizeOn/Off Extruded (1mm) plastic button(Pressed is On, Released is Off)Round“Power” Next to button (3mm) Relatively small (3-5mm in diameter) due to constraints in frame size.Manual/ Automatic mode switch (On-screen feedback (M) or (A) )Extruded 1mm slider switch(Front slot is Auto, Back slot is Manual)Round button in a sliding slot“Auto”/”Manual” Above slider in close to slider positionsRelatively small (3-5mm in diameter) due to constraints in frame size.Figure SEQ Figure \* ARABIC 3 Physical buttons(Courtesy of: )Figure 4 Physical Button LabellingThe decision of having only 2 physical buttons is to reduce confusion and improve speed of action. The two physical controls are vitally important as they provide users with a reliable failsafe switch in case of system (or its automated mode) malfunction. For example, Mary can immediately switch off iSunny in case it was blocking her vision during driving. Eye-tracking controlApart from the 2 physical buttons, iSunny is a Head-Mounted-Display unit, with no physical control panel or input device, the only sensible and natural interface is Eye-Tracking and Voice input. Both have relatively low reliability compares to other traditional input methods. However due to the low complexity of decision (switching assistant mode and browsing information), a well-designed control system can achieve desirable functionalities. The key point is to improve accuracy as eye-tracking is a very high speed input. Eye-tracking controls combine two main modes: virtual buttons and task-related gesture control.Virtual buttonsVirtual buttons consist of: Tabs representing current mode (Navigation, Shopping Assistant etc.), turning on and off functionalities (Messaging, Social Networking, Weather etc.) (See Displays for examples). These buttons are controlled by user first looking at them, the system will highlight the button immediately (fast Feedback), but not activate the button yet, but flash a select signal (a tick symbol e.g.) at the lower-left corner of the screen (Response Expectation), and activate the button only if user immediately look at the confirmation button (two-steps control to improve accuracy). This method is “double-checking” to improve system’s accuracy. For less important controls, for example checking weather, only looking at the icon will show required information, but only in short form to avoid interference. Figure 5 Double Checking Algorithm Avoiding Unintended ActionsTask-related gesture controlWhen the current task of user is known, certain gestures can be recognized and trigger appropriate action. For example when Mary is shopping, the gesture of picking up an item can immediately trigger displaying rating and brief review of the item, combined with pricing recommendation. Or when Barry is walking toward a table, chair locating and sitting down assistant is immediately turned on. These functionalities can be disabled by using physical Automated/Manual button.Voice Control AnalysisVoice input can be used as a redundant control interface to improve eye-tracking accuracy, or acting as the main interface for vision-impaired users like Barry. Its advantages are being natural and ability to multi-task, however major issues are it is generally highly subjected to environment noise, low reliability and flexibility, low response speed.To ensure usability of voice control, control complexity needs to be kept at minimal; the system will attempt to listen to complex commands like: “take me to a good place to have a coffee nearby”. However if it failed to do so, it will request the users to input simple command in a menu-like style, for example: “Find Command” – “Local” – “Café”-“By rating”. If the number of options per steps is kept low, the successful rate should be sufficient. Anthropometry and Workspace DesignAnthropometric factorsAs iSunny is designed to work with a wide range of users with different ages, genders, races and possibly disabilities, its anthropometric design needs to be adjustable, accommodating the 5th and 95th percentiles of the population. This can be achieved within a reasonable budget as the system only concerns with three body dimensions in term of shape design, which are: Head breadth, interpupillary breadth, binocular breadth.According to table 10.2 ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998), the minimum and maximum adjustable range of iSunny respective dimensions areiSunny DimensionRespective body dimensionLower Adjustable RangeUpper Adjustable RangeOverall Frame WidthHead Breadth5.46.3Inter - Centre display of 2 glassesInterpupillary Breadth2.12.6Usable display widthBinocular breadth3.33.9 Table 1: iSunny dimensions versus Anthropometric Factors Figure 6 iSunny DimensionsAnother important aspect to be taken into account is the normal line of sight (NLoS) ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998) , all information displayed by iSunny needs to fit into the cone of easy rotation (about +- 15 degree around the normal line of sights), with main task-related information (navigating, item rating etc.) to be straight on the NLoS itself. Component LayoutThe following analysis will examine the display and control components as shown in Figure 2 and 3, in regarding to satisfactory anthropometric design. The two images are shown again below for referenceFrequency of useSpatial organisation analysisThe physical buttons are less frequently used than the virtual ones. Therefore they are placed rather far away from the main interface and require relatively significant travel time (i.e. user needs to raise their hand to the frame to use the buttons). This design helps avoid unintended control.In regarding to virtual interface, the tab switching is less frequent than other common tasks like checking messages or local offers; therefore the tab are placed outside of the Cone of Easy Rotation (20 degree above the Normal Line of Sight), again to avoid unintended actions by request longer and more effortful travel of eyes to switch tab. As also will later on be shown in Sequence of Use, the confirmation point for tab switching will be placed relatively far, which increase total travel time to complete a control action.Importance of useThe physical switches act as failsafe functions allowing users to switch off Auto mode or the whole system immediately. Due to their important functions, they are placed in fixed and easy to reach positions. It might be questionable that important switches are placed too far away from the primary display (the glasses surface), but the low reliability of eye-tracking forces the design of important button to be physical. The virtual buttons are arranged according to the guideline in Figure 10.7 ADDIN EN.CITE <EndNote><Cite><Author>Wickens</Author><Year>1998</Year><RecNum>39</RecNum><DisplayText>(Wickens et al., 1998)</DisplayText><record><rec-number>39</rec-number><foreign-keys><key app="EN" db-id="9d99pxzx392rares2saxavrkere59pf9vzwz">39</key></foreign-keys><ref-type name="Book">6</ref-type><contributors><authors><author>Wickens, Christopher</author><author>Lee, John</author><author>Liu, Yili</author><author>Gordon-Becker, Sallie</author></authors></contributors><titles><title>Introduction to Human Factors Engineering</title></titles><keywords><keyword>accidents</keyword><keyword>analysis</keyword><keyword>assessment</keyword><keyword>ergonomic</keyword><keyword>ergonomics</keyword><keyword>evaluation</keyword><keyword>heavy_engineering</keyword><keyword>human_error</keyword><keyword>human_factors</keyword></keywords><dates><year>1998</year></dates><publisher>Addison-Wesley</publisher><isbn>0-321-01229-1</isbn><urls><related-urls><url>;(Wickens et al., 1998), where primary display (the navigation arrow in this case) is placed right within 10 to 15 degree within the Normal line of sight. Secondary controls like tabs and buttons are placed outside.Sequence of useLink analysisSimpler functionalities like checking messages, mails etc. can be done in one action, therefore dismiss the need of link analysis. Most of other commands and controls are activated in one click as well.However the “double-checking” feature for eye-tracking control requires a sequence of control, which are intentionally separated from each other (against the principle). This is to increase accuracy at the cost of speed. ConsistencyThis principle is applied throughout the design; all the tabs are placed up the top bar, with the main current task always locates at the top left corner; all functionalities are placed down the bottom bar; physical buttons are placed at the same spatial location in opposite frames. Control-display compatibilityFor our system, the display is also the control, and labels placed on or very close to these buttons. The physical buttons are labelled with conventional symbols and text to avoid confusion.Figure 4 Physical Button LabellingClutter avoidanceThe physical buttons in iSunny are placed on the opposite sides of the frame to avoid unintended activation. This is necessary due to their important functions. Accidental activation is a major problem with eye-tracking-based control system for virtual buttons in iSunny. Therefore apart from leaving enough space between controls buttons, and ensuring the buttons sizes are sufficiently large. iSunny employs the method of “double-checking” for important command like Mode switching, as shown in Control section.Functional groupingAs mentioned in Consistency, all tabs are grouped together up the top, with inactive tasked grouped together on the right. The current mode’s functionalities are grouped down the bottom bar, with Social and Communication Functions grouped to the left, Task-related Functions grouped to the right. References ADDIN EN.REFLIST WICKENS, C., LEE, J., LIU, Y. & GORDON-BECKER, S. 1998. Introduction to Human Factors Engineering, Addison-Wesley. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download