Introduction



Pre-print Version: Danger, Danger! Evaluating the Accessibility of Web-based Emergency Alert Sign-Ups in the Northeastern United States

Brian Wentz (corresponding author)

Department of Management Information Systems, Shippensburg University

1871 Old Main Drive, Shippensburg, PA 17257

bwentz@ship.edu, 717-477-1601

Jonathan Lazar

Radcliffe Institute for Advanced Study, Harvard University and

Department of Computer and Information Sciences, Towson University

8000 York Road, Towson, MD 21252

jlazar@towson.edu, 410-704-2255

Michael Stein

Executive Director for HPOD and Visiting Professor for HLS

Cabell Professor, William & Mary Law School, Harvard University

1515 Massachusetts Avenue, Cambridge, MA 02138

mastein@law.harvard.edu, 617-495-1726

And

Oluwadamilola Gbenro

Edwin Holandez

Andrew Ramsey

(students from Frostburg State University,

101 Braddock Rd, Frostburg, MD 21532)

Abstract

People with disabilities need access to emergency-related information at the same time that the general public receives that information. Many county and municipal-level governments suggest that citizens sign up on a web page to receive emergency alert information. While the messages being sent out via e-mail or text message might be accessible, the sign-up processes are often inaccessible, preventing people with disabilities for signing up for these important information services. In this paper, all of the county-level emergency alert sign-ups in Massachusetts, New York, and Maryland, were evaluated for accessibility. A total of 156 evaluations took place (6 evaluations for each of the 26 counties evaluated). Of the 26 counties evaluated, 21 of them had accessibility violations. Legal, policy, and design-related implications are presented in the following discussion.

Keywords

Web accessibility, policy, disability, compliance, Section 508, Section 504, WCAG, emergency alerts

1. Introduction

Many municipal, county and state governments offer emergency alert services, where citizens can sign up to receive an e-mail or text message with information about weather, flooding, or other emergency events. Access to this emergency information is vital for public safety. The most convenient approach to register to receive emergency alerts is through a web-based registration form. Some local and state governments use third-party web interfaces to manage the registration and account information for their citizens. It is important to evaluate the accessibility of the web-based registration processes for people with disabilities, since historically, people with disabilities are often not considered when electronically communicating emergency information to the public (Waterstone and Stein, 2006). When planning for emergencies, the technologies used by government and emergency respondents (such as GIS mapping), often do not include any information about the location of people with disabilities, disability-related barriers, or organizations that serve people with disabilities (Enders and Brandt, 2007).

It has been informally reported that many emergency alert systems have sign-up processes that are inaccessible to many people with disabilities. In one high-profile instance, the home page of FEMA, the Federal Emergency Management Agency, had been inaccessible to blind people who use screen reader technology (Olalere and Lazar, 2011), but it has since been fixed. The purpose of this paper is to discuss the legal status of accessibility of web-based emergency alert sign-ups, and then evaluate all of the county-level emergency alert sign-ups of three states in the Northeastern US.

1.1 Background Literature on Web Accessibility

People with various disabilities often use different types of assistive technology to access web-based information. For instance, blind users may utilize a screen reader, which will take what appears on the computer screen, and provide computer-synthesized speech output. Deaf or hard of hearing users may utilize captioning or transcripts instead of audio. People with motor impairments that limit use of their hands may use a keyboard, but not use a pointing device (such as a mouse), may use an adaptive keyboard or may use no keyboard at all, instead using speech recognition or head tracking to control their computer (Lazar, 2007). Web site designers are not expected to design different web site versions for each disability population, nor are they expected to add different features for each disability group. A set of international technical standards for making web sites accessible for people with disabilities, called the Web Content Accessibility Guidelines (WCAG), has been in existence since 1999. These technical standards cover all perceptual and motor impairments as well as some cognitive impairments, and are internationally considered the “gold standard” for making web sites accessible (Loiacano, Romano, and McCoy, 2009). Most countries have laws or regulations related to disability access to Internet content, which either are technically identical to the WCAG, or are derived from the WCAG with only minor differences (Lazar and Wentz, 2012). For instance, in the United States, the US Access Board defines engineering specifications (and the related regulations) for disability access, including both physical architectural access and also access to web content. The first versions of the technical specifications for US federal government web content, which are the regulations for Section 508 of the Rehabilitation Act, were derived from WCAG 1.0. A new version of WCAG, 2.0 was officially released in 2008, and the US federal Government is currently going through a rulemaking process to update the Section 508 regulations (Olalere and Lazar, 2011). In the most recent draft, the US Access Board has indicated the new version of Section 508 (known as the “508 Refresh”) will refer directly to the international standard WCAG 2.0.

Technical guidelines for web accessibility have existed for over a decade, and there is currently a wealth of information available to web developers, that explain how to make their web sites accessible. Furthermore, making web sites accessible is not technically hard to do, especially for the simple web-based forms typically used for emergency alert sign-ups. However, numerous studies have reported that U.S. government web sites, at state and national levels, are inaccessible (Fagen and Fagen, 2004; Jackson-Sanborn, Odess-Harnish, and Warren, 2002; Jaeger, 2006; Lazar et al., 2010; Loiacono, McCoy, and Chin, 2005; Olalere and Lazar, 2011; Yu and Parmanto, 2001). Governments around the world have had varying levels of success with web accessibility (Goodwin et al., 2011), and the approaches that seem to lead to higher levels of compliance include either massive automated monitoring of government web site accessibility (Mirri, Muratoir, and Salomoni, 2011) or public posting of accessibility results on a regular basis (Gulliksen et al. 2010).

Given the gap between existing knowledge and technical ability, and actual practice, Vint Cerf, the president of the Association for Computing Machinery, even wrote an article asking “Why is Accessibility So Hard?” (Cerf, 2012). Numerous reasons have been presented as possible explanations for such a low level of government web accessibility. These explanations include: a gap of almost 10 years in compliance activities at the federal level, a lack of a requirement to document activities related to accessibility compliance, clear technical guidelines, but no guidelines related to process or procedures, and accessibility compliance responsibilities being added on to government employees who already have full-time jobs (no resources or time provided for compliance activities) (Olalere and Lazar, 2011). Often, there is more expertise about IT accessibility at federal and state levels, as compared to local levels of government, such as towns, cities, and counties. Yet, the average citizen interacts more often with their local government (for water bills, fire and ambulance service, public schools, public libraries, trash collection, etc.) than their state of federal government (Lazar and Wentz, 2012). Although state IT accessibility can sometimes rival federal IT accessibility (Yu and Parmanto, 2011), at no point has there been any documentation of local (city or county) IT being superior in accessibility. So, in some ways, it is not surprising that local governments may have challenges in IT accessibility.

While there is a lot of published research about inaccessible web sites in general, there are no published studies about accessibility of emergency-related information via electronic means. However, in the following section, there are references to research relating to the more general topic of legal issues related to providing emergency-related information to people with disabilities.

1.2 Background literature on emergency information access for people with disabilities

Both the Rehabilitation Act and the Americans with Disabilities Act (ADA) prohibit state and local governments in the US from discriminating against individuals with disabilities. Section 504 of the Rehabilitation Act bans “any program or activity receiving Federal financial assistance” from excluding equal participation by people with disabilities in funded programming (US Department of Justice, 2012). Title II of the ADA declares that “public services and programs must be accessible to people with disabilities” (US Department of Justice, 2008).

Consequently, courts have found that not including people with disabilities in disaster preparation and evacuation plans violates both those federal laws. For example, in two recent federal court cases, a California district court held that the City of Los Angeles violated both federal laws by failing to adequately serve the needs of some 800,000 individuals with disabilities through its emergency preparedness program (US Department of Justice, 2011), and a New York district court certified a class action against the City of New York on behalf of some 900,000 people with disabilities who were not sufficiently accommodated within disaster plans (US District Court, 2012).

Nevertheless, state and local governments have been sorely remiss as far as including people with disabilities in disaster preparedness. In April 2005, for instance, before either hurricane Rita or Katrina, the National Council on Disability released a report that examined the disaster experiences of people with disabilities and concluded that access to emergency public warnings did not satisfactorily include individuals with visual or hearing impairments. The report noted specific examples of such failures— including the lack of closed captioning during the September 11 attacks—and underscored that, although emergency e-mail and wireless network alerts can be helpful, they were not being used (Frieden, 2005). This situation existed despite an Executive Order issued by then-President Bush requiring State and local governments to design and implement emergency evacuation plans for persons with disabilities (Lord and Stein, 2010).

The glaring gap in inclusive preparedness for the disability sector tragically manifested in grievous harm following hurricanes Rita and Katrina. People with disabilities were not adequately warned of the impending disasters, were not taken sufficiently into account as part of emergency evacuation plans, and were not accommodated post-disaster in government sponsored relief efforts (Waterstone and Stein, 2006). The federal government has subsequently responded to this egregious oversight by establishing an Interagency Coordinating Council on Emergency Preparedness and Individuals with Disabilities, periodic reviews of emergency preparedness by the Department of Homeland Security, and the creation by President Obama of a disability focal point position at FEMA (Federal Emergency Management Agency). The change in FEMA is obvious when you consider that since 2010, the Office of Disability Integration and Coordination has grown from one disability coordinator to a staff of over 70 disability integration advisors working to improve coordination and communication for people with disabilities, before, during and after emergencies. While FEMA is a U.S. federal-level agency, it is important to clarify that the emergency alerts referred to in this report are implemented at the local government level, not the federal level (Federal Emergency Management Agency, 2013).

2. Research Methods

There are typically two different approaches for evaluating the accessibility of web sites accurately: expert inspections and user testing. Expert inspections involve experts in accessibility, using a structured method to inspect a series of web pages against guidelines. Usability (user) testing involves people with disabilities attempting to complete representative tasks. User testing is generally more effective for assessing accessibility, especially when the site focuses on performing transactions involving a series of steps, such as signing up for an e-mail account, submitting an employment application, purchasing from an e-commerce site, or signing up for a service. Expert reviews are more effective for assessing the compliance of individual web pages, with laws, regulations, or guidelines. User tests involve attempting to complete tasks, while expert reviews involve checking code for every specific interface guideline.

Because signing up for emergency alerts involves a series of steps to complete a transaction, user testing would typically be the most appropriate evaluation method. However, there were a number of complications with using traditional usability testing for this evaluation project. In usability testing, users with disabilities would attempt to perform tasks, but it is always important to protect the anonymity of the participants in the usability testing. In other words, no one who participated in usability testing should have their participation publicly known. In most types of usability testing, the identity of participants is protected by using fake names, postal addresses, phone numbers, and e-mail addresses anytime that personal information is requested (Lazar, Olalere, and Wentz, 2012). However, most usability testing does not involve signing up for services on government web sites where it could be legally problematic to provide false information. Furthermore, many of these emergency alert sign-ups are limited to residents of the respective counties or states, which means that if you are not a county resident or do not have a work address in the county, you may not sign up for emergency alerts.

Therefore, this accessibility evaluation used a hybrid approach, using aspects of both a usability test and an expert inspection. Most emergency sign-up pages have multiple areas of content. Only the steps specifically involved in signing up for emergency alerts were evaluated in this project; the other content on each web site was not evaluated. The interface experts (researchers) could not actually sign up for the services using false identification, since that would potentially be against the law, so the evaluation process stopped short of submitting the registration information. A task-focused inspection (often known as a cognitive walkthrough) was conducted (Brajnik et al., 2012; Blackmon et al., 2002; Wharton et al., 1994), in which only the aspects related to the task were inspected, except in this case, the researchers did not complete the task of actually submitting information.

A team of evaluators, trained on how to conduct accessibility evaluations, inspected the steps involved in signing up for an emergency alert, to determine if those steps were compliant with Section 508 of the Rehabilitation Act. In reality, the researchers did not conduct a full Section 508 inspection of the web pages. The focus of this project was identifying violations specific to the sign-up process. Because of this, the data collected in this project will not determine if the entire web pages are compliant with Section 508. The focus of the task-based inspection included the following research questions:

1. Is the link to the sign-up page accessible?

2. Are the descriptions accessible?

3. Are all of the form fields marked-up properly?

4. Are all buttons labeled/marked-up properly?

5. Is the notification for which fields are required accessible?

6. Is there a CAPTCHA? If so, is there an audio version?

7. Is any progress indicator accessible?

8. Are there any alternative means to register for the alert service?

For the inspections, the four most popular web browsers (IE, Chrome, Firefox and Safari) were used. The following combination of operating systems, web browsers and screen readers were used for the evaluations:

• Windows 7 with Firefox 17 and JAWS 13 screen reader

• Windows 8 with Chrome 23 and JAWS 14 screen reader

• Windows 7 with Internet Explorer 9 and JAWS 13 screen reader

• Windows 7 with Internet Explorer 9 and JAWS 14 screen reader

• Mac OS 10.7.5 with Safari and VoiceOver screen reader

Blind users typically access web sites through the use of text-to-speech (or screen reader) software, which reads the content of a web page in an audible manner to the user in a linear fashion. Examples of screen reader software include JAWS, WindowEyes, NVDA, and VoiceOver. Screen reader software is the dominant method of access because Braille literacy is extremely low among blind people (Schroeder, 2006).

The researchers chose to evaluate the emergency alert sign-ups in three states: Maryland, Massachusetts, and New York. This was done because the researchers were based in Massachusetts and Maryland and also have partnerships in New York. The intent was to evaluate a sample of states in the northeastern part of the US and to be able to use the data to influence improvements in accessibility in those specific states. Furthermore, all three states are high-value targets for terrorist attacks (considering the April 2013 Boston marathon bombing, as well as previous attacks in New York City and Washington, DC.) There are a total of 100 counties (or county-equivalents) within these three states (24 in Maryland, 14 in Massachusetts, and 62 in New York). The process for selecting which counties should be included in the evaluations included the following steps:

1. Exclusion of any counties within the three states that did not have county-level sign-ups available to the public (as of October 2012)

2. For the remaining counties that did have county-level sign-ups, exclusion of any counties where the sign-up processes were not web-based (such as PDF or e-mail only)

3. For the counties that did have web-based sign-ups, exclusions of any counties where the web site sign-up process (or links) were not functional (as of October 2012)

The conclusion of this selection process resulted in a 26 emergency alert sign-up processes being evaluated, by each of six evaluators (for a total of 156 individual evaluations).

In contrast to the human evaluations that took place in this study, evaluations conducted with automated software tools (such as Deque WorldSpace, Odellus ComplyFirst, or SSB Technologies InFocus) might be able to point out where potential violations exist. However, automated evaluations are often not as accurate as human evaluations, primarily because the automated tools cannot determine whether the context is appropriate (e.g. that “button one” is not descriptive alternative text for a graphic). Multi-stage human inspections of web pages, involving screen readers (such as JAWS or VoiceOver) are considered to be the most accurate form of accessibility evaluation (Mankoff et al., 2005), and that accuracy further increases when multiple individuals evaluate the same interfaces and then combine their results into a meta-evaluation (Lazar et al., 2010). Screen readers, while designed primarily for blind users, are very helpful in identifying accessibility violations because they help point out where web page components are not accessible for keyboard-only use, which also impacts how people with motor impairments utilize web pages. Furthermore, in these evaluations, coding inspections on the web pages were performed if any clarifications were needed.

When each evaluator completed their individual evaluation of the emergency sign-up processes, the evaluators met as a group and compared their results. The approach of using multiple human evaluators to produce one meta-evaluation increases the reliability and accuracy of the inspection and has been previously utilized in many web accessibility evaluations (Lazar, et al., 2010; Lazar et al., 2012; Lazar et al., 2011; Wentz et al., 2012). If there was a disagreement in the results comparison, the group of evaluators re-visited the sign-up process in question and formed a consensus on the nature or lack of a violation. Due to the dynamic nature of web-based content, it is important to note that these evaluations took place in November and December 2012. It is possible that the accessibility of the web-based content and sign-ups have changed since December 2012.

The highlighted violations that resulted from the evaluations are based on the 16 guidelines set forth in Section 508 of the US Rehabilitation Act (1194.22), identified as paragraphs “A” through paragraph “P” that focus on web site accessibility. Most US states have technical guidelines that are identical to the technical guidelines in Section 508 (although the legal requirements and remedies may differ). Table 1 lists only the four “paragraphs” of the Section 508 guidelines that were violated by the interfaces evaluated and provides a short description of each guideline (note that the descriptions are from Lazar et al., 2010, not from the law).

Table 1. Description of the Four Paragraphs of the Section 508 Web Accessibility Guidelines that were violated by the Emergency Sign-up Forms

|(A) Text Equivalent (have a text equivalent for any graphical elements) |

|(C) Use of Color (color should not be used as the only method for identifying elements of the web page or any data) |

|(L) Scripting Languages (make sure that equivalents for any non-accessible scripting are included, e.g., for those who are not using |

|pointing devices) |

|(N) Online Electronic Forms (all forms must be properly labeled and accessible) |

It is important to note that a number of the web sites use an external service provider or software vendor for their emergency alert sign-ups. That is, the emergency alert web pages were not developed in-house by the county government, but rather, were acquired through a government procurement process. Examples of common interfaces/systems that were used (along with the number of counties included in this study that utilized them) include:

• Cassidian Communications (2)

• CODERED (7)

• Connect-City (Blackboard product) (1)

• CooperNotification (3)

• Everbridge (4)

• Hyper-reach (3)

• Nixle (1)

3. Results

Of the 26 emergency notification sign-up forms that were evaluated, 21 had one or more accessibility violations during the sign-up process. The aspect of the emergency notification sign-up process that had accessibility violations on the most sites was the indication of which form fields are required (14 sites, as illustrated by the fifth column in Table 2). The use of color alone (par. C), inaccessible scripts (par. L), and poorly labeled form components (par. N) were the primary problems related to the required sign-up fields. Inaccessible progress indicators were the second most problematic violation (13 sites, the 6th column on Table 2). The lack of adequate alternate text (par. A) was the primary problem with the graphics-based progress indicators. There were also problems with general form field accessibility (11 sites, as illustrated by the fourth column in Table 2). A lack of alternate text for graphical form components (par. A) and poorly-labeled form fields (par. N) were the primary problems with general form field accessibility.

Table 2 provides a matrix of which sites had Section 508 paragraph violations in the various aspects of the emergency alert sign-up process. Allegany County (MD), Charles County (MD), West Chester County (NY), Wicomico County (MD), and the state-wide (NY) were the only sites that had no accessibility violations related to their emergency alert sign-up process.

Table 2. Matrix of Section 508 Paragraphs Violated by Sign-up Process Components (26 counties evaluated; population = 100 counties)

|County Name: |Sign-up |Directio|Form |Required|Progress |Buttons |CAPTCHA |

| |Link |ns |Fields |Fields |Indicator| | |

|Dukes Co., MA | |N |A |L |A |A | |

|Plymouth Co., MA | |N |A |L |A |A | |

|Worcester Co., MA | | | | |A | | |

|Allegany Co., MD |No violations |

|Anne Arundel Co., MD | |N |A |L |A |A | |

|Baltimore Co., MD | | |L | | | | |

|Calvert Co., MD | |N |A |L |A |A | |

|Cecil Co., MD | |N |A |L |A |A | |

|Charles Co., MD |No violations |

|Dorchester Co., MD |A | | | | | | |

|Frederick Co., MD | | | | |A | | |

|Harford Co., MD |A | | |C |A | | |

|Howard Co., MD | | |N |N | | | |

|Montgomery Co., MD | | |N |N | | | |

|Prince George’s Co., MD | | |N |N | | | |

|Washington Co., MD | | | | |A | | |

|Wicomico Co., MD |No violations |

|Franklin Co., NY | | | |C | | | |

|Greene Co., NY |A |N |A |L |A |A | |

|Suffolk Co., NY | |N |A |L |A |A | |

|Tioga Co., NY |A | | |C | | | |

|Wayne Co., NY |A | | |C | | | |

|West Chester Co., NY |No violations |

|NYC Area | | | | |A | | |

| (used by other NY counties) |No violations |

Cassidian Communications designed the interface for Baltimore County (MD) and Wicomico County (MD). The CODERED system was used by Anne Arundel County (MD), Calvert County (MD), Cecil County (MD), Dukes County (MA), Plymouth County (MA), Greene County (NY), and Suffolk County (NY). Connect-City from Blackboard was only used by Harford County (MD). CooperNotification was used by Howard County (MD), Montgomery County (MD), and Prince George’s County (MD). Everbridge was used by Frederick County (MD), Washington County (MD), Worcester County (MA), and the Boston area. Hyper-reach was used by Franklin County (NY), Tioga County (NY), and Wayne County (NY). Nixle was only used by Allegany County (MD). Charles County (MD), Dorchester County (MD), the New York City area, West Chester County (NY), and the site (for other counties in NY) all appeared to use some type of proprietary interface design. The most widely used product (CODERED) was also the interface with the most accessibility violations, and this interface claims to be in use by thousands of users in all 50 US states (Emergency Communications Network, 2012).

One example of an accessibility violation that impacted all eight sites that used the CODERED interface was the “continue” and “edit” buttons used during the registration process which were lacking alternate text (the continue button would read as “continue-ns.png” to screen reader users). Another example of a violation on the CODERED interface was the progress indicator at the top of the screen, which was lacking alternate text to indicate to non-sighted users the progress status. Figure 1 shows a screenshot of the progress indicator, which is lacking alternate text.

[pic]

Figure 1. The Inaccessible Progress Indicator on the CODERED Interface

One example of an accessibility violation that impacted the Baltimore County interface (designed by Cassidian Communications) was the inaccessible pop-up calendar script that is illustrated by Figure 2.

[pic]

Figure 2. The Inaccessible Calendar Script from Cassidian Communications

The CooperNotification system violations included form fields on the second page of the sign-up process that had text fields that were not properly labeled and read as “secondary text field.” Also, when a required field was missed, the form reloaded, but only a sighted user would readily notice the notification that shows up at the top of the page. The notification used to indicate which fields are required was inaccessible, since the focus for screen reader users was once again at the beginning of the form (just below the critical information regarding required fields). Figure 3 shows a screenshot of this accessibility violation.

[pic]

Figure 3. Required Form Field Violation on CooperNotification Interface

The proprietary sign-up form for the New York City area had a violation relating to an inaccessible progress indicator (missing alternate text). On the interface for Hyper-reach (used by three counties in New York), the only indication that fields were required was through the use of red text, as indicated by Figure 4.

[pic]

Figure 4. Required Form Field Violation on Hyper-reach Interface

There were several examples of overall poor design that would result in accessibility problems but were not actual violations. For example, the Cassidian Communications sign-up forms for Baltimore and Wicomico Counties (MD) have no indication that an asterisk is used to indicate required form fields (other than a visible red color) until the end of the forms are reached. Figure 5 shows a screenshot of the Baltimore County sign-up form.

[pic]

Figure 5. Screenshot of the Baltimore County Sign-up Form

Washington County (MD) and Worcester County (MA) provide an audio CAPTCHA, however the CAPTCHAs seemed to be of some proprietary nature (designed for use with the Everbridge system) and were extremely difficult to use. Tioga, and Wayne Counties (NY) used the Hyper-reach system, which had a graphical form submit button with alternate text but no label, which caused it to not work properly for some web browser/screen reader combinations (such as Internet Explorer and JAWS). The New York City area notification sign-up had a form with items (such as the combo box for SMS carrier) in the wrong tab order, which could cause problems for screen reader users. Suffolk County (NY) provided two sign-up links, but the first (graphical) link did not work. Table 3 shows a matrix of the violations that were consistently seen on third-party interfaces which were used by various counties/states. The interfaces implemented by Cassidian Communications and Nixle were the only ones that did not seem to have a consistent Section 508 violation.

Table 3. Matrix of Section 508 Paragraphs Consistently Violated

|System Name: |Sign-up |Directio|Form |Required|Progress |Buttons |CAPTCHA |

| |Link |ns |Fields |Fields |Indicator| | |

|CODERED | |N |A |L |A |A | |

|Connect-City from Blackboard | | | |C |A | | |

|CooperNotification | | |N |N | | | |

|Everbridge | | | | |A | | |

|Hyper-reach | | | |C | | | |

|Nixle | | | | | | | |

It should be noted that several sites provided users with an alternate means of signing up for emergency alerts. Howard County and Prince George’s County (MD) allowed users to sign up with a Google, Yahoo, or OpenID account, and Montgomery County (MD) provided users with an option to sign up via text message. The Boston area, the New York City area, Suffolk County (NY), and the state-wide all provided a phone number for users to call if they wish to sign up over the phone.

4. Discussion

4.1 Technical Implications

From a technical point of view, all of these emergency alert sign-ups are simple web-based forms. The coding is not complex, and the accessibility solutions are not complex. Unlike, for example, a complex web accessibility challenge like creating accessible equivalents of geo-spatial interactive maps for blind users (Weir et al., 2012), accessibility solutions for web-based forms are relatively easy. Referring to the screenshot presented earlier in the paper as Figure 1, the progress indicator for the web sites using the CODERED system, the actual code is as follows:

PROVIDE

VERIFY

SUBMIT

Note that there are really two problems with this code. The images are identified with alternative text of “1” “2” or “3” and there no way to determine which page you are on. This can be seen with the names of the images, which are “1.png”, which is a darkened image, and “2w.png” and “3w.png”, which are lighter, and the fact that the image is lighter, is the only representation that you are on that specific page. To make that code accessible, it should be changed to the following:

PROVIDE

VERIFY

SUBMIT

To provide another example from the most commonly used interface evaluated, one of the first screens on the CODERED interface provides a section for users to select the types of alerts that they wish to receive. Under “Alert Types” there is a choice for “Emergency Notifications” and also “General Notifications.” “Emergency Notifications” is selected by default, and the user has the option of selecting or deselecting “General Notifications.” While this seems like an obvious and non-problematic design by all visual appearances (refer to Figure 6), a screen reader user only hears the option to select the “General Notifications” and only hears the alternate text “Receive community notifications” next to the text for “Emergency Notifications” with no indication that the box in the graphic depicted is already checked. The current code for that form: could easily be modified to more accessible alternate text such as or similar. This is a good example of an instance where an automated accessibility evaluation tool would consider this code to be accessible (since the graphic DOES have associated alternate text), when it is really not accessible in its context.

[pic]

Figure 6. Screenshot of the Alert Types Selection Problem on CODERED

As another example not previously mentioned in the paper, the following form (Figure 7), to complete the registration for the alerts in Howard County, Maryland, requires that you provide a home address.

[pic]

Figure 7. Screenshot of the NotifyMeHoward Sign-up Form, Requiring a Home Address

However, when you are listening to the page using a screen reader and you hear the edit box for the form field (which requires a home address), the screen reader says, “supplementary text field”, not indicating what the edit box is asking for (although an expert user might be able to navigate around the page and determine that the box is located near text for home address). The actual page code is as follows:

Furthermore, although the home address field is required, the fact that the field is labeled as “supplementary” strongly hints that the field is not required. Yet, both of these problems could be resolved quickly by changing that one line of code to:

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download