Summer Internship @ Google, Inc.: Accessibility Experiences


By A. D. Shaikh, Wichita State University
P. Strain, Queens University Belfast, UK

Note: This article is based on a paper accepted for CHI 2007 (Strain, Shaikh, & Boardman).

Summary. This paper summarizes some of the major lessons learned about conducting usability tests with visually impaired participants while working as interns at Google, Inc. The lessons were in four major areas: (1) recruitment and scheduling, (2) preparing the usability lab for testing sessions, (3) using think-aloud protocol with screen readers, and (4) helping observers to get the most out of the test sessions.

The term "Accessibility," when applied to websites, involves the assurance that users with a disability have an equivalent user experience to those without disabilities. A variety of guidelines have been published to help website developers achieve accessible websites, such as W3C WCAG 1.0 [8] and Section 508 [7]. However, studies have shown that it is possible for a website to be compliant with such guidelines yet still have accessibility issues [2, 4]. For example, blind people using screen readers rely on image links having "alt text" to describe their function. This is a key web accessibility guideline. However, if the alt text does not clearly describe the target of the link, a blind user can still have difficulties (See examples in Table 1). Such cases have contributed to the recognition of the need to test websites with disabled populations, and the British Standards Institution (BSI) has released a Publicly Available Specification (BSI PAS 78) [1] that emphasizes the need for user testing.

Table 1. Examples of ALT text that could be more informative and therefore result in a better experience for screen reader users.

Image & URL Current ALT text Suggested ALT text
Wichita State University

http://www.wichita.edu/thisis/
(December 18, 2006)

Wichita State University Bigger opportunities, bigger experiences, bigger rewards. At Wichita State, it’s all within reach.
the really big sale

http://www.gap.com
(December 18, 2006)

the really big sale The really big sale. Up to 50% off select styles. In stores and online. Limited time only.

This paper describes some of the authors’ experiences as interns at Google, Inc. We were both assigned to spend time on accessibility issues encountered by visually impaired users. In this paper, we discuss some of our experiences recruiting participants, steps taken to prepare the usability lab, challenges faced by the team, and steps taken to make the most of usability testing session.

Recruiting Participants

In order to recruit visually impaired participants, we initially made phone contact with local centers that offer support services for visually impaired people. After establishing an initial contact, we sent a screening questionnaire by email along with a brief introduction letter. The contact for each center then forwarded our email to potential participants. Within a few days of contacting the centers, we started receiving completed questionnaires. We found that many recipients of the initial email kindly forwarded it on to their contacts who met our criteria as well. Our use of snowball sampling resulted in 30+ completed questionnaires and provided us with a variety of participants to choose from. In spite of the fact that potential participants were skewed toward experienced computer users, the respondents varied in terms of age, ethnicity, length of blindness, and time using screen readers.

We then personally scheduled each participant for their on-campus usability session. On-campus testing was conducted in order to allow team members from the various Google properties to attend the usability testing sessions. Once scheduled for a session, participants received a confirmation letter stating their schedule testing time, compensation, and directions to the testing facility. The consent form was also included as a MS Word attachment.

Lessons Learned:

  1. Use the same person to schedule and greet all participants. This seemed to result in increased comfort level for the participants.
  2. Send out the screening questionnaire as text embedded in the initial email and as a MS Word document attachment.
  3. Briefly explain the goals of the study in the initial email.
  4. Make all equipment requirements clear from the beginning and include questions in the screener to verify use of required equipment. (i.e.: If you are looking for JAWS users then ask participants if they use JAWS, etc. )
  5. Use the screening questionnaire to collect typical demographic information – this saves time during the actual test session.
  6. Many of the participants we reached through our email campaign were long-time computer users. If more novice computer users are needed, an alternative method of recruitment may be needed.
  7. Information about the testing session should be sent as text or MS Word file. The information sent should include directions (in text form) to the testing facility.
  8. If conducting on-site test sessions, provide participants with travel compensation.
  9. Allow ample time for each test session. Screen reader users often require more time than sighted users. Additionally, plan on providing breaks.

Preparing the Usability Lab for Test Sessions

All equipment was tested before each session. Once participants were in the testing room, we allowed them to customize the screen reader (JAWS), the keyboard, and other computer settings to mimic their typical set-up as closely as possible. Some users added tactile markers to specific keys on the keyboard. One user requested a split keyboard. The most common adjustment made by participants was to increase the voice rate of JAWS. Speaker volume was tested for each participant and adjusted as needed.

Besides computer-related adjustments, some participants requested the facilitator to sit on their left or right. These requests were typically related to the preferred location of their guide dog or due to a hearing-related impairment. We also found that participants did not want us to talk to or pet their guide dog while he/she was "working".

Lessons Learned:

  1. When scheduling each participant, ask if he/she modifies the keyboard and software in any way and be prepared to make such modifications.
  2. Have a supply of removable, reusable adhesive putty on hand in the testing room. This works well as a tactile marker that is easy to remove at the end of the session.
  3. Ensure that the testing room has ample room for the guide dog to sit under the table or on one side of the participant.
  4. In the event a facilitator is allergic to dogs, ask participants at the time of scheduling if they will be accompanied by a guide dog.
  5. Be flexible in terms of seating arrangements for the facilitator and the participant. One way to make sure this is feasible is to have a clutter-free computer table which makes rearranging machinery quick and easy.
  6. Ask first, pet later! Make sure to ask before you pet the guide dog.

Facilitating Effective Moderation through Think-Aloud Protocol

Screen readers "read out" the content and structure of a website. This leads to a significant challenge for the facilitator since the screen reader audio interferes with any dialogue between facilitator and participant. Perceptual studies have shown that it is possible for humans to deal with two voices at once (the so-called "cocktail party effect"); however, due to cognitive limitations people often have a difficult time talking and listening at the same time [5]. Figure 1a shows the audio flow for each member of the testing session when testing sighted participants; Figure 1b illustrates the different audio flows when testing visually impaired participants using screen readers. In our experiences, the participant and facilitator were in the same room while the observer was in a separate room. The only two-way audio flow occurred between the participant and the facilitator. As the diagrams show, the addition of the screen reader increased the overall "noise" experienced in the usability testing session.

To facilitate the interaction between the participant, facilitator, and screen reader existing think-aloud strategies were modified. At the beginning of each session, the standard think-aloud protocol was explained to participants. The facilitator then acknowledged the possibility of conflicts between simultaneously thinking aloud and listening to the screen reader audio. We experimented with three variations of the think-aloud protocol during the sessions. These variations are discussed in detail in the CHI 2007 paper (Strain, Shaikh, & Boardman).

Lessons Learned:

  1. The screen reader dominates the audio output of test sessions.
  2. It is easy for participants to forget to think aloud when they are busy listening to the screen reader audio.
  3. The standard think-aloud protocol must be modified when testing visually impaired participants using screen readers.

Figure 1a. Audio flows commonly expe

Figure 1b. Audio flows experienced in u

Figure 1a. Audio flows commonly experienced in
standard usability testing of sighted users.
Figure 1b. Audio flows experienced in usability
testing of visually impaired users using a screen reader.

Helping Observers Observe!

In standard usability testing, observing is a fairly easy task. However, we found that team members who were unfamiliar with screen readers had a hard time following the screen reader terminology and audio output. In our studies, the participant’s screen was projected into the observation room along with the audio of the screen reader and conversations between the facilitator and participant.

Observers frequently commented that it was difficult to keep up with the participant due to the observers’ inabilities to understand the screen reader speech synthesizer and commands which are often much faster than normal speech. By default a screen reader will "read" at a rate of around 180 words per minute (wpm), however some users may increase this to as fast as 300 wpm [9]. In contrast, the normal rate of speech is approximately 150 wpm [6].

In addition, experienced screen reader users often rely on a complex series of keystrokes (‘H’ to move through headers; ‘Enter’ to enter forms mode; ‘Insert–F7’ to get a list of links) to navigate interfaces. However, observers (who themselves were not screen reader users) found such keystrokes very hard to keep up with in the study. Furthermore, if the shortcut command resulted in an error message such as "No Headers Found," observers unfamiliar with screen readers could not understand what was happening. Many software developers have had little exposure to screen reader usage – this is an ongoing education challenge for the accessibility team at Google.

Two ways we found to help the observers get the most out of each session were to (1) Provide a "screen reader interpreter" and (2) Highlight activity on the screen. The screen reader interpreter was an additional researcher (who was literate in screen readers) and who was assigned to stay in the observation room to offer explanations for common shortcuts. The screen reader interpreter assisted the observers (mostly programmers, designers, and project managers) in understanding the common usage patterns of the visually impaired participants. For example, when participants repeatedly searched for headers to navigate a page, the observer-room researcher was able to explain what the screen reader shortcut achieved and ways to change the site to utilize headers. Additionally, highlighting activity on the screen allowed observers to easily follow the tabbing pattern of the participants as they moved through the page. In order to highlight screen activity, modifications were made to prototypes so that the selected link was highlighted.

Lessons Learned:

  1. Observing screen reader users is cognitively taxing.
  2. When planning usability sessions for visually impaired users, add a screen reader interpreter to your regularly scheduled researchers. This person should stay in the observation room.
  3. When testing prototypes, highlight activity onscreen by using an a:active CSS selector which creates a highlighted box on active links.

CONCLUSION

During our summer internships at Google, Inc., we worked on many Google properties with the accessibility team. We both had the opportunity to conduct several sessions of on-site usability testing using local visually impaired participants. Based on our experiences, we have offered tips for recruiting, scheduling, and interacting with participants. Time must be spent with each participant to prepare the usability lab to meet individual needs. During the testing sessions, we came up with strategies to work think-aloud protocol into the session while accommodating for the additional noise of the screen reader. The traditionally "easy" role of the observer becomes complicated if he/she is not screen reader literate. We found that including a screen reader literate researcher in the observation room and highlighting on-screen activity to be beneficial. Our work led to improvements in several Google products and increased overall awareness among programmers and designers of issues faced by this population of users.

REFERENCES

[1] BSI PAS 78, Guide to Good Practice in Commissioning Accessible Websites, http://www.bsi-global.com/ICT/PAS78/index.xalter

[2] Ivory, M. and Chevalier, A., "A Study of Automated Web Site Evaluation Tools," Technical Report UW-02-10-01, University of Washington, Department Computer Science and Engineering, 2002.

[3] JAWS screen reader, http://www.freedomscientific.com

[4] Jim Thatcher, "What not to do", http://www.jimthatcher.com/whatnot.htm

[5] Kemper, S., Herman, R. and Lian, C., The Costs of Doing Two Things at Once for Young and Older Adults: Talking While Walking, Finger Tapping, and Ignoring Speech or Noise, In Psychology and Aging, 2003, Vol. 18, No. 2, 181-192.

[6] Pashek, G. and Brookshire, R., Effects of Rate of Speech and Linguistic Stress on Auditory Paragraph Comprehension of Aphasic Individuals, In Journal of Speech and Hearing Research, Vol.25 377-383, 1982.

[7] Section 508 of the U.S. Rehabilitation Act of 1973, http://www.section508.gov

[8] W3C Web Content Accessibility Guidelines 1.0, http://www.w3.org/TR/WAI-WEBCONTENT

[9] WebAIM: Designing for Screen Reader Compatibility, http://www.webaim.org/techniques/screenreader/

[10] Zhiwei Guan, Z., Lee, S., Cuddihy, E. and Ramey, J., The validity of the stimulated retrospective think-aloud method as measured by eye tracking, In Proc. of CHI’06, pp. 1253 – 1262, ACM Press, 2006.

Tagged with: , , , , , ,
Posted in Usability News
Subscribe to SURL

Want to receive notifications when SURL has new articles? Please enter your name and email address to subscribe to our website.

Popular Topics in Usability News
Log in/Sign Up
%d bloggers like this: