How could smartphones be easier to use when we can’t look at the screen?

Touch-screens are increasingly ubiquitous in our lives, not just in the smart phones and tablets that we use almost unthinkingly, but also when we pay for our shopping at a self-service checkout, when we buy a train ticket, and in any number of other situations.

tabletyFor many people and in many situations touchscreens are the easiest way to interact with a device, but what about when we can’t easily look at the device or if the user is visually impaired? We are interested in exploring the ways in which auditory and tactile cues can help to solve these problems and how they affect the user-experience.

Previously our team at QMUL have been developing non-visual access to diagrams including for tube maps but also circuit diagrams and mind-mapping, allowing sighted and visually-impaired colleagues to work better together. Our current research is looking at developing audio production software that is accessible to visually-impaired audio production specialists and musicians – potentially opening up whole new careers to people previously denied the opportunity by the limits of technology.

In a recent preliminary study we wanted to find out how sighted users’ experiences of navigating menus on a touch-screen that they couldn’t see was affected by audio-only and audio and tactile, feedback. Using a Samsung Galaxy Note device, users in our study were asked to navigate a series of menus using touch-based gestures and tapping, much as they would when using a touch-device in the real world. In the audio-only test the device read out the menu labels to the user and in the audio-tactile test there was an additional vibration whenever the user touched a menu item.

Interestingly, in contrast to similar tests done elsewhere the addition of tactile feedback seems to have slowed users down in our test. We’re not yet sure why this is, and the slower completion times weren’t reflected by the participants, who rated the effort taken equally for both tests, but we hope that further wider studies might shed some light. This type of subjective ‘cognitive load’ measurement is often overlooked in other tests and the contrast in usability of different methods would be an interesting topic for future study, particularly when users have others things, often other screens, competing for their attention.

This was a limited preliminary study that threw up some thought provoking questions about when different types of feedback might be most useful to users. How can we improve the user experience of sighted people by making use of their other sense? How can we best configure touchscreens in public so that visually-impaired users can use them unhindered? And, what applications can we find for touchscreens that allow greater collaboration between sighted and visually-impaired colleagues?

More details can be found in this paper, which won the Best Short Paper Award at the 2014 British Human-Computer Interaction Conference 2014:

Metatla O., Martin F., Stockman T., and Bryan-Kinns N., Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display. Proceeding of the BCS HCI Conference, Southport, UK, 2014.

[ This post was originally written by Nick Bryan-Kinns, co-author of the paper referred to in this post ]