Design Patterns for Inclusive Collaboration (DePIC) – 2012/2015

Sponsored by EPSRC/ Role: Researcher Co-Investigator

DePIC aims to develop new ways for people to interact with each other using different senses, so reducing barriers caused by visual and other sensory impairments.

Our interaction with the world around us relies on perception which exploits combinations of the senseswe have available to us, for instance when we both see and hear someone speaking we associate the words spoken with the speaker. Enabling people to use combinations of senses becomes critical in situations where people who have different senses available to them interact with each other. These differences can arise because of temporary or permanent sensory impairment, or due to the technology they are using. However, very little research has examined how people combine and map information from one sense to another, particularly for individuals with sensory impairments, and then used such mappings to inform the design of technology to make collaboration easier.

The aim of this multi-disciplinary project is to develop new ways for people to interact with each other using different combinations of senses. This will reduce barriers to collaboration caused by sensory impairment, and improve social and workplace inclusion by optimising the use of available senses.

DePIC aims to combine empirical studies of mappings between senses with participatory designtechniques to develop new ideas for inclusive design grounded in Cognitive Psychology. We will capture these design ideas and mappings in the form of Design Patterns and demonstrate their usefulness through the development of interactive systems to support assisted work, living, and leisure.

Collaborative Cross-modal interfaces (CCmI): Accessible Diagrams for Workplace Inclusion – 2010/2012

Sponsored by EPSRC Research in the Wild/ Role: Researcher Co-Investigator

ccmi_1

Collaboration is a fundamental form of human interaction. However, software tools which support collaboration assume that all collaborators have access to the same sets of modalities. For example, audio-video conferencing systems assume all participants can see and hear the outputs of the system, and shared whiteboards assume equal access to the visual modality. This disadvantages users with differing access to sensory channels due to their context and abilities. It is a critical problem for distributed team work where participants are likely to use different communication technologies to collaborate (e.g. mobile teamwork) as well as co-located work groups involving elderly participants or participants with perceptual impairments (e.g. visual impairments). 

The CCmI project will refine and adapt my PhD work to demonstrate its utility for improving the accessibility of collaboration in real world scenarios. The challenge is to design support for collaboration where participants have differing access to modalities – we refer to these situations as cross-modal collaboration. We are working with the Royal National Institute of the Blind [RNIB] and the British Computer Association of the Blind [BCAB] to explore how to support visually impaired and sighted co-workers in the context of editing diagrams in the software engineering workplace. [MORE] on the CCmI Home page.

Collaborating Through Sounds: Audio-only Interaction with Diagrams – 2006/2010

Sponsored by MESRS PhD Scholarship

PhDStudyThe research that I conducted during my PhD investigated aspects of diagram accessibility when using sound as the only means of interaction. The research aimed to empirically explore the potential of using auditory display for accessing and manipulating graphically represented information, in both individual and collaborative usage scenarios.

In other (simpler) words, I looked into how diagrams can be transformed from visual to auditory artefacts, how audio can be used to access and manipulate diagrams; how interaction with diagrams (inspecting, constructing, editing) can be supported through audio-only interfaces, and how the dynamics of collaboration are affected when groups of people work together on auditory diagrams.  I asked and worked on answering questions such as; What might a graph or a diagram sound like? How do we hear – and learn to listen – a diagram? How do we work together on diagrams when we can only hear each other?

This work was nominated for and received the prize of International Excellence in HCI Research by the British Computer Society Interaction Group[MORE]

Accessible Music Studio (aMuST) – 2009

Sponsored by INNOVATE UK (ex TSB) / Role: Named Researcher

waveformModern sound editors employ multi-function mixing and transport controls that rely heavily on visual displays for multi-track editing and production. This situation imposes severe limitations on the autonomy of visually impaired users within a field that could be a natural one for them, that of sound and music production. The AMuST project was a Technology Strategy Board (TSB) funded feasibility study which aimed to identify usability issues in current accessibility solutions, and ways in which they could be overcome through employment of novel interactive design techniques. This was achieved through an examination of state-of-the-art approaches to providing accessibility to mainstream sound editing tools, as well as through close interaction with experienced visually impaired and blind audio engineering practitioners.

Web-based Audio-visual Collaboration – 2008

WebCollaborationThe mismatch between the spatial layout of Web pages and the temporal nature of speech imply a substantially increased cognitive load for Web interactions. This project addressed how Web cognition is experienced by blind users employing screen-readers for Web interaction. Many of the differences in Web interaction between sighted users and users of screen-readers arise from the serial way in which Web pages are rendered by screen-readers. We examinined the ways in which these differences are brought about through the functionality of current screen-readers. Continue reading “Web-based Audio-visual Collaboration – 2008”

Interactive Audio Football – 2006

audioSoccerThis project explored basic questions in the design and implementation of an audio-only computer-base football game; such as size of playing area, orientation, awareness of team mates and opponents and basic navigation. The project also explored more advanced design issues, not addressed by previous audio only ball games, involving the provision of a multi-player perspective, requiring the provision of an intuitive means of supporting changes in the focus of the interaction in audio. In general the dynamic, multi-player perspective poses interesting questions of how to provide real time and interactive sonification of ball and player positions and how these should be managed within the context of the changes in interaction focus mentioned above.

To assist with these and other design questions, advice was sort from past and present players of the British blind football squad. The information gathered ranged from basic facts about the rules and conditions under which games are played, through to discussions about the role of echo location in providing an awareness of physical features of the pitch and the proximity of other players. This in turn led to the question of how realistically to present the information provided through echo location in a virtual auditory display. Thus the project also explored the potential roles of audio-only game systems in team coaching, exploring the practical applications of audio game representations to realistic coaching scenarios.