Oussama Metatla


News: I am excited to share that I’ve been awarded an ERC Consolidator grant. The grant will fund an interdisciplinary team for 5 years to research inclusive technologies for the early cognitive and social development of blind and sighted children. I will soon be recruiting for postdoctoral and PhD positions in the areas of fabrication, multisensory interaction, crossmodal cognition and design. So watch this space, or get in touch for a chat if this sounds interesting to you!


I am Associate Professor of Human-Computer Interaction at the University of Bristol. I’m Co-Head of the Bristol Interaction Group and Lead of the Diverse-Ability Interaction Lab (Dive Lab.)

My field of research is Human-Computer Interaction (HCI), my team and I investigate how HCI as an applied field of inquiry can contribute to making human society more inclusive of disabled people.  We start from the premises that a) an inclusive future is not only desirable but one that should be actively designed, b) that disabled and non-disabled people must be actively involved in co-designing said future, and c) that bias toward particular forms and notions of technology and interaction in traditional HCI paradigms, is a detriment to inclusion because they overlook the richness and diversity of human abilities and experiences.

We are particularly interested in exploring how insights and principles from multisensory interactioncrossmodal perception and embodied cognition could be used to design more inclusive interactions between disabled and non-disabled people. We use a mixed-methods approach in our research, combining theory with field work, co-design, and controlled studies and evaluation.

Ongoing projects and research interests include:

  • Education technologies for blind and visually-impaired children
  • Hybrid technologies for people with and without dementia
  • Social play technologies for autistic and non-autistic children
  • Virtual & Augmented reality for people with limb differences
  • Olfactory displays in exergames for people with dementia
  • Crossmodal perception applications in HCI
  • Synaesthesia in interactive technology
  • Child-Computer Interaction
  • Co-design, particularly with diverse-ability groups

Key Projects

InclusiveXplay

Between 2023 and 2028, I have been awarded a 5-years ERC Consolidator Grant for the project Inclusive Cross-sensory Social Play: Towards a new paradigm of assistive technology for early development of blind and visually impaired children (inclusiveXplay). In this project, we will conduct interdisciplinary research that will radically change the way we design, engineer and evaluate assistive technologies for blind and visually impaired (BVI) children. The aim is to carve new directions in technological research for inclusion by focusing on how interactions between disabled and non-disabled children can be facilitated with and through inclusive cross-sensory assistive technologies

CRITICAL

Between 2016 and 2021, I held a 5-years EPSRC Early Career Fellowship which supported my research into inclusion, co-design, crossmodal perception and multisensory interaction. In this fellowship, I focused on ways of applying the above principles to design and research education technologies that can improve the inclusion of visually-impaired children when they learn alongside their sighted peers in mainstream schools

Check out our growing consortium on inclusive education technologies here

Prior to this, I was a postdoctoral researcher on two EPSRC projects at Queen Mary University and I received my PhD in HCI from the University of London in 2011.