Collaborating senses
Some things just go together: keys and locks, hide and seek, Bert and Ernie…
Our brains are working hard at a level mostly below our conscious awareness to spot things from different senses that go together. It’s one of the ways in which we solve the binding problem: given that information about the world comes through lots of different sensory organs, how do we stick it all back together to get an understanding of the world that makes sense? Most of this is happening with what we might think of as the basic sensory features of the world, like temperature, colour, size, pitch, and so on.
Lots of researchers have looked at how basic sensory features ‘go together’, but most of the time they’ve looked at only two things at a time - size and pitch, for example. This is interesting, but doesn’t really resemble the real world, where things have lots and lots of sensory features. So together with Paul Hibbard and Mary Spiller, I showed people lots and lots of pictures of blobs that varied in terms of four visual features: saturation (colour intensity), luminance (how light or dark they were), size and vertical position on the screen. We asked people to decide whether each blob went better with a high pitched noise or a low pitched noise, then looked at how the different visual features interacted with each other when people were making a decision.
We expected that people would pick the low-pitched noise when the blob was grey rather than brightly coloured, dark rather than light, large rather than small and low rather than high on the screen, but we weren’t sure what would happen if there was conflicting information - say the blob is large but also high up. We found that people were making a summative decision - that is, the more of the visual features ‘went with’ the low-pitched noise, the more likely people were to choose that noise. All the visual features seemed to be equally contributing to this decision, rather than some features being more important than others.
This information could be really useful in designing sensory substitution devices, which ‘translate’ from one sense to another - for example, a camera image gets translated into a soundscape. Sensory substitution devices can be helpful for people with sensory disabilities like blindness, but they can be hard to figure out. The more we know about how things ‘go together’ across the senses, the easier it will be to design sensory substitution devices with default settings that are intuitive for most users.