Sound moves Light
Sound moves Light
Sound moves Light
This new audiovisual illusion shows that what we see can sometimes depend on our ears as much as our eyes. In the new illusion, the perceived direction of motion of red bars across a screen depends on the timing of an accompanying sequence of beeps. When the auditory timing changes, the visual bars switch direction. By all appearances, sound moves light.
In the well-known ‘Ventriloquist Effect’, vision can influence auditory perception. On television for example, sound seems to come not the loudspeakers but from the lips of the actors. But the new audio-visual illusion works in the opposite direction, (we could call it ’Reverse Ventriloquism in Motion’) with sound timing changing what we see. Television provides more everyday examples of this: try watching dancing with the volume off and even the best dancers may start to look as if they have no rhythm. Sports provide another example: in tennis the ball often moves faster than the camera or the eye can track, but the sound of the ball hitting the racquet is a give-away cue to which direction the ball is now travelling (even though the sound itself doesn’t move).
The new phenomenon challenges the natural intuition that sound and light (being entirely different physical phenomena) are each processed independently in the brain via different sensory mechanisms. However our ability to perform complex tasks such as identifying who is saying what in a chattering crowd suggests that at some stage information from our hearing and vision must be integrated. The question currently on the lips of many multisensory researchers is ‘which stage’? The answer may have profound implications for understanding the neural processes underlying multisensory perception. Are sights and sounds processed completely independently first, and the final results then combined? Or are sensory areas effectively ‘crosswired’, so that they share information even at early stages of processing? The present illusion of sound moving light supports early integration or crosswiring, suggesting that auditory signals can influence one of the primary functions of vision – tracking the motion of light over space and time.
A sceptic might still counter that the rhythm of the beeps only influences what you think you see or what you say you see, rather than what you actually see. Such subtle arguments have dogged much past multisensory research, but here a further critical observation tips the balance of evidence further in favour of an effect of sound on early visual processing rather than merely interpretation. A tell-tale signature of primary visual functions is that they show illusory ‘negative aftereffects’. These may be easily observed with colour: for example, after gazing for a few seconds at a red mug on your desk you may notice a bluish mug-shaped patch when you next shift your gaze to a white paper lying next to it. In the present case sound can similarly induce an aftereffect for visual motion: thus, if you see rightwards visual motion with a given sequence of beeps, you are likely to see leftwards motion for a few seconds when the soundtrack is suddenly silenced. As there is no longer any sound to influence interpretation, this aftereffect proves that the previous auditory stimulation must have had an enduring effect on visual motion processing itself.
On an applied level, the present illusion might translate into novel displays that change their appearance depending on sound, which may be of use in advertising or providing an eye-catching multisensory warning or alert in safety-critical applications. It may also eventually be useful in detecting and diagnosing subtle perceptual deficits thought to be characteristic of certain clinical populations (e.g. dyslexia and autistic-spectrum).
Wednesday, 25 March 2009
Click above to download my Radio 4 interview on 30th Oct 2008 (.wma 3.57MB). Or see the programme summary on the BBC website