Activating the 'mind's eye' - sounds [with video]

A device which employs musical notes is helping blind individuals to “see” using sound. This non-invasive sensory-substitution device (SSD) was named the “EyeMusic” as it converts images into a combination of musical notes or “soundscapes” that propose alternate visional guidance for visually impaired or blind people.

HFSP Career Development Award holder Amir Amedi and colleagues
authored on Mon, 04 March 2013

A team of researchers, led by Amir Amedi from the Edmond and Lily Safra Center for Brain Sciences (ELSC) at the Hebrew University in Jerusalem, developed a device which employs musical notes to help blind individuals “see” using sound.  The EyeMusic’s algorithm uses different musical instruments for each of the five colors: white (vocals), blue (trumpet), red (reggae organ), green (synthesized reed) and yellow (violin); black is represented by silence (sample sound recordings are available at The ultimate goal of this, as well as other devices developed in Amedi’s lab, is to assist blind, visually impaired and color blind individuals in perceiving and interacting with their environment.

Even after only a short training session with the device (less than 5 min), 18 sighted participants were able to distinguish between two targets – on either the right or the left – sounded out by the EyeMusic. Following the training, they were asked to use a joystick to point at these targets. Their arm was placed under a cover, so that they could not see it moving. The targets were either seen on a computer screen, or sounded out via headphones, with the EyeMusic. Shortly after the start of the experiment, the relationship between their hand movements and the on-screen cursor movement was changed: for example, they had to move their hand to the left for the cursor to go up. The participants learned this new relationship, or mapping, without being aware of it, when they could see the targets and the cursor. Then they used this new mapping to make movements to targets whose location they only heard via the EyeMusic. That is, they naturally and seamlessly transferred the novel, unconsciously learned, information between their different senses, without being aware of it.

These findings hint at what appears to be a supra-modal representation of space: whether the spatial information comes from vision or from audition, it appears to be interchangeably used to create an inner representation of space that is then used to move within it.  “These findings are very encouraging and pave the way for development of hybrid aids for the blind”, comments Amir Amedi. Expectations are to combine input from low-resolution visual prostheses (or residual vision) – used for example to locate a nearby tree – and from the EyeMusic, used to perceive the luscious fall colors of the leaves.

First results show that blind people can actually "see" and describe objects and even identify letters and words using a unique training paradigm based on sensory substitution devices. The blind participants in this study, recently published in Neuron, using this device acquired a level of visual acuity technically surpassing the world-agreed criterion of the World Health Organization (WHO) for blindness. The study shows that following a dedicated (but relatively brief) 70 hours of a unique training paradigm developed in the Amedi lab, the blind people could easily use SSDs to characterize images into object categories, such as images of faces, houses, body shapes, everyday objects and textures. They could also identify even more complex objects -- locating people's positions, identifying facial expressions, and even reading letters and words.

But Amedi’s team went one step further to actually test what happens in the brain when the blind learn to see with sounds. Specifically, the group tested the ability of this high-acuity vision to activate the supposedly dormant visual cortex of the blind, even though it was taught to process the visual images through sounds only in adulthood. The researchers used functional magnetic resonance imaging (fMRI) to measure the neural activity of people blind from birth as they "saw" -- using the SSD -- high-resolution images of letters, faces, houses, everyday objects and body-shapes. Surprisingly, not only was their visual cortex activated by the sounds, their brain showed selectivity for visual categories which characterize the normally developing, sighted brain. A specific part of the brain, known as the Visual Word Form Area, or VWFA -- first discovered in sighted people by French collaborators of the current study -- is normally very selective. In sighted people, it has a role in reading, and is activated by seeing and reading letters more than by any other visual object category. Astonishingly, the same activity was found in this area in people deprived of vision. After only tens of hours of training in SSD use, their VWFA showed more activation for letters than for any of the other visual categories tested. In fact, the VWFA was so plastic to change, that it showed increased activation for SSD letters after less than two hours of training by one of the study participants. "The adult brain is more flexible than we thought," says Amedi. In fact, this and other recent research from various groups have demonstrated that multiple brain areas are not specific to their input sense (vision, audition or touch), but rather to the task, or computation they perform, which may be computed with various modalities.

All of this suggests that in the blind, brain areas might potentially be "awakened" to processing visual properties and tasks even after years of or maybe even lifelong blindness, if the proper technologies and training approaches are used. “The findings also give hope that reintroduced input into the visual centers of the blind brain could potentially restore vision, and that SSDs might be useful for visual rehabilitation. SSDs might help blind or visually-impaired individuals learn to process complex images, as done in this study, or they might be used as sensory interpreters that provide high-resolution, supportive, synchronous input to a visual signal arriving from an external device such as bionic eyes", says Amir Amedi.



Cross-sensory transfer of sensory-motor information: visuomotor learning affects performance on an audiomotor task, using sensory-substitution. Levy-Tzedek, S., Novick, T., Arbel, R., Abboud, S., Maidenbaum, S., Vaadia, E., Amedi, A. (2012). Scientific Reports 2: 949.

Pubmed link

Link to Scientific Reports

Other References

Reading with Sounds: Sensory Substitution Selectively Activates the Visual Word Form Area in the Blind. Striem-Amit,E., Cohen, L., Dehaene, S., Amedi, A. (2012). Neuron 70: 640-652.

Pubmed link