Tuesday, 20 January 2015

It's Time to Relabel the Brain

Another day, another study finds that 'visual' cortex is activated by something other than information from the eyes:
A research team from the Hebrew University of Jerusalem recently demonstrated that the same part of the visual cortex activated in sighted individuals when reading is also activated in blind patients who use sounds to “read”. The specific area of the brain in question is a patch of left ventral visual cortex located lateral to the mid-portion of the left fusiform gyrus, referred to as the “visual word form area” (VWFA). Significant prior research has shown the VWFA to be specialized for the visual representation of letters, in addition to demonstrating a selective preference for letters over other visual stimuli. The Israeli-based research team showed that eight subjects, blind from birth, specifically and selectively activated the VWFA during the processing of letter “soundscapes” using a visual-to-auditory sensory substitution device (SSD) (see www.seeingwithsound.com for description of device).
There's lots of research like this. People are excited by mirror neurons because they are cells in motor cortex that are activated by both motor activity and perception of that motor activity. It's incredible, people cry - cells in a part of the brain that we said 30 years ago does one thing seem to also do another thing. How could this be??

I would like to propose a simple hypothesis to explain these incredible results and that is that we have been labeling the brain incorrectly for a long time. The data telling us this has been around for a long time too and continues to roll in, but for some reason we still think the old labels are important enough to hold onto. It's time to let go.

I'm going to go out on a limb and say it's a bit more complicated than this

Thursday, 15 January 2015

The Size-Weight Illusion Induced Through Human Echolocation

Echolocation is the ability to use sound to perceive the spatial layout of your surroundings (the size and shape and distance to objects, etc). Lots of animals use it, but humans can too, with training. Some blind people have taught themselves to echolocate using self-generated sounds (e.g. clicks of the tongue or fingers) and the result can be amazing (I show this video of Daniel Kish in class sometimes; see the website for the World Access for the Blind group too).

In humans, this is considered an example of sensory substitution; using one modality to do what you would normally do with another. This ability is interesting to people because the world is full of people with damaged sensory systems (blind people, deaf people, etc) and being able to replace, say, vision with sound is one way to deal with the loss. Kish in particular is a strong advocate of echolocation over white canes for blind people because canes have a limited range. Unlike vision and sound, they can only tell you about what they are in physical contact with, and not what's 'over there'. 'Over there' is a very important place for an organism because it's where almost all of the world is, and if you can perceive it you give yourself more opportunities for activity and more time to make that activity work out. This is why Kish can ride a bike.

A recent paper (Buckingham, Milne, Byrne & Goodale, 2014; Gavin is on Twitter too) looked at whether information about object size gained via echolocation can create a size-weight illusion (SWI). I thought this was kind of a fun thing to do and so we read and critiqued this paper for lab meeting.