A research team from the Hebrew University of Jerusalem recently demonstrated that the same part of the visual cortex activated in sighted individuals when reading is also activated in blind patients who use sounds to “read”. The specific area of the brain in question is a patch of left ventral visual cortex located lateral to the mid-portion of the left fusiform gyrus, referred to as the “visual word form area” (VWFA). Significant prior research has shown the VWFA to be specialized for the visual representation of letters, in addition to demonstrating a selective preference for letters over other visual stimuli. The Israeli-based research team showed that eight subjects, blind from birth, specifically and selectively activated the VWFA during the processing of letter “soundscapes” using a visual-to-auditory sensory substitution device (SSD) (see www.seeingwithsound.com for description of device).There's lots of research like this. People are excited by mirror neurons because they are cells in motor cortex that are activated by both motor activity and perception of that motor activity. It's incredible, people cry - cells in a part of the brain that we said 30 years ago does one thing seem to also do another thing. How could this be??
I would like to propose a simple hypothesis to explain these incredible results and that is that we have been labeling the brain incorrectly for a long time. The data telling us this has been around for a long time too and continues to roll in, but for some reason we still think the old labels are important enough to hold onto. It's time to let go.
|I'm going to go out on a limb and say it's a bit more complicated than this|
An informational hypothesis
I don't know exactly how this would play out, but I start this the place I start everything: by thinking about information. Sabrina has developed a draft taxonomy of information types which we are currently elaborating for some upcoming papers. The thing that drives the taxonomy is what kind of behaviour the information is capable of supporting. For the online control of action, the information probably needs to be specifying information of the kind Gibson identified and studied. Specifying information is information the organism uses so as to organise it's behaviour with respect to the thing that created that information (so moving so as to catch a fly ball uses information specific to your ongoing attempt to intercept it). But we can also use information to organise our behaviour in ways that have nothing to do with what created the information. The motor act of speaking creates information, but we don't use that information to perceive the details of the motor act of someone speaking; we use it to engage a person in conversation.
Information is all the same kind of thing. It's all just structure in energy arrays. But these variables vary in their stability and what we use them for, and presumably the brain is a major player in connecting information to behaviour. One way to think about the organisation of the brain is therefore to consider the nature of the information that it is working with. Why is visual cortex organised the way it is? Not because it's visual cortex, but because the kind of information that comes via vision has certain properties and is therefore used to support some but not other kinds of behaviour. We are not our brains - our brains reflect what we do.
From this perspective, the fact that 'visual' cortex can be involved in a reading task even when the reading happens via sound is not magical or all that surprising. One of Gibson's insights is that modalities are irrelevant; information is primary and you can often present information about the same thing through different systems (see my last post on the size-weight illusion induced via echlocation, or this paper by me on the haptic perception of the information for relative phase). If you present information that has certain characteristics, then brain regions organised to best handle those characteristics will presumably get involved. The 'visual word form area' is therefore nothing of the sort and needs to be relabeled with respect to the form of the information involved in the tasks this area is involved in.
This is, of course, very speculative and very early on. One obvious counter here is that the people in this study were blind since birth; why did the occipital lobe still get specialised for handling this kind of information? I have no idea, although I would hazard a guess there are genetic and developmental pressures that bias the development of the brain, and reconsidering those as biases for handling information rather than modalities is well worth doing.
Regardless, our ecological, embodied cognition has many implications for neuroscience (hence our research topic at Frontiers) and for what we think the brain is doing with all that energy it's using. We're already proposed (with Eric Charles) that the next big job for embodied cognition is to provide a new language for talking about the brain so that neuroscience can start asking better questions. Results like this current study and all the many many other studies like it prove that we need this new language soon, so we can all stop being amazed by the fact that nature doesn't know anything about our current labels for all the bits of the brain. Instead, we might end up with labels that are more closely aligned to the the system actually works.