Haptics (or proprioception) is the sensory modality built into our bodies; it's provides constant information about the state of the body and things it is in mechanical contact with, such as tools. Many ecological psychologists (myself included) have investigated haptic perception and it's role in the control of action, but unlike the optic array, we have basically zero work identifying what the relevant information variables look like.
I first investigated haptic perception in the context of coordinated rhythmic movements (Wilson, Bingham & Craig, 2003). Geoff had run studies showing that visual judgements of different relative phase varied in stability in the same way that the production of those relative phases does. This suggested that the movement phenomena were being caused by the way relative phase is perceived. This was vision, however, and the movement phenomena obviously involve motion of the body and the haptic system. This involvement was typically explained in terms of muscle homology and neural crosstalk effects. Our study had people track manipulanda that moved up and down one of three mean relative phases with various levels of phase variability added, and had them make judgements of that variability (replicating the visual studies). We found haptic perception of relative phase, as measured by those judgements, behaved just like visual perception of relative phase - we inferred that the information, the relative direction of motion, can be detected by both systems and has the same effects.
I am moving back into the haptic information world for two related reasons.
First, I want to replace the muscle homology/neural crosstalk stories with a haptic perception story. The effects these theories account for are very large and reliable, and Geoff's perception-action model currently only applies to visual information. Specifically, muscle homology applies to relative phase defined in an egocentric (body centred) frame of reference, while Geoff's model applies to relative phase defined in an allocentric (external) frame of reference. Relative phase is clearly detected in both frames of references; when they are pitted against one another experimentally, both matter and the egocentric effects dominate (e.g. Pickavance, Azmoodah & Wilson, 2018).
Second, I have become interested in individual variation in the variables used to perceive relative phase. Based on his data, Geoff's model predicts relative phase is perceived via the information variable relative direction of motion, the detection of which is modified by the relative speed of the oscillators. In Wilson & Bingham (2008; blog post), we showed this was true in 7 out of 10 untrained participants judging 0° and 180°. The other three became unable to judge these phases when we perturbed another candidate variable, relative position. This experiment also showed that people trained to perceive 90° had improved because they had switched to this variable, but we were not expecting people at the other relative phases to be using this variable. I'm finally getting back into experiments probing the prevalance of this individual difference in visual information use and the consequences for perception-action stability (briefly: there's a lot of variation and it matters!). As part of the above project, I want to do the same kinds of studies on haptic perception too.
My problem here is, there is essentially no information in the literature on the nature of haptic information variables. This Peril lays out my current hypothesis about where to look; please, dear God, come help me!
The Justice Algorithm
1 week ago