Thursday, 1 November 2018

Where is the Haptic Information? (A Purple Peril)

Haptics (or proprioception) is the sensory modality built into our bodies; it's provides constant information about the state of the body and things it is in mechanical contact with, such as tools. Many ecological psychologists (myself included) have investigated haptic perception and it's role in the control of action, but unlike the optic array, we have basically zero work identifying what the relevant information variables look like. 

I first investigated haptic perception in the context of coordinated rhythmic movements (Wilson, Bingham & Craig, 2003). Geoff had run studies showing that visual judgements of different relative phase varied in stability in the same way that the production of those relative phases does. This suggested that the movement phenomena were being caused by the way relative phase is perceived. This was vision, however, and the movement phenomena obviously involve motion of the body and the haptic system. This involvement was typically explained in terms of muscle homology and neural crosstalk effects. Our study had people track manipulanda that moved up and down one of three mean relative phases with various levels of phase variability added, and had them make judgements of that variability (replicating the visual studies). We found haptic perception of relative phase, as measured by those judgements, behaved just like visual perception of relative phase - we inferred that the information, the relative direction of motion, can be detected by both systems and has the same effects. 

I am moving back into the haptic information world for two related reasons. 

First, I want to replace the muscle homology/neural crosstalk stories with a haptic perception story. The effects these theories account for are very large and reliable, and Geoff's perception-action model currently only applies to visual information. Specifically, muscle homology applies to relative phase defined in an egocentric (body centred) frame of reference, while Geoff's model applies to relative phase defined in an allocentric (external) frame of reference. Relative phase is clearly detected in both frames of references; when they are pitted against one another experimentally, both matter and the egocentric effects dominate (e.g. Pickavance, Azmoodah & Wilson, 2018).

Second, I have become interested in individual variation in the variables used to perceive relative phase. Based on his data, Geoff's model predicts relative phase is perceived via the information variable relative direction of motion, the detection of which is modified by the relative speed of the oscillators. In Wilson & Bingham (2008; blog post), we showed this was true in 7 out of 10 untrained participants judging 0° and 180°. The other three became unable to judge these phases when we perturbed another candidate variable, relative position. This experiment also showed that people trained to perceive 90° had improved because they had switched to this variable, but we were not expecting people at the other relative phases to be using this variable. I'm finally getting back into experiments probing the prevalance of this individual difference in visual information use and the consequences for perception-action stability (briefly: there's a lot of variation and it matters!). As part of the above project, I want to do the same kinds of studies on haptic perception too. 

My problem here is, there is essentially no information in the literature on the nature of haptic information variables. This Peril lays out my current hypothesis about where to look; please, dear God, come help me!

Dynamic Touch

One huge literature that does investigate haptic perception is the dynamic touch literature. These are experiments that study the haptic perception of limb and object properties that is possible when we wield (move) objects. 

The basis of perception via dynamic touch is the inertia tensor. This is the mathematical description of the characteristic way an object resists being moved in all 6 degrees of freedom (the x, y, z planes and the pitch, roll and yaw rotations). Judgements of object properties such as relative mass or length covary with various moments of inertia (eigenvectors of this matrix). This is now well established. 

The problem is that everyone in this literature refers to the moments of inertia and properties such as mass as information variables. This is wrong; they are the dynamical characteristics of the object to be perceived, not the information. So to my knowledge, this literature has not characterised any haptic information variables. 

Limb Kinematics

Geoff has investigated dynamic touch in the context of hefting an object to perceive how throw-able it is (e.g. Bingham, Schmidt & Rosenblum, 1989; blog post; this was replicated and extended by Zhu & Bingham, 2008; blog post)

In the 1989 paper, Geoff did extensive analyses on the kinematics of the limbs during hefting, looking for invariant features of that motion that might be serving as information about throwability. He identified a candidate, but the follow up paper tested and ruled this particular one out. Over all the perceptual work he's done with Zhu, they have not been able to identify any patterns of limb motion that could be the information. They have shown that equally throwable objects that vary in size feel equally heavy, but they tested and ruled out a role for the inertia tensor in creating felt heaviness and so right now, there is no known information variable supporting the (very good) perception of the affordance. 

A Better Place to Look

The failures to find invariants in any kinematic features of limb motion have bugged me for a long time; where the hell could the information be, if not there? I've recently realised the answer; haptic information lives in the kinematic motions of the medium of the haptic array caused by the kinematics of limbs during dynamic touch activities. Geoff was working one level too high up, and the dynamic touch people are working one level too low. 

The medium of the optic array is light; the medium of the acoustic array is the atmosphere. Turvey & Fonseca (2014) propose the hypothesis that the medium of the haptic array involves muscles and the web of connective tissues that bind them to the skeleton (that paper was actually part of a special issue on tensegrity analyses of haptic perception). They then further propose that the most perceptually appropriate way to characterise the physical properties of this medium is as a tensegrity (specifically, a multifractal tensegrity, in which multiple tensegrities are nested within and coupled to each each other). In the same way that visual information is implemented by the physical properties of light, haptic information must be implemented by the physical properties of the haptic tensegrity. 

In order to understand haptic perception of dynamical properties, we need to characterise how wielding limbs and objects with those properties affects not just the kinematics of the limb, but then how those altered kinematics affect the kinematics of the haptic array. 

Where I'm At Now

I am at the early days of this. I have read Turvey & Fonseca a couple of times and, like with all of Turvey's papers, I think he's right but I don't yet understand the full awesomeness I'm sure is in there. So the tensegrity papers are at the top of my reading list for any haptic perception project. 

The basic idea of a tensegrity structure sounds like an ideal formalism. To quote Wikipedia, 
Tensegrity, tensional integrity or floating compression is a structural principle based on the use of isolated components in compression inside a net of continuous tension, in such a way that the compressed members (usually bars or struts) do not touch each other and the prestressed tensioned members (usually cables or tendons) delineate the system spatially
Here is a video of Buckminster Fuller explaining this idea, here is a video of Tom Myers explaining this idea in the context of the connective tissue of the body, and here is a video of Turvey explaining the idea in the context of haptic perception. These are the three stages of this hypothesis being developed; from a physical principle for constructing things in a certain way, to the hypothesis that the fascia of the body are constructed this way, to the hypothesis that the medium of haptic perception is therefore constructed this way. You can make tensegrity structures yourself pretty easily (e.g. see this video for instructions). 

Based on the egocentric constraint on coordination dynamics, my proposal of haptic perception of relative phase over muscle homology requires that that haptic perception must happen in an egocentric frame of reference, and so the medium of haptic perception must be egocentric. The tensegrity hypothesis fits this bill, as far as I can tell. I still have to run many experiments on the haptic perception of relative phase (I need to replicate the whole suite of visual judgement studies Geoff did, plus a series that pits haptic vs visual perception against one another), but assuming haptic perception of coordination is egocentric, the tensegrity analysis will be the place to go to characterise the actual information variables that are being used. I'd like to write a grant for this project; it requires some equipment and a lot of time. This is me getting my head into the game to get ready for that.

Beyond this, I do not have any further specific details worked out. Haptic perception is going to be a damned tough problem, not least of which is the intimidating mathematics of nested multifractal tensegrity systems (as opposed to the still hard but more obviously tractable geometry of the optic array). I'm going to need some heavy hitting support on that part, so feel free to join in - please! :)

Summary

In order to properly characterise the nature of haptic information variables, we need to characterise what happens to the multifractal tensegrity haptic medium when we move and mechanically interact with dynamical properties of objects such as the inertia tensor. I want to get into this in the context of coordination dynamics because that's where my interest and expertise lies, but the other obvious domain is dynamic touch. I'd love to see work developing this hypothesis. This work will not only be a rich investigation of haptic perception, but it will help improve all the theory work that currently depends heavily (but inappropriately) on the dynamic touch literature. I have in mind here the direct learning model of Jacobs and Michaels, 2007, which I like but currently spends all it's time talking about attuning to moments of inertia as information variables. The ecological approach has been heavily visual for lots of good pragmatic reasons for most of it's life, but this is a rich and as yet untapped vein of perception-action research waiting to happen. 

1 comment:

  1. I had a stroke (ischaemic)...actually two...at the start of 2015. Reading your post here prompts me to think I may be able to contribute to your perception-action research. Pete A.

    ReplyDelete