Thursday, 1 November 2018

Where is the Haptic Information? (A Purple Peril)

Haptics (or proprioception) is the sensory modality built into our bodies; it's provides constant information about the state of the body and things it is in mechanical contact with, such as tools. Many ecological psychologists (myself included) have investigated haptic perception and it's role in the control of action, but unlike the optic array, we have basically zero work identifying what the relevant information variables look like. 

I first investigated haptic perception in the context of coordinated rhythmic movements (Wilson, Bingham & Craig, 2003). Geoff had run studies showing that visual judgements of different relative phase varied in stability in the same way that the production of those relative phases does. This suggested that the movement phenomena were being caused by the way relative phase is perceived. This was vision, however, and the movement phenomena obviously involve motion of the body and the haptic system. This involvement was typically explained in terms of muscle homology and neural crosstalk effects. Our study had people track manipulanda that moved up and down one of three mean relative phases with various levels of phase variability added, and had them make judgements of that variability (replicating the visual studies). We found haptic perception of relative phase, as measured by those judgements, behaved just like visual perception of relative phase - we inferred that the information, the relative direction of motion, can be detected by both systems and has the same effects. 

I am moving back into the haptic information world for two related reasons. 

First, I want to replace the muscle homology/neural crosstalk stories with a haptic perception story. The effects these theories account for are very large and reliable, and Geoff's perception-action model currently only applies to visual information. Specifically, muscle homology applies to relative phase defined in an egocentric (body centred) frame of reference, while Geoff's model applies to relative phase defined in an allocentric (external) frame of reference. Relative phase is clearly detected in both frames of references; when they are pitted against one another experimentally, both matter and the egocentric effects dominate (e.g. Pickavance, Azmoodah & Wilson, 2018).

Second, I have become interested in individual variation in the variables used to perceive relative phase. Based on his data, Geoff's model predicts relative phase is perceived via the information variable relative direction of motion, the detection of which is modified by the relative speed of the oscillators. In Wilson & Bingham (2008; blog post), we showed this was true in 7 out of 10 untrained participants judging 0° and 180°. The other three became unable to judge these phases when we perturbed another candidate variable, relative position. This experiment also showed that people trained to perceive 90° had improved because they had switched to this variable, but we were not expecting people at the other relative phases to be using this variable. I'm finally getting back into experiments probing the prevalance of this individual difference in visual information use and the consequences for perception-action stability (briefly: there's a lot of variation and it matters!). As part of the above project, I want to do the same kinds of studies on haptic perception too. 

My problem here is, there is essentially no information in the literature on the nature of haptic information variables. This Peril lays out my current hypothesis about where to look; please, dear God, come help me!

Tuesday, 25 September 2018

Tolerance, Noise and Covariation in Skilled Action

The field of motor control has been recently steadily moving towards the idea that there is no such thing as an ideal movement. The system is not trying to reliably produce a single, stable, perfect form, and movement variability has gone from being treated as noise to being studied and analysed as a key feature of a flexible, adaptive control process. This formalises Bernstein's notion of 'repetition without repetition' in movement, and recognises that the redundancy in our behavioural capabilities relative to any given task allows for multiple solutions to that task being legitimate options. 

There are many new analysis techniques within this 'motor abundance' framework, and I've reviewed most of them already; uncontrolled manifold analysis, stochastic optimal control theory and goal equivalent manifolds are the three big ones, as well as nonlinear covariation analysis. The essence of all these methods is that they take variability in the execution or outcome of a movement, and decompose that variability into variability that does not interfere with achieving the outcome and variability that does

This post will explain the variability decomposition process in Sternad & Cohen's (2009) Tolerance, Noise and Covariation (TNC) analysis, which my students and I are busily applying to some new throwing data from the lab. I have talked a little about this analysis here but I focused on the part of the analysis that involves a task dynamical analysis identical to the one I did for my throwing paper in 2016. In this post, I want to explain the TNC analysis itself. I will be relying on Sternad et al, 2010, which I've found to be a crystal clear explanation of the entire approach; you can also download Matlab code implementing the analysis from her website.  

Sunday, 15 July 2018

You Cannot Perceive a Relational Affordance (A Purple Peril)

One of the more enduring arguments in ecological psychology is about the best way to formally describe affordances. The two basic approaches are that they are dispositions (Turvey, Scarantino, me) or that they are relations (Reitveld, Kiverstein, Chemero). The argument has mostly settled down into just agreeing to disagree, but I am still convinced that the relational analysis is critically flawed and I want to try and either get them to solve the problem or end the debate once and for all. I've reviewed this in a bunch of places (e.g. here, here, and here)  but this post is just setting out my challenge once and for all; you cannot perceive a relational affordance, and there is as yet no good story about how to learn new affordances.

My problem stems from this Gibson (1979) quote (we all have our favourite, but this one seems to cut to the heart of it)
The central question for the theory of affordances is not whether they exist and are real but whether information is available in ambient light for perceiving them.
Right now, the affordances-are-relations camp have no story for how these can structure light (or other energy media) and therefore create information about themselves. They are therefore, as currently formulated, not even in principle perceptible. This means affordances-as-relations is of zero use to the ecological approach. 

Bruinberg et al (2018) tried to address this problem, but as I blogged here their solution is not ecological information and it reveals that these authors do not as yet understand what information actually is. My challenge is therefore this: tell me a story in which affordances-as-relations are able to create ecological information in energy arrays, and might therefore be learned, and the debate will be back on. Until then, affordances-as-dispositions is the only account that formalises the right properties and the debate is over. 

Thursday, 31 May 2018

The Evolution of Sex Differences in Throwing

One of the most robust sex differences occurs in throwing. Men can throw (on average) much faster and therefore much farther than women, and this gap even exists at comparable levels of sports such as baseball and softball. The most common explanations are that a) men are, on average, larger and stronger than women, and b) most cultures gender throwing activities as male, leading to earlier acquisition and much more practice. YouTube has plenty of videos of men throwing with their off hand that point to the critical role of learning. 

However, Lombardo & Deaner (2018; L&D) have just published a hypothesis that while these factors are at play, they rest on top of an underlying biological advantage and that 'throwing is a male adaptation'. Specifically, they claim that there has been greater selective evolutionary pressure on men (as compared to women) to develop the strength, skills and anatomy needed to throw for large distances and great accuracy. Men have evolved to be better throwers than women.

This post will briefly review the hypothesis and the evidence, and then come to two conclusions. First, many of the differences they discuss seem quite closely aligned to the cultural sex differences around throwing that we know exist and so may not be biologically innate. Second, and more importantly, there may not even be a throwing-specific sex difference to explain. Right now, the only clear finding is that men throw faster; but they are also (on average) stronger and larger for non-throwing reasons. There is, as yet, no clear evidence that men are better throwers. I will then review some recent data of my own that suggests when the full perception-action task dynamic is analysed in closer detail, trained women show every sign of being equally skilled throwers as trained men.

Friday, 27 April 2018

The Ecological Approach to Virtual Reality

As virtual reality (VR) gear gets better, cheaper, and easier to use, there is renewed interest in trying to figure out how best to make a virtual environment feel real.  The typical for framing for VR is in the name: it's about creating the illusion of reality, a virtual world. Programmers even talk this way; they describe creating virtual (pretend) environments, objects, spaces, etc. From this point of view, VR is an attempt to create a stable illusory experience by faking a world that doesn't really exist. 

Of course, VR programmers don't make worlds; they make information. This makes folding VR into the ecological approach a natural move, and I propose that ecologically, VR development is actually an attempt to design an optic array containing information that can support certain behaviours. It's less virtual reality, and more virtual information. This is important because the nature of the information we are using explains the form of the behaviour we are controlling. Your goal as a developer is therefore not to create tricks and illusions, but to provide information capable of supporting the behaviours you want to be possible,

As a first step towards an ecological understanding of VR, I will first follow the path Gibson laid down taking the science of perception away from illusions and towards information. I'll then think about some of the implications of taking an ecological approach for VR design. Virtual reality needs our theory of perception to become the best it can possibly be, and I hope that this post serves as an entry point for designers to become aware of what we have to offer them.

Tuesday, 17 April 2018

Affordance Maps and the Geometry of Solution Spaces

I study throwing for two basic reasons. One, it is intrinsically fascinating and I want to know how it works. Second, it's become a rich domain in which to study affordances, and it is really forcing me to engage in great detail with the specifics of what these are.

My approach to affordances is that they are dynamical properties of tasks, which means that in order to study them, I need to be able to characterise my task dynamics in great detail. I developed an analysis (Wilson et al, 2016) to do this, and I also have a hunch this analysis will fit perfectly with the motor abundance analyses like UCM (Wilson, Zhu & Bingham, in press). I have recently discovered that another research group (led by Dagmar Sternad) has been doing this whole package for a few years, which is exciting news. Here I just want to briefly summarise the analysis and what the future might hold for this work.

Thursday, 1 March 2018

General Ecological Information Does Not Support the Perception of Anything

One common critique of the ecological approach is how can we use perception to explain behaviour that is organised with respect to things in the world that aren't currently in our area? How do we plan for future activities, or how do we know that the closed fridge has beer? 

A recent attempt to get ecological about this comes from Reitveld & Kiverstein (2014) who propose a relational account of affordances that enables them to talk about opportunities for more complex behaviours. This account has developed into the Skilled Intentionality Framework (e.g. Bruineberg & Rietveld, 2014), where skill is an 'optimal grip' on a field of task-relevant, relational affordances. 

I have always had one primary problem with this programme of work - I don't believe that they can show how these affordances create information and thus can be perceived. I discuss this here and here, and there's comments and replies for Rietveld and Kiverstein there too. You can indeed carve the world up into their kind of entities, but if they don't create information then they cannot be perceived and they are irrelevant to behaviour. 

I was therefore excited to see a new paper from the group called 'General ecological information supports engagement with affordances for ‘higher’ cognition' (Bruineberg, Chemero & Rietveld, 2018; hence BC&R). There is a lot of excellent work in here; but their proposal for a general ecological information is, in fact, neither ecological nor information. It is a good way of talking ecologically about conventional constraints on behaviour, but it doesn't make those perceivable and so the main thesis of the paper fails.