Monday 19 April 2010

What else could it be? The case of the centrifugal govenor.

Previously, I’ve dismissed the idea of mental representation because 1) no one knows what a representation is and 2) the arguments for representation tend to be pretty weak. Now, I’d like to spend a bit of time discussing a possible alternative – a dynamical systems approach to cognition. To frame this discussion, I’m going to summarise a very handy philosophy paper by Van Gelder (1995) in which he distinguishes between a computational and dynamical solution to a particular problem (see also Andrew's post on the polar planimeter). Van Gelder has clearly picked a side - that cognition emerges from dynamical systems and that cognitive processes are evolutions in the state-space within these systems. One of the main arguments for computation is that it’s difficult to imagine what else could be going on (see footnotes p. 346 for references for this argument). So, Van Gelder wrote this paper,  not to decisively rule out computation, but to provide an answer to the question “what else could [cognition] be?”


A major 18th century engineering problem was reconciling the oscillation of pistons with the rotation of flywheels. Driving a flywheel lets you generate rotative motion, rather than just pumping motion that results directly from piston. In other words, figuring out how to power a flywheel with pistons lets you power a wide range of machines. One trick is getting the flywheel to turn with uniform speed. This feature was particularly desirable for the cotton industry, which wanted to replace horse and water power to run its spinning and weaving. However, flywheels vary in response to the current steam pressure and to the overall engine workload. And, both of these factors, themselves, are variable. A throttle valve allows one to change the pressure of the steam, and therefore control the speed of the flywheel. But, this valve would have to be adjusted by just the right amount at just the right time to keep the speed uniform.

The computational solution

There are two kinds of solutions to this problem. One solution would require something or someone to measure the state of the system at various points in time and adjust the valve by a certain amount in response to those measurements. This is an algorithmic solution. If the steam pressure is x and the workload is y, then adjust the valve by z. The first characteristic of this type of solution is that it proceeds in stages. The first stage takes measurements. The second stage applies a rule based on those measurements. It’s easy to imagine a person doing this job. Someone is trained to measure pressure and workload and to adjust the valve in response to the state of those variables. However, people are slow. Because this solution relies on two stages, there is necessarily some time lag between measurement and correction. Depending on the duration of this lag, the correction might be inappropriate for the current state of the system. This will be a source of error in maintaining a constant flywheel speed. If this problem were being addressed today, rather than in the 18th century, then we could imagine using a computer to measure the steam pressure and workload. This computer could be attached to a mechanical arm that implemented the valve adjustments. So, the error caused by the lag between measurement and adjustment could be minimised, but as long as the solution involves two steps, this source of error would still exist. This type of solution necessitates an executive – someone or something to take account of the state of the system (e.g., “if x”) and then to carry out the appropriate action (e.g., “then y). It also necessitates measuring the difference between things. For instance, the only reason to adjust the valve is if the current speed differs from the speed a second ago.

The dynamic solution


There is another, radically different way to solve this problem, that is to couple the opening of the valve to something that necessarily varies in response to steam pressure and workload in a way that results in constant flywheel speed. By ‘necessarily’ I mean that the physical properties of this thing respond to changes in steam pressure and workload in a particular way. Such a solution responds in one step and does not require measurement. Thus, there is no time lag problem or concomitant source of error. Nor does it require an executive. In our current example, the thing that responds to both steam pressure and workload is the speed of the engine itself. Hitching the valve control to the flywheel couples the thing you want to control (the valve opening) directly to the thing that embodies the relevant sources of variance (the flywheel speed). So far, so good. The remaining problem is connecting these objects with the proper relation. In this case, we want a negative relation between flywheel speed and valve opening - increased flywheel speed should close the valve, while decreased speed should open it. The solution is to attach weighted arms to flywheel-driven spindle and to link the valve control to these arms (see pic). As the spindle turns, the arms raise and lower according to the centrifugal force generated by the speed of rotation. Faster spinning raises the arms, which pulls the valve closed. Slower spinning lowers the arms, which allows the valve to open. The degree of valve opening / closing varies continuously in response to the height of the arms. The result is uniform flywheel speed linked to oscillating pistons. This beautiful solution is implemented in the Watts centrifugal governor, and 18th century piece of technology that still works brilliantly.

Consequences for representation

The computational solution relies on discrete computational symbolic representations (for more info about discrete computational representations see here and here). For instance, it has to measure and represent flywheel speed, steam pressure, and workload using abstract symbols. Then, it has to apply operations to these symbols in order to calculate how to adjust the valve. This output is a representation that causes the appropriate adjustment to be made (via computer or human worker). Clearly, this solution proceeds in an ordered sequence. First measure, then apply algorithm, then adjust. It is also cyclic – after running through this sequence, you must goes back to the beginning and runs through it again.

The second solution, the one that describes the actual centrifugal governor, is nonrepresentational. Some (e.g., Dietrich & Markman) would argue that the angle between the weighted arms and the spindle represents flywheel speed. Van Gelder is adamant that this is not, in fact, a representation. He argues that a representation has to “stand[] in for some...state of affairs thereby enabling the system to behave appropriately with respect to that state of affairs” (p. 351). Using this loose definition, a good rule of thumb for deciding whether a system contains representations is to ask yourself whether the system makes more sense if you think about it in representational terms. If representations don’t seem explain anything for that system, then why assume that it uses representations? By this definition, it’s clear that the centrifugal governor does not contain representations. In fact, it was possible to describe the whole device without referring to the angle between the arm and the spindle – the supposed representation of speed. It doesn’t matter that the arm angle correlates with engine speed. Lots of things correlate without being representations. Ice cream consumption correlates with accidental deaths, but ice cream consumption doesn’t represent death. I’m pretty sure. Furthermore, for the centrifugal governor, the correlation between arm angle and flywheel speed is only part of the story. Importantly, the arm angle doesn’t change in a 1:1 fashion with flywheel speed. For instance, if the flywheel suddenly decelerates, the arms will decelerate slowly, in response to gravity. This nonlinear relationship ensures that the valve is adjusted smoothly, even when speed changes abruptly.

Because the centrifugal governor is nonrepresentational it is also noncomputational. Although the relationship between the arms, the flywheel, and the valve opening can be described mathematically, the relationship does not apply algorithms to discrete, symbolic representations. There are no steps during which one representation is transformed into another. Input and output are simultaneous.

Another difference between the computational solution and the centrifugal governor, concerns their relationship to time. In the computational solution, practical considerations dictate that the time between measurement and adjustment should be relatively brief. However, there is no necessary temporal relationship between steps. The computation could take 10 years to complete as long as this was sufficient to keep the flywheel spinning at a constant speed. However, in the centrifugal governor, the motion of the arms depends on the simultaneous speed of the engine.

Because these two solutions are so very different, different conceptual frameworks are used to describe them. The first solution is well described by algorithms and the conceptual framework of computer science. The second solution is described by dynamics. In particular, we need to understand how the arm angle changes over time. This depends most directly on the speed of the flywheel. In other words, we need to define arm angle as a function of flywheel speed. More completely, this angle will depend on its current state, how it is currently changing, and the engine speed (see equation on p. 356). Certain properties of this function stay fixed, such as the length of the pendulum and the gravitational constant. If we know what the current arm angle is, then we can solve for instantaneous acceleration. But, we also want to know what the arm angle will be one second from now. This requires finding a general solution that describes arm angle with respect to time, as well. Now, we also want to account for feedback – that is, how the arm angle affects the engine speed. To do this, we need to think of the engine, itself, as a dynamical system that is governed by a differential equation. In this system, arm angle will be a parameter, because it controls the valve opening. So, we have a case where we’re describing arm angle in terms of engine speed and we’re describing engine speed in terms of arm angle (this means that arm angle and engine speed are coupled). Furthermore, we can describe arm angle as a function of time – that is, we can see how arm angle evolves over time.

Consequences for cognition

While cognitive psychologists are often happy to admit that dynamical systems do a good job describing some systems like the centrifugal governor, they are hesitant to admit that dynamics might also characterise complex cognitive behaviour. Van Gelder provides an example of how one cognitive phenomenon, decision making, can be modelled as a dynamical system. Decision making is a good example, because it is canonically described with a discrete computational model. According to prospect theory, for instance, we order possible outcomes of a decision - which are losses and which are gains - and then compute the utility of each outcome, selecting the one with the highest utility (Kahneman & Tversky, 1979). This is obviously a scaled-down version of the theory, but you can see that it clearly depends on discrete representations (e.g., of each option’s utility) and computation (e.g., calculating which option has the largest utility value). However, it is also possible to describe decisions in terms of state space evolution in a dynamical system. For example, motivational oscillatory theory (MOT; cf. Townsend) describes oscillations resulting from satiation of persisting desires. We approach food when we’re hungry, but not when we’ve just eaten and are temporarily satiated. It’s possible to interpret this behaviour as a decision – when I’m hungry, I decide to eat. But, in this model there are no discrete states and no algorithmic processes effecting transformation on these states. There is just the evolution of the system over time. Furthermore, peculiarities in human decisions that cannot be accounted for by utility theory (e.g., the common consequence effect http://en.wikipedia.org/wiki/Allais_paradox) emerge naturally in the dynamic framework.
The ability to describe the governing problem in two completely different ways is a good illustration of the choice facing cognitive psychology. The computer/information processing model is so embedded in most people’s conception of cognition that it can feel impossible to characterise the system any other way. However, Van Gelder outlines a clear and promising, nonrepresentational alternative based on dynamical systems. In the future I’ll tie this discussion back to Gibson and describe why this is the type of solution animals, who have to solve real problems related to survival, are likely to have evolved (see also here).


Kahneman, D. & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47, 263-291.

Van Gelder, T. (1995). What might cognition be, if not computation. The Journal of Philosophy, 92 (7), 345-381.

3 comments:

  1. Just came across this. It is a very good summary of the perspective laid out in that 1995 paper.

    ReplyDelete
  2. Wow, thank you so much for this!
    I'm a molecular biologist and we recently have a revolution in the field in that we are starting to analyse the "behavior" of cells dynamically in detail (starting to understand how what we, as humans, perceive as "behavior" arises from systemic interaction and fluctuations), and this is such a good article, you explain everything so clearly!

    I mean, it is, in principle, possible to describe the behavior of single cells computationally (algorithmically) if we desired to do so (like they do in psychology for single humans), but for psychogists, who, for a very long time, tried to postulate that only humans are capable of complex behavior and bla-bla-bla have a soul full of Platon's "ideas" this is ridiculous: bacteria? "Behave"? No no no, they are merely "responding to environment".


    Nevertheless, what bacteria do is just as complex as what people do - psychologists just can't see them thinking (and so dismiss what is too small and non-studiable for them as "primitive"). But for biologists bacteria are not primitive, never were, never will be. Even viruses are so complex in their behavior that there isn't a single one we actually 100% understand.

    And recently there are so many papers (and more to come) stating that single cells possess very complex "behaviour" that can described in the same terms we describe human behavior - cooperation, cheating etc. (even memory, and even memory passed through generations, see for example http://www.pnas.org/content/113/15/4224.abstract)

    The difference is that historically bacteria have been described in terms of "what molecules do what so that it results in these events". So the sequence of events in bacterial world (and in cell communities, including human cells in a dish) are described as fluctuations within systems rather then as computations carried out by some entity (an executive), or, in other words, as a result of some "computation between representations of real world inside of a bacterium".

    For bacteria we understand mechanisms clearly enough to say for sure there aren't any "representations" and they are not necessary for a bacterium to, say, move towards an object that it "likes" or move away from something it "dislikes", or express altruism, or commit suicide. The representation are simply not necessary. A cell does not need to have an internal "map" about the world, or, in case of the whole organism, about the state of the whole organism, to make decisions to behave the way it behaves. And I don't think humans have these maps either.

    In short, again, thanks a lot for this article, for me this is just a hobby and of course I am not a professional and might not even understand properly what you want to say but I just had so much fun reading this that I really want to say thank you.

    ReplyDelete