next up previous
Next: Bibliography Up: Computation, Dynamics and the Previous: Theory of Cellular Automata


Phase-Transitions and Computation

The Phase-Transition

As discussed in the previous section, there are two primary regimes of rules, periodic (Class II) and chaotic (Class III), separated by a third, transition regime (Class IV). Schematically, this appears as Langton's famous egg diagram:

Figure: Schematic drawing of CA rule space indicating relative location of periodic, chaotic, and ``complex'' transition regimes.

The transition regime has proven to be made of a complicated structure. In some experiments with the $ \lambda $ parameter, the transition regime acts as nothing more than a boundary line between periodic and chaotic behavior. Crossing over this line (increasing or decreasing $ \lambda $) produces a discrete jump between behaviors. Quantitative analysis shows a discrete jump in statistical measures9 of the kind usually associated with first-order transitions10. In other experiments with the $ \lambda $ parameter, the transition regime acts as a smooth transition between periodic and chaotic activity. This smooth change in dynamical behavior was described in the experiment discussed in section [*]. This type of transition is primarily second-order11, also called a critical transition.

To summarize, giving a qualitative description of CA behavior as the $ \lambda $ parameter is varied produces the full spectrum of dynamical behaviors. Furthermore, the pattern of transient length (a complexity metric) forms a sort of butterfly pattern; the transition regime acting as the butterfly's body. Thus, we can order the dynamical behaviors with respect to their $ \lambda $-distance from the critical transition regime. This is precisely what one expects to see as one approaches and passes through a second-order phase-transition. Typically, physical systems which pass through this type of phase-transition have transients that diverge to infinity and behavior becomes maximally complex and unpredictable. Thus, Langton hypothesized that it is this phase-transition structure which ``is responsible for the existence of most of the surface-level features in the diverse phenomenology of CA behaviors.''

Using this notion of a phase-transition, we describe12 the existence of and relationship between the four Wolfram classes in Figure [*].

Figure: Schematic drawing of CA rule space showing the relationship between the Wolfram classes and the underlying phase-transition structure.

CA and Computation Space

A natural question to ask is whether or not this phase-transition structure underlying CA ``phenomenology'' bares any resemblance to the ``phenomenology'' of computations. In other words, we would like to find CA behaviors that parallel the behaviors we see in computation. We already know that CAs can form logical universes in which we may construct universal computers. Hence, we look for analogues of computability and complexity.


The phase-transition structure tells us that near a critical $ \lambda $ value, the complexity of CA behavior rapidly increases. What this means is that it becomes much more difficult to characterize the long-term behavior of CAs near this critical value. This is primarily due to the increased length of transients. For relatively13 low values of $ \lambda $, it becomes quickly clear that the system under observation is heading towards long-term periodic behavior. For relatively high values of $ \lambda $, it becomes quickly clear that the system under observation is heading towards long-term chaotic behavior. However, for $ \lambda $ near the critical value, either of these long-term outcomes is possible. Coupled with very long transient lengths, it becomes effectively undecidable as to whether a given rule set and starting configuration will tend towards periodic or chaotic behavior. A genesis of the Halting Problem14!


Complexity classes in computation refer to the time it takes for a halting computation to complete with respect to the size of its input. Typically, these classes have names such as constant, polynomial, and exponential, referring to the class of algebraic formula representing the relation between input size and computation time. The time it takes for a computation to complete can be thought of as the transient time leading to the fixed-point state of solution. Thus, the complexity classes of computation are analagous to the structures produced in answering the question: Given a rule set and starting configuration, how long does it take for the CA to exhibit ``typical'', long-term behavior.

The ``Deep-Structure'' of Computation

We have been using the notion of an underlying phase-transition structure as an analogy for discussing the relation between computational and dynamical behaviors. In fact, what we want to show is that by assuming this underlying phase-transition structure, we are provided with ``a simple and straightforward explanation for the existence of, and relationship between, many significant features of the phenomenology of computation''15. (See Figure [*]).

Figure: Schematic drawing of computation space, indicating relative location of halting, non-halting, and undecidable regimes.
% latex2html id marker 1650

Langton gave the following four arguments for making this assumption:

  1. The existence of computability classes is explained by the separation between ordered and disordered regimes being dominated by a phase-transition.
  2. The existence of undecidability is explained by transients diverging to infinity on approach to a second-order phase-transition. This gives rise to the Halting Problem.
  3. The existence of complexity classes is explained by the divergence of transient length on approach to a second-order phase-transition.
  4. The existence of universal computation is explained by the quality of ``susceptibility''. Langton describes this quality as self-sensitivity to minute details of internal structure and external perturbations. ``This self-sensitivity in the vicinity of a critical transition is manifested in universal computations as their 'programmability'.''16
Thus, by assuming computation space has an underlying phase-transition structure, we are able to produce exactly those behaviors expected in a study of computational behavior.

Some Implications

Arising out of the existence of a fundamental connection between computation and phase-transitions is a wide variety of beautiful and powerful implications. We briefly discuss the implications introduced at the end of Langton's thesis.

Phase-Transitions and Information Processing

There is a fundamental equivalence between the dynamics of phase-transitions and the dynamics of information processing. This means that computation can be thought of as a special case of phase-transition phenomena and that the theory of computation can be used to explain phenomena in the theory of phase-transitions. Thus, we may look to the natural systems around us for emergent, information processing capabilities.

Material Phases as Dynamics

We all know the three phases of matter: solid, liquid and gas. However, there exists a critical temperature at which the surface between the gas and liquid phases disappears. The resulting matter is known as a supercritical fluid [3]. Thus, we may say that there exists two phases of matter: solid and fluid. What is very interesting is that even though the CA experiments existed in a non-physical substrate, we still ended up with two primary regimes of behavior: the static, solid, dynamics of fixed-point and periodic structures and the non-static, fluid, chaotic dynamics. Langton hypothesizes that the categories, solid and fluid, are not merely material qualities but instead are dynamical qualities separated by a critical transition.

If the phases of matter can indeed be generalized as dynamical qualities, then the difference between a natural and artificial system seems less real. For, if the behavior of matter can be reproduced in a computer world ``then it is only a matter of organization to turn 'hardware' into 'wetware' and, ultimately, for hardware to achieve everything that has been achieved by wetware, and more.''17

Dynamics and Life Support

If we were to look for a beginning of life, we are tempted to focus on the phase-transitions that were occurring on pre-biotic earth. From them would have emerged information processing abilities. Those abilities may have become coupled with one another to form complex, meta-stable information structures in the physical substrate of earth (the primordial ``soup''). Those structures which gained a local control over their parameters survived, while others were pushed passed the event horizon of an attractor. In this sense, the ``living'' systems are precisely those systems that are able to avoid attractors. Evolution is then a repeated iteration whereby a system climbs out of one attractor into a higher dimensional phase-space, only to find itself heading for a higher dimensional attractor; a Red Queen18.

Looking at the processes and structures of living cells, we see the same sort of phase-transition conditions. A particularly good example of this is the dynamics of the olfactory bulb which during rest takes on low-amplitude chaotic activity and during inhalation falls into the basin of a periodic attractor. Thus, the olfactory bulb spends a good portion of its time in the transition regime between chaotic and periodic behavior. (See Figure19 [*]).

Figure: Comparison between schematic of CA rule space and bifurcation diagram of the phase portrait of olfactory bulb states.


That a phase-transition structure is underlying CA state space is not just a convenient analogy for describing how behavior varies with the $ \lambda $ parameter. By assuming this structure we are provided with powerful means for explaining the existence of many of the fundamental features of computational theory by allowing them to naturally arise out of a dynamical systems context. Conversely, the dynamical behaviors that are found in dynamical systems near a critical transition can be explained in terms of the computational abilities they may harbor. Thus, we are presented with new ways in which we may analyze and reanalyze both the natural and artificial systems surrounding us.


... measures9
Langton utilized theories from thermodynamics and information theory to analyze the changes in dynamical activity of evolving CAs as the $ \lambda $-parameter varied. In particular, complexity was measured using Shannon entropy which Langton first derived from Boltzmann entropy.

Average mutual information was plotted against the single cell entropy. This plot gave away that the transition regime optimized system entropy to handle both storage of information (which reduces entropy) and transmission of information (which increases entropy). The plot took on the same shape that is characteristic of thermodynamic phase-transition graphs; particular, the $ \lambda $-transition [3].

... transitions10
A transition for which the first derivative is discontinuous with respect to some, usually physical, quantity, such as temperature.
A transition for which the second-derivative is discontinuous with respect to some, usually physical, quantity, such as temperature.
... describe12
This figure emphasizes that any scheme which attempts to partition CA rule space into a small set of qualities will inevitably be only a low resolution rasterization of the underlying, continuous reality.
... relatively13
With respect to the critical $ \lambda $ value.
... Problem14
See section [*].
... computation''15
Chris Langton in his thesis, p. 130. [6]
... 'programmability'.''16
Chris Langton in his thesis, p. 129. [6]
... more.''17
Chris Langton in his thesis, p. 135. [6]
... Queen18
In Lewis Carol's, Through the Looking Glass, the Red Queen is seen running as fast as she can, but does not get anywhere because the scenery moves with her. This notion is taken up later as an analogy to evolution in Matt Ridley's, Red Queen: Sex and the Evolution of Human Nature.
... Figure19
Image from's Chaos and Neurodynamics [12].

next up previous
Next: Bibliography Up: Computation, Dynamics and the Previous: Theory of Cellular Automata
Jeremy Avnet (brainsik); Senior Thesis, Mathematics; University California, Santa Cruz; 6th June 2000; email: jeremy (at)