As discussed in the previous section, there are two primary regimes of rules, periodic (Class II) and chaotic (Class III), separated by a third, transition regime (Class IV). Schematically, this appears as Langton's famous egg diagram:

The transition regime has proven to be made of a complicated structure. In some experiments with the parameter, the transition regime acts as nothing more than a boundary line between periodic and chaotic behavior. Crossing over this line (increasing or decreasing ) produces a discrete jump between behaviors. Quantitative analysis shows a discrete jump in statistical measures^{9} of the kind usually associated with firstorder transitions^{10}. In other experiments with the parameter, the transition regime acts as a smooth transition between periodic and chaotic activity. This smooth change in dynamical behavior was described in the experiment discussed in section . This type of transition is primarily secondorder^{11}, also called a critical transition.
To summarize, giving a qualitative description of CA behavior as the parameter is varied produces the full spectrum of dynamical behaviors. Furthermore, the pattern of transient length (a complexity metric) forms a sort of butterfly pattern; the transition regime acting as the butterfly's body. Thus, we can order the dynamical behaviors with respect to their distance from the critical transition regime. This is precisely what one expects to see as one approaches and passes through a secondorder phasetransition. Typically, physical systems which pass through this type of phasetransition have transients that diverge to infinity and behavior becomes maximally complex and unpredictable. Thus, Langton hypothesized that it is this phasetransition structure which ``is responsible for the existence of most of the surfacelevel features in the diverse phenomenology of CA behaviors.''
Using this notion of a phasetransition, we describe^{12} the existence of and relationship between the four Wolfram classes in Figure .

A natural question to ask is whether or not this phasetransition structure underlying CA ``phenomenology'' bares any resemblance to the ``phenomenology'' of computations. In other words, we would like to find CA behaviors that parallel the behaviors we see in computation. We already know that CAs can form logical universes in which we may construct universal computers. Hence, we look for analogues of computability and complexity.
The phasetransition structure tells us that near a critical value, the complexity of CA behavior rapidly increases. What this means is that it becomes much more difficult to characterize the longterm behavior of CAs near this critical value. This is primarily due to the increased length of transients. For relatively^{13} low values of , it becomes quickly clear that the system under observation is heading towards longterm periodic behavior. For relatively high values of , it becomes quickly clear that the system under observation is heading towards longterm chaotic behavior. However, for near the critical value, either of these longterm outcomes is possible. Coupled with very long transient lengths, it becomes effectively undecidable as to whether a given rule set and starting configuration will tend towards periodic or chaotic behavior. A genesis of the Halting Problem^{14}!
Complexity classes in computation refer to the time it takes for a halting computation to complete with respect to the size of its input. Typically, these classes have names such as constant, polynomial, and exponential, referring to the class of algebraic formula representing the relation between input size and computation time. The time it takes for a computation to complete can be thought of as the transient time leading to the fixedpoint state of solution. Thus, the complexity classes of computation are analagous to the structures produced in answering the question: Given a rule set and starting configuration, how long does it take for the CA to exhibit ``typical'', longterm behavior.
We have been using the notion of an underlying phasetransition structure as an analogy for discussing the relation between computational and dynamical behaviors. In fact, what we want to show is that by assuming this underlying phasetransition structure, we are provided with ``a simple and straightforward explanation for the existence of, and relationship between, many significant features of the phenomenology of computation''^{15}. (See Figure ).

Langton gave the following four arguments for making this assumption:
Arising out of the existence of a fundamental connection between computation and phasetransitions is a wide variety of beautiful and powerful implications. We briefly discuss the implications introduced at the end of Langton's thesis.
There is a fundamental equivalence between the dynamics of phasetransitions and the dynamics of information processing. This means that computation can be thought of as a special case of phasetransition phenomena and that the theory of computation can be used to explain phenomena in the theory of phasetransitions. Thus, we may look to the natural systems around us for emergent, information processing capabilities.
We all know the three phases of matter: solid, liquid and gas. However, there exists a critical temperature at which the surface between the gas and liquid phases disappears. The resulting matter is known as a supercritical fluid [3]. Thus, we may say that there exists two phases of matter: solid and fluid. What is very interesting is that even though the CA experiments existed in a nonphysical substrate, we still ended up with two primary regimes of behavior: the static, solid, dynamics of fixedpoint and periodic structures and the nonstatic, fluid, chaotic dynamics. Langton hypothesizes that the categories, solid and fluid, are not merely material qualities but instead are dynamical qualities separated by a critical transition.
If the phases of matter can indeed be generalized as dynamical qualities, then the difference between a natural and artificial system seems less real. For, if the behavior of matter can be reproduced in a computer world ``then it is only a matter of organization to turn 'hardware' into 'wetware' and, ultimately, for hardware to achieve everything that has been achieved by wetware, and more.''^{17}
If we were to look for a beginning of life, we are tempted to focus on the phasetransitions that were occurring on prebiotic earth. From them would have emerged information processing abilities. Those abilities may have become coupled with one another to form complex, metastable information structures in the physical substrate of earth (the primordial ``soup''). Those structures which gained a local control over their parameters survived, while others were pushed passed the event horizon of an attractor. In this sense, the ``living'' systems are precisely those systems that are able to avoid attractors. Evolution is then a repeated iteration whereby a system climbs out of one attractor into a higher dimensional phasespace, only to find itself heading for a higher dimensional attractor; a Red Queen^{18}.
Looking at the processes and structures of living cells, we see the same sort of phasetransition conditions. A particularly good example of this is the dynamics of the olfactory bulb which during rest takes on lowamplitude chaotic activity and during inhalation falls into the basin of a periodic attractor. Thus, the olfactory bulb spends a good portion of its time in the transition regime between chaotic and periodic behavior. (See Figure^{19} ).

That a phasetransition structure is underlying CA state space is not just a convenient analogy for describing how behavior varies with the parameter. By assuming this structure we are provided with powerful means for explaining the existence of many of the fundamental features of computational theory by allowing them to naturally arise out of a dynamical systems context. Conversely, the dynamical behaviors that are found in dynamical systems near a critical transition can be explained in terms of the computational abilities they may harbor. Thus, we are presented with new ways in which we may analyze and reanalyze both the natural and artificial systems surrounding us.
Average mutual information was plotted against the single cell entropy. This plot gave away that the transition regime optimized system entropy to handle both storage of information (which reduces entropy) and transmission of information (which increases entropy). The plot took on the same shape that is characteristic of thermodynamic phasetransition graphs; particular, the transition [3].