shytone  books  music  essays  home  exploratories  new this month

book reviews



Jack Cohen & Ian Stewart: The Collapse of Chaos:
discovering simplicity in a complex world
(Viking: 1994)


“At the heart of this book lies a paradox. The more we learn about the universe, the more complicated it appears to be, but we have discovered that beneath those complexities lie deep simplicities, laws of nature. How can simple laws explain complex behavior? Where does the complexity ‘come from’? ...[As well,] the universe does not always seem complex. In our daily lives, we experience the world as a simple place - in fact, we would be unable to function if we had to grapple with the complexities as such. So, in order to comprehend our world and humanity’s place within it, we must do more than just explain higher-level complexities in terms of lower-level simplicities. We must explain...where the simplicities of nature come from. The conventional answer is that deep down inside, nature is simple: It functions on the basis of simple laws. Any large-scale simplicities we observe - such as the spiral form of galaxies, or the tendency of a flock of geese to string out in a V - are just the underlying simplicities becoming visible on a higher level. Unfortunately, this answer is no longer convincing. Chaos theory tells us that simple laws can have very complicated - indeed, unpredictable - consequences. Simple laws can produce complex effects. Complexity theory tells us the opposite: Complex causes can produce simple effects. And conventional reductionistic science tells us that inside the great simplicities of the universe, we find not simplicity but overwhelming complexity.”
(Cohen & Stewart, pp.1-2)

The development of science - its techniques, methodologies, and philosophies - has been a long and complex one, with no sign of slowing-down to date. Indeed, the rise of approaches such as chaos and complexity theory in recent times has further broadened science...albeit at the cost of many foolish claims that traditional “reductionist” science is either dead or outmoded - particularly from ill-informed Humanities scholars, who really ought to know better. Of the works written by specialists treating science as a whole - and the proper place of approaches such as chaos and complexity within it - none to my mind surpass this, despite the fact that it is over ten years old, and deals extensively with rapidly-changing areas.

The reason for this is simple. Cohen & Stewart, rather than attempting the usual breathless outline of cutting-edge discoveries, arguments (and personalities) beloved of science journalists, instead offer us something much more fundamental (and particularly useful to outsiders)...a tour through the basic hierarchy of explanation alert to exactly how scientific understanding emerges, precisely what it entails, and how it is likely to develop in the future, as it attempts to grasp more and more complex matters:


The Collapse of Chaos shows how simplicity in nature is generated from chaos and complexity...[in] interaction. We show that the same simple, large-scale features occur in many different complex systems, because patterns of kind do not depend upon detailed substructure. The book is in two parts. The first half is about what science knows - and what it doesn’t. The second half is about how to think about what science knows - and what it doesn’t. The first half is a guided tour of the Islands of Truth that have been mapped by conventional science; the second half is an adventurous and unorthodox dive into the Oceans of Ignorance that surround them.... [In this,] we argue that simplicities of form, function, or behavior emerge from complexities on lower levels because of the action of external constraints. The focus moves from things to rules that govern things.... The final chapter combines content and context into two new concepts: simplexity and complicity. Simplexity is the tendency of simple rules to emerge from underlying disorder and complexity, in systems whose large-scale structure is independent of the fine details of their substructure. Complicity is the tendency of interacting systems to coevolve in a manner that changes both, leading to a growth of complexity from simple beginnings - complexity that is unpredictable in detail, but whose general course is comprehensible and foreseeable.... If either of us were writing this on his own, he would be much surer he was right but (paradoxically) much more cautious in presenting his ideas. Instead, our joint voice knows that it is probably wrong all over the place, but puts its ideas forward with immense confidence...[as] we believe that, even when we’re wrong, we’re constructively wrong - wrong in a more informative way than the orthodox story is right.”
(Cohen & Stewart, pp.1-4)

“Reductionism seeks to explain all patterns in nature, obvious or hidden, as simplexities arising from underlying simplicities. We think that many patterns do not fit this description at all; they are complicities, arising from internal complexities and simplicities under the influence of external complexities and simplicities. Because our brains themselves evolved through complicity between their internal representation of reality and the external reality itself - between their content and their context - they can recognize features, analogies, and metaphors, and see patterns in them.”
(Cohen & Stewart, p.435)

“The reductionist strategy - take it apart, see what the pieces are, understand how they fit together - provides simple explanations for many puzzling complexities. The behavior of atomic nuclei is explained by the properties and interactions of protons and neutrons; and we could have told you how these in turn are explained by the interactions of particles such as quarks, photons, and gluons operating under more exotic rules, but we didn’t want to go further in that direction. The numerology of electron shells explains the chemistry of elements as expressed in Mendeleev’s periodic table. Complementary behavior of electron shells, where an electron ‘missing’ in one atom can be provided by one that is ‘spare’ in the other, explain how they combine together to form molecules; and the Lego-block flexibility of those particular rules makes it clear that chemistry will be a complex area containing enormous diversity. The long-chain chemistry of carbon is a predictable consequence of special features of its ‘self-complementary’ electronic structure - four pimples and four sockets. The resulting complexity of organic compounds opens up the way to the organized complexity of DNA, proteins, and other important biomolecules. The structure and processes of simpler living things are determined by their DNA genetics, and its ability to code for proteins. More complicated life-forms develop through a sequence of stages, each of which is read out from the genetic structure in more or less the same way as happens for simpler organisms; but there is a hierarchical structure of genes that make proteins, genes that regulate those, and so on. The origin of life itself, or of complex life from simple life, is seen to be the inevitable consequence not just of these processes, but also of their inherent imperfections. Mistakes in copying can occur; most of them are disastrous but short-lived, but the occasional improvement flourishes and reproduces. This selection mechanism provides the asymmetry that (usually) drives the evolutionary process toward increasing complexity, both of organisms and of their development.”
(Cohen & Stewart, p.179)

As this virtuoso act of summary suggests, Cohen & Stewart have a genuinely sophisticated understanding of the value of reductionist approaches - and of the knowledge these have produced - which makes their critique of exclusively reductionist thinking both pointed and highly persuasive. Moreover, they have a way with aphorisms any writer might envy, as well as what I might venture to call a fundamentally humanist outlook...perhaps best summarized here, in words that all intellectuals would do well to remember.


“Reality may be a figment of our imagination, as some philosophers argue, but our imagination is definitely a figment of reality.... [And] romanticism alone can seriously damage your mind, but reductionism alone can seriously damage your soul.”
(Cohen & Stewart, p.429-31)

The result is an extremely important book, which shows no sign at all of dating - even if its key coinages have not been commonly accepted. Yet, the reasoning behind them is sound, the writing both clear and entertaining, and the jokes (opening each chapter) marvellously apposite...with more wit scattered throughout, particularly at the expense of the overly reductionistic, but always pointing toward proper understandings:


“Proponents of a Theory of Everything effectively believe that when scientists play Twenty Questions with nature, nature has already chosen the answer. The job of science is to find the unique word in nature’s dictionary that fits every conceivable question we could ask. The actual state of science is quite different: More often than not we get ourselves into a conversation something like this:

SCIENTIST: Does it have three letters?
NATURE: Yes.
SCIENTIST: Is it a color?
NATURE: Yes.
SCIENTIST: Does it begin with ‘R’?
NATURE: Yes.
SCIENTIST (triumphantly): Is it ‘red’?
NATURE: No.

When this kind of thing happens, scientist first check the questions and answers again. If everything still holds up, they are forced to reinterpret some of the questions, or some of the answers.

SCIENTIST: Is it a wave?
NATURE: Yes.
SCIENTIST: Is it a particle?
NATURE: Yes.
SCIENTIST: I think I’d better go away and invent quantum mechanics.”
(Cohen & Stewart, p.275)

“A theory is like a net. It catches what it’s designed to catch.... If you fish nature with the theory of gravity, you catch elliptical orbits; if you fish with quantum electrodynamics you catch light and electrons; if you fish with crystallography, you catch crystals. That’s great, because you can catch one type of thing without wasting your time on all the others. But a Theory of Everything is like a Net for Everything, a net that...would have a mesh so fine that it catches every atom in the ocean, and every particle of light. It would be a vast sheet of black plastic.... But if anybody asks you what’s in the net, you have no idea. It’s black, you can’t see inside, and even if you could, you can’t pick out anything interesting.... A Theory of Everything would have the whole universe wrapped up. And that’s precisely what would make it useless.”
(Cohen & Stewart, p.365)

“Science has developed paradigms for the same reason that mammals developed warm mothers. That new trick allowed the mammals to throw away a lot of unnecessary DNA programming [for different temperature conditions]; paradigms allow science to throw away a lot of unnecessary facts, by deriving them from general, simple laws. This is Medawar’s point that ‘theories destroy facts.’ Within a framework like science, successive generations of children have to learn less to know more. The same is true for successive generations of scientists...[as] science uses the basic trick of data compression. Replace a product by a process that generates it. Replace a list of planetary data by a general law that implies it. Replace tables of chemical properties by Mendeleev’s periodic table. Replace measurements made on generations of pea plants by Mendel’s laws of heredity.... Data compression is very effective, but there is a price to pay...[as] the true information-theory cost of data is not just how many bits it contains, but how difficult the decoding procedure is.... Scientists often object to the concept of God, on the grounds that it explains the universe too easily: You can’t see how it ‘works.’ God is a contextual Theory of Everything. But a reductionist Theory of Everything suffers from the same problem. The physicist’s belief that the mathematical laws of a Theory of Everything really do govern every aspect of the universe is very like a priest’s belief that God’s law’s do. The main difference is that the priest is looking outward while the physicist looks inward. Both are offering an interpretation of nature; neither can tell you how it works.”
(Cohen & Stewart, pp.363-5)

As should be clear by this stage, Cohen & Stewart are exemplary guides to science - homing in on the most important questions w/outstanding skill, and treating them with the clarity (and humour) they (and we) deserve. As I noted earlier, however, they are more concerned here with the implications of what we know and how we know it, than with the specific details of theories and evidence, lending a philosophical air to the book without, however, making much use of the specialized language of philosophy. And, as those familiar with this site should be all-too aware by now, I always prefer to source my philosophy from those willing to get their hands dirty in the relevant areas...


“The universe...looks like a pretty complicated place if we remove our commonsense blinkers, and look beneath our comfortable, illusory simplicities. If we don’t want to be caught napping when that complexity decides to bite us, we must come to terms with it. There are two main approaches. Recondite professions (such as astrology or plumbing) claim to handle these hidden complexities in their own terms, but shy away from any attempt to explain their methods. The astrologer who casts your horoscope and predicts the approach of a tall, dark stranger, and the plumber who produces an odd-shaped wrench to unscrew a nut you didn’t even know your sink possessed, are both keeping a lot up their sleeves. Science adopts a radically different approach. It claims to see beyond the apparent complexities to the underlying simplicities, which it calls laws of nature. By working with these simple laws, rather than trying to handle the complexities as complexities, science claims to render the world once more accessible to common sense. It is common sense on a much more refined level, common sense with different intuitions; but when a physicist argues that perpetual motion machines are impossible because of the law of conservation of energy, the general line of thought is just as simple and transparent as the statement that the cat needs some milk because it’s thirsty. Common sense, in short, has a lot going for it...when it is congruent to reality. When it is not, it can go horribly wrong - like, for example, throwing water onto a gasoline fire.... The slogan “Water puts out fires” sounds like common sense, but it [depends on the context.] ...The trouble is, our brains mostly think in such slogans. The word ‘comprehend’ originally meant ‘grasp.’ To understand something is to grasp it with your mind, to make it into an object, that you can hold as a unit.... When protohumanity learned how to generalize about the structure of the natural world, to classify similar objects under identical labels - in short, to exploit the power of metaphor - it latched onto a wonderful trick.... [For,] mental computations must be in real time, so something quick and dirty is the order of the day. We have to think in slogans, because a really high level of congruence with reality takes too long.  So, a flash of black and orange is labelled ‘tiger,’ when it might be just a funny-colored leaf - because tigers can bite. It’s better to be safe than sorry.”
(Cohen & Stewart, pp.9-11)

“The universe appears to simplify at nonhuman scales, because we possess a very limited set of techniques for converting its behavior into human-scale effects, in both space and time...[and] we probably miss a lot of the fun by peering through glasses darkly. [But] physics takes a pragmatic and severely critical stance. It concentrates on simple, highly controlled systems; in return, it expects impeccable agreement between experiment and theory.... Physicists would argue, quite properly, that in the absence of evidence that wild electrons behave differently from tame ones, the onus of proof is on the skeptic. Physics deals with an invented, simplified world. That is how it derives its strength, this is why it works so well: Its raw material is of a type that can be placed in simple settings. Sciences like biology are less fortunate.... [In such sciences], especially those where really accurate measurements or repeated experiments aren’t possible, people nowadays tend to speak of ‘models’ rather than ‘laws.’ They look for underlying rules and regularities that explain a limited range of phenomena, in simple, graspable terms. From that point of view, ‘laws’ may be just spectacularly successful, very simple, models. The important thing is that, even though we can’t be certain that what we think of as laws of nature are actually true, we do see a lot of patterns and regularities in the world, and we can use these.”
(Cohen & Stewart, pp.12-19)

“Reductionism equips us with a variety of mental funnels, with complexities at the top, deeper simplicities below.... This structure of nested funnels provides a chain of logical explanation that works leads in the reverse direction, ‘upward’ from simple laws to complicated features of the natural world. The resulting insights tend to be presented in a deductive form: ‘These laws imply this phenomenon, which explains that observation.’ In contrast, the discovery of the structure tends to be inductive: ‘This observation would make sense if that phenomenon were taking place; and that would make sense if nature obeyed these obeyed laws.’ [And] when we look down our reductionist funnels at the deeper levels of chemistry and physics, what we find is mathematics: wholistic numerology (electron shells), geometry (buckyballs), equations (Einstein’s famous ‘e = mc2’ relating energy to mass). The logic of reductionism is most precise in the mathematical depths, and it becomes progressively more fuzzy as we ascend to the more complex levels of biology. By the time we reach Darwinian evolution, the model has become verbal, rather than mathematical. It is, however, cast in very precise and subtle language, and much of it is supported by mathematical submodels. The explanatory logic is still very precise, but its style has subtly changed. [But] the reductionist strategy seems far less successful when we think about still higher levels of organization than evolution...[due to] the sheer complexity of the system we are trying to reduce.... Interactions can have a very tiny effect compared to those of individuals, but if the number of individuals gets big enough, then it is the interactions that matter most. Unfortunately, if the effect of any particular interaction is tiny, we may not be able to work out what it is. We can’t study it on its own, in a reductionist manner, because it’s too small; but we can’t study it as part of the overall system, because we can’t separate it from all the other interactions. This is one of the main reasons why we don’t have effective explanations in ecology, epidemiology, or economics.”
(Cohen & Stewart, pp.180-2)

By quoting out of context - and similarly venerable techniques - many in the postmodern Humanities have managed to convince themselves that scientists such as Cohen & Stewart are actually supportive of the wooly, jargon-ridden, and self-satisfied relativism that passes for thinking in their circles. The reality, thankfully, is considerably different. Instead of amplifying legitimate concerns about representation, say, beyond sense - or dismissing them outright, as many might prefer - they take the hard-nosed pragmatic line common to most thinking scientists, which demonstrates exactly why the relativistic excesses of postmodernism are so misguided:


“It is undeniable that the patterns we can make explicit are limited by the material available to our imaginations. In 1963, Benoit Mandelbrot introduced the new concept of a ‘fractal,’ a geometric form with [self-similar] fine structure on all scales of magnification. That concept has since become a remarkably pervasive influence in scientific thought. Before 1963, one of the simplicities that pervades the 1990s picture of the world was - missing. Just as the concept ‘sphere’ unites raindrops, planets and suns, so now we perceive a unity between such diverse objects as trees, clouds, and coastlines. They are irregular, but with the same kind of irregularities. Before the simplicity ‘fractal’ was introduced, it was not only impossible to express this unity, it was pretty much impossible to notice it. This same example, however, does make us ask whether our mental patterns are genuine reflections of reality.... [Nevertheless,] forms are the result of processes, and congruences of processes are metaphors with genuinely useful content.... In recent years, a fecund mathematics has generated innumerable ‘new’ mental images, such as catastrophes, chaos, fractals, that might be advance warning of new simplicities in the world. Each extends the list of patterns that we can name, recognize, and manipulate. [But] it is not clear that all such patterns must necessarily prove operationally congruent to reality...[for] the universe that we experience is in a very real sense a figment of our imagination. However, this does not in any way imply that the universe itself has no independent existence...[since] our brains are ‘figments of reality’...[which] have survived millions of years of natural selection for congruence with reality. [And,] what better way to build simplified models of the world than to exploit simplicities that are actually there?”
(Cohen & Stewart, pp.23-7)

Another major strength of this work lies in its accessible approach to key aspects of scientific thinking - such as the rise of chaos theory and mathematical/computer modelling - treating not only their foundations, but also offering readers an informed understanding of their strengths...and weaknesses. And in a world increasingly dominated by opaque expertise, such understandings are crucially important.


“The common feature of...unpredictable yet deterministic systems is the process ‘stretch and fold.’ If the dynamics kneads the system like a lump of dough, stretching it out and folding it back on itself, then states that are close together always get pulled apart. On the other hand, states that are far apart may suddenly be folded together. The system can’t settle down to anything simple, because simple structures get pulled to bits, but it can’t escape altogether, because it’s perpetually folded back into the same space. Like a ball in a pinball machine, it is pushed away from all the pins - the simple types of behavior - but it can’t escape from the table. What do you do if you’re not allowed to behave simply, but you can’t get away? You are forced to do something complicated. The pinball bounces from pin to pin, never doing the same thing twice. This kind of complicated behavior, produced by simple, deterministic rules, is called chaos. Before computers became powerful enough, hardly anybody noticed it could occur; whenever they ran into it, the problem got too hard, so they gave up. They didn’t ask why it had gotten too hard; they just went off and worked on a different problem. Now that our computers are up to the task, the dreadful truth has become inescapable: Chaos is everywhere. It is just as common as the nice, simple behavior so valued by traditional physics.”
(Cohen & Stewart, p.190)

“Mathematics wallows in emergent phenomena. It also came to terms, long ago, with something that often puzzles nonmathematicians. By definition, all mathematical statements are tautologies. Their conclusions are logical consequences of their hypotheses. The hypotheses already ‘contain’ the information in the conclusions. The conclusions add nothing to what was implicitly known already. Mathematics tells you nothing new. Except, of course, that it makes things explicit rather than implicit.... [But] it’s not enough for something to be true; you have to know it’s true, and be able to explain why. Otherwise, you don’t know whether it’s safe to use it. [And,] far from being unimportant or tautologous, higher-level simplicities are the bread and butter of science - the simple, recognizable features of otherwise complicated theories that we use to understand the natural world. The representation of a higher-level simplicity as an explicit consequence of lower-level complexities tells us interesting things about the connection between the levels. But often it adds no useful gadgets to a working scientist’s tool kit, because a high-level simplicity is much easier to think about than some chain of consequences that causes it. When you use a hammer, you don’t want to worry about its molecular structure. Mathematicians know this well. ‘A theorem,’ said Christopher Zeeman, ‘is an intellectual resting point’ - something you can stand on to proceed further. Something you can know, can encapsulate, grasp as a whole. Key scientific concepts have this same quality.”
(Cohen & Stewart, pp.234-5)

“Higher levels of the reductionist story use mathematics as a metaphor, not as a precise representation of nature.... [Yet] even though mathematical models do not correspond to the whole of reality - indeed, because they do not correspond to the whole of reality - they offer definite advantages. Because mathematics is more precise than words, it can handle more delicate distinctions. It can also direct attention to features that are not directly observable, such as average infection rates. And it can be used in thought experiments to show that many of our cherished beliefs - such as that in the impossibility of systems spontaneously becoming more complex - are false. Mathematical models have a disadvantage, too - a trap that has caught more than one top scientist.... The quality of a mathematical conclusion is determined by a lot more than just the accuracy of the calculations. There are three main types of mistake. Errors made within the model are the easy type to spot. Harder are errors made in the explicit assumptions that lie behind the model. The hardest of all to spot are the implicit assumptions in the worldview that suggested the model.... Impeccable mathematics can produce nonsense, if it is based on nonsensical assumptions. ‘Garbage in, garbage out,’ as the computer scientists say. You might expect a book by a mathematician and a biologist to praise the precision of mathematics as a tool for digging out surprising biological truths. On the contrary, we both warn you not to take mathematical models too seriously. Surprising consequences are fine, but consequences so surprising that they don’t make any sense are almost certainly based on false assumptions. Don’t be impressed by mathematics just because you can’t understand it.”
(Cohen & Stewart, pp.184-6)

“Over the centuries, scientists have devised a working philosophy that places the emphasis upon simplicity. The principle known as Occam’s razor asserts that assumptions should not be made unnecessarily complicated.... So, when scientists select theories, they do not use just the criterion of agreement or disagreement with observations. They also have aesthetic principles in mind. They want the theory to be universal, not peculiar to some particular place and time. They want it to be elegant, not held together with chewing gum and string. They use these aesthetic principles to remove cloud of ‘trivially’ competing theories that necessarily surrounds every theory. Paul Dirac took a rather extreme view, saying that he would prefer a false but beautiful theory to a correct but ugly one. [But] Occam’s razor isn’t a scientific theory; it’s a philosophical principle, a meta-theory, a theory about theories. And it has its problems. At any given instant in the development of science, Occam’s razor is great for chopping away unnecessary detail, and concentrating your mind on what currently seems to matter. However, as science develops, theories that started simple tend to get more complicated...[and] over time, we [also] revise our views of what is or is not simple.... The story of science is that of repeated revolutions in our conception of the simple.”
(Cohen & Stewart, pp.225-8)

Ian Stewart is a mathematician....who lends this work’s treatment of mathematics his flair and deep insight. On the other hand, Jack Cohen is a developmental biologist - a very different realm one, would think , although it is the coming together of their thoughts in unexpected consilience that has delivered the key insights of this book. However, the most important sections of the work are, arguably, those centred around developmental questions...since these show most clearly how necessary pluralist perspectives are, and how essential it is to consider context:


“Evolution mostly leads to increased complexity in individual organisms. Complexity is downhill to evolution.... There are two rather difficult ideas to be grasped. The first is that it is ‘easier’ to add stages onto an already effective sequence, than it is to modify earlier steps in the sequence. So, most innovations that offer a competitive edge are refinements that complicate (and often enlarge) the adult stage of organisms. The second idea concerns the kind of innovation: It is more likely that competitive advantage will be gained by adding something than by removing it. Neither idea is universally valid, but both are true far more often than they are false.... The reason [for the first] is quite simple. When you add a new stage, you can build on what already exists.... Don’t forget, we’re talking of modifying a process here, not just changing the product...[and] if you tinker with an early stage, you may well mess up everything that happens afterwards.... The second idea is that improvements generally involve extra gadgetry rather than less. This is not a rigid rule either, but again it makes a lot of sense. Suppose you decide to remove something. It was presumably there for a reason, so you’ll lose whatever advantage it originally conveyed. The advantage you gain by simplification either has to be so good that you don’t mind, or (more likely) the thing you’ve removed has already become obsolete. Both of these methods for improvement are uncommon - though striking when they work.”
(Cohen & Stewart, pp.135-7)

“Even in an isolated ecology with limited raw materials, evolutionary pressures lead to diversity and the occupation of ever more specialized niches. But this process of continuing complication can’t go on forever.  Living creatures are forced by evolutionary pressure to operate right at the limits of what they are capable of...[and] there may come a time when the ‘style’ of an organism - its system of organization - starts to get top-heavy. Having chosen to specialize, all it can then do to improve is to become more specialized; it’s trapped in an evolutionary dead end.... In such circumstances, it is evolutionarily worthwhile for some of the competitors to cut out the later, complicated part of their life history; this excision results in neoteny (omitting the adult stage) or progenesis (breeding as a larva), depending on whether stability or exploitation is the background rhythm. If it is advantageous to stay the same, to continue to occupy a well-developed and canalized niche, then you get neoteny; if there is a new niche to be developed out of the old one and exploited to the full, the result is progenesis.... From this new basis, a new set of competing complications can be established. Advance, retreat-and-consolidate, advance again.”
(Cohen & Stewart, pp.140-1)

“The standard textbook story is of a random mutation in the DNA providing a range of developmental variants (many of which are lethal: The developmental system crashes). However, a single-gene mutation rarely has a predictable effect upon the development of an organism. Waddington showed this clearly: The extent to which development is affected by either an environmental kick or a gene difference depends upon what other genes are present...[and] when the mutation first happens there is always a normal version available on the equivalent chromosome from the other parent.... More subtly, in canalized development, where the entire system has stabilized itself against environmental and genetic changes, there are all kinds of alternative routes to each stage or function, often because the original gene sequence has been duplicated, or indeed multiplied.... [Moreover,] there is an even more subtle reason why most DNA changes (and most environmental differences) don’t affect the developmental program in any obvious way. This is because the systems concerned are versatile. Like the subprogram for skin, which permits any size of embryo to grow within it without bursting, most of the developmental programs of most animals and plants have lots of if-then contingency plans.... Such built-in versatility, in which the ends are predictable but the means can be varied, is characteristic of life.”
(Cohen & Stewart, pp.141-3)

“At the end of chapter 3, we likened the developing egg to a new computer that is provided with a start-up disc by Mother. We must now take a more contextual view, and amend that image.... Mother provides the ‘hardware,’ the cell that begins to develop; the new item is the software, the DNA inside the egg (provided by combining sequences from both father and mother and peculiar to the developing infant). That is, Mother provides the whole computer, not the start-up disk. Indeed, there is no start-up disk: The computer is up and running before the infant’s DNA software is inserted into it, and only then does it begin to obey the program  on the infant DNA. Notice how different the roles of infant DNA (content) and mother (context) now become. In the reductionist image of chapter 3, all of the magic is in the infant  DNA...[and] the maternal context is just a starter motor to get the process going. But in our new image, all of the magic lies in the maternal context, the fully functioning egg into which a naive nucleus containing infant DNA is inserted and run. [And] it is because the context is at least as important as the content that we can envisage a case in which the same DNA message has vastly different developmental meanings.... To twist one of [Richard] Dawkins’s own images to our own purposes: The Blind Watchmaker is indeed blind, so it can’t read its own blueprints. Biological development is a complicated transaction between the DNA ‘program’ and its host organism; neither alone can construct a creature, and neither alone holds all the secrets, not even implicitly.”
(Cohen & Stewart, pp.298-306)

Another key aspect of The Collapse of Chaos is Cohen & Stewart’s insightful critique of extensions of Information Theory - and other decontextualized approaches - into the realms of meaning. And, as usual, their analysis (and use of examples) make clear just what a ridiculous stretch this is...and how grossly inappropriate such models are for dealing with anything we would consider complex enough to be called a “message”.


“The standard quantitative measure of information - the number of characters in the message - is well designed for its original purpose, that of informing the engineer who is required to build devices to transmit and receive the message. Those devices don’t care what the message means. Meaning is a quality, not a quantity, and it is highly dependent upon context.”
(Cohen & Stewart, p.289)

“The meaning in a language does not reside in the code, the words, the grammar, the symbols. It stems from the shared interpretation of those symbols in the mind of sender and receiver. This in turn stems from the existence of a shared context. For language, the context is the culture shared by those who speak the language. For the DNA message, the context is biological development [and] if the manner by which DNA code is transformed into creatures is ignored, we have no idea whatever of the possible complexity of the creature that results from a given segment of DNA.... [Therefore,] prescription is closer to the mark than description, not just for DNA but for any message outside the abstract setting of information theory, which deliberately strips away context. A prescription from the doctor is not a cure in itself; it only becomes one when taken to a drugstore, received by a pharmacist, and acted upon. All messages in the real world that really are messages happen within a context. That context may be evolutionary, chemical, biological, neurological, linguistic, or technological, but it transforms the question of information content beyond measure. We understand this point for technology. We don’t usually try to play a compact disc on a telephone answering-machine. But when thinking about the natural world, we  often forget that we don’t know how much contextual input there is into processes we like to model as ‘message sending.’”
(Cohen & Stewart, pp.354-5)

There is much of great merit in this book which I do not have space to consider here: on the proper care and housing of Schrodinger’s cat, why the entropy of the universe cannot be increasing, and perhaps the best short treatment of Gaia theory I have yet to encounter.

However, as the authors’ note in their introduction, the most original portions of the book treat that which they refer to as simplexity and complicity....the latter, in particular, suggesting how future science may come to treat areas mainly seen now as intractably complex. And, whilst these coinages have not been adopted to date, I suspect the ideas behind them have been taken very seriously. Because they make compelling sense re questions which science has only just begun to ask...


“Emergent simplicities are the peaks in the landscapes of the possible...[and] the ‘big questions’ in science are about the big peaks. Reductionism tries to understand them by digging deep down inside the peak, to see what lies beneath it, what it’s built upon. But mountain peaks aren’t built by piling up long thin tubes of rock; they come from the overall folding of the total landscape...[and] nature’s geography is not the geography of laws, but of the landscape that emerges from those laws. We also need a description that makes sense on the level of the landscape itself.”
(Cohen & Stewart, p.395)

“We shall give the name ‘simplexity’ to the process whereby a system of rules can engender simple features. Simplexity is the emergence of large-scale simplicities as a direct consequence of rules. Newton’s laws - rules - of motion have direct mathematical implications about centers of mass, energy conservation, and so on. You can write them down in a few pages, grasp them in their entirety.... An important point about simplexities  is that their presence is guaranteed, once you have the rules. Any system with the same rules will necessarily exhibit exactly the same simplexities.... Simplexity is, appropriately, a relatively simple concept. It is the easy way for different rules to generate similar or even identical features; it works because the rules themselves are very similar. Now we consider something much more subtle, in which totally different rules converge to produce similar features, and so exhibit the same large-scale structural patterns. We call it complicity. For example, let’s think about the transmission of malaria.... What is special about this kind of system is that the interaction of several subsystems enlarges the space of the possible. There’s nothing remotely like malaria in any of the component spaces on their own...but when those spaces interact, they open up entirely new possibilities.... Simplexity merely explores a fixed state of the possible. Complicity enlarges it. And both processes collapse the underlying chaos, producing stable features from a sea of complexity and randomness. [But] complicity, by its nature, is so intricate and convoluted that any attempt to dissect out its internal workings and past history just leads to the Reductionist Nightmare. Despite this, there are patterns to complicity - patterns that let us recognize its presence.  They are meta-rules, large-scale universals. You couldn’t have predicted malaria, in all its gory detail, from the interaction of blood space and bloodsucker space. But with a bit of imagination, you could have predicted the universal pattern ‘parasite’ and guessed that the combined space [bloodsuckers, flight, multiple hosts] opened up new niches for parasitism.”
(Cohen & Stewart, pp.411-15)

“These meta-rules and meta-meta-rules work on the level of features, and only on that level. Their explanations do not lie inside the complexities of the component subsystems, which rapidly diverge as you progress to deeper reductionist levels. The meta-rules are emergent, not reductionist. Indeed, they are so nonreductionist that it doesn’t even matter whether a given feature arose through simplexity or complicity. As long as it looks the same to the outside world, that’s all that matters. This fungibility or universality is what makes the patterns universal, and it’s what lets them collapse chaos.... Complicity arises when simple systems interact in a way that changes both, and erases their dependence on initial conditions. The hallmark of complicity is the occurrence of the very same feature or features, in systems whose rules are either known to be very different, or are expected to be very different if only we could find out what they are. This carries an important consequence: Complicity is a convergent process; it homes in on the same features regardless of the fine detail in the rules. Another way to say this is that complicity leads to ‘replaceability’ of (some) components.... [Moreover,] because complicity, by its nature, is convergent...it cannot be reduced to a particular system of rules in any useful manner.... We can nevertheless explain roughly how it works, by appealing to the idea of spaces of the possible. Continuing to focus on the DNA/organism example...there is a feedback loop between the two spaces, which cause them to coevolve toward a common dynamic.... [And] because the two spaces have very different geography, their individual attractors don’t match up nicely, so the feedback between the spaces has a creative effect...[generating] a new, combined geography that in no sensible way can be thought of as a mixture of the two separate geographies.”
(Cohen & Stewart, pp.415-21)

Cohen and Stewart’s key example of complicity is evolution - the crucial subsystems being the chemistry of DNA and the systematic ways organisms interact with their environments - and the plethora of examples of convergent evolution provide a most compelling argument for the concept. As such, it is perhaps not so surprising that evolution is so widely misunderstood - even by many scientists - for proper understanding requires us to stretch our minds in several deeply unfamiliar ways...


“Few of our daily experiences equip us to think sensibly about evolutionary systems. We tend to act on simplified models of the world; we seldom think about the effects of small changes over huge periods of time; and we almost never try to tackle anything remotely as complex as the totality of life on earth. Many aspects of evolution run counter to our intuition.”
(Cohen & Stewart, p.98)

Jack Cohen & Ian Stewart’s The Collapse of Chaos is far more than an introduction to chaos and complexity theory...or, indeed, than a grand tour of the hierarchies of understandings which structure conventional science. It is all of those things, as well as a modest proposal to expand the nature of science to properly incorporate contextual factors, an insightful (and moderate, read: useful) critique of excessively reductionist approaches to complex areas, a meditation on the nature of natural laws, and a highly readable and entertaining book to boot.

This does not mean - of course! - that it is flawless...in particular, I found chapter eleven’s promotion of the Aquatic Ape and Meme theories to be markedly weak...and the latter even going against the contextual thrust of the work (when properly understood). Nevertheless, this is a work with unique virtues, that more than amply outweigh its few faults. And, as an introduction to scientific thinking/understanding - both mainstream and emergent - it has not be bettered. Read it, and see...


“We began by asking if the universe is simple or complicated. The answer: It depends on the context you have in mind when asking the question, and the kind of answer you want.... Nothing is as simple - or as complex - as we thought when we began. Simple rules can breed simple behavior or complex; complex rules can breed simple behavior, or complex. Contrary to common belief, complexity is one of the least conserved quantities in the universe. So are those things that go with it, such as information, meaning, organization, awareness. You can sometimes get something for nothing - or nothing for something.... We think the key is to understand complicity, not as an incredibly complex reductionist network, but as the interaction of features within different spaces of the possible. That is, we must put the dynamics back into biological development, evolution, and brain function, with the emphasis being on qualitative forms and features. Think of DNA and organisms. We’ve already argued that the organism does not ‘see’ the DNA code, but only those features of it that produce particular effects that matter to the organism. Similarly, DNA does not ‘see’ organisms; all that matters to DNA is that the organism bearing it should survive to replicate it. This is the ‘selfish gene’ image, but as one aspect of a double-edged process, not as the sole factor. Each system reacts only to the features of the other. So what we need is a theory of features, an understanding of how the geographies of spaces of the possible conspire to create new patterns, and combined dynamics.... We’re not saying that the reductionist approach should be abandoned, and we’re certainly not advocating replacing it by Just So Stories. But we think that too much of the emphasis currently placed on reductionism stems from the Panda Principle: It was there first, and its devotees won’t let anything else displace it. And we think that a lot more effort should be put into questions such as meaning, structure, and development, so that science can combine internals and externals into a single, coherent scheme.”
(Cohen & Stewart, pp.441-3)


John Henry Calvinist