Simplexity: Why simple things become complex and how complex things can be made simple

Simplexity by Jeffrey Kluger is a very frustrating read.  It starts off with great promise; by page 28 and the “Complexity Arc” diagram, he has identified one of the biggest unsolved problems in complexity theory.

The problem is this: we would like a well-defined, mathematically precise definition of complexity, the kind of complexity that we see in biological life, but we don’t really have one.  The most widely used definition of information, negative Shannon entropy, is completely wrong for the job.  Systems with high entropy and low information, like a hot gas, have no complexity because they have no structure.  Systems with low entropy and near-perfect information, like a flawless crystal near absolute zero, have no complexity because they have no variety.  Complex systems exist in between these two extremes, and their complexity must be measured along a different axis.  But what axis?

With this set up, and given the book’s title, I was expecting Kluger to attempt to answer this question.  But instead, chapter after chapter, he just gives us a bunch of vaguely related anecdotes about systems he considers complex and why they are interesting.  Nowhere in the remainder of the book does he step back and try to draw any conclusions or general principles from these many examples. Nearly 300 more pages go by with no reflection, no analysis, and no attempt to achieve any deeper level of understanding.  And then the book ends, with no summary or conclusions.

I fail to see the point.  If you want a bunch of informal descriptions of possibly complex systems, this book might be adequate. But if you’re looking for a deep understanding of complexity, check this book out of a library, read the first chapter, and then take it back.  What you seek is elsewhere.

Howard Bloom, The God Problem: How A Godless Cosmos Creates

I think it was Paul Halmos who said that, for a book to be about something, it must NOT be about a great many other things.  Howard Bloom seems unable to follow this prescription.  The God Problem contains many interesting facts, most of them completely unrelated to its main quest of explaining how a universe can create complexity.  In Bloom’s chatty, verbose, and peripatetic peregrinations, the very first sentence which sheds any light on the book’s nominal topic occurs on page 431.

Bloom’s choice of writing the entire book in the 2nd-person-as-substitute-for-1st-person (“you are Jewish”) is odd, at best confusing and at worst irritating.  The book is repetitive.  Like the movie The Hobbit, it could have been made half as long and twice as good with some decent editing.

What makes this especially frustrating is that Bloom does, occasionally, as if by accident, touch on some important and difficult topics, like the difference between “information” and “meaning”.  But he seems unable to stay focused on them long enough to make much progress.  He also seems unaware of much previous work.

I suppose I should mention that the book contains 4 false statements.  I leave it to the interested reader to find the 4.

Using the Mandelbrot set as a model for how complexity arises has some merits, but also some drawbacks. Yes it’s a simple iterated rule that creates immense amounts of detail.  But it doesn’t create any meaning, or even information: the Kolmogorov complexity of the whole thing is no greater than that of the equation generating it.  If you want to explain the complexity of, say, a eukaryotic genome, you have to look elsewhere.

The kind of complexity we are interested in requires both nonlinearity (gain, chaos, solitons) and entropy creation (non-equilibrium thermodynamics, metabolism).  But Bloom is prevented from understanding any of this by his insistence that the law of entropy is simply wrong.  Life and evolution climb upstream against a constant flow of degradation; how they manage to do that is one of the key components of the answer Bloom purports to seek, but refuses to see.

Shannon entropy is not the best measure for attacking this problem; this has been well-known for some time.  The state of maximum entropy, total randomness, is dead because it has no structure.  The state of minimum entropy, a perfect crystal close to absolute zero, is dead because it has no variety.  Life, and all complexity generation, has to exist in between order and chaos.  Bloom spends so much time flogging Shannon’s dead horse that he is unable to say much about what alternative he prefers.  He seems unaware of Fisher Information, and makes little or no use of Kolmogorov complexity.  We could use a workable theory of meaning.  Bloom is probably right that any such theory has to be receiver-dependent, but he fails to actually propose one.  This makes his contribution eerily parallel to, and about as useless as, the creationist information theory of Dr. Werner Gitt (In the Beginning was Information).

This book bills itself as a rocket to new heights of understanding, but in the end it feels more like a bunch of firecrackers going off on the ground: lots of little pyrotechnics, but no real progress.