A brilliant and vastly irritating book. Brilliant because it covers
a wide range of fascinating subjects, with many intriguing new
insights. Irritating because it never seems to go into sufficient
depth about any of them.
The book covers two main topics, and tries to integrate them. The
section on quantum mechanics and fundamental particles -- Gell-Mann's
original background -- has some fascinating stuff on the modern "many
alternative histories" interpretation of QM, coarse-graining and
entropy, and the link to the quasi-classical domain. All gripping
stuff, and worth a whole (popular) book of their own. Then the section
on Complex Adaptive Systems -- based on Gell-Mann's more recent work
at the Santa Fe Institute, has lots of equally fascinating insights
into evolution, information, complexity,
chaos, self-organisation, adaptation, and so on. But again, tons of
interesting ideas are each compressed into a sentence or two. The text
is split, under headings, into half-page or one-page fragments, each
of which could usefully be expanded almost to chapter length in order
to discuss fully the points Gell-Mann mentions in passing.
The Afterword, announced as a kind of "executive summary"
(so why wasn't it at the beginning?), compresses further the entire
book into nine breathlessly dense pages. However, I felt that the
entire book acted as an executive summary to the book(s) I really want
to read. Gell-Mann admits that the book "reaches into a large
number of areas it cannot explore thoroughly or in depth" --
indeed, many of those areas are still subjects of active research --
and that its purpose is "to stimulate thought and discussion"
-- it certainly does that, but a Further Reading section would have
been a welcome addition.
in general ...
the behaviour of highly complex nonlinear systems may exhibit
simplicity, but simplicity that is typically emergent and not obvious
at the outset
Contents includes:
- Kinds of complexity
- computational complexity -- how the time taken for a
computer program to run varies with the size of the problem
- algorithmic information content -- (Greg
Chaitin) how much a bit stream can be compressed -- the
minimum length of a description of the string (so at a maximum
when the string is its own minimum description, that is,
incompressible, or random)
- effective complexity -- (a system has a regular and a
random component) the length of the description used to describe
the regularities -- hence completely regular and completely
random systems both have very small effective complexity
- depth -- (Charles Bennett) how laborious it is to go
from the highly compressed description to a fully-unwound
description of a system (which is why we tend to use less-deep,
partially-unwound, more efficient descriptions -- we don't
always go back to "first principles")
- crypticity -- (Charles Bennett) how laborious it is
to go from a fully-unwound description to a highly compressed
description of a system
- Kinds of randomness
- incompressible: a random bit string is incompressible
-- it has maximum algorithmic information content
- stochastic: produced by a chance process, such as
tossing a fair coin -- most stochastic streams will be random(1)
but occasionally, by chance, some will not (a chance long run of
heads, for example)
- pseudo-random: produced by a deterministic
computational process, but so complicated as to be effectively
unpredictable, and to appear to simulate a stochastic process
- A more fundamental science has less "special
information" in it. Chemistry is less fundamental than physics,
because it applies only at temperatures where atoms exist, and you
have to ask "chemistry-type" questions of it. Biology is a
lot less fundamental than chemistry, because it has all the special
information of contingent evolution on this planet in it. Less
fundamental sciences are more complex sciences; many of their
regularities arise from the special information, as well as from
fundamental laws. Reductionism focuses on the fundamental laws, not
the special information.
- quantum mechanics
- bosons (photons, gravitons, etc) "condense" and
build up high densities, so can behave almost like classical
fields (electromagnetism, gravity, respectively) -- fermions
(electrons, etc) "exclude"; they can be described in
terms of fields, but such fields never behave classically
- modern view is based on Hugh Everett's "many worlds"
interpretation -- although better called "many alternative
histories" -- not on "observers" and "collapsing
wavefunctions"
- coarse-grained histories and decoherence; correlation
with the quasi-classical domain -- fine-grained histories and
interference, like the two slit experiment
- "quark" is pronounced "kwork" --
the line Three quarks for Muster Mark from Finnegans
Wake, that suggests the pronunciation "kwark", was
discovered later.
- pruning the history tree of probabilities when something is
correlated with the quasi-classical domain, in other words "observed"
to be a fact, is like pruning a probability tree when a coin is
"observed" to be heads or tails -- nothing to do with
collapsing wavefunctions
- Schrodinger's red herring -- the quantum event is amplified
and correlated with the quasi-classical domain, that is, it
becomes a fact -- so the cat really is either
alive or dead -- there is no quantum interference of two cat
states
- individuality perceived when each member of a set
needs more bits to describe it than are needed to enumerate the
set -- 6 billion people are individuals, each needing more than
32 each bits to describe them -- 100 billion stars in the galaxy
are not all that individual, to us, at the current level of
detail of our observation
- maximal quasi-classical domain -- the most detail the
universe can be described in without getting quantum
interference -- such a description need not be unique -- if one
particular description gives rise to regularities, a complex
adaptive system can arise to "compress" those
regularities -- a CAS that arises to compress the regularities
of a different quasi-classical domain description might not be
able to perceive the regularities of the first: "goblin
worlds"
- time and entropy -- time's arrow from the very
special initial condition of the universe -- algorithmic information
content also contributes to entropy
- gateway event: an increase in complexity leads to
possibility of further huge increases in kinds and levels of
complexity -- eukaryotes -- multi-celled organisms -- a new
technology can be an economic gateway event
- CASs recognise regularities
- superstition: recognising a regularity that is not
there -- denial: refusing to recognise one that is there
- art can recognise patterns
not of interest to science, such as metaphors
- how can we achieve the spiritual satisfaction, comfort,
social cohesion, great art, that accompany mythical beliefs,
without having to believe the myths? -- how can
we achieve cultural diversity without parochialism and
ethnic conflict?
- there is no such thing as paranormal: if something weird
really does occur, the relevant scientific laws will get
changed or updated, as they were for continental drift, for
meteorites, ...
- reasons for maladaption -- other selection pressures,
frozen accidents and windows of maturation, differing timescales
- computer simulation of CASs
- biologically inspired neural nets, genetic algorithms -- what
is the computer equivalent of the CAS human immune system?
- emergence of complex behaviour from simple rules: the rules
imply general regularities, plus individual special regularities
-- Thomas Ray's TIERRA: complex
evolutionary behaviour from the beginning -- more realistic
economics: bounded rationality, learning, inclusion of
hard-to-quantify values
- physics: 1st law of thermodynamics, conservation of energy;
economics, keeping track of money -- physics 2nd law: increase
of entropy; what is the economic analog of irreversibility?
- Synthesis, integration and review are skills as important as
finding new knowledge, but are not equally rewarded