Books

Short works

Other information

Books : reviews

George Dyson.
Darwin among the Machines.
Penguin. 1997

rating : 3.5 : worth reading
review : 27 February 2003

This is a leisurely, meandering, insightful and deeply fascinating exploration of the relationship between machines and minds, from an historical perspective. It covers a lot of ground, exploring the key themes are information, evolution, communications, emergence, and the growing complexity of machines leading to them having (probably incomprehensible) minds of their own. This is all shot through with the feeling of a deep love of machines.

Something about abandoned machines---the suspension of life without immediate decay---evokes a mix of fear and hope. When the machine stops, we face whatever it is that separates death from life.

Each chapter is titled after a particular work about machines, life, and the mind. This is used as the focus (or sometimes merely the excuse) for a discussion of the theories and thoughts, often quoting their own words, of a wide range of thinkers. We get all the usual suspects --- Gottfried von Leibnitz, Charles Babbage, Charles Darwin, Kurt Godel, Alan Turing, John von Neumann --- but also some rather less usual ones --- Thomas Hobbes, Robert Hooke, Erasmus Darwin, Andre-Marie Ampere, Samuel Butler, Nils Barricelli, William Ashby, Olaf Stapleton --- and many many others, including anecdotes about his childhood growing up surrounded by famous scientists and their computers.

This work weaves a tapestry of interrelated concepts and ideas, rather than making some single sharp point, or giving a single crisp answer. The main theme is the possibility of machine intelligence, and that we may be on the cusp of it (or already have achieved it), because of the current complexity of our machines.

As we develop digital models of all things great and small, our models are faced with the puzzle of modeling themselves. As far as we know, this is how consciousness evolves.

This main theme is augmented by other discussions, such as the currently fashionable ideas of complexity (although Dyson puts this in its historical context, showing that it is not so much a modern subject as we might be led to believe) and emergence.

Von Neumann believed that a complex network formed its own simplest behavioral description; to attempt to describe its behavior using formal logic might be an intractable problem, no matter how much computational horsepower was available for the job.

Emergent behavior is that which cannot be predicted through analysis at any level simpler than that of the system as a whole. Explanations of emergence, like simplifications of complexity, are inherently illusory and can only be achieved by sleight of hand. This does not mean that emergence is not real. Emergent behavior, by definition, is what's left after everything else has been explained.

Along the way, there are many intriguing diversions that at first seem irrelevant, and are then woven skillfully into the growing whole. One fascinating apparent sideline is a discussion of the complexity and creativity of economic systems, in terms of Godel's theorem.

Money is a recursive function, defined, layer upon layer, in terms of itself. The era when you could peel away the layers to reveal a basis in precious metals ended long ago. There's nothing wrong with recursive definitions. (... Gregory Bateson's definition of information as "any difference that makes a difference"---the point being that information and meaning are self-referential, not absolute.) But formal systems based on recursive functions ... have certain peculiar properties. Godel's incompleteness theorems have analogies in the financial universe, where liquidity and value are subject to varying degrees of definability, provability, and truth. Within a given financial system ... it is possible to construct financial instruments whose value can be defined and trusted but cannot be proved without assuming new axioms that extend the system's reach. ... No financial system can ever be completely secure and closed. On the other hand, like mathematics or any other sufficiently powerful system of languages, there is no limit to the level of concepts that an economy is able to comprehend.

That this is only an apparent sideline becomes clear when the importance of economics for the study of complex systems in general is described.

economic principles are the only known way to evolve intelligent systems from primitive components that are not intelligent themselves.

In the universe according to von Neumann, life and nature are playing a zero-sum game. Physics is the rules. Economics ... is the study of how organisms and organizations develop strategies that increase their chances for reward. ... the formation of coalitions holds the key... These coalitions are forged on many levels---between molecules, between cells, between groups of neurons, between individual organisms, between languages, and between ideas. The badge of success is worn most visibly by the members of a species, who constitute an enduring coalition over distance and over time. Species may in turn form coalitions, and, perhaps, biology may form coalitions with geological and atmospheric processes otherwise viewed as being on the side of nature, not on the side of life.

There are many such diversions, forming interesting snippets themselves, but eventually adding to the overall picture.

It is surprising that noncomputable functions, which outnumber computable ones, are so hard to find. It is not just that noncomputable functions are difficult to recognize or awkward to define. We either inhabit a largely computable world or have gravitated toward a computable frame of mind. The big questions---Is human intelligence a computable function? Are there algorithms for life?---may never be answered. But computable functions appear to be doing most of the work.

The link between Troy and Mycenae was a one-way, one-time, and one-bit channel, encoded as follows, no signal meant Troy belonged to the Trojans; a visible signal meant Troy belonged to the Greeks. Communications engineers have been improving the bandwidth ever since. Suffering a fate that still afflicts brief messages after three thousand years, Clytaemnestra's message acquired a header---a cumulative listing of gateways that handled the message along the Way---longer than the message she received.

Ashby's law of requisite variety demands this level of detail in a system that can learn to control ... rather than simply identify ... The tendency of representative models ... is to translate increasingly detailed knowledge into decision making and control. ... The smallest transactions count. Centrally planned economies, violating Ashby's law of requisite variety, have failed not by way of ideology but by lack of attention to detail.

The main theme, of machine intelligence, drives the whole argument. Thinking about the concept of machine intelligence leads to thinking about many different levels of intelligence. Different levels of organisation each have their own intelligence, which is incomprehensible to the components at that given level of organisation.

As information is distributed, it tends to be represented (encoded) by increasingly economical (meaningful) forms. This evolutionary process, whereby the most economical or meaningful representation wins, leads to a hierarchy of languages, encoding meaning on levels that transcend comprehension by the system's individual components---whether genes, insects, microprocessors, or human minds.

Also, even at a given level, there may be different kinds of intelligence. At our own level, there may well be completely alien intelligences (and machine intelligences), incomprehensible to us, or even unnoticed by us.

[Samuel] Butler knew ... that our definition of intelligence is so anthropocentric as to be next to useless for anything else. "Nothing, we say to ourselves, can have intelligence unless we understand all about it---as though intelligence in all except ourselves meant the power of being understood rather than of understanding," he wrote. "We are intelligent, and no intelligence so different from our own as to baffle our powers of comprehension deserves to be called intelligence at all. The more a thing resembles ourselves, the more it thinks as we do---and thus by implication tells us that we are right, the more intelligent we think it; and the less it thinks as we do, the greater fool it must be; if a substance does not succeed in making it clear that it understands our business, we conclude that it cannot have any business of its own." [quote from Butler's Luck, or Cunning?, 1887]

Machine intelligences in particular might be very different, because their hardware runs so much faster, and because they can be so much larger, and live so much longer. They form a potentially a different kind of life from biological life.

Yes, there is plenty of room at the bottom---but nature got there first. Life began at the bottom. Microorganisms have had time to settle in; most available ecological niches have long been filled. Many steps higher on the scale, insects have been exploring millimeter-scale engineering and socially distributed intelligence for so long that it would take a concerted effort to catch up. ... Things are cheaper and faster at the bottom, but it is much less crowded at the top. The size of living organisms has been limited by gravity, chemistry, and the inability to keep anything much larger than a dinosaur under central-nervous-system control. ... Large systems, in biology as in bureaucracy, are relatively slow. ... Life now faces opportunities of unprecedented scale. Microprocessors divide time into imperceptibly fine increments, releasing signals that span distance at the speed of light. Systems communicate globally and endure indefinitely over time. Large, long-lived, yet very fast composite organisms are free from the constraints that have limited biology in the past.

Non-biological life and intelligence can take advantage not only of fast non-biological substrates for their existence, but also of fast non-biological evolutionary mechanisms for their development.

Under the neo-Darwinian regime---not so much a consequence of the origins of life as a consequence of the origins of death---replicators will, in the long run, win. But there is no law against changing the rules. Intelligence and technology are bringing Lamarckian mechanisms into play, with results that may leave the slow pace of Darwinian trial and error behind.

But non-biological complexity could also arise in ways analogous to biological development.

organization is arrived at as much by chance as by design. Most of the connections [of the Web] make no sense, and few make any money except in circuitous ways. Critics say that the World Wide Web is a passing stage (right) and doomed to failure because it is so pervaded by junk (wrong). Yes, most links will be relegated to oblivion, but this wasteful process, like many of nature's profligacies, will leave behind a structure that could not otherwise have taken form.

The conclusion of Dyson's discussion is the contention that, just as we are achieving mastery over the natural world, we are building an artificial world of even greater, unbounded, complexity and wildness. Like many discussions and explorations, it is the entire journey, not the particular conclusion, that is of most interest.

George Dyson.
Project Orion: the atomic spaceship 1957-1965.
Penguin. 2002

George Dyson.
Turing's Cathedral: the origins of the digital universe.
Penguin. 2012

George Dyson.
Analogia: the entangled destinies of nature, human beings and machines.
Penguin. 2020