The Age of Spiritual MachinesWhen Computers Exceed Human Intelligence
CHAPTER ONE
The Age of Spiritual Machines
When Computers Exceed Human Intelligence
By RAY KURZWEIL
Viking
Read the Review
The Law Of Time And Chaos
A (Very Brief) History of the Universe:
Time Slowing Down
The universe is made of stories, not of atoms.
-- Muriel Rukeyser
Is the universe a great mechanism, a great computation, a great symmetry, a great accident or a great thought?
-- John D. Barrow
As we start at the beginning, we will notice an unusual attribute of the nature of time, one that is critical to our passage to the twenty-first century. Our story begins perhaps 15 billion years ago. No conscious life existed to appreciate the birth of our Universe at the time, but we appreciate it now, so retroactively it did happen. (In retrospect -- from one perspective of quantum mechanics -- we could say that any Universe that fails to evolve conscious life to apprehend its existence never existed in the first place.)
It was not until 10-43 seconds (a tenth of a millionth of a trillionth of a trillionth of a trillionth of a second) after the birth of the Universe that the situation had cooled off sufficiently (to 100 million trillion trillion degrees) that a distinct force -- gravity -- evolved.
Not much happened for another 10-34 seconds (this is also a very tiny fraction of a second, but it is a billion times longer than 10-43 seconds), at which point an even cooler Universe (now only a billion billion billion degrees) allowed the emergence of matter in the form of electrons and quarks. To keep things balanced, antimatter appeared as well. It was an eventful time, as new forces evolved at a rapid rate. We were now up to three: gravity, the strong force, and the electroweak force. After another 10-10 seconds (a tenth of a billionth of a second), the electroweak force split into the electromagnetic and weak forces we know so well today.
Things got complicated after another 10-5 seconds (ten millionths of a second). With the temperature now down to a relatively balmy trillion degrees, the quarks came together to form protons and neutrons. The antiquarks did the same, forming antiprotons.
Somehow, the matter particles achieved a slight edge. How this happened is not entirely clear. Up until then, everything had seemed so, well, even. But had everything stayed evenly balanced, it would have been a rather boring Universe. For one thing, life never would have evolved, and thus we could conclude that the Universe would never have existed in the first place.
For every 10 billion antiprotons, the Universe contained 10 billion and 1 protons. The protons and antiprotons collided, causing the emergence of another important phenomenon: light (photons). Thus, almost all of the antimatter was destroyed, leaving matter as dominant. (This shows you the danger of allowing a competitor to achieve even a slight advantage.)
Of course, had antimatter won, its descendants would have called it matter and would have called matter antimatter, so we would be back where we started (perhaps that is what happened).
After another second (a second is a very long time compared to some of the earlier chapters in the Universe's history, so notice how the time frames are growing exponentially larger), the electrons and antielectrons (called positrons) followed the lead of the protons and antiprotons and similarly annihilated each other, leaving mostly the electrons.
After another minute, the neutrons and protons began coalescing into heavier nuclei, such as helium, lithium, and heavy forms of hydrogen. The temperature was now only a billion degrees.
About 300,000 years later (things are slowing down now rather quickly), with the average temperature now only 3,000 degrees, the first atoms were created as the nuclei took control of nearby electrons.
After a billion years, these atoms formed large clouds that gradually swirled into galaxies.
After another two billion years, the matter within the galaxies coalesced further into distinct stars, many with their own solar systems.
Three billion years later, circling an unexceptional star on the arm of a common galaxy, an unremarkable planet we call the Earth was born.
Now before we go any further, let's notice a striking feature of the passage of time. Events moved quickly at the beginning of the Universe's history. We had three paradigm shifts in just the first billionth of a second. Later on, events of cosmological significance took billions of years. The nature of time is that it inherently moves in an exponential fashion -- either geometrically gaining in speed, or, as in the history of our Universe, geometrically slowing down. Time only seems to be linear during those eons in which not much happens. Thus most of the time, the linear passage of time is a reasonable approximation of its passage. But that's not the inherent nature of time.
Why is this significant? It's not when you're stuck in the eons in which not much happens. But it is of great significance when you find yourself in the "knee of the curve," those periods in which the exponential nature of the curve of time explodes either inwardly or outwardly. It's like falling into a black hole (in that case, time accelerates exponentially faster as one falls in).
The Speed of Time
But wait a second, how can we say that time is changing its "speed"? We can talk about the rate of a process, in terms of its progress per second, but can we say that time is changing its rate? Can time start moving at, say, two seconds per second?
Einstein said exactly this -- time is relative to the entities experiencing it. One man's second can be another woman's forty years. Einstein gives the example of a man who travels at very close to the speed of light to a star -- say, twenty light-years away. From our Earth-bound perspective, the trip takes slightly more than twenty years in each direction. When the man gets back, his wife has aged forty years. For him, however, the trip was rather brief. If he travels at close enough to the speed of light, it may have only taken a second or less (from a practical perspective we would have to consider some limitations, such as the time to accelerate and decelerate without crushing his body). Whose time frame is the correct one? Einstein says they are both correct, and exist only relative to each other.
Certain species of birds have a life span of only several years. If you observe their rapid movements, it appears that they are experiencing the passage of time on a different scale. We experience this in our own lives. A young child's rate of change and experience of time is different from that of an adult. Of particular note, we will see that the acceleration in the passage of time for evolution is moving in a different direction than that for the Universe from which it emerges.
It is in the nature of exponential growth that events develop extremely slowly for extremely long periods of time, but as one glides through the knee of the curve, events erupt at an increasingly furious pace. And that is what we will experience as we enter the twenty-first century.
EVOLUTION: TIME SPEEDING UP
In the beginning was the word. . . . And the word became flesh.
-- John 1:1,14
A great deal of the universe does not need any explanation. Elephants, for instance. Once molecules have learnt to compete and create other molecules in their own image, elephants, and things resembling elephants, will in due course be found roaming through the countryside.
-- Peter Atkins
The further backward you look, the further forward you can see.
-- Winston Churchill
We'll come back to the knee of the curve, but let's delve further into the exponential nature of time. In the nineteenth century, a set of unifying principles called the laws of thermodynamics was postulated. As the name implies, they deal with the dynamic nature of heat and were the first major refinement of the laws of classical mechanics perfected by Isaac Newton a century earlier. Whereas Newton had described a world of clockwork perfection in which particles and objects of all sizes followed highly disciplined, predictable patterns, the laws of thermodynamics describe a world of chaos. Indeed, that is what heat is.
Heat is the chaotic -- unpredictable -- movement of the particles that make up the world. A corollary of the second law of thermodynamics is that in a closed system (interacting entities and forces not subject to outside influence; for example, the Universe), disorder (called "entropy") increases. Thus, left to its own devices, a system such as the world we live in becomes increasingly chaotic. Many people find this describes their lives rather well. But in the nineteenth century, the laws of thermodynamics were considered a disturbing discovery. At the beginning of that century, it appeared that the basic principles governing the world were both understood and orderly. There were a few details left to be filled in, but the basic picture was under control. Thermodynamics was the first contradiction to this complacent picture. It would not be the last.
The second law of thermodynamics, sometimes called the Law of Increasing Entropy, would seem to imply that the natural emergence of intelligence is impossible. Intelligent behavior is the opposite of random behavior, and any system capable of intelligent responses to its environment needs to be highly ordered. The chemistry of life, particularly of intelligent life, is comprised of exceptionally intricate designs. Out of the increasingly chaotic swirl of particles and energy in the world, extraordinary designs somehow emerged. How do we reconcile the emergence of intelligent life with the Law of Increasing Entropy?
There are two answers here. First, while the Law of Increasing Entropy would appear to contradict the thrust of evolution, which is toward increasingly elaborate order, the two phenomena are not inherently contradictory. The order of life takes place amid great chaos, and the existence of life-forms does not appreciably affect the measure of entropy in the larger system in which life has evolved. An organism is not a closed system. It is part of a larger system we call the environment, which remains high in entropy. In other words, the order represented by the existence of life-forms is insignificant in terms of measuring overall entropy.
Thus, while chaos increases in the Universe, it is possible for evolutionary processes that create increasingly intricate, ordered patterns to exist simultaneously. Evolution is a process, but it is not a closed system. It is subject to outside influence, and indeed draws upon the chaos in which it is embedded. So the Law of Increasing Entropy does not rule out the emergence of life and intelligence.
For the second answer, we need to take a closer look at evolution, as it was the original creator of intelligence.
The Exponentially Quickening Pace of Evolution
As you will recall, after billions of years, the unremarkable planet called Earth was formed. Churned by the energy of the sun, the elements formed more and more complex molecules. From physics, chemistry was born.
Two billion years later, life began. That is to say, patterns of matter and energy that could perpetuate themselves and survive perpetuated themselves and survived. That this apparent tautology went unnoticed until a couple of centuries ago is itself remarkable.
Over time, the patterns became more complicated than mere chains of molecules. Structures of molecules performing distinct functions organized themselves into little societies of molecules. From chemistry, biology was born.
Thus, about 3.4 billion years ago, the first earthly organisms emerged: anaerobic (not requiring oxygen) prokaryotes (single-celled creatures) with a rudimentary method for perpetuating their own designs. Early innovations that followed included a simple genetic system, the ability to swim, and photosynthesis, which set the stage for more advanced, oxygen-consuming organisms. The most important development for the next couple of billion years was the DNA-based genetics that would henceforth guide and record evolutionary development.
A key requirement for an evolutionary process is a "written" record of achievement, for otherwise the process would be doomed to repeat finding solutions to problems already solved. For the earliest organisms, the record was written (embodied) in their bodies, coded directly into the chemistry of their primitive cellular structures. With the invention of DNA-based genetics, evolution had designed a digital computer to record its handiwork. This design permitted more complex experiments. The aggregations of molecules called cells organized themselves into societies of cells with the appearance of the first multicellular plants and animals about 700 million years ago. For the next 130 million years, the basic body plans of modern animals were designed, including a spinal cord-based skeleton that provided early fish with an efficient swimming style.
So while evolution took billions of years to design the first primitive cells, salient events then began occurring in hundreds of millions of years, a distinct quickening of the pace. When some calamity finished off the dinosaurs 65 million years ago, mammals inherited the Earth (although the insects might disagree). With the emergence of the primates, progress was then measured in mere tens of millions of years. Humanoids emerged 15 million years ago, distinguished by walking on their hind legs, and now we're down to millions of years.
With larger brains, particularly in the area of the highly convoluted cortex responsible for rational thought, our own species, Homo sapiens, emerged perhaps 500,000 years ago. Homo sapiens are not very different from other advanced primates in terms of their genetic heritage. Their DNA is 98.6 percent the same as the lowland gorilla, and 97.8 percent the same as the orangutan. The story of evolution since that time now focuses in on a human-sponsored variant of evolution: technology.
TECHNOLOGY: EVOLUTION BY OTHER MEANS
When a scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong. The only way of discovering the limits of the possible is to venture a little way past them into the impossible. Any sufficiently advanced technology is indistinguishable from magic.
-- Arthur C. Clarke's three laws of technology
A machine is as distinctively and brilliantly and expressively human as a violin sonata or a theorem in Euclid.
-- Gregory Vlastos
Technology picks right up with the exponentially quickening pace of evolution. Although not the only tool-using animal, Homo sapiens are distinguished by their creation of technology. Technology goes beyond the mere fashioning and use of tools. It involves a record of tool making and a progression in the sophistication of tools. It requires invention and is itself a continuation of evolution by other means. The "genetic code" of the evolutionary process of technology is the record maintained by the tool-making species. Just as the genetic code of the early life-forms was simply the chemical composition of the organisms themselves, the written record of early tools consisted of the tools themselves. Later on, the "genes" of technological evolution evolved into records using written language and are now often stored in computer databases. Ultimately, the technology itself will create new technology. But we are getting ahead of ourselves.
Our story is now marked in tens of thousands of years. There were multiple subspecies of Homo sapiens. Homo sapiens neanderthalensis emerged about 100,000 years ago in Europe and the Middle East and then disappeared mysteriously about 35,000 to 40,000 years ago. Despite their brutish image, Neanderthals cultivated an involved culture that included elaborate funeral rituals -- burying their dead with ornaments, including flowers. We're not entirely sure what happened to our Homo sapiens cousins, but they apparently got into conflict with our own immediate ancestors Homo sapiens sapiens, who emerged about 90,000 years ago. Several species and subspecies of humanoids initiated the creation of technology. The most clever and aggressive of these subspecies was the only one to survive. This established a pattern that would repeat itself throughout human history, in that the technologically more advanced group ends up becoming dominant. This trend may not bode well as intelligent machines themselves surpass us in intelligence and technological sophistication in the twenty-first century.
Our Homo sapiens sapiens subspecies was thus left alone among humanoids about 40,000 years ago.
Our forebears had already inherited from earlier hominid species and subspecies such innovations as the recording of events on cave walls, pictorial art, music, dance, religion, advanced language, fire, and weapons. For tens of thousands of years, humans had created tools by sharpening one side of a stone. It took our species tens of thousands of years to figure out that by sharpening both sides, the resultant sharp edge provided a far more useful tool. One significant point, however, is that these innovations did occur, and they endured. No other tool-using animal on Earth has demonstrated the ability to create and retain innovations in their use of tools.
The other significant point is that technology, like the evolution of life-forms that spawned it, is inherently an accelerating process. The foundations of technology -- such as creating a sharp edge from a stone -- took eons to perfect, although for human-created technology, eons means thousands of years rather than the billions of years that the evolution of life-forms required to get started.
Like the evolution of life-forms, the pace of technology has greatly accelerated over time. The progress of technology in the nineteenth century, for example, greatly exceeded that of earlier centuries, with the building of canals and great ships, the advent of paved roads, the spread of the railroad, the development of the telegraph, and the invention of photography, the bicycle, sewing machine, typewriter, telephone, phonograph, motion picture, automobile, and of course Thomas Edison's light bulb. The continued exponential growth of technology in the first two decades of the twentieth century matched that of the entire nineteenth century. Today, we have major transformations in just a few years' time. As one of many examples, the latest revolution in communications -- the World Wide Web -- didn't exist just a few years ago.
WHAT IS TECHNOLOGY?
As technology is the continuation of evolution by other means, it shares the phenomenon of an exponentially quickening pace. The word is derived from the Greek tekhn¯e, which means "craft" or "art," and logia, which means "the study of." Thus one interpretation of technology is the study of crafting, in which crafting refers to the shaping of resources for a practical purpose. I use the term resources rather than materials because technology extends to the shaping of nonmaterial resources such as information.
Technology is often defined as the creation of tools to gain control over the environment. However, this definition is not entirely sufficient. Humans are not alone in their use or even creation of tools. Orangutans in Sumatra's Suaq Balimbing swamp make tools out of long sticks to break open termite nests. Crows fashion tools from sticks and leaves. The leaf-cutter ant mixes dry leaves with its saliva to create a paste. Crocodiles use tree roots to anchor dead prey.
What is uniquely human is the application of knowledge -- recorded knowledge -- to the fashioning of tools. The knowledge base represents the genetic code for the evolving technology. And as technology has evolved, the means for recording this knowledge base has also evolved, from the oral traditions of antiquity to the written design logs of nineteenth-century craftsmen to the computer-assisted design databases of the 1990s.
Technology also implies a transcendence of the materials used to comprise it. When the elements of an invention are assembled in just the right way, they produce an enchanting effect that goes beyond the mere parts. When Alexander Graham Bell accidentally wire-connected two moving drums and solenoids (metal cores wrapped in wire) in 1875, the result transcended the materials he was working with. For the first time, a human voice was transported, magically it seemed, to a remote location. Most assemblages are just that: random assemblies. But when materials -- and in the case of modern technology, information -- are assembled in just the right way, transcendence occurs. The assembled object becomes far greater than the sum of its parts.
The same phenomenon of transcendence occurs in art, which may properly be regarded as another form of human technology. When wood, varnishes, and strings are assembled in just the right way, the result is wondrous: a violin, a piano. When such a device is manipulated in just the right way, there is magic of another sort: music. Music goes beyond mere sound. It evokes a response -- cognitive, emotional, perhaps spiritual -- in the listener, another form of transcendence. All of the arts share the same goal: of communicating from artist to audience. The communication is not of unadorned data, but of the more important items in the phenomenological garden: feelings, ideas, experiences, longings. The Greek meaning of tekhne logia includes art as a key manifestation of technology.
Language is another form of human-created technology. One of the primary applications of technology is communication, and language provides the foundation for Homo sapiens communication. Communication is a critical survival skill. It enabled human families and tribes to develop cooperative strategies to overcome obstacles and adversaries. Other animals communicate. Monkeys and apes use elaborate gestures and grunts to communicate a variety of messages. Bees perform intricate dances in a figure-eight pattern to communicate where caches of nectar may be found. Female tree frogs in Malaysia do tap dances to signal their availability. Crabs wave their claws in one way to warn adversaries but use a different rhythm for courtship. But these methods do not appear to evolve, other than through the usual DNA-based evolution. These species lack a way to record their means of communication, so the methods remain static from one generation to the next. In contrast, human language does evolve, as do all forms of technology. Along with the evolving forms of language itself, technology has provided ever-improving means for recording and distributing human language.
Homo sapiens are unique in their use and fostering of all forms of what I regard as technology: art, language, and machines, all representing evolution by other means. In the 1960s through 1990s, several well-publicized primates were said to have mastered at least childlike language skills. Chimpanzees Lana and Kanzi pressed sequences of buttons with symbols on them. Gorillas Washoe and Koko were said to be using American Sign Language. Many linguists are skeptical, noting that many primate "sentences" were jumbles, such as "Nim eat, Nim eat, drink eat me Nim, me gum me gum, tickle me, Nim play, you me banana me banana you." Even if we view this
phenomenon more generously, it would be the exception that proves the rule. These primates did not evolve the languages they are credited with using, they do not appear to develop these skills spontaneously, and their use of these skills is very limited. They are at best participating peripherally in what is still a uniquely human invention -- communicating using the recursive (self-referencing), symbolic, evolving means called language.
The Inevitability of Technology
Once life takes hold on a planet, we can consider the emergence of technology as inevitable. The ability to expand the reach of one's physical capabilities, not to mention mental facilities, through technology is clearly useful for survival. Technology has enabled our subspecies to dominate its ecological niche. Technology requires two attributes of its creator: intelligence and the physical ability to manipulate the environment. We'll talk more in chapter 4, "A New Form of Intelligence on Earth," about the nature of intelligence, but it clearly represents an ability to use limited resources optimally, including time. This ability is inherently useful for survival, so it is favored. The ability to manipulate the environment is also useful; otherwise an organism is at the mercy of its environment for safety, food, and the satisfaction of its other needs. Sooner or later, an organism is bound to emerge with both attributes.
THE INEVITABILITY OF COMPUTATION
It is not a bad definition of man to describe him as a tool-making animal. His earliest contrivances to support uncivilized life were tools of the simplest and rudest construction. His latest achievements in the substitution of machinery, not merely for the skill of the human hand, but for the relief of the human intellect, are founded on the use of tools of a still higher order.
-- Charles Babbage
All of the fundamental processes we have examined -- the development of the Universe, the evolution of life-forms, the subsequent evolution of technology -- have all progressed in an exponential fashion, some slowing down, some speeding up. What is the common thread here? Why did cosmology exponentially slow down while evolution accelerated? The answers are surprising, and fundamental to understanding the twenty-first century.
But before I attempt to answer these questions, let's examine one other very relevant example of acceleration: the exponential growth of computation.
Early in the evolution of life-forms, specialized organs developed the ability to maintain internal states and respond differentially to external stimuli. The trend ever since has been toward more complex and capable nervous systems with the ability to store extensive memories; recognize patterns in visual, auditory, and tactile stimuli; and engage in increasingly sophisticated levels of rea-soning. The ability to remember and to solve problems -- computation -- has constituted the cutting edge in the evolution of multicellular organisms.
The same value of computation holds true in the evolution of human-created technology. Products are more useful if they can maintain internal states and respond differentially to varying conditions and situations. As machines moved beyond mere implements to extend human reach and strength, they also began to accumulate the ability to remember and perform logical manipulations. The simple cams, gears, and levers of the Middle Ages were assembled into the elaborate automata of the European Renaissance. Mechanical calculators, which first emerged in the seventeenth century, became increasingly complex, culminating in the first automated U.S. census in 1890. Computers played a crucial role in at least one theater of the Second World War, and have developed in an accelerating spiral ever since.
THE LIFE CYCLE OF A TECHNOLOGY
Technologies fight for survival, evolve, and undergo their own characteristic life cycle. We can identify seven distinct stages. During the precursor stage, the prerequisites of a technology exist, and dreamers may contemplate these elements coming together. We do not, however, regard dreaming to be the same as inventing, even if the dreams are written down. Leonardo da Vinci drew convincing pictures of airplanes and automobiles, but he is not considered to have invented either.
The next stage, one highly celebrated in our culture, is invention, a very brief stage, not dissimilar in some respects to the process of birth after an extended period of labor. Here the inventor blends curiosity, scientific skills, determination, and usually a measure of showmanship to combine methods in a new way to bring a new technology to life.
The next stage is development, during which the invention is protected and supported by doting guardians (which may include the original inventor). Often this stage is more crucial than invention and may involve additional creation that can have greater significance than the original invention. Many tinkerers had constructed finely hand-tuned horseless carriages, but it was Henry Ford's innovation of mass production that enabled the automobile to take root and flourish.
The fourth stage is maturity. Although continuing to evolve, the technology now has a life of its own and has become an independent and established part of the community. It may become so interwoven in the fabric of life that it appears to many observers that it will last forever. This creates an interesting drama when the next stage arrives, which I call the stage of the pretenders. Here an upstart threatens to eclipse the older technology. Its enthusiasts prematurely predict victory. While providing some distinct benefits, the newer technology is found on reflection to be missing some key element of functionality or quality. When it indeed fails to dislodge the established order, the technology conservatives take this as evidence that the original approach will indeed live forever.
This is usually a short-lived victory for the aging technology. Shortly thereafter, another new technology typically does succeed in rendering the original technology into the stage of obsolescence. In this part of the life cycle, the technology lives out its senior years in gradual decline, its original purpose and functionality now subsumed by a more spry competitor. This stage, which may comprise 5 to 10 percent of the life cycle, finally yields to antiquity (examples today: the horse and buggy, the harpsichord, the manual typewriter, and the electromechanical calculator).
To illustrate this, consider the phonograph record. In the mid-nineteenth century, there were several precursors, including Édouard-Léon Scott de Martinville's phonautograph, a device that recorded sound vibrations as a printed pattern. It was Thomas Edison, however, who in 1877 brought all of the elements together and invented the first device that could record and reproduce sound. Further refinements were necessary for the phonograph to become commercially viable. It became a fully mature technology in 1948 when Columbia introduced the 33 revolutions-per-minute (rpm) long-playing record (LP) and RCA Victor introduced the 45-rpm small disc. The pretender was the cassette tape, introduced in the 1960s and popularized during the 1970s. Early enthusiasts predicted that its small size and ability to be rerecorded would make the relatively bulky and scratchable record obsolete.
Despite these obvious benefits, cassettes lack random access (the ability to play selections in a desired order) and are prone to their own forms of distortion and lack of fidelity. In the late 1980s and early 1990s, the digital compact disc (CD) did deliver the mortal blow. With the CD providing both random access and a level of quality close to the limits of the human auditory system, the phonograph record entered the stage of obsolescence in the first half of the 1990s. Although still produced in small quantities, the technology that Edison gave birth to more than a century ago is now approaching antiquity.
Another example is the print book, a rather mature technology today. It is now in the stage of the pretenders, with the software-based "virtual" book as the pretender. Lacking the resolution, contrast, lack of flicker, and other visual qualities of paper and ink, the current generation of virtual book does not have the capability of displacing paper-based publications. Yet this victory of the paper-based book will be short-lived as future generations of computer displays succeed in providing a fully satisfactory alternative to paper.
The Emergence of Moore's Law
Gordon Moore, an inventor of the integrated circuit and then chairman of Intel, noted in 1965 that the surface area of a transistor (as etched on an integrated circuit) was being reduced by approximately 50 percent every twelve months. In 1975, he was widely reported to have revised this observation to eighteen months. Moore claims that his 1975 update was to twenty-four months, and that does appear to be a better fit to the data.
MOORE'S LAW AT WORK
Year
1972
1974
1978
1982
1985
1989
1993
1995
1997 Transistors in Intel's Latest Computer Chip*
3,500
6,000
29,000
134,000
275,000
1,200,000
3,100,000
5,500,000
7,500,000
*Consumer Electronics Manufacturers Association
The result is that every two years, you can pack twice as many transistors on an integrated circuit. This doubles both the number of components on a chip as well as its speed. Since the cost of an integrated circuit is fairly constant, the implication is that every two years you can get twice as much circuitry running at twice the speed for the same price. For many applications, that's an effective quadrupling of the value. The observation holds true for every type of circuit, from memory chips to computer processors.
This insightful observation has become known as Moore's Law on Integrated Circuits, and the remarkable phenomenon of the law has been driving the acceleration of computing for the past forty years. But how much longer can this go on? The chip companies have expressed confidence in another fifteen to twenty years of Moore's Law by continuing their practice of using increasingly higher resolutions of optical lithography (an electronic process similar to photographic printing) to reduce the feature size -- measured today in millionths of a meter -- of transistors and other key components. But then -- after almost sixty years -- this paradigm will break down. The transistor insulators will then be just a few atoms thick, and the conventional approach of shrinking them won't work.
What then?
We first note that the exponential growth of computing did not start with Moore's Law on Integrated Circuits. In the accompanying figure, "The Exponential Growth of Computing, 1900-1998," I plotted forty-nine notable computing machines spanning the twentieth century on an exponential chart, in which the vertical axis represents powers of ten in computer speed per unit cost (as measured in the number of "calculations per second" that can be purchased for $1,000). Each point on the graph represents one of the machines. The first five machines used mechanical technology, followed by three electromechanical (relay based) computers, followed by eleven vacuum-tube machines, followed by twelve machines using discrete transistors. Only the last eighteen computers used integrated circuits.
I then fit a curve to the points called a fourth-order polynomial, which allows for up to four bends. In other words, I did not try to fit a straight line to the points, just the closest fourth-order curve. Yet a straight line is close to what I got. A straight line on an exponential graph means exponential growth. A careful examination of the trend shows that the curve is actually bending slightly upward, indicating a small exponential growth in the rate of exponential growth. This may result from the interaction of two different exponential trends, as I will discuss in chapter 6, "Building New Brains." Or there may indeed be two levels of exponential growth. Yet even if we take the more conservative view that there is just one level of acceleration, we can see that the exponential growth of computing did not start with Moore's Law on Integrated Circuits, but dates back to the advent of electrical computing at the beginning of the twentieth century.
Mechanical Computing Devices
1900 Analytical Engine
1908 Hollerith Tabulator
1911 Monroe Calculator
1919 IBM Tabulator
1928 National Ellis 3000
Electromechanical (Relay Based) Computers
1939 Zuse 2
1940 Bell Calculator Model 1
1941 Zuse 3
Vacuum-Tube Computers
1943 Colossus
1946 ENIAC
1948 IBM SSEC
1949 BINAC
1949 EDSAC
1951 Univac I
1953 Univac 1103
1953 IBM 701
1954 EDVAC
1955 Whirlwind
1955 IBM 704
Discrete Transistor Computers
1958 Datamatic 1000
1958 Univac II
1959 Mobidic
1959 IBM 7090
1960 IBM 1620
1960 DEC PDP-1
1961 DEC PDP-4
1962 Univac III
1964 CDC 6600
1965 IBM 1130
1965 DEC PDP-8
1966 IBM 360 Model 75
Integrated Circuit Computers
1968 DEC PDP-10
1973 Intellec-8
1973 Data General Nova
1975 Altair 8800
1976 DEC PDP-11 Model 70
1977 Cray 1
1977 Apple II
1979 DEC VAX 11 Model 780
1980 Sun-1
1982 IBM PC
1982 Compaq Portable
1983 IBM AT-80286
1984 Apple Macintosh
1986 Compaq Deskpro 386
1987 Apple Mac II
1993 Pentium PC
1996 Pentium PC
1998 Pentium II PC
In the 1980s, a number of observers, including Carnegie Mellon University professor Hans Moravec, Nippon Electric Company's David Waltz, and myself, noticed that computers have been growing exponentially in power, long before the invention of the integrated circuit in 1958 or even the transistor in 1947. The speed and density of computation have been doubling every three years (at the beginning of the twentieth century) to one year (at the end of the twentieth century), regardless of the type of hardware used. Remarkably, this "Exponential Law of Computing" has held true for at least a century, from the mechanical card-based electrical computing technology used in the 1890 U.S. census, to the relay-based computers that cracked the Nazi Enigma code, to the vacuum-tube-based computers of the 1950s, to the transistor-based machines of the 1960s, and to all of the generations of integrated circuits of the past four decades. Computers are about one hundred million times more powerful for the same unit cost than they were a half century ago. If the automobile industry had made as much progress in the past fifty years, a car today would cost a hundredth of a cent and go faster than the speed of light.
As with any phenomenon of exponential growth, the increases are so slow at first as to be virtually unnoticeable. Despite many decades of progress since the first electrical calculating equipment was used in the 1890 census, it was not until the mid-1960s that this phenomenon was even noticed (although Alan Turing had an inkling of it in 1950). Even then, it was appreciated only by a small community of computer engineers and scientists. Today, you have only to scan the personal computer ads -- or the toy ads -- in your local newspaper to see the dramatic improvements in the price performance of computation that now arrive on a monthly basis.
So Moore's Law on Integrated Circuits was not the first, but the fifth paradigm to continue the now one-century-long exponential growth of computing. Each new paradigm came along just when needed. This suggests that exponential growth won't stop with the end of Moore's Law. But the answer to our question on the continuation of the exponential growth of computing is critical to our understanding of the twenty-first century. So to gain a deeper understanding of the true nature of this trend, we need to go back to our earlier questions on the exponential nature of time.
THE LAW OF TIME AND CHAOS
Is the flow of time something real, or might our sense of time passing be just an illusion that hides the fact that what is real is only a vast collection of moments?
-- Lee Smolin
Time is nature's way of preventing everything from happening at once.
-- Graffito
Things are more like they are now than they ever were before.
-- Dwight Eisenhower
Consider these diverse exponential trends:
The exponentially slowing pace that the Universe followed, with three epochs in the first billionth of a second, with later salient events taking billions of years.
The exponentially slowing pace in the development of an organism. In the first month after conception, we grow a body, a head, even a tail. We grow a brain in the first couple of months. After leaving our maternal confines, our maturation both physically and mentally is rapid at first. In the first year, we learn basic forms of mobility and communication. We experience milestones every month or so. Later on, key events march ever more slowly, taking years and then decades.
The exponentially quickening pace of the evolution of life-forms on Earth.
The exponentially quickening pace of the evolution of human-created technology, which picked up the pace from the evolution of life-forms.
The exponential growth of computing. Note that exponential growth of a process over time is just another way of expressing an exponentially quickening pace. For example, it took about ninety years to achieve the first MIP (Million Instructions per Second) for a thousand dollars. Now we add an additional MIP per thousand dollars every day. The overall innovation rate is clearly accelerating as well.
Moore's Law on Integrated Circuits. As I noted, this was the fifth paradigm to achieve the exponential growth of computing.
Many questions come to mind:
What is the common thread between these varied exponential trends?
Why do some of these processes speed up while others slow down?
And what does this tell us about the continuation of the exponential growth of computing when Moore's Law dies?
Is Moore's Law just a set of industry expectations and goals, as Randy Isaac, head of basic science at IBM, contends? Or is it part of a deeper phenomenon that goes far beyond the photolithography of integrated circuits?
After thinking about the relationship between these apparently diverse trends for several years, the surprising common theme became apparent to me.
What determines whether time speeds up or slows down? The consistent answer is that time moves in relation to the amount of chaos. We can state the Law of Time and Chaos as follows:
The Law of Time and Chaos: In a process, the time interval between salient events (that is, events that change the nature of the process, or significantly affect the future of the process) expands or contracts along with the amount of chaos.
When there is a lot of chaos in a process, it takes more time for significant events to occur. Conversely, as order increases, the time periods between salient events decrease.
We have to be careful here in our definition of chaos. It refers to the quantity of disordered (that is, random) events that are relevant to the process. If we're dealing with the random movement of atoms and molecules in a gas or liquid, then heat is an appropriate measure. If we're dealing with the process of evolution of life-forms, then chaos represents the unpredictable events encountered by organisms, and the random mutations that are introduced in the genetic code.
Let's see how the Law of Time and Chaos applies to our examples. If chaos is increasing, the Law of Time and Chaos implies the following sublaw:
The Law of Increasing Chaos: As chaos exponentially increases, time exponentially slows down (that is, the time interval between salient events grows longer as time passes).
This fits the Universe rather well. When the entire Universe was just a "naked" singularity -- a perfectly orderly single point in space and time -- there was no chaos and conspicuous events took almost no time at all. As the Universe grew in size, chaos increased exponentially, and so did the timescale for epochal changes. Now, with billions of galaxies sprawled out over trillions of light-years of space, the Universe contains vast reaches of chaos, and indeed requires billions of years to get everything organized for a paradigm shift to take place.
We see a similar phenomenon in the progression of an organism's life. We start out as a single fertilized cell, so there's only rather limited chaos there. Ending up with trillions of cells, chaos greatly expands. Finally, at the end of our lives, our designs deteriorate, engendering even greater randomness. So the time period between salient biological events grows longer as we grow older. And that is indeed what we experience.
But it is the opposite spiral of the Law of Time and Chaos that is the most important and relevant for our purposes. Consider the inverse sublaw, which I call the Law of Accelerating Returns:
The Law of Accelerating Returns: As order exponentially increases, time exponentially speeds up (that is, the time interval between salient events grows shorter as time passes).
The Law of Accelerating Returns (to distinguish it from a better-known law in which returns diminish) applies specifically to evolutionary processes. In an evolutionary process, it is order -- the opposite of chaos -- that is increasing. And, as we have seen, time speeds up.
Disdisorder
I noted above that the concept of chaos in the Law of Time and Chaos is tricky. Chaos alone is not sufficient -- disorder for our purposes requires randomness that is relevant to the process we are concerned with. The opposite of disorder -- which I called "order" in the above Law of Accelerating Returns -- is even trickier.
Let's start with our definition of disorder and work backward. If disorder represents a random sequence of events, then the opposite of disorder should imply "not random." And if random means unpredictable, then we might conclude that order means predictable. But that would be wrong.
Borrowing a page from information theory, consider the difference between information and noise. Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism, or the bits in a computer program. Noise, on the other hand, is a random sequence. Neither noise nor information is predictable. Noise is inherently unpredictable, but carries no information. Information, however, is also unpredictable. If we can predict future data from past data, then that future data stops being information. For example, consider a sequence which simply alternates between zero and one (01010101 . . .). Such a sequence is certainly orderly, and very predictable. Specifically because it is so predictable, we do not consider it information bearing, beyond the first couple of bits.
Thus orderliness does not constitute order because order requires information. So, perhaps I should use the word information instead of order. However, information alone is not sufficient for our purposes either. Consider a phone book. It certainly represents a lot of information, and some order as well. Yet if we double the size of the phone book, we have increased the amount of data, but we have not achieved a deeper level of order.
Order, then, is information that fits a purpose. The measure of order is the measure of how well the information fits the purpose. In the evolution of life-forms, the purpose is to survive. In an evolutionary algorithm (a computer program that simulates evolution to solve a problem) applied to, say, investing in the stock market, the purpose is to make money. Simply having more information does not necessarily result in a better fit. A superior solution for a purpose may very well involve less data.
The concept of "complexity" has been used recently to describe the nature of the information created by an evolutionary process. Complexity is a reasonably close fit to the concept of order that I am describing. After all, the designs created by the evolution of life-forms on Earth appear to have become more complex over time. However, complexity is not a perfect fit, either. Sometimes, a deeper order -- a better fit to a purpose -- is achieved through simplification rather than further increases in complexity. As Einstein said, "Everything should be made as simple as possible, but no simpler." For example, a new theory that ties together apparently disparate ideas into one broader, more coherent theory reduces complexity but nonetheless may increase the "order for a purpose" that I am describing. Evolution has shown, however, that the general trend toward greater order does generally result in greater complexity.
Thus improving a solution to a problem -- which may increase or decrease complexity -- increases order. Now that just leaves the issue of defining the problem. And as we will see, defining a problem well is often the key to finding its solution.
The Law of Increasing Entropy Versus the Growth of Order
Another consideration is how the Law of Time and Chaos relates to the second law of thermodynamics. Unlike the second law, the Law of Time and Chaos is not necessarily concerned with a closed system. It deals instead with a process. The Universe is a closed system (not subject to outside influence, since there is nothing outside the Universe), so in accordance with the second law of thermodynamics, chaos increases and time slows down. In contrast, evolution is precisely not a closed system. It takes place amid great chaos, and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order. Even a crisis that appears to introduce a significant new source of chaos is likely to end up increasing -- deepening -- the order created by an evolutionary process. For example, consider the asteroid that is thought to have killed off big organisms such as the dinosaurs 65 million years ago. The crash of that asteroid suddenly created a vast increase in chaos (and lots of dust, too). Yet it appears to have hastened the rise of mammals in the niche previously dominated by large reptiles and ultimately led to the emergence of a technology-creating species. When the dust settled (literally), the crisis of the asteroid had increased order.
As I pointed out earlier, only a tiny fraction of the stuff in the Universe, or even on a life- and technology-bearing planet such as Earth, can be considered to be part of evolution's inventions. Thus evolution does not contradict the Law of Increasing Entropy. Indeed, it depends on it to provide a never-ending supply of options.
As I noted, given the emergence of life, the emergence of a technology-creating species -- and of technology -- is inevitable. Technology is the continuation of evolution by other means, and is itself an evolutionary process. So it, too, speeds up.
A primary reason that evolution -- of life-forms or of technology -- speeds up is that it builds on its own increasing order. Innovations created by evolution encourage and enable faster evolution. In the case of the evolution of life-forms, the most notable example is DNA, which provides a recorded and protected transcription of life's design from which to launch further experiments.
In the case of the evolution of technology, ever improving human methods of recording information have fostered further technology. The first computers were designed on paper and assembled by hand. Today, they are designed on computer workstations with the computers themselves working out many details of the next generation's design, and are then produced in fully automated factories with human guidance but limited direct intervention.
The evolutionary process of technology seeks to improve capabilities in an exponential fashion. Innovators seek to improve things by multiples. Innovation is multiplicative, not additive. Technology, like any evolutionary process, builds on itself. This aspect will continue to accelerate when the technology itself takes full control of its own progression.
We can thus conclude the following with regard to the evolution of life-forms, and of technology:
The Law of Accelerating Returns as Applied to an Evolutionary Process:
An evolutionary process is not a closed system; therefore, evolution draws upon the chaos in the larger system in which it takes place for its options for diversity; and
Evolution builds on its own increasing order.
Therefore:
In an evolutionary process, order increases exponentially.
Therefore:
Time exponentially speeds up.
Therefore:
The returns (that is, the valuable products of the process) accelerate.
The phenomenon of time slowing down and speeding up is occurring simultaneously. Cosmologically speaking, the Universe continues to slow down. Evolution, now most noticeably in the form of human-created technology, continues to speed up. These are the two sides -- two interleaved spirals -- of the Law of Time and Chaos.
The spiral we are most interested in -- the Law of Accelerating Returns -- gives us ever greater order in technology, which inevitably leads to the emergence of computation. Computation is the essence of order. It provides the ability for a technology to respond in a variable and appropriate manner to its environment to carry out its mission. Thus computational technology is also an evolutionary process, and also builds on its own progress. The time to accomplish a fixed objective gets exponentially shorter over time (for example, ninety years for the first MIP per thousand dollars versus one day for an additional MIP today). That the power of computing grows exponentially over time is just another way to say the same thing.
So Where Does That Leave Moore's Law?
Well, it still leaves it dead by the year 2020. Moore's Law came along in 1958 just when it was needed and will have done its sixty years of service by 2018, a rather long period of time for a paradigm nowadays. Unlike Moore's Law, however, the Law of Accelerating Returns is not a temporary methodology. It is a basic attribute of the nature of time and chaos -- a sublaw of the Law of Time and Chaos -- and describes a wide range of apparently divergent phenomena and trends. In accordance with the Law of Accelerating Returns, another computational technology will pick up where Moore's Law will have left off, without missing a beat.
Most Exponential Trends Hit a Wall . . . but Not This One
A frequent criticism of predictions of the future is that they rely on mindless extrapolation of current trends without consideration of forces that may terminate or alter that trend. This criticism is particularly relevant in the case of exponential trends. A classic example is a species happening upon a hospitable new habitat, perhaps transplanted there by human intervention (rabbits in Australia, say). Its numbers multiply exponentially for a while, but this phenomenon is quickly terminated when the exploding population runs into a new predator or the limits of its environment. Similarly, the geometric population growth of our own species has been a source of anxiety, but changing social and economic factors, including growing prosperity, have greatly slowed this expansion in recent years, even in developing countries.
Based on this, some observers are quick to predict the demise of the exponential growth of computing.
But the growth predicted by the Law of Accelerating Returns is an exception to the frequently cited limitations to exponential growth. Even a catastrophe, as apparently befell our reptilian cohabitants in the late Cretaceous period, only sidesteps an evolutionary process, which then picks up the pieces and continues unabated (unless the entire process is wiped out). An evolutionary process accelerates because it builds on its past achievements, which includes improvements in its own means for further evolution. In the evolution of life-forms, in addition to DNA-based genetic coding, the innovation of sexual reproduction provided for improved means of experimenting with diverse characteristics within an otherwise homogenous population. The establishment of basic body plans of modern animals in the "Cambrian explosion," about 570 million years ago, allowed evolution to concentrate on higher-level features such as expanded brain function. The inventions of evolution in one era provide the means, and often the intelligence, for innovation in the next.
The Law of Accelerating Returns applies equally to the evolutionary process of computation, which inherently will grow exponentially and essentially without limit. The two resources it needs -- the growing order of the evolving technology itself and the chaos from which an evolutionary process draws its options for further diversity -- are unbounded. Ultimately, the innovation needed for further turns of the screw will come from the machines themselves.
How will the power of computing continue to accelerate after Moore's Law dies? We are just beginning to explore the third dimension in chip design. The vast majority of today's chips are flat, whereas our brain is organized in three dimensions. We live in a three-dimensional world, so why not use the third dimension? Improvements in semiconductor materials, including superconducting circuits that don't generate heat, will enable us to develop chips -- that is, cubes -- with thousands of layers of circuitry that, combined with far smaller component geometries, will improve computing power by a factor of many millions. And there are more than enough other new computing technologies waiting in the wings -- nanotube, optical, crystalline, DNA, and quantum (which we'll visit in chapter 6, "Building New Brains") -- to keep the Law of Accelerating Returns going in the world of computation for a very long time.
THE LEARNING CURVE: SLUG VERSUS HUMAN
The "learning curve" describes the mastery of a skill over time. As an entity -- slug or human -- learns a new skill, the newly acquired ability builds on itself, and so the learning curve starts out looking like the exponential growth we see in the Law of Accelerating Returns. Skills tend to be bounded, so as the new expertise is mastered, the law of diminishing returns sets in, and growth in mastery levels off. So the learning curve is what we call an S curve because exponential growth followed by a leveling off looks like an S leaning slightly to the right: S.
The learning curve is remarkably universal: Most multicellular creatures do it. Slugs, for example, follow the learning curve when learning how to ascend a new tree in search of leaves. Humans, of course, are always learning something new.
But there's a salient difference between humans and slugs. Humans are capable of innovation, which is the creation and retention of new skills and knowledge. Innovation is the driving force in the Law of Accelerating Returns, and eliminates the leveling-off part of the S curve. So innovation turns the S curve into indefinite exponential expansion.
Overcoming the S curve is another way to express the unique status of the human species. No other species appears to do this. Why are we unique in this way, given that other primates are so close to us in terms of genetic similarity?
The reason is that the ability to overcome the S curve defines a new ecological niche. As I pointed out, there were indeed other humanoid species and subspecies capable of innovation, but the niche seems to have tolerated only one surviving competitor. But we will have company in the twenty-first century as our machines join us in this exclusive niche.
A Planetary Affair
The introduction of technology on Earth is not merely the private affair of one of the Earth's innumerable species. It is a pivotal event in the history of the planet. Evolution's grandest creation -- human intelligence -- is providing the means for the next stage of evolution, which is technology. The emergence of technology is predicted by the Law of Accelerating Returns. The Homo sapiens sapiens subspecies emerged only tens of thousands of years after its human forebears. According to the Law of Accelerating Returns, the next stage of evolution should measure its salient events in mere thousands of years, too quick for DNA-based evolution. This next stage of evolution was necessarily created by human intelligence itself, another example of the exponential engine of evolution using its innovations from one period (human beings) to create the next (intelligent machines).
Evolution draws upon the great chaos in its midst -- the ever increasing entropy governed by the flip side of the Law of Time and Chaos -- for its options for innovation. These two strands of the Law of Time and Chaos -- time exponentially slowing down due to the increasing chaos predicted by the second law of thermodynamics; and time exponentially speeding up due to the increasing order created by evolution -- coexist and progress without limit. In particular, the resources of evolution, order and chaos, are unbounded. I stress this point because it is crucial to understanding the evolutionary -- and revolutionary -- nature of computer technology.
The emergence of technology was a milestone in the evolution of intelligence on Earth because it represented a new means of evolution recording its designs. The next milestone will be technology creating its own next generation without human intervention. That there is only a period of tens of thousands of years between these two milestones is another example of the exponentially quickening pace that is evolution.
The Inventor of Chess and the Emperor of China
To appreciate the implications of this (or any) geometric trend, it is useful to recall the legend of the inventor of chess and his patron, the emperor of China. The emperor had so fallen in love with his new game that he offered the inventor a reward of anything he wanted in the kingdom.
"Just one grain of rice on the first square, Your Majesty."
"Just one grain of rice?"
"Yes, Your Majesty, just one grain of rice on the first square, and two grains of rice on the second square."
"That's it -- one and two grains of rice?"
"Well, okay, and four grains on the third square, and so on."
The emperor immediately granted the inventor's seemingly humble request. One version of the story has the emperor going bankrupt because the doubling of grains of rice for each square ultimately equaled 18 million trillion grains of rice. At ten grains of rice per square inch, this requires rice fields covering twice the surface area of the Earth, oceans included.
The other version of the story has the inventor losing his head. It's not yet clear which outcome we're headed for.
But there is one thing we should note: It was fairly uneventful as the emperor and the inventor went through the first half of the chessboard. After thirty-two squares, the emperor had given the inventor about 4 billion grains of rice. That's a reasonable quantity -- about one large field's worth -- and the emperor did start to take notice.
But the emperor could still remain an emperor. And the inventor could still retain his head. It was as they headed into the second half of the chessboard that at least one of them got into trouble.
So where do we stand now? There have been about thirty-two doublings of speed and capacity since the first operating computers were built in the 1940s. Where we stand right now is that we have finished the first half of the chessboard. And, indeed, people are starting to take notice.
Now, as we head into the next century, we are heading into the second half of the chessboard. And this is where things start to get interesting.
OKAY, LET ME GET THIS STRAIGHT, MY CONCEPTION AS A FERTILIZED EGG WAS LIKE THE UNIVERSE'S BIG BANG -- UH, NO PUN INTENDED -- THAT IS, THINGS STARTED OUT HAPPENING VERY FAST, THEN KIND OF SLOWED DOWN, AND NOW THEY'RE REAL SLOW?
That's a reasonable way to put it, the time interval now between milestones is a lot longer than it was when you were an infant, let alone a fetus.
YOU MENTIONED THE UNIVERSE HAD THREE PARADIGM SHIFTS IN THE FIRST BILLIONTH OF A SECOND. WERE THINGS THAT FAST WHEN I GOT STARTED?
Not quite that fast. The Universe started as a singularity, a single point taking up no space and comprising, therefore, no chaos. So the first major event, which was the creation of the Universe, took no time at all. With the Universe still very small, events unfolded extremely quickly. We don't start out as a single point, but as a rather complex cell. It has order but there is a lot of random activity within a cell compared to a single point in space. So our first major event as an organism, which is the first mitosis of our fertilized egg, is measured in hours, not trillionths of a second. Things slow down from there.
BUT I FEEL LIKE TIME IS SPEEDING UP. THE YEARS JUST GO BY SO MUCH FASTER NOW THAN THEY DID WHEN I WAS A KID. DON'T YOU HAVE IT BACKWARD?
Yes, well, the subjective experience is the opposite of the objective reality.
OF COURSE. WHY DIDN'T I THINK OF THAT?
Let me clarify what I mean. The objective reality is the reality of the outside observer observing the process. If we observe the development of an individual, salient events happen very quickly at first, but later on milestones are more spread out, so we say time is slowing down. The subjective experience, however, is the experience of the process itself, assuming, of course, that the process is conscious. Which in your case, it is. At least, I assume that's the case.
THANK YOU.
Subjectively, our perception of time is affected by the spacing of milestones.
MILESTONES?
Yeah, like growing a body and a brain.
AND BEING BORN?
Sure, that's a milestone. Then learning to sit up, walking, talking . . .
OKAY.
We can consider each subjective unit of time to be equivalent to one milestone spacing. Since our milestones are spaced further apart as we grow older, a subjective unit of time will represent a longer span of time for an adult than for a child. Thus time feels like it is passing by more quickly as we grow older. That is, an interval of a few years as an adult may be perceived as comparable to a few months to a young child. Thus a long interval to an adult and a short interval to a child both represent the same subjective time in terms of the passage of salient events. Of course, long and short intervals also represent comparable fractions of their respective past lives.
SO DOES THAT EXPLAIN WHY TIME PASSES MORE QUICKLY WHEN I'M HAVING A GOOD TIME?
Well, it may be relevant to one phenomenon. If someone goes through an experience in which a lot of significant events occur, that experience may feel like a much longer period of time than a calmer period. Again, we measure subjective time in terms of salient experiences.
NOW IF I FIND TIME SPEEDING UP WHEN OBJECTIVELY IT IS SLOWING DOWN, THEN EVOLUTION WOULD SUBJECTIVELY FIND TIME SLOWING DOWN AS IT OBJECTIVELY SPEEDS UP, DO I HAVE THAT STRAIGHT?
Yes, if evolution were conscious.
WELL, IS IT?
There's no way to really tell, but evolution has its time spiral going in the opposite direction from entities we generally consider to be conscious, such as humans. In other words, evolution starts out slow and speeds up over time, whereas the development of a person starts out fast and then slows down. The Universe, however, does have its time spiral going in the same direction as us organisms, so it would make more sense to say that the Universe is conscious. And come to think of it, that does shed some light on what happened before the big bang.
I WAS JUST WONDERING ABOUT THAT.
As we look back in time and get closer to the event of the big bang, chaos is shrinking to zero. Thus from the subjective perspective, time is stretching out. Indeed, as we go back in time and approach the big bang, subjective time approaches infinity. Thus it is not possible to go back past a subjective infinity of time.
THAT'S A LOAD OFF MY MIND. NOW YOU SAID THAT THE EXPONENTIAL PROGRESS OF AN EVOLUTIONARY PROCESS GOES ON FOREVER. IS THERE ANYTHING THAT CAN STOP IT?
Only a catastrophe that wipes out the entire process.
SUCH AS AN ALL-OUT NUCLEAR WAR?
That's one scenario, but in the next century, we will encounter a plethora of other "failure modes." We'll talk about this in later chapters.
I CAN'T WAIT. NOW TELL ME THIS, WHAT DOES THE LAW OF ACCELERATING RETURNS HAVE TO DO WITH THE TWENTY-FIRST CENTURY?
Exponential trends are immensely powerful but deceptive. They linger for eons with very little effect. But once they reach the "knee of the curve," they explode with unrelenting fury. With regard to computer technology and its impact on human society, that knee is approaching with the new millennium. Now I have a question for you.
SHOOT.
Just who are you anyway?
WHY, I'M THE READER.
Of course. Well, it's good to have you contributing to the book while there's still time to do something about it.
GLAD TO. NOW, YOU NEVER DID GIVE THE ENDING TO THE EMPEROR STORY. SO DOES THE EMPEROR LOSE HIS EMPIRE, OR DOES THE INVENTOR LOSE HIS HEAD?
I have two endings, so I just can't say.
Subscribe to:
Post Comments (Atom)
Blog Archive
-
►
2018
(2)
- ► 03/25 - 04/01 (2)
-
►
2013
(5)
- ► 08/04 - 08/11 (2)
- ► 06/30 - 07/07 (1)
- ► 06/23 - 06/30 (2)
-
►
2012
(2)
- ► 08/05 - 08/12 (1)
- ► 06/03 - 06/10 (1)
-
►
2011
(3)
- ► 11/20 - 11/27 (1)
- ► 08/07 - 08/14 (1)
- ► 07/24 - 07/31 (1)
-
►
2010
(3)
- ► 11/21 - 11/28 (1)
- ► 11/07 - 11/14 (2)
-
▼
2009
(916)
- ► 06/14 - 06/21 (2)
- ► 06/07 - 06/14 (11)
- ► 05/17 - 05/24 (2)
- ► 05/03 - 05/10 (50)
- ► 04/26 - 05/03 (4)
- ► 04/19 - 04/26 (90)
- ► 04/12 - 04/19 (32)
- ► 04/05 - 04/12 (15)
- ► 03/29 - 04/05 (160)
- ► 03/22 - 03/29 (107)
- ► 03/15 - 03/22 (124)
-
▼
03/08 - 03/15
(37)
- The Nomalization of EvilThe Nomalization of EvilBY...
- PINK FLOYD MEDDLE
- GORDON LIGHTFOOT
- 'Sniper'
- 'The Lesser Evil Politics in an age of terror
- September 30, 2001
- Tiger on the Brink China's new elite
- Two Faces of Liberalism
- One Hundred Years of Socialism''
- Marcus Aurelius The Meditations
- the Fifty-Year Swiss-Nazi Conspiracy to Steal Bill...
- LARRY ELDER
- 'Napoleon: A Biography
- 'Condi vs. Hillary'
- The Russian Revolution Through American Eyes
- A HISTORY OF THE ZIONIST-ARAB CONFLICT 1881-1999
- BERTRAND RUSSELL THE GHOST OF MADNESS
- CROSSING OVER FROM ROMANTICISM TO ROCK AND ROLL. A...
- PROZAC BACKLASH
- BUSHISMS
- THE AGE OF SPIRITUAL MACHINES. WHEN COMP[UTERS EXC...
- THE ABOLITIONISTS
- CRAZY AMEERICANS. THE HYPOMANIC ENTREPRENEUR
- BAD RAP ON THE SCHOOLS THE WILSON QUARTERLY
- ROBERT BORK
- THE NEW NARCISSIST
- 1491. NEW REVELATIONS OF THE AMERICA BEFORE COLUMBUS
- BRAVE NEW WORLDS STAYING HUMAN IN THE GENETIC FUTURE
- FIRST CHAPTERS. THE TRUTH ABOUT DRUG COMPANIES
- what the...?
- I DON'T WORSHIP AN AWESOME GOD I CAN'T SEE.,
- "Shorty" Guzman,
- The Challenge of Positive Political Theory
- SOLDIERS
- LIST OF SONGS
- AUDEN
- lido
- ► 03/01 - 03/08 (19)
- ► 02/22 - 03/01 (7)
- ► 02/15 - 02/22 (256)
No comments:
Post a Comment