In the Winter 1998/99 FREE INQUIRY, Richard Dawkins published Part 1 of his assessment of the state of science as we end the millennium. He traced scientific progress in the last 1,000 years to today's digital revolution, and laments the deficiencies that prevent modern times from being the golden age of science.
He continues his analysis in Part 2 below. "Science and Sensibility" is adapted from a lecture series entitled "Sounding the Century: What Will the Twentieth Century Leave to its Heirs?" and expanded upon in his new book Unweaving the Rainbow.
Personal uncertainty about the uncertainty principle reminds me of another hallmark that will he alleged for twentiethcentury science. This is the century, it will be claimed, in which the deterministic confidence of the previous one was shattered. Partly by quantum theory. Partly by chaos (in the trendy, not the ordinary language meaning). And partly by relativism (cultural relativism, not the sensible, Einsteinian meaning).
TRENDS OF ANTI-SCIENCE
Quantum uncertainty and chaos theory have had deplorable effects upon popular culture, much to the annoyance of genuine aficionados. Both are regularly exploited by obscurantists, ranging from professional quacks to daffy New-Agers. In America, the self-help "healing" industry coins millions, and it has not been slow to cash in on quantum theory's formidable talent to bewilder. This has been documented by the American physicist Victor Stenger. One well-heeled healer wrote a string of best-selling books on what he calls "Quantum Healing." Another book in my possession has sections on quantum psychology, quantum responsibility, quantum morality, quantum aesthetics, quantum immortality, and quantum theology.
Chaos theory, a more recent invention, is equally fertile ground for those with a bent for abusing sense. It unfortunately named, for "chaos" implies randomness. Chaos in the technical sense is not random at all. It is completely determined, but it depends hugely, in strangely hard-to-predict ways, on tiny differences in initial conditions. Undoubtedly it is mathematically interesting. If it impinges on the real world, it would rule out ultimate prediction. If the weather is technically chaotic, weather forecasting in detail becomes impossible. Major events like hurricanes might be determined by tiny causes in the past such-as the now proverbial flap of a butterfly's wing. This does not mean that you can flap the equivalent of a wing and hope to generate a hurricane. As the physicist Robert Park says, this is "a total misunderstanding of what chaos is about. . . while the flapping of a butterfly's wings might conceivably trigger a hurricane, killing butterflies is unlikely to reduce the incidence of hurricanes."
Quantum theory and chaos theory, each in their own peculiar ways, may call into question the predictability of the universe, in deep principle. This could be seen as a retreat from nineteenth-century confidence. But nobody really thought that such fine details would ever be predicted in practice, anyway. The most confident determinist would always have admitted that, in practice, sheer complexity of interacting causes would defeat accurate prediction of weather or turbulence. So chaos doesn't make a lot of difference in practice. Conversely, quantum events are statistically smothered, and massively so, in most realms that impinge on us. So the possibility of prediction is, for practical purposes, restored.
In the late twentieth century, prediction of future events in practice has never been more confident or more accurate. This is dramatic in the feats of space engineers. Previous centuries could predict the return of Halley's Comet. Twentieth-century science can hurl a projectile along the right trajectory to intercept it, precisely computing and exploiting the gravitational slings of the solar system. Quantum theory itself, whatever the indeterminacy at its heart, is spectacularly accurate in the experimental accuracy of its predictions. The late Richard Feynman assessed this accuracy as equivalent to knowing the distance between New York and Los Angeles to the width of one human hair. Here is no licence for anything-goes, intellectual flappers, with their quantum theology and quantum you-name-it.
Cultural relativism is the most pernicious of these myths of twentieth-century retreat from Victorian certainty. A modish fad sees science as only one of many cultural myths, no more true nor valid than the myths of any other culture. In the United States it is fed by justified guilt over the appalling treatment of Native Americans. But the consequences can be laughable, as in the case of Kennewick Man.
Kennewick Man is a skeleton discovered in Washington State in 1996, carbon-dated to older than 9,000 years. Anthropologists were intrigued by anatomical suggestions that he might be unrelated to typical Native Americans, and might represent a separate early migration across what is now the Bering Strait, or even from Iceland. They were about to do all-important DNA tests when the legal authorities seized the skeleton, intending to hand it over to representatives of local Indian tribes, who proposed to bury it and forbid all further study. Naturally there was widespread opposition from the scientific and archaeological community. What if Kennewick Man is an American Indian of some kind, it is highly unlikely that his affinities lie with whichever particular tribe happens to live in the same area 9,000 years later.
Native Americans have impressive legal muscle, and "The Ancient One" might have been handed over to the tribes, but for a bizarre twist. The Asatru Folk Assembly, a group of worshipers of the Norse Gods Thor and Odin, filed an independent legal claim that Kennewick Man was actually a Viking. This Nordic sect, whose case you may read in your copy of The Runestone, were actually allowed to hold a religious service over the bones. This upset the Yakama Indian community, whose spokesman feared that the Viking ceremony could be "keeping Kennewick Man's spirit from finding his body." The dispute between Indians and Norsemen might be settled by DNA comparison with Kennewick Man, and the Norsemen are quite keen to be put to this test. More probably, DNA would decide the case in favor of neither side. Further scientific study would certainly cast fascinating light on the question of when humans first arrived in America. But Indian leaders resent the very idea of studying this question, because they believe their ancestors have been in America since the creation. As Armand Minthorn, religious leader of the Umatilla tribe, puts it: " From our oral histories, we know that our people have been part of this land since the beginning of time. We do not believe our people migrated here from another continent, as the scientists do."
Perhaps the best policy for the archaeologists would be to declare themselves a religion, with DNA fingerprints their sacramental totem. Facetious, but, such is the climate in the United States at the end of the twentieth century, it is possibly the only recourse that would work. If you say, "Look, here is overwhelming evidence from carbon dating, from mitochondrial DNA, and from archaeological analyses of pottery, that X is the case" you will get nowhere. But if you say, "It is a fundamental and unquestioned belief of my culture that X is the case" you will immediately hold a judge's attention.
Also the attention of many in the academic community who, in the late twentieth century, have discovered a new form of anti-scientific rhetoric, sometimes called the "postmodern critique" of science. The most thorough whistleblowing on this kind of thing is Paul Gross and Norman Levitt's splendid book, Higher Superstition: The Academic Left and Its Quarrels with Science. The American anthropologist Matt Cartmill sums up the basic credo:
Anybody who claims to have objective knowledge about anything is trying to control and dominate the rest of us.... There are no objective facts. All supposed "facts" are contaminated with theories, and all theories are infested with moral and political doctrines. Therefore, when some guy in a lab coat tells you that such and such is an objective fact . . . he must have a political agenda up his starched white sleeve. There are even a few, but very vocal, fifth columnists within science itself who hold exactly these views, and use them to waste the time of the rest of us.
Cartmill's thesis is that there is an unexpected and pernicious alliance between the know-nothing fundamentalist religious Right, and the sophisticated academic Left. A bizarre manifestation of the alliance is joint opposition to the theory of evolution. The opposition of the fundamentalists is obvious. That of the left is a compound of hostility to science in general, of "respect" for tribal creation myths, and various political agendas. Both these strange bedfellows share a concern for "human dignity" and take offense at treating humans as "animals." Moreover, in Cartmill's words, "Both camps believe that the big truths about the world are moral truths. They view the universe in terms of good and evil, not truth and falsehood. The first question they ask about any supposed fact is whether it serves the cause of righteousness."
And there is a feminist angle, which saddens me, for I am sympathetic to true feminism. Instead of exhorting young women to prepare for a variety of technical subjects by studying science, logic, and mathematics, Women's Studies students are now being taught that logic is a tool of domination. . . the standard norms and methods of scientific inquiry are sexist because they are incompatible with "women's ways of knowing." The authors of the prizewinning book with this title report that the majority of the women they interviewed fell into the category of "subjective knowers," characterized by a "passionate rejection of science and scientists." These "subjectivist" women see the methods of logic, analysis, and abstraction as "alien territory belonging to men" and "value intuition as a safer and more fruitful approach to truth."
That was a quotation from the historian and philosopher of science Noretta Koertge, who is understandably worried about a subversion of feminism which could have a malign influence upon women's education. Indeed, there is an ugly, hectoring streak in this kind of thinking. Barbara Ehrenreich and Janet McIntosh witnessed a woman psychologist speaking at an interdisciplinary conference. Various members of the audience attacked her use of the oppressive, sexist, imperialist, and capitalist scientific method. The psychologist tried to defend science by pointing to its great discoveriesfor example DNA. The retort came back "You believe in DNA?" Fortunately, there are still many intelligent young women prepared to enter a scientific career, and I should like to pay tribute to their courage in the face of such bullying intimidation.
I have come so far with scarcely a mention of Charles Darwin. His life spanned most of the nineteenth century, and he died with every right to be satisfied that he had cured humanity of its greatest and grandest illusion. Darwin brought life itself within the pale of the explicable. No longer a baffling mystery demanding supernatural explanation, life, with the complexity and elegance that defines it, grows and gradually emerges, by easily understood rules, from simple beginnings. Darwin's legacy to the twentieth century was to demystify the greatest mystery of all.
LEGACY AND OUTLOOK
Would Darwin be pleased with our stewardship of that legacy, and with what we are now in a position to pass to the twenty-first century? I think he would feel an odd mixture of exhilaration and exasperation. Exhilaration at the detailed knowledge, the comprehensiveness of understanding, that science can now offer, and the polish with which his own theory is being brought to fulfillment. Exasperation at the ignorant suspicion of science, and the air-headed superstition, that still persist.
Exasperation is too weak a word. Darwin might justifiably be saddened, given our huge advantages over himself and his contemporaries, at how little we seem to have done to deploy our superior knowledge in our culture. Late twentieth-century civilization, Darwin would be dismayed to note, though imbued and surrounded by the products and advantages of science, has yet to draw science into its sensibility. Is there even a sense in which we have slipped backwards since Darwin's co-discoverer, Alfred Russel Wallace, wrote The Wonderful Century, a glowing scientific retrospective on his era?
Perhaps there was undue complacency in turn-of-century science, about how much had been achieved and how little more advancement could be expected. William Thomson, First Lord Kelvin, president of the Royal Society, pioneered the transatlantic cable-symbol of Victorian progress-and also the second law of thermodynamics-C. P. Snow's litmus of scientific literacy. Kelvin is credited with the following three confident predictions "Radio has no future"; "Heavier than air flying machines are impossible"; "X-rays will prove to be a hoax."
Kelvin also gave Darwin a lot of grief by proving, using all the prestige of the senior science of physics, that the sun was too young to have allowed time for evolution. Kelvin, in effect, said, "Physics argues against evolution" so your biology must be wrong. Darwin could have retorted: "Biology shows that evolution is a fact, so your physics must be wrong." Instead, he bowed to the prevailing assumption that physics automatically trumps biology, and fretted. Twentiethcentury physics, of course, showed Kelvin wrong by powers of ten But Darwin did not live to see his vindication, and he never had the confidence to tell the senior physicist of his day where to get off.
In my attacks on millenarial superstition, I must beware of Kelvinian over-confidence. Undoubtedly there is much that we still don't know. Part of our legacy to the twenty-first century must be unanswered questions, and some of them are big ones. The science of any age must prepare to be superseded. It would be arrogant and rash to claim our present knowledge as all there is to know. Today's commonplaces, such as mobile telephones, would have seemed to previous ages pure magic. And that should be our warning. Arthur C. Clarke, distinguished novelist and evangelist for the limitless power of science, has said, "Any sufficiently advanced technology is indistinguishable from magic." This is Clarke's Third Law. Maybe, some day in the future, physicists will fully understand gravity, and build an antigravity machine. Levitating people may one day become as commonplace to our descendants as jet planes are to us. So, if someone claims to have witnessed a magic carpet zooming over the minarets, should we believe him, on the grounds that those of our ancestors who doubted the possibility of radio turned out to be wrong? No, of course not. But why not?
Clarke's Third Law doesn't work in reverse. Given that "Any sufficiently advanced technology is indistinguishable from magic," it does not follow that, "Any magical claim that anybody may make at any time is indistinguishable from a technological advance that will come some time in the future."
Yes, there have been occasions when authoritative skeptics have come away with egg on their pontificating faces. But a far greater number of magical claims have been made and never vindicated. A few things that would surprise us today will come true in the future. But lots and lots of things will not come true in the future. History suggests that the very surprising things that do come true are in a minority. The trick is to sort them out from the rubbish-from claims that will forever remain in the realm of fiction and magic
It is right that, at the end of our century, we should show the humility that Kelvin, at the end of his did not. But it is also right to acknowledge all that we have learned during the past hundred years. The digital century was the best I could come up with, as a single theme. But it covers only a fraction of what twentieth-century science will bequeath. We now know, as Darwin and Kelvin did not, how old the world is. About 4.6 billion years. We understand-what Alfred Wegener was ridiculed for suggesting-that the shape of geography has not always been the same. South America not only looks as if it might jigsaw neatly under the bulge of Africa. It once did exactly that, until they split apart some 125 million years ago. Madagascar once touched Africa on one side and India on the other. That was before India set off across the widening ocean and crashed into China to raise the Himalayas. The map of the world's continents has a time dimension, and we who are privileged to live in the Plate Tectonic Age know exactly how it has changed, when, and why.
We know roughly how old the universe is, and, indeed, that it has an age, which is the same as the age of time itself, and less than 20 billion years. Having begun as a singularity with huge mass and temperature and very small volume, the universe has been expanding ever since. The twenty-first century will probably settle the question whether the expansion is to go on forever, or go into reverse. The matter in the cosmos is not homogeneous, but is gathered into some hundred billion galaxies, each averaging a hundred billion stars. We can read the composition of any star in some detail, by spreading its light in a glorified rainbow. Among the stars, our sun is generally unremarkable. It is unremarkable, too, in having planets in orbit, as we know from detecting tiny rhythmic shifts in the spectrums of stars. There is no direct evidence that any other planets house life. If they do, such inhabited islands may be so scattered as to make it unlikely that one will ever encounter another.
We know in some detail the principles governing the evolution of our own island of life. It is a fair bet that the most fundamental principle-Darwinian natural selection-underlies, in some form, other islands of life, if any there be. We know that our kind of life is built of cells, where a cell is either a bacterium or a colony of bacteria. The detailed mechanics of our kind of life depend upon the near-infinite variety of shapes assumed by a special class of molecules called proteins. We know that those all-important threedimensional shapes are exactly specified by a one-dimensional code, the genetic code, carried by DNA molecules that are replicated through geological time. We understand why there are so many different species, although we don't know how many. We cannot predict in detail how evolution will go in the future, but we can predict the general patterns that are to be expected.
Among the unsolved problems we shall bequeath to our successors, physicists such as Steven Weinberg will point to their Dreams of a Final Theory, otherwise known as the Grand Universal Theory, or Theory of Everything. Theorists differ about whether it will ever be attained. Those who think it will would probably date this scientific epiphany somewhere in twenty-first century. Physicists famously resort to religious language when discussing such deep matters. Some of them really mean it. The others are at risk of being taken literally, when really they intend no more than I do when I say "God knows" to mean that I don't.
Biologists will reach their grail of writing down the human genome, early in the next century. They will then discover that it is not so final as some once hoped. The human embryo project-working out how the genes interact with their environments, including each other, to build a bodymay take at least as long to complete. But it too will probably be finished during the twenty-first century, and artificial wombs built, if these should be thought desirable.
I am less confident about what is for me, as for most biologists, the outstanding scientific problem that remains: the question of how the human brain works, especially the nature of subjective consciousness. The last decade of this century has seen a flurry of big guns take aim at it, including Francis Crick no less, and Daniel Dennett, Steven Pinker, and Sir Roger Penrose. It is a big, profound problem, worthy of minds like these. Obviously I have no solution. If I had, I'd deserve a Nobel Prize. It isn't even clear what kind of a problem it is, and therefore what kind of a brilliant idea would constitute a solution. Some people think the problem of consciousness an illusion: there's nobody home, and no problem to be solved. But before Darwin solved the riddle of life's provenance, in the last century, I don't think anybody had clearly posed what sort of a problem it was. It was only after Darwin had solved it that most people realized what it had been in the first place. I do not know whether consciousness will prove to be a big problem solved by a genius; or will fritter unsatisfactorily away into a series of small problems and nonproblems.
I am by no means confident that the twenty-first century will solve the human mind. But if it does, there may be an additional by-product. Our successors may even be in a position to understand the paradox of twentieth-century science: On the one hand our century arguably added as much new knowledge to the human store as all previous centuries put together, while on the other hand the twentieth-century ended with approximately the same level of supernatural credulity as the nineteenth, and rather more outright hostility to science. With hope, if not with confidence, I look forward to the twenty-first century and what it may teach us.[Author note]