Technological Nightmares (Lecture): Terror and Error

Series: Pardee Center Distinguished Lecture Series
Date: October 2003
Location: Frederick S. Pardee Center for the Study of the Longer-Range Future, Boston University, Boston, MA

Terror and Error

Once I organized a conference in Sri Lanka for the Society for International Development. I had invited Ivan Illich and Arthur Clark. After Arthur Clark had described in glowing terms his technical utopia in which everything is done by robots, Illich, who then was an ordained priest of the Catholic Church, said, “I would not like to make love to a robot.” Since then, I have read a chapter by Howard Rheingold on “virtual reality and teledildonics” predicting that we shall be literally “embracing technology.”

Martin Rees, Britain’s Astronomer Royal, a professor at Cambridge University, has bet $1,000 that an instance of bioterror or bioerror will take a million lives before the year 2020. He gives humanity’s survival in the 21st century a 50:50 chance. His book is called Our Final Century in Britain and Our Final Hour in the USA. He wanted a question mark after the title of his book but his publishers had ruled it out to gain a wider audience.[1]

I do not want to scare anybody. It is important to distinguish between the role of a prophet and that of a forecaster. The prophet Jonah did not know the difference. He predicted the destruction of Nineveh because its people did not behave themselves. They took his warnings seriously and mended their ways. God spared them. Jonah was furious with God. He mistook the function of a prophet for that of a forecaster. He mistook a warning for a prediction. On the other hand, prophets of doom cannot be wrong. If their forecasts turn out to be true, they can always say, “I told you so”; if not, they can say that people heeded their warnings and mended their ways.

There are technological accidents (errors, not terror) from the Titanic in 1912 to the Challenger in 1986 to the shuttle Columbia in 2003. The nuclear accident at Three Mile Island in 1979, the fire at the Chernobyl nuclear plant in Ukraine in 1986, the explosion of a chemical factory in Toulouse in 2001, the sunken tanker Prestige off Spain in 2002 with the resulting oil spill are examples of errors. In the past disasters inflicted by environmental forces (floods, earthquakes, volcanoes, and hurricanes) were not uncommon. The obverse of technology’s immense prospects is an escalating variety of potential disasters, not just from malevolent intent but also from innocent inadvertence. Now we witness also those inflicted by human agency.

We are often blind to the manifestations of technological discoveries. Ernest Rutherford, the greatest nuclear physicist of his time, dismissed as “moonshine” the practical relevance of nuclear energy. The pioneers of radio regarded wireless transmission as a substitute for the telegraph (used mainly for ship-to-shore communication) rather than as a means for broadcasting entertainment to a wide public. According to Rees, neither the great mathematician John von Neumann nor the IBM founder Thomas J. Watson envisaged the need for more than a few computers in the entire country.


  1. ^ Martin Rees, Our Final Hour; A Scientist’s Warning: How Terror, Error and Environmental Disaster Threaten Humankind’s Future in This Century—on Earth and Beyond (New York: Basic Books, 2003).

This is a chapter from Technological Nightmares (Lecture).
Previous: Nanotechnology  |  Table of Contents  |  Next: Suffering from Success




Longer-Range, F. (2008). Technological Nightmares (Lecture): Terror and Error. Retrieved from


To add a comment, please Log In.