Controls on Technology
Instapundit Glenn Reynolds' latest TechCentralStation column is up, and it concerns Michael Crichton's new novel, Prey. This novel -- which I'm hoping to read before too long, like Crichton's other novels -- deals with nanotechnology gone bad.
Reynolds' column deals with the "gone bad" part of the plot. He defends Crichton against some critics by pointing out that the "gone bad" aspect must be present in the novel, otherwise there is not reason for the novel!
Reynolds goes on to point out, though, that what went bad in the novel could only happen "if the researchers in question were (1) stupid; (2) criminally negligent; and (3) willing to violate the consensus ideas about nanotechnology safety." Specifically, he points out that "Crichton's nanobots are capable of evolution (at least in programming) and of surviving in the "wild" - that is, of making more copies of themselves from ordinary material found in nature," and that these two factors "are two big no-nos of nanotechnology. In fact, they're the first two no-no's of the Foresight Guidelines for Molecular Nanotechnology: 1. Artificial replicators must not be capable of replication in a natural, uncontrolled environment. 2. Evolution within the context of a self-replicating manufacturing system is discouraged."
The problem I see is this: how are guidelines in and of themselves going to prevent these sort of abuses from happening? Scientists have, unfortunately, acted in ways that 20/20 hindsight shows to have been foolish -- and even that were seen as such by their contemporaries. Note that I'm not saying we should not explore nanotechnology; I'm not "antitechnology." All I ask and hope for is that as scientists continue to bring more and more aspects of our world under our control, they remember that just because we can do something doesn't mean that we should.
Scientific formation involves (or should involve) more than technical know-how; it includes a moral awareness of how to best use the awesome power science presents us.