Articles
"The question is, how do you prepare to be wrong? If you know you can't walk away from the consequences of what you do, how do you not screw it up?" said Todd La Porte, sitting across from me in the dappled light of the faculty dining room at the University of California, Berkeley. La Porte, a former Marine, is also a veteran political scientist who is internationally known for his thoughtful study of "long-term stewardship" of man-made hazards; that is, how a society prepares to take care of the messes it has made that it can't get rid of, generations into the future.
La Porte has spent many years studying how nuclear engineers and scientists go about the business of containing radioactive waste, which to date is the most persevering toxic substance known to (and created by) man. I had contacted him when I first started my research into risk and genetic engineering. It occurred to me that if something bad happened as a result of our self-assured release of transgenic organisms throughout the world, we might eventually need to have a more intimate understanding of his work. For starters, as La Porte noted, "nuclear waste doesn't reproduce." A population of living, multiplying transgenic organisms gone awry could end up being significantly more difficult to contain than radioactive sludge. Should such a thing happen, it would create stewardship challenges for generations into the future that are already far beyond our present scientific knowledge or capabilities.
And while the thought of being wrong about having stocked the entire planet with self-replicating hazards was sobering enough, La Porte posed yet another, equally troubling question about the topic of my inquiry: "How are you going to get the scientists to listen to you?"
After decades of study, La Porte himself had no answer. "My experience with technical people, with scientists, is that they're utopians and they see us as the problem," he said. "This was a tragedy in the nuclear industry."
Nuclear scientists, said La Porte, entered their profession believing they were doing something good for the world by developing what was then called "atomic energy." Many of us remember this era, when nuclear energy was pervasively (and now infamously) touted by the nuclear industry as 100 percent safe, clean and "too cheap to meter." The scientific basis for those claims was accurate as far as it went, but clearly it didn't go far enough. When the industry's claims of safety literally blew up—with operator and engineering errors triggering the meltdowns at Three Mile Island in 1979 and Chernobyl in 1986—the public rejected the technology as too risky for the benefits promised by its government and industry champions.
A new focus on our global dependence on fossil fuels has some people trying to salvage nuclear energy's reputation. But nuclear power plants are still considered too vulnerable to human error to be reliable, and the issue of how to safely store and safeguard radioactive waste remains a weighty and so far intractable problem, for both public health and global security.
"To think that other people might suffer as a result of their actions is not part of the expert's world, or it gets pushed away in the drive to deploy the technology," said La Porte. "But what are the consequences if it turns out that all the things they believed in are wrong? That's really hard. And most technical people can't talk about this. What they do is theology to them, not science.
"This attitude used to be less of a problem because we couldn't destroy the earth," La Porte continued. "But now we can. The consequences of error are likely to be greater than they were. The power of technology is much greater. The capacity for untoward error is much greater. And the effects of technology at scale have not been tested in the area of biology."
COMMENTS