acm - an acm publication


The future and its enemies (book excerpts)

Ubiquity, Volume 2000 Issue February, February 1 - February 28, 2000 | BY Virginia Postrel 


Full citation in the ACM Digital Library

The following brief excerpt from the book, THE FUTURE AND ITS ENEMIES by Virginia Postrel is copyrighted (c) 1998 by Virginia Postrel and reprinted by permission of The Free Press, an imprint of Simon & Schuster, Inc. Virginia Postrel is editor-at-large of Reason magazine, a columnist for Forbes and its companion technology magazine Forbes ASAP, and a contributing editor for the online political magazine Her work appears frequently in the Los Angeles times, the Wall Street Journal, Wired, and other major publications. She lives in Los Angeles.

We hate not knowing the future. Soothsayers are as old as history. But the kind of soothsaying that runs on giant computers, that fills the pages of business publications and informs the decisions of legislators and regulators, is different from old-time magic. Rather than tap omniscient forces operating outside time, it claims scientific knowledge of the present, or at least of everything important about the present. Drawing on that information, it then predicts what people will do and how their actions will shape the world. Or it tells them how they must act and assumes it can foresee the results.

Sometimes this soothsaying is limited and relatively harmless, just one more factor in the trials and errors that compete to shape a more pleasing future. In the late 1980s and early 1990s, for instance, many retailers turned to consultants to predict which women's fashions to stock. Using impeccable demographic data, the consultants homed in on a central fact: Consumers were getting older and fatter. But the inferences they drew-forget youth, novelty, or sex appeal, and go for the basics-could not have been more wrong. What actually sold were slinky slip dresses and curvy, miniskirted business suits. Retailers who followed the reductionist consultants' advice got stuck with unwanted inventory, and the entire industry slumped.

"Never have so many people been employed in analyzing fashion, and never has fashion business been so dismal," commented New York Times fashion critic Amy M. Spindler. "As in any design field, fashion sells when something innovative is presented, something no consumer could have anticipated. . . . But most consultants, even if they are sharply tuned to changes in the demographics of the world, know little about fashion's X-factor, the unknown quantity that makes an item seem hot to a consumer."

The world is full of X-factors, the unarticulated and unrealized knowledge that can be elicited only by experience and experiment. Informed by younger friends that the latest Washington hot spots were cigar-and-martini bars, an out-of-town visitor figured the young folks must be slumming in fusty old K Street steakhouses. "I was as usual totally wrong. As [a hypothetical 1978] planner would have been totally wrong," he later told a conference on industrial policy. "Because this was not a steakhouse that had somehow acquired a second clientele. This was built from the ground up for 22 year olds with so much facial jewelry that they would set off airport metal detectors." Moral of the story: "It's extraordinarily difficult to tell which products will be the successful ones."

It is possible to discern patterns and to test that analysis against others'. But the competition-the test-is crucial. Most predictions are wrong, and the more specific the claim, the more likely the error. When in 1983 Forbes confidently ran a list of "names you are not likely ever to see in The Forbes Four Hundred," the story had a perfectly good theory: that inventors rarely get rich off their creations. But the author got too specific. Third on the list of people unlikely ever "to transcend the $125 million mark in net worth" was none other than Bill Gates, who went on to become the richest man in the world.

In his bet with Paul Ehrlich, by contrast, Julian Simon was able to predict confidently that the prices of five metals would decline from 1980 to 1990. His prediction was based on a dynamic understanding of resource use; his mental model assumed increasing knowledge about alternative sources and applications, feedback from prices, and competitive pressures to do more with less. Simon bet only on the general trend, however, not on specifics. He did not try to say in advance which innovations would lead to the price declines, nor did he project the exact magnitude of the drops. Of those things, he admitted ignorance.

Like the Forbes list, politically imposed stasist plans often get very specific. They admit no X-factors and no learning. They know that high-definition television will take off (and will do so in the form pioneered by the oh-so-scary Japanese) and that cigar-and-martini bars will not. Stasist plans do not consider how people might adjust to new circumstances, and they don't factor in new inventions.

"Most experts believe that without deep changes in both industry behavior and government policy, U.S. microelectronics will be reduced to permanent, decisive inferiority within ten years," wrote MIT's Charles Ferguson in a famous 1988 Harvard Business Review article. He called for a government-directed policy to help U.S. chip companies threatened by foreign competition and denounced the "fragmented, 'chronically entrepreneurial' industry" of Silicon Valley. As authorities to back up his prescriptions, he cited a wide number of university researchers and senior personnel of my acquaintance in the U.S. Defense Department, the CIA, the National Security Agency, the National Science Foundation, and most major U.S. semiconductor, computer, and electronic capital equipment producers. My conclusion, after meetings with groups in the U.S. Defense Science Board, the White House Science Council, and others, is that only economists moved by the invisible hand have failed to apprehend the problem.

Ferguson and his mandarin contacts just Couldn't envision an industry driven by microprocessors, software, and networks rather than memory-chip manufacturing. Instead, they assumed an essentially static world, anticipated disaster, and demanded industrial policy.

Ferguson's ideas were not adopted by either businesses or government. Yet ten years after he predicted an industry "reduced to per manent, decisive inferiority," American information technology companies lead the world. Had chip makers followed his advice, clinging to commodity technologies and stifling entrepreneurship in an effort to build larger firms, the industry would have indeed gone down the drain. "Economists moved by the invisible hand," who understood the dynamic patterns of the industry but did not try to predict its exact evolution, knew more than Ferguson's "experts"-for the very reason that they recognized the limits of their knowledge.

Technocratic plans assume the very things they try to enforce: that the world is simple and easily controlled, that it changes only in predictable ways, that it can be mastered. They suppose that the planners have all the relevant information and know exactly how the world works. The urban renewal programs of the 1950s and 1960s were neat, logical expressions of a certain understanding of city life, as neat and logical as the fashion consultants' projections. But the planners recognized neither the bustling vitality that appeals to city dwellers nor the personal space that draws people to the suburbs. They thought that plazas surrounding high-rise apartment buildings-which looked great in architectural drawings-would somehow duplicate the open space of suburban lawns; instead, such projects lacked both the urban convenience of nearby places to mingle and shop and the suburban attractions of privacy and green space. These technocrats scorned the critical information embedded in the lives of both city dwellers and suburbanites: the "tacit knowledge" expressed in relationships and habits and conveyed through webs of economic and social connections.

More recently, the Environmental Protection Agency has evaluated California's smog-reduction regulations not by measuring actual pollution levels but by cranking computer models. The models neither permit radically new ideas for cutting pollution nor incorporate unexpected changes in the human environment. They cram any new technology or information into the same old framework. By 1996, when California developed a plan to comply with the 1990 Clean Air Act amendments, the state had ample data indicating that a small percentage of "gross polluters" contribute the majority of vehicle pollution-and that the most effective way to spot such cars is through roadside "remote sensing," analogous to radar guns for catching speeders. Under EPA rules, however, officials could not fully adapt their smog-reduction program to this new information and technology. Instead, they had to create an awkward hybrid that sticks remote sensing onto established programs of periodic smog checks and trip reductions. "The public cares about results-cleaner air," says Lynn Scarlett, who chaired the California Inspection and Maintenance Review Committee, which was responsible for developing a plan to meet EPA requirements. "EPA cares more about whether folks are complying with permit procedures and technology mandates."

EPA predictions also take a simplistic view of human behavior. The agency's rigid models make room for scheduled inspections, but not random smog checks or their deterrent effects. And the models assume that population will grow, never that it will shrink or change in composition. Projections made in the late 1980s thus missed southern California's post-Cold War economic downturn, which reduced growth rates and traffic; yet those projections remain, feeding regulations. The agency's predictions presume that both behavior and knowledge are essentially fixed. And they force 17 million motorists to live accordingly.

Predictions go wrong because there are many possible sources of error: environmental shocks, bad or incomplete models, bad or incomplete data, sensitivity to initial conditions, the ever-branching results of action and reaction. Writing of technology, the physicist Freeman Dyson notes that its inherent unpredictability makes centralized decision making hazardous:

Whenever things seem to be moving smoothly along a predictable path, some unexpected twist changes the rules of the game and makes the old predictions irrelevant. . . . A nineteenth-century development program aimed at the mechanical reproduction of music might have produced a superbly engineered music box or Pianola, but it would never have imagined a transistor radio or subsidized the work of Maxwell on the physics of the electromagnetic field which made the transistor radio possible. . . . Yet human legislators act as if the future were predictable. They legislate solutions to technological problems, and they make choices between technological alternatives before the evidence upon which a rational choice might be based is available.
Many important developments take place out of view of the pundits. What business analyst in the 1970s would have looked to rural Arkansas to find the future of retailing? Yet that's where Wal-Mart emerged. It took Jimmy Carter, a born-again Southern Baptist immersed in Bible Belt culture, to recognize the political potential of evangelical voters-who were there all along. In retrospect, fashion consultants could trace those miniskirted business suits to the characters of Melrose Place. Not so surprising after all. The critical "local knowledge" is out there, but it's hard to collect.

Unexpected events or patterns often make perfect sense in hindsight. But the very difficulty of predicting the future points up how little we know-or can know-about the present. The present is, after all, the basis of all prediction. Management guru Peter Drucker, among the most perceptive of trend spotters, declares emphatically that "I don't speculate about the future. It's not given to mortals to see the future. All one can do is analyze the present, especially those parts that do not fit what everybody knows and takes for granted. Then one can apply to this analysis the lessons of history and come out with a few possible scenarios. . . . Even then there are always surprises."

Knowledge is at the heart of a dynamic civilization-but so is surprise. A dynamic civilization maximizes the production and use of knowledge by accepting widespread ignorance. At the simplest level, only people who know they do not know everything will be curious enough to find things out. To celebrate the pursuit of knowledge, we must confess our ignorance; both that celebration and that confession are central to dynamic culture. Dynamism gives individuals both the freedom to learn and the incentives to share what they discover. It not only permits but encourages decentralized experiments and competitive trial and error-the infinite series by which new knowledge is created. And, just as important, a dynamic civilization allows its members to gain from the things they themselves do not know but other people do. Its systems and institutions evolve to let people develop, extend, and act on their particular knowledge without asking permission of a higher, but less informed, authority. A dynamic civilization appreciates, protects, and nurtures specialized, dispersed, and often unarticulated knowledge.

Not surprisingly, how we think about knowledge-like how we think about progress-is one of the questions over which dynamists and stasists clash. These competing visions simply do not imagine knowledge in the same way. To dynamists, knowledge is like an ancient, spreading elm tree in full leaf: a broad trunk of shared experience and general facts, splitting into finer and finer limbs, branches, twigs, and leaves. The surface area is enormous, the twigs and leaves often distant from each other. Knowledge is dispersed, shared through a complex system of connections. We benefit from much that we do not ourselves know; the tree of knowledge is too vast. For stasists, by contrast, the tree is a royal palm: one long, spindly trunk topped with a few fronds-a simple, limited structure.

NOTE: Virginia Postrel uses the word "stasists" to contrast with "dynamists." Stasists are people who seek specific to govern each new situation and keep things under control, whereas dynamists appreciate dispersed, often tacit knowledge. Stasists are focused on a controlled, uniform society that changes only with permission from some central authority. Dynamists want to create an open-ended society where creativity and enterprise, operating under predictable rules, generate progress in unpredictable ways.


In addition to being an author and columnist, Virginia Postrel is the founder of the Dynamic Visions Conference: Exploring Creativity, Enterprise, and Progress.


Leave this field empty