acm - an acm publication

Articles

The social life of information (book excerpt)

Ubiquity, Volume 2000 Issue March, March 1 - March 31, 2000 | BY Paul Duguid , John Seely Brown 

|

Full citation in the ACM Digital Library


The following excerpt is from Chapter 1 of the new book "The Social Life of Information" by John Seely Brown and Paul Duguid. John Seely Brown is Chief Scientist at Xerox Corporation and Director of the Xerox Palo Alto Research Center (PARC). Paul Duguid is a research specialist in Social and Cultural Studies in Education at the University of California at Berkeley.

 

"The Social Life of Information"

 

On an average weekday the New York Times contains more information than any contemporary of Shakespeare's would have acquired in a lifetime.

-Anonymous (and ubiquitous)

Every year, better methods are being devised to quantify information and distill it into quadrillions of atomistic packets of data.

-Bill Gates

By 2047 . . . all information about physical objects, including humans, buildings, processes and organizations, will be online. This is both desirable and inevitable.

-Gordon bell and Jim gray

This is the datafication of shared knowledge.

-Tom Phillips, Deja News

 

It now seems a curiously innocent time, though not that long ago, when the lack of information appeared to be one of society's fundamental problems. Theorists talked about humanity's "bounded rationality" and the difficulty of making decisions in conditions of limited or imperfect information. Chronic information shortages threatened work, education, research, innovation, and economic decision making-whether at the level of government policy, business strategy, or household shopping. The one thing we all apparently needed was more information.

So it's not surprising that infoenthusiasts exult in the simple volume of information that technology now makes available. They count the bits, bytes, and packets enthusiastically. They cheer the disaggregation of knowledge into data (and provide a new word-datafication-to describe it). As the lumps break down and the bits pile up, words like quadrillion, terabyte, and megaflop have become the measure of value.

Despite the cheers, however, for many people famine has quickly turned to glut. Concern about access to information has given way to concern about coping with the amounts to which we do have access. The Internet is rightly championed as a major information resource. Yet a little time in the nether regions of the Web can make you feel like the SETI researchers at the University of California, Berkeley, searching through an unstoppable flood of meaningless information from outer space for signs of intelligent life.

With the information spigot barely turned on-the effect has seemed more like breaching a dam than turning a tap- controlling the flow has quickly become the critical issue. Where once there seemed too little to swim in, now it's hard to stay afloat. The "third wave" has rapidly grown into a tsunami.3 Faced by cheery enthusiasts, many less optimistic people resemble the poor swimmer in Stevie Smith's poem, lamenting that

I was much too far out all my life

And not waving, but drowning.

Yet still raw information by the quadrillion seems to fascinate.

 

COULD LESS BE MORE?

Of course, it's easy to get foolishly romantic about the pleasures of the "simpler" times. Few people really want to abandon information technology. Hours spent in a bank line, when the ATM in the supermarket can do the job in seconds, have little charm. Lose your papers in a less-developed country and trudge, as locals must do all the time, from line to line, from form to form, from office to office and you quickly realize that life without information technology, like life without modern sanitation, may seem simpler and even more "authentic," but for those who have to live it, it is not necessarily easier or more pleasant.

Even those people who continue to resist computers, faxes, e-mail, personal digital assistants, let alone the Internet and the World Wide Web, can hardly avoid taking advantage of the embedded microchips and invisible processors that make phones easier to use, cars safer to drive, appliances more reliable, utilities more predictable, toys and games more enjoyable, and the trains run on time. Though any of these technologies can undoubtedly be infuriating, most people who complain want improvements, not to go back to life without them.

Nonetheless, there is little reason for complacency. Information technology has been wonderfully successful in many ways. But those successes have extended its ambition without necessarily broadening its outlook. Information is still the tool for all tasks. Consequently, living and working in the midst of information resources like the Internet and the World Wide Web can resemble watching a firefighter attempt to extinguish a fire with napalm. If your Web page is hard to understand, link to another. If a "help" system gets overburdened, add a "help on using help." If your answer isn't here, then click on through another 1,000 pages. Problems with information? Add more.

Life at Xerox has made us sensitive to this sort of trap. As the old flip cards that provided instructions on copiers became increasingly difficult to navigate, it was once suggested that a second set be added to explain the first set. No doubt, had this happened, there would have been a third a few years later, then a fourth, and soon a whole laundry line of cards explaining other cards.

The power and speed of information technology can make this trap both hard to see and hard to escape. When information burdens start to loom, many of the standard responses fall into a category we call "Moore's Law" solutions. The law, an important one, is named after Gordon Moore, one of the founders of the chip maker Intel. He predicted that the computer power available on a chip would approximately double every eighteen months. This law has held up for the past decade and looks like it will Continue to do so for the next. (It's this law that can make it hard to buy a computer. Whenever you buy, you always know that within eighteen months the same capabilities will be available at half the price.)

But while the law is insightful, Moore's Law solutions are usually less so. They take it on faith that more power will somehow solve the very problems that they have helped to create. Time alone, such solutions seem to say, with the inevitable cycles of the Law, will solve the problem. More information, better processing, improved data mining, faster connections, wider bandwidth, stronger cryptography-these are the answers. Instead of thinking hard, we are encouraged simply to "embrace dumb power."

More power may be helpful. To the same degree, it is likely to be more problematic, too. So as information technology tunnels deeper and deeper into everyday life, it's time to think not simply in terms of the next quadrillion packets or the next megaflop of processing power, but to look instead to things that lie beyond information.

 

DROWNING AND DIDN'T KNOW IT

If, as one of our opening quotations suggests, "all information about physical objects, including humans, buildings, processes and organizations, will be online," it's sometimes hard to fathom what there is beyond information to talk about.

Let us begin by taking a cue from MIT's Nicholas Negroponte. His handbook for the information age, Being Digital, encouraged everyone to think about the differences between atoms, a fundamental unit of matter, and bits, the fundamental unit of information. Here was a provocative and useful thought experiment in contrasts. Moreover, it can be useful to consider possible similarities between the two as well.

Consider, for example, the industrial revolution, the information revolution's role model. It was a period in which society learned how to process, sort, rearrange, recombine, and transport atoms in unprecedented fashion. Yet people didn't complain that they were drowning in atoms. They didn't worry about "atom overload." Because, of course, while the world may be composed of atoms, people don't perceive it that way. They perceive it as buses and books and tables and chairs, buildings and coffee mugs, laptops and cell phones, and so forth. Similarly, while information may come to us in quadrillions of bits, we don't consider it that way. The information reflected in bits comes to us, for example, as stories, documents, diagrams, pictures, or narratives, as knowledge and meaning, and in communities, organizations, and institutions.

The difficulty of looking to these various forms through which information has conventionally come to us, however, is that infocentric visions tend to dismiss them as irrelevant. Infoenthusiasts insist, for example, not only that information technology will see the end of documents, break narratives into hypertext, and reduce knowledge to data, but that such things as organizations and institutions are little more than relics of a discredited old regime.

Indeed, the rise of the information age has brought about a good deal of "endism." New technology is widely predicted to bring about, among other things,

the end of the press, television, and mass media

the end of brokers and other intermediaries

the end of firms, bureaucracies, and similar organizations

the end of universities

the end of politics

the end of government

the end of cities and regions

the end of the nation-state

There's no doubt that in all these categories particular institutions and particular organizations are under pressure and many will not survive long. There's nothing sacred here. But it's one thing to argue that many "second wave" tools, institutions, and organizations will not survive the onset of the "third wave." It's another to argue that in the "third wave" there is no need for social institutions and organizations at all.

The strong claim seems to be that in the new world individuals can hack it alone with only information by their side. Everyone will return to frontier life, living in the undifferentiated global village. Here such things as organizations and institutions are only in the way. Consequently, where we see solutions to information's burdens, others see only burdens on information.

 

ORIGIN MYTHS

From all the talk about electronic frontiers, global villages, and such things as electronic cottages, it's clear that the romanticism about the past we talked about earlier is not limited to technophobes.10 Villages and cottages, after all, are curious survivors from the old world applied to the conditions of the new. They remind us that the information age, highly rationalist though it seems, is easily trapped by its own myths. One of the most interesting may be its origin myth, which is a myth of separation.

Historians frequently trace the beginnings of the information age not to the Internet, the computer, or even the telephone, but to the telegraph. With the telegraph, the speed of information essentially separated itself from the speed of human travel. People traveled at the speed of the train. Information began to travel at the speed of light. In some versions of this origin story (which tends to forget that fire and smoke had long been used to convey messages over a distance at the speed of light), information takes on not only a speed of its own, but a life of its own. (It is even capable, in some formulations, of "wanting" to be free.) And some scholars contend that with the computer, this decisive separation entered a second phase. Information technologies became capable not simply of transmitting and storing information, but of producing information independent of human intervention.

No one doubts the importance of Samuel Morse's invention. But with the all-but-death of the telegraph and the final laying to rest in 1999 of Morse code, it might be time to celebrate less speed and separation and more the ways information and society intertwine. Similarly, it's important not to overlook the significance of information's power to breed upon itself. But it might be time to retreat from exuberance (or depression) at the volume of information and to consider its value more carefully. The ends of information, after all, are human ends. The logic of information must ultimately be the logic of humanity. For all information's independence and extent, it is people, in their communities, organizations, and institutions, who ultimately decide what it all means and why it matters.

Yet it can be easy for a logic of information to push aside the more practical logic of humanity. For example, by focusing on a logic of information, it was easy for Business Week in 1975 to predict that the "paperless office" was close. Five years later, one futurist was firmly insisting that "making paper copies of anything" was "primitive." Yet printers and copiers were running faster and faster for longer and longer periods over the following decade. Moreover, in the middle of the decade, the fax rose to become an essential paper-based piece of office equipment. Inevitably, this too was seen as a breach of good taste. Another analyst snorted that the merely useful fax "is a serious blemish on the information landscape, a step backward, whose ramifications will be felt for a long time."

But the fax holds on. Rather like the pencil-whose departure was predicted in 1938 by the New York Times in the face of ever more sophisticated typewriters-the fax, the copier, and paper documents refuse to be dismissed. People find them useful. Paper, as we argue in chapter 7, has wonderful properties-properties that lie beyond information, helping people work, communicate, and think together.

If only a logic of information, rather than the logic of humanity, is taken into account, then all these other aspects remain invisible. And futurists, while raging against the illogic of humankind and the primitive preferences that lead it astray, will continue to tell us where we ought to go. By taking more account of people and a little less of information, they might instead tell us where we are going, which would be more difficult but also more helpful.

 

HAMMERING INFORMATION

Caught in the headlights of infologic, it occasionally feels as though we have met the man with the proverbial hammer to whom everything looks like a nail. If you have a problem, define it in terms of information and you have the answer. This redefining is a critical strategy not only for futurology, but also for design. In particular, it allows people to slip quickly from questions to answers.

If indeed Morse did launch the information age, he at least had the modesty to do it with a famously open-ended question. "What," he piously asked in his first message, "hath God wrought?" Now, "we have answers," or "solutions" or "all the answers you need" (11,000 according to Oracle's Web site). Similarly, IBM claims that a single computer can contain "answers to all the questions you ever had." So if Morse were to ask his question again today, he would no doubt be offered an answer beginning "http://www. . . . ."

True, Microsoft advertises itself with a question: "Where do you want to go today?" But that is itself a revealing question. It suggests that Microsoft has the answers. Further, Microsoft's pictures of people sitting eagerly at computers also suggest that whatever the question, the answer lies in digital, computer-ready information. For though it asks where you want to go, Microsoft isn't offering to take you anywhere. (The question, after all, would be quite different if Microsoft's Washington neighbor Boeing had asked it.) Atoms are not expected to move, only bits. No doubt to the regret of the airlines, the ad curiously redefines "go" as "stay." Stay where you are, it suggests, and technology will bring virtually anything you want to you in the comfort of your own home. (Bill Gates himself intriguingly refers to the computer as a "passport.") Information offers to satisfy your wanderlust without the need to wander from the keyboard.

 

REFINING, OR MERELY REDEFINING?

In the end, Microsoft's view of your wants is plausible so long as whatever you do and whatever you want translates into information-and whatever gets left behind doesn't matter. From this viewpoint, value lies in information, which technology can refine away from the raw and uninteresting husk of the physical world.

Thus you don't need to look far these days to find much that is familiar in the world redefined as information. Books are portrayed as information containers, libraries as information warehouses, universities as information providers, and learning as information absorption. Organizations are depicted as information coordinators, meetings as information consolidators, talk as information exchange, markets as information-driven stimulus and response.

This desire to see things in information's light no doubt drives what we think of as "infoprefixation." Info gives new life to a lot of old words in compounds such as infotainment, infomatics, infomating, and infomediary. It also gives new promise to a lot of new companies, from InfoAmerica to InfoUSA, hoping to indicate that their business is information. Adding info or something similar to your name doesn't simply add to but multiplies your market value.

Undoubtedly, information is critical to every part of life. Nevertheless, some of the attempts to squeeze everything into an information perspective recall the work of the Greek mythological bandit Procrustes. He stretched travelers who were too short and cut off the legs of those who were too long until all fitted his bed. And we suspect that the stretching and cutting done to meet the requirements of the infobed distorts much that is critically human. Can it really be useful, after all, to address people as information processors or to redefine complex human issues such as trust as "simply information?"

 

6-D VISION

Overreliance on information leads to what we think of as "6-D vision." Unfortunately, this is not necessarily twice as good as the ordinary 3-D kind. Indeed, in many cases it is not as good, relying as it does on a one-dimensional, infocentric view.

The D in our 6-D notion stands for the de- or dis- in such futurist-favored words as

demassification

decentralization

denationalization

despacialization

disintermediation

disaggregation

These are said to represent forces that, unleashed by information technology, will break society down into its fundamental constituents, principally individuals and information. (As we scan the Ds, it sometimes feels as though the only things that will hold up against this irresistible decomposition are the futurists' increasingly long words.)

We should say at once that none of these D-visions is inherently mistaken or uninteresting. Each provides a powerful lens on an increasingly complicated world. They help expose and explain important trends and pressures in society. Nonetheless, the Ds too easily suggest a linear direction to society-parallel movements from complex to simple, from group to individual, from personal knowledge to ubiquitous information, or more generally from composite to unit.

Yet it does not feel that modern life is moving in one direction, particularly in the direction from complex to simple. To most of us, society seems capable of moving in almost any direction, and often in the direction of chaos rather than simplicity. Indeed, many shifts that the 6-Ds reveal are not the first step in an unresisting downward spiral from complex to simple. Rather, they are parts of profound and often dramatic shifts in society's dynamic equilibrium, taking society from one kind of complex arrangement to another, as a quick review of a few Ds will suggest.

Dimensions of the Ds

Much talk about disaggregation and demassification readily assumes that the new economy will be a place of ever-smaller firms, light, agile, and unencumbered. It was once commonplace, for example, to compare the old Goliath, GM, against the new David, Microsoft. As Microsoft's market capitalization passed GM's, the latter had some 600,000 employees and the former barely 25,000. The difference is stark. Not, though, stark enough to step from here to what the business writers Larry Downes and Chunka Mui call the "Law of Diminishing Firms." After all, it's GM that's shrinking. Microsoft continues to grow while other high-tech start-ups compete for the title of "fastest growing ever."

Downes and Mui draw on the theory of the firm proposed by the Nobel Prize-winning economist Ronald Coase. Coase developed the notion of transaction costs. These are the costs of using the marketplace, of searching, evaluating, contracting, and enforcing. When it is cheaper to do these as an organization than as an individual, organizations will form. Conversely, as transaction costs fall, this glue dissolves and firms and organizations break apart. Ultimately, the theory suggests, if transaction costs become low enough, there will be no formal organizations, but only individuals in market relations. And, Downes and Mui argue, information technology is relentlessly driving down these costs.

Though he produced elegant economic theory, Coase had strong empirical leanings. He developed his theory of transaction costs in the 1930s to bridge the gap between theoretical accounts of the marketplace and what he saw in the actual marketplace-particularly when he traveled in the United States. There, business was dominated by huge and still-growing firms. These defied the purity and simplicity of the theoretical predictions, which envisaged markets comprising primarily individual entrepreneurs.

In honor of Coase's empiricism, it's important to look around now. When we began work on this book, Justice Department lawyers opened their case against Microsoft, accusing it of monopolistic practices. David now resembles Goliath. At the same time, other Justice Department lawyers were testifying that 1998 would be the first two-trillion-dollar year for mergers. Seven of the ten largest mergers in history had occurred in the first six months alone. We began keeping a list of firms involved. These included Amoco, AT&T, Bankers Trust, BMW, British Petroleum, Chrysler, Citibank, Deutsche Bank, Exxon, Ford, IBM, MCI, Mercedes, Mobil, Travelers, and many more.

Nor were these large firms buying up minnows. They were buying up each other. Ninety years after the era of trust busting, oil, banking, and tobacco, the initial targets, were all consolidating again. As the Economist put it, after Exxon's merger with Mobil followed British Petroleum's purchase of Amoco: "Big Oil is Dead. Long Live Enormous Oil."

Whatever else was apparent, we soon realized that whenever the book came out, any list of ours would be profoundly out of date. The only successful strategy in such conditions would be to imitate the great comic novelist of the eighteenth century, Laurence Sterne, who faced with an impossible description inserted a blank page into his manuscript and told the readers to take up their own pen and do it for themselves. As we were revising the manuscript, the two behemoths of the information age, AT&T and Microsoft, began their own extraordinary mating dance. That we found well beyond the reach of our pens.

Undoubtedly, several of the mergers we mentioned may represent the last struggles of dinosaurs to protect their ecological niche before other forces destroy it. Investment and even retail banking, for example, may have particularly precarious futures.

But massification is not occurring in dying "second wave" sectors alone. Many mergers have involved firms based in the "third wave" information sectors. Here mergers often involve not so much dinosaurs as phoenixes rising from the ashes of old business models. These might include AT&T's absorption of TCI and Time-Warner's of Turner Broadcasting. They surely do include Internet-driven combinations such as MCI's merger with WorldCom, IBM's takeover of Lotus, and AT&T's purchase of IBM's Global Network. Meanwhile, firms wholly within the new economy, such as AOL, Microsoft, Amazon, and eBay, go on regular shopping sprees for other companies.

Elsewhere in the information sector, Sir John Daniel, vice-chancellor of Britain's Open University, points to the rise of the "mega-university." Daniel presides over some 160,000 students, but his school hardly qualifies as "mega" in a field in which the largest-China's TV University System-has 580,000 students in degree programs. According to Daniel's figures, two universities break the half-million mark, one exceeds one-third of a million, and three are approaching a quarter million. These are all "distance" universities, using a variety of information technologies to reach their students. So no simple demassification here either. Similarly, the concentration of the media in recent years challenges any simple idea of media demassification.

It doesn't feel then as if firms are shrinking under an iron law. Rather, it feels more as if, as the economist Paul Krugman puts it, "We've gone from an economy where most people worked in manufacturing-in fairly large companies that were producing manufactured goods and engaged in things like transportation-to an economy where most people work for fairly large companies producing services."

The resilience of the large organization is not all that surprising. Given that information technologies are particularly good at taking advantage of large networks, the information economy in certain circumstances actually favors the aggregated, massified firm. These are firms that can or have knit diverse networks together, as AOL hopes to do with its purchase of Netscape or as Microsoft hopes to do with the insertion of Windows into television set-top boxes. Consequently, the small, agile firm with big ideas and little money is less likely to be the viable start-up of legend. (As a recent article in Red Herring put it, referring to the famous garage-based start-ups of Silicon Valley, the "garage door slams shut.") And any that do start up in the traditional way are likely to be snatched up by the giants of the industry.

So, while stories abound about the new "niche" markets exploited through the Internet, the examples often come not from niche firms, but from large ones with well-established networks. The paradoxical phrase "mass customizing" suggests that fortune favors the latter. It is possible, for example, to have jeans cut to your personal dimensions. But it is quite probably Levi's that will do it for you. Here the strategy for customized goods relies on a large firm with a large market and a highly standardized product. So the demassification of production relies on the massification of markets and consumption. The Henry Ford of the new economy would tell us that we can all have jeans made to measure, so long as they are Levi's.

Finally, firms are not merely taking power from one another. They are accumulating power that once lay elsewhere. The political scientist Saskia Sassen traces the decline of the nation-state not to the sweeping effects of demassification and disaggregation, but to the rise of powerful, concentrated transnational corporations. The new economic citizen of the world, in her view, is not the individual in the global village but the transnational corporation, often so formidable that it has "power over individual governments." The state and the firm, then, are not falling together along a single trajectory. At least in some areas, one is rising at the other's expense.

In sum, as people try to plot the effects of technology, it's important to understand that information technologies represent powerful forces at work in society. These forces are also remarkably complex. Consequently, while some sectors show disaggregation and demassification, others show the opposite. On the evidence of the 6-Ds, attempts to explain outcomes in terms of information alone miss the way these forces combine and conflict.

So while it might seem reasonable to propose a law of increasing, not diminishing, firms, that too would be a mistake. It would merely replace one linear argument with another. It's not so much the actual direction that worries us about infocentrism and the 6-Ds as the assumption of a single direction. The landscape is more complex. Infocentricity represents it as disarmingly simple. The direction of organizational change is especially hard to discern. The 6-Ds present it as a foregone conclusion.

More Dimensions

Similarly, despite talk of disintermediation and decentralization, the forces involved are less predictable and unidirectional than a quick glance might suggest. First, the evidence for disintermediation is far from clear. Organizations, as we shall see, are not necessarily becoming flatter. And second, where it does occur, disintermediation doesn't necessarily do away with intermediaries. Often it merely puts intermediation into fewer hands with a larger grasp. The struggle to be one of those few explains several of the takeovers that we mentioned above. It also explains the "browser wars" between Netscape and Microsoft, the courtship of AT&T and Microsoft, and the continuing struggle for dominance between Internet Service Providers (ISPs). Each of these examples points not to the dwindling significance but to the continuing importance of mediation on the 'Net (as does the new term infomediary, another case of infoprefixation). Moreover this kind of limited disintermediation often leads to a centralization of control. These two Ds, then, are often pulling not together, but against one another.

NOT FLATTER. Francis Fukuyama and Abram Shulsky conducted a RAND study in 1997 into the relationship between disintermediation, flat organizations, and centralization on behalf of the army. They began by studying the private sector. Here they give little hope for any direct link between information technology and flatter organizations. Indeed, like us, they believe that the conventional argument that information technology (IT) will lead to flatter organizations is an infocentric one

[that] focuses on a single, if very important, function of middle management: the aggregation, filtering, and transmission of information. It is of course precisely with respect to this function that the advances in IT suggest that flattening is desirable, since IT facilitates the automation of much of this work. On the other hand, middle management serves other functions as well.

If managers are primarily information processors, then information-processing equipment might replace them, and organizations will be flatter. If, on the other hand, there is more to management than information processing, then linear predictions about disintermediation within firms are too simple.

Empirical evidence suggests such predictions are indeed oversimplified. Despite the talk of increasingly flatter and leaner organizations, Paul Attewell, a workplace sociologist, argues that "administrative overhead, far from being curtailed by the introduction of office automation and subsequent information technologies, has increased steadily across a broad range of industries." Attewell's data from the U.S. Bureau of Labor Statistics show that the growth of nonproduction employees in manufacturing and the growth of managerial employment as a percentage of the nation's workforce has risen steadily as the workplace has been infomated.

NOR MORE EGALITARIAN. Fukuyama and Shulsky also argue that in instances where information technology has led to disintermediation, this has not necessarily produced decentralization. "Despite talk about modern computer technology being necessarily democratizing," they argue, "a number of important productivity-enhancing applications of information technology over the past decade or two have involved highly centralized data systems that are successful because all their parts conform to a single architecture dictated from the top." Among the successful examples they give are Wal-Mart and FedEx, both of which have famously centralized decision making.

These two are merely recent examples of a clear historical trend whereby information technology centralizes authority. Harold Innis, an early communications theorist, noted how the international telegraph and telephone lines linking European capitals to their overseas colonies radically reduced the independence of overseas administrators. Previously, messages took so long to travel that most decisions had to be made locally. With rapid communication, they could be centralized. Similarly, histories of transnational firms suggest that with the appearance of the telegraph, overseas partners, once both financially and executively autonomous, were quickly absorbed by the "home" office.

Less innocent than infoenthusiasts, commanders in the U.S. Navy understood the potential of information technology to disempower when they resisted the introduction of Marconi's ship-to-shore radio. They realized that, once orders could be sent to them on-board ship, they would lose their independence of action. (Their resistance recalls a story of the famous British admiral Lord Nelson, who "turned a blind eye" to his telescope at the Battle of Copenhagen to avoid seeing his commander's signal to disengage.)

In contemplating assumptions about the decentralizing role of information technology, Shoshona Zuboff, a professor at Harvard Business School, confessed to becoming much more pessimistic in the decade since she wrote her pathbreaking book on the infomated workplace, In the Age of the Smart Machine: "The paradise of shared knowledge and a more egalitarian working environment," she notes, "just isn't happening. Knowledge isn't really shared because management doesn't want to share authority and power."

Of course this need not be the outcome. As Zuboff argues, it's a problem of management, not technology. Smaller organizations, less management, greater individual freedom, less centralization, more autonomy, better organization, and similar desirable goals-these arguments suggest-will not emerge spontaneously from information's abundance and the relentless power of the 6-Ds. Rather, that abundance is presenting us with new and complex problems that another few cycles of Moore's Law or "a few strokes of the keyboard" will not magically overcome. The tight focus on information, with the implicit assumption that if we look after information everything else will fall into place, is ultimately a sort of social and moral blindness.

 

THE MYTH OF INFORMATION

6-D vision, while giving a clear and compelling view of the influence of the 'Net and its effects on everything from the firm to the nation, achieves its clarity by oversimplifying the forces at work. First, it isolates information and the informational aspects of life and discounts all else. This makes it blind to other forces at work in society. Second, as our colleague Geoffrey Nunberg has argued, such predictions tend to take the most rapid point of change and to extrapolate from there into the future, without noticing other forces that may be regrouping.

This sort of reductive focus is a common feature of futurology. It accounts, for example, for all those confident predictions of the 1950s that by the turn of the century local and even domestic nuclear power stations would provide all the electricity needed at no cost. Not only did such predictions overlook some of the technological problems ahead, they also overlooked the social forces that confronted nuclear power with the rise of environmentalism. (Fifties futurism also managed to miss the significance of feminism, civil rights, and student protest while continually pointing to the imminence of the videophone and the jet pack.)

We began this chapter with a brief look back to the industrial revolution. In many ways the train epitomized that earlier revolution. Its development was an extraordinary phenomenon, spreading from a 12-mile line in the 1830s to a network of nearly 25,000 miles in little more than a decade. The railway captured the imagination not only of Britain, where it began, but of the world. Every society that could afford a railway, and some that couldn't, quickly built one. Standards were developed to allow for interconnections. Information brokers emerged to deal with the multiple systems involved. The train also sparked an extraordinarily speculative bubble, with experienced and first-time investors putting millions of pounds and dollars into companies with literally no track record, no income, and little sign of profitability. Unsurprisingly, in popular imagination, both at the time and since, the train has presented itself as a driving force of social and economic revolution.

Economic and social historians have long argued, however, that the story of the industrial revolution cannot be told by looking at the train alone. Historians might as well whistle for all the effect they have had. The myth of the train is far more powerful.

Today, it's the myth of information that is overpowering richer explanations. To say this is not to belittle information and its technologies. These are making critical and unprecedented contributions to the changes society is experiencing. But it is clear that the causes of those changes include much more than information itself. So the myth significantly blinds society to the character of and forces behind those changes.

In particular, the myth tends to wage a continual war against aspects of society that play a critical role in shaping not only society, but information itself, making information useful and giving it value and meaning. It's hard to see what there is other than information when identity is reduced to "life on the screen," community thought of as the users of eBay.com, organization envisaged only as self-organization, and institutions merely demonized as "second wave."

We do not believe that society is relentlessly demassifying and disaggregating. Though we admit it would be much easier to understand if it were. The social forces that resist these decompositions, like them or not, are both robust and resourceful. They shaped the development of the railroad, determining where it ran, how it ran, and who ran it. And they will continue to shape the development of information networks. As we hope to show in the course of this book, to participate in that shaping and not merely to be shaped requires understanding such social organization, not just counting on (or counting up) information.

 

 

[This excerpt has been reproduced by permission of Harvard Business School Press. Excerpt of The Social Life of Information by John Seely Brown and Paul Duguid. Copyright Ó 2000 by the President and Fellows of Harvard College; All Rights Reserved.]

 


COMMENTS

POST A COMMENT
Leave this field empty