Does IT Matter? Information Technology and the Corrosion of Competitive Advantage, Nicholas G. Carr, Harvard Business School Press, 2004, Boston, Massachusetts, 193 pages
Look at the dust jacket. My edition shows rows of video monitors, not any bought within the last few years, reminiscent, no doubt intentionally, of those cartoon images of endless rows of office drones, although human beings are conspicuously missing from this image. Don't blame the author for his publisher's marketing choices, you say? Well, bear with me. The photo illustrates the key thesis of Nicholas Carr's argument, both here in the Harvard Business School Publishing Corporation's publication of the current book and the May 2003 article, in the Harvard Business Review, of the article it expands upon: "IT Doesn't Matter."
The title changed to a question in the transition, but the message remained the same: IT can't/doesn't guarantee business success. If that seems underwhelming, you and I have something in common. Here, though, from his preface, are the details that justify his argument and frustrate practitioners like me. Pay attention! Nothing later is reasonable if you don't accept these arguments and premises. The numbering is mine. Don't overlook #4.
- To date, despite the fact that IT investment has become the largest of all corporate capital expenditures, it has not led to any essential change in corporate structure.
- IT has become a simple factor of production a commodity that is necessary for competitiveness but insufficient for advantage.
- IT's strategic importance is not growing but diminishing.
- ". . . I am talking about the technology itself. The meaning of 'IT' does not encompass the information that flows through the technology or the talent of the people using the technology. . . . [I]nformation and talent often form the basis of business advantage. That has always been true, and it will continue to be so. Indeed, as the strategic value of the technology fades, the skill with which it is used on a day-to-day basis may well become even more important to a company's success. [Emphasis mine]
- There is a "common and universal IT infrastructure" that influences/constrains "the way the underlying technology and the information it carries are used."
- Because of the "continued evolution of IT architecture . . . at this point most innovations will tend to enhance the reliability and efficiency of the shared infrastructure rather than enable proprietary uses of that infrastructure."
While I'm in a numbering frame of mind, let me enumerate three conclusions about this book:
1. It is full of useful and interesting information and based on impressive, documented scholarship, mainly from the business journalism press where the author has spent most of his career.
Seventeen pages of citations and a bibliography of 139 sources, plus a further list of 36 responses and reactions to his "IT Doesn't Matter" article, buttress his analysis.
2. If you want your journal article or book to be read, an inflammatory title is a good first step.
The title here is so successfully controversial that many of the negative reactions to the book seem based on the title rather than on the book's specific content and arguments. The same was true of the original article, which, according to the dust jacket, was called "dangerously wrong" in Fortune, "hogwash," by Microsoft's Steve Balmer, and "dead wrong" by HP's Carly Fiorina. Those are reactions to the title, and perhaps to what Carr wants readers to believe he is saying. His actual argument is so much more limited, however, that these reactions were unnecessarily alarmist. Most of the laudatory reactions seem based, as well, on what some readers want to read into the book: that enterprises need not spend so much on IT. As I'll explain below, this may or may not be true, depending on situations and strategies.
3. Because Carr defines IT so narrowly (remember item 4 above: the technology itself and not how it is used), he argues against straw men.
As was the case with such earlier enabling technologies as railroads and electrical power, Carr argues, competitive advantage exists for a while for early adopters and for those close followers who learned from the often primitive and sometimes painful experiences of the earliest adopters, but once the technology is in widespread use, its application in and of itself yields no competitive advantage. Sure enough, the first enterprises to position themselves where rail service was convenient enjoyed a distinct competitive advantage. Similarly, those who switched to electricity while their competitors were still using, say, water wheels for power had an advantage, especially once they realized the advantages of delivering power to diverse locations within a plant, instead of powering only the original spindle that drove all mechanized functions.
By analogy, Carr reasons, the advantages that the earliest adopters of technology (he points to an English chain of tea shops in 1951 as the first commercial deployment of computers) enjoyed processing their payrolls and orders and inventories by computer instead of by hand rapidly shrank as others followed suit. Now, in fact, all businesses have rapid access to the same technology the same office-productivity tools, e-mail, Web services, databases, design and control utilities, and the like so the technology itself cannot give advantage, nor can increased investment in technology guarantee greater advantage.
At first glance, those points seem unarguable. In one way, if time and progress stand still, they are. They sound at first somewhat surprising, but consider how much software in use today is supplied by vendors who are eager to sell it to anyone and everyone. Carr distinguishes between proprietary and infrastructural technology, and his point is indisputable in the case of the latter. Of course Microsoft wants us to believe that adopting its software will guarantee our success ("your potential; our passion"), but if that were true then every user of Microsoft Office would be successful, and there really isn't room for all of us millions at the top.
On the other hand, the argument rests on the assumption that the phenomenon studied the railroad, the telephone, electricity rapidly rises to a level of maturity and saturation, and that's that. If, however, the railroad is seen not as a discrete technology but a step in the technologies that led from horses, barges and sails through railroads and automobiles and trucks and airplanes if the technology arena is not "railroads" but "transportation," are not the implications more fluid? Weren't advantages still to be reaped by UPS, Fed-Ex and others? The telegraph preceded the telephone, whose impact amazed the world. Yet radio and television, not to mention the Internet, were still ahead, in a technology field properly called not "telephones" but "communication." Electricity seems the exception, but its price fluctuation and potential for generation alternatives (from hydro and fossil fuel sources to wind, solar and fission reactors, and perhaps in the future to safe neighborhood-level hydrogen or fusion sources could revolutionize a technology we might more appropriately and generally call "power" than "electricity."
Carr does not say that technological innovation is finished. In fact, he acknowledges that it will continue, though with rapidly diminishing proprietary advantage. Still, if IT, like the earlier infrastructures of railroads, telephones and electricity, is now mature and stable, we ought to be singing, "Ev'rythin's up to date in Kansas City / They've gone about as fur as they c'n go!"
Carr insists that much of business investment in IT, especially during the boom years of the late 1990s, was wasteful. "Businesses purchase more than one hundred million PCs every year, most of which replace older models. Yet the vast majority of workers who use PCs rely on only a few simple applications word processing, spreadsheets, e-mail and Web browsing. These applications have been technologically mature for years; they require only a fraction of the computing power provided by today's microprocessors. Nevertheless, companies have continued to roll out across-the-board hardware and software upgrades, often every two or three years." Ignore for the moment the two-or-three-year assertion, which is faster than any general replacement cycle I know of. Many in positions like mine did argue in the mid-90s for a four-year cycle, justified at the time by the quality of the hardware and the pace of change in the software. Today, when components are more reliable and software more stable, we are seeing that cycle lengthen, often at the reasonable request of CEOs and CFOs. Eight years ago we were waiting until a computer proved inadequate for its use was actually hampering its operator's productivity before replacing it. Eight years ago not replacing a four-year-old machine meant having to crack open the case to add memory or replace hard disks, as well as upgrading operating systems or supporting a plethora of platforms, and anyone in the business of PC support can testify that the purchase price of a system is only a down payment on what can be an expensive ride.
"Wasteful IT spending has long been endemic in corporations, and it reached plague-like proportions during the Internet boom of the late 1990s, when, as one computer industry executive put it, 'servers were growing like bacteria.' Today, in the wake of all the excess spending, 'considerably less than half of [installed] IT capacity is actually used,' reports the Financial Times. Needless to say, much of the superfluous hardware will never be used it's already out of date. But the lesson is clear: Companies need to make sure they get the value out of past investments before making new ones." So much for the sunk-costs doctrine!
I can't imagine that anyone would argue that there is not waste in IT spending (or in any other area of business purchasing, for that matter), but no enterprise I know is using "considerably less than half" of its installed capacity, except in one important sense. Server capacity, in terms of processing power, memory and storage, is dramatically variable, and operating too close to the literal limits of capacity will ensure operational disaster, as systems become unresponsive due to overload, grow memory-bound, or frustrate users by being unable to save needed files. A similar argument governs network capacity, both in local-area networks and in Internet-connection speeds. Unless the steady-state use of the resource is low a system will not be capable of handling spikes in usage, not to mention predictable growth over time. IT capacity management is an exercise in prediction, as much as anything else. Buy enough to meet only current demand and the users (including the CEO and CFO) will demand your head in a few months.
Two related points will wind up my comments: (1) Money can't buy happiness, and (2) Even straw men can take only so much before they fight back.
1. Don't spend without a reason.
Carr says, "Studies of corporate IT spending consistently show that greater expenditures rarely translate into superior financial results." He cites a study by Paul Strassman, who has served as CIO of Kraft, Xerox and NASA, revealing "absolutely no correlation between how much a company spends on IT and how well it performs." The quote is from Carr; I did not look up the Strassman original to check on the "absolute." But this Carr cites from Strassman: "The relationship between profits and IT is random," "From now on, it's economics and the role of the CIO is to make money. Technology has to be taken for granted." I don't know what that last sentence means, or, rather, how it relates to the earlier statement, but can anyone disagree with that earlier statement? Was there a time when the CIO's job IT's job was to do anything other than support the mission of the organization? Is there anyone (not counting external hardware and software vendors) who argues that more money invested in IT will, willy-nilly, achieve enterprise goals?
2. Of straw men and reality
The primary conundrum that bothers me about Carr's technically excellent work is that he ropes off as inadmissible what I consider the crucial questions about IT in contemporary use: How is it used, to what purpose, and advancing what goal?
If we are limited to discussing technology qua technology, he can win his argument, but it is an empty victory. It is not (ever, I imagine) the technology itself that accounts for success. The mail-order sales that led Sears, Roebuck to success in the late 19th century and well into the 20th was not based in technology alone but in the match between available technology and a business strategy. Carr studies Wal-Mart and Dell, going to lengths to demonstrate that IT alone is not responsible for their success and growth. Neither, he points out, uses particular cutting-edge technology, but although he admits that both have successfully matched technology and strategy, he seems to have a blind spot for the significance of that observation. Or, rather, he can incorporate that admission only by reverting to the extreme narrowness of his prefatory focus: information technology only as technology, and not its uses. He quotes Joan Magretta's book: "Michael Dell's really powerful insights haven't been technological ones. They've been business insights." Wal-Mart and Dell both succeed in no small measure because they match technology to their business insights. Wal-Mart gains awesome control over inventory and supplier relations. Dell, like Amazon.com, not only achieves internal efficiency but empowers the consumer, making him or her not only do the work order clerks formerly had to handle but also making the experience seem enjoyable.
A few pages later, he notes of JetBlue that, "Although JetBlue's IT initiatives support its competitive advantage, the source of its advantage lies not in the technology but in the business model. . . ."
Ultimately, this is what matters. No single element in a business strategy makes or breaks an enterprise. It can contribute or drain. Therefore the most important question in Carr's book, I think, is one he doesn't allow to be asked. IT is an important element in any enterprise's arsenal. Its contribution must be justified by strategy and incorporated by tactics. It must be understood and managed like any other resource. It must advance the goals of the organization, and it must be appropriate in scale and fit. Carr is probably right that if IT is treated as the railroad, telephone, or electricity of an earlier technology generation, its contribution to corporate advantage is limited. If, however, times change and progress marches on, all bets are off.
Does IT matter? One might as well ask, "Do people matter?" Business today can't exist without either resource, but spending more on either of them cannot guarantee success unless it contributes to a rational strategy. IT itself is no answer to anything except IT sales. Apply IT innovatively, creatively, and appropriately to a sound organizational purpose, though, and stand back. It's the application that makes the difference; not merely the technology. So in the end Carr and I agree: IT is indispensable, a necessary but not sufficient ingredient of advantage. In my book that amounts to, "IT Does Matter," but no one would buy that book.
About the Author br> John Stuckey is the Director of University Computing at Washington and Lee University, Lexington, VA.