Nicholas Carr, of IT Doesn't Matter fame, spoke with Computerworld's Joyce Carpenter about his new book, The Big Switch: Rewiring the World, from Edison to Google.
What is this big switch you see coming?
I think we're at the early stages of a fundamental shift in the nature of computing, which is going from something that people and businesses had to supply locally, through their own machines and their own installed software, to much more of a utility model where a lot of the computer functions we depend on are supplied from big, central stations, big central utilities over the Internet.
It's analogous to what happened to mechanical power 100 years ago, when the electric utilities changed the nature of that resource and how businesses and people used it and received it.
How did you come upon the electricity analogy for computing?
It was pretty clear to me that we were in this kind of shift. I'm interested in the history of technology and had been reading a lot about the technologies of the Industrial Revolution. It struck me that the kind of radical shift that businesses, in particular, had to go through when they decided to close down their waterwheels or steam engines at their factories and trust an outside supplier to provide this essential resource reflected the kind of upheaval that people feel in computing when they begin to rely on and trust outside suppliers to supply another essential resource.
You refer to both electricity and information technologies as general-purpose technologies. Has computing gained that status?
Yes. General-purpose technology is a term that economists use to describe any technology that can be used for many, many different purposes. They're very rare, and they're very important in economic and business history for the simple reason that you can use them so broadly. I think it's widely acknowledged now that the two most important general-purpose technologies in history really are electricity and computing.
Processing is done in so many different ways and for so many different purposes. Is it really as general as you make out?
The analogy between electricity and information technology works at an economic level, pretty well, I think. When you start looking at a technological level, you see that there are, of course, major differences, and I'm not arguing that IT is like electricity in some fundamental technological way.
The main difference is that IT is extremely modular in a way that electricity wasn't. With the electric utility, they produced the power, transmitted the power, and then everything on your side of the electric socket was your responsibility. With IT, all of the functions can be considered as individual modules. Raw processing can be done either locally or over the net; data storage, same thing; and all the applications -- unlike electricity -- can also be supplied either locally or over the grid.
But I do think that, if you break computing down to its essence -- which is data processing, data storage, data transmission -- that it is very much a general-purpose technology. Similar to the electric grid, you can build all sorts of applications or appliances on top of it to do all sorts of things.
Which will be more significant in the near future: scientific and engineering breakthroughs or economic forces?
Ultimately, it's the economics that really determine what people and companies do. It's easy to lose sight of that, because it's exciting to see technological breakthroughs and progress. But businesses are completely economic beasts. It's going to be the economics of IT, and the central or local supply of IT, that determines how companies think about information technology in the future, and how this new utility industry matures and grows and the ultimate structure that it takes.