At least, this is the simple version of the tale we tell. A closer look at what happened next—and in the decades following—complicates the narrative. We’re used to the idea that new office technologies make us strictly more productive, but the history of workplace tools teaches us that the quest to make common activities more efficient can yield unexpected side effects. This was true of the first PCs, and it likely explains the uneasy relationship we have with a more recent office innovation: email.
Not long after the arrival of the PC, experts began to question the miraculous nature of this suddenly ubiquitous device. In 1991, an article in The New York Times quoted an economist who pointed out that although companies continue to spend heavily on technology, “white-collar productivity has stagnated.” He concludes: “No longer are chief executives confident that throwing computers at their office staffs will result in greater efficiency.”The data supported these concerns. A study of the years 1987 to 1993, conducted by economists Daniel Sichel and Stephen Oliner, estimated that computer technology contributed at most 0.2 percentage points a year to business output growth, after adjusting for inflation, a period during which overall growth expanded by 1.9 percent a year. A contemporaneous article summarized these findings bluntly: “The impact of computers on recent productivity growth has been vastly overstated.”
In his 1997 book Why Things Bite Back, Edward Tenner tackles the “productivity paradox” that surrounded the initial introduction of the PC to the office. He points out several explanations, but perhaps the most interesting concerns the disconnect between easy and effective. The computer made certain common activities more efficient, but it also created more overall work to be done. Instead of tasking an accountant to update their paper accounting ledger, business owners might now do it themselves using a digital spreadsheet. In isolation, the spreadsheet is easier than the ledger book, but in practice the business owners now have less time available for other, potentially more valuable activities. “If computers really made it possible for a smaller number of people to accomplish the same amount of work,” Tenner notes, “there would be little outcry about the longer hours for middle managers and professionals.” But of course, this is the opposite of what happened.
My whole generation learned relentless work was the way to cope with the rolling crisis, with the mood of imminent collapse and economic insecurity that was the elevator music of our entire youth—the relentless tension between trying to save yourself and trying to save the world, between desperate aspiration and actual hope.
Tenner supports the claim that PCs can increase workloads by citing Georgia Tech economist Peter G. Sassone’s fascinating research. In a 1992 paper, Sassone reports on what he found studying the impact of new technology on 20 departments in five major corporations. As he documents, many of these departments fired support staff after the arrival of time-saving computer software made them unnecessary. (There’s no need to maintain a typing pool once you have word processors.) The obvious problem is that the work once conducted by this staff now shifted to the workers they used to support. Though these support staff reductions saved salary costs in the short term, they required hiring more higher-level—and therefore higher-salary—employees in the long term to maintain similar levels of output. After crunching the numbers, Sassone concluded that the introduction of supposedly productivity-boosting tech ended up costing these companies 15 percent more in overall salary expenses.