The Word Processing of Watergate and the Metaphysics of Information
Chelsea Spencer
By the middle of March 1973, the research staff of the US Senate’s Select Committee on Presidential Campaign Activities had processed 10,000 records, each one a three-by-five index card storing particulars about a single person, date, location, or other discrete datum relevant to any potential wrongdoing committed during the election campaign of 1972 between George McGovern and the incumbent president, Richard M. Nixon. Despite the staff’s industriousness, however, “the backlog of information available had scarcely been processed,” and soon the researchers had to conclude that the volume of documentary evidence they had collected far outstripped conventional methods of discovery processing. After a fortuitous call from the Information Systems Office at the Library of Congress, they began to explore”the possibility of using automation.” Thus was inaugurated, according to the Select Committee’s final, 1,250-page report published in 1974, “the first time a congressional investigating committee employed a computer” for “analytical purposes.” If the Select Committee staff viewed the volume of evidence available for their investigation as having reached a fundamentally new order of magnitude, the proximate cause of this condition was the Nixon White House’s bureaucratic practices of documentation and communication—practices that were only slightly more intense and frenzied than those to be found in a typical American office of the time.
Though the political scandal of Watergate is most famous for litigating questions of truth and mens rea—what the president knew and when he knew it—it was produced by a crisis of information management, one in which a regime of rapid documentary reproduction and paper-based communications had reached a moment of saturation and was beginning to leak.