Illustration by Stephen Sweny
|

With the aggressive pace of technological change and the onslaught of news regarding data breaches, cyber-attacks, and technological threats to privacy and security, it is easy to assume these are fundamentally new threats. The pace of technological change is slower than it feels, and many seemingly new categories of threats have actually been with us longer than we remember. Nervous System is a monthly blog that approaches issues of data privacy and cybersecurity from the context of history—to look to the past for clues about how to interpret the present and prepare for the future.

In the waning days of World War II, Vannevar Bush faced a dilemma. He was by that point already a legendary figure of extraordinary accomplishments. In 1922, he had founded the defense contractor Raytheon, and his company went on to manufacture most of the radar systems used in the war. In 1930, Bush had built a working “differential analyzer,” an analog computer that handled complex calculations by grinding away with physical gears. He was appointed president of the Carnegie Institution in 1935 and would hold that post until 1955. Bush created and led the Office of Scientific Research and Development to provide scientific guidance to President Roosevelt. In that role, Bush supervised the Manhattan Project developing the atomic bomb. All in all, Bush had played a role in many of the most world-shaking technological developments over the last three decades.

Bush knew there were technological revolutions yet to come, though. His favorite saying was, “It is earlier than we think.” He knew a secret about that future, but he was not allowed to tell. Therein lay his dilemma.

During the war, Bush had been researching the problem of searching. As an information scientist, Bush understood that as technology allowed more and more information to be created and stored, the existing mechanisms for managing that information were soon going to be overwhelmed.

Historically, the primary concern of how to organize information for human use was optimizing physical space. Put simply, a library's worth of books needs to be put somewhere to be of use, and how the books are stored will directly affect how useful they are. To achieve this, library scientists like Melvil Dewey devised means of classifying and cataloging information. A human would find it challenging to navigate even a modestly stocked library to find a specific text without the organizational aid provided by the library's cataloging system, ensuring items can be found and are likely to be stored alongside related items.

Conversely, if the Internet is a library, it is the most massive library ever assembled. Yet humans navigate this online library without regard to its internal organization. Online data storage is driven by technical considerations, not organizational ones, and the contents of a single web page may be scattered across networked computer systems around the world. The physical location of the binary data corresponding to a piece of the Internet is almost irrelevant; what matters is how pieces are connected.

The shift to storing information in digital form fundamentally changed priorities. Instead of worrying how to optimize physical space, the main concern for the new Information Age would be finding information in an exponentially growing inventory of almost limitless capacity.

Bush applied himself to this challenge. This was the pre-digital age, so his concern was how to find data on microfilm, but the principles he developed would be applicable to digital technologies.

Bush developed a process of imprinting index codes onto the microfilm to embed the cataloging scheme into the documents themselves. Paired with a mechanism that could automatically scan those codes, he had a way to speed through microfilm to find what he wanted. A given document was integrally linked to related documents via those index codes. A reader who started at one document could follow the web of links across the wider library of microfilm to explore a train of thought.

While the specific mechanics of this idea are rooted in quaint analog technologies, a user of the World Wide Web should feel a twinge of recognition. Before digital computers existed, Vannevar Bush had invented hyperlinks.

Bush built a prototype version of this device, but the world was not allowed to know about it. The Allies used his “Rapid Selector Machine” in their codebreaking efforts. The program was perhaps the most closely guarded secret in the world. Bush was unable to share his invention—at least not openly or directly. Nevertheless, he knew that World War II had ushered in a new age of information—the citizens of the future would need a way to cope with an exponentially growing universe of information.

Bush found a way to meet his moral obligation to the citizens of the future while meeting his ethical obligations to the government charged with safeguarding the secrets of the codebreaking program. He did it by pretending his invention did not exist.

In July 1945, Bush published a landmark article in the Atlantic Monthly, “As We May Think.” It was reprinted in Life Magazine and summarized in Time, and has been reprinted and cited ever since. It is one of the foundational texts of Information Science.

In it, Bush described plans for an imaginary machine he called the Memex. This device was shaped like a desk and contained a storehouse of spools of microfilm. A projector in the Memex allowed the contents of the microfilmed pages to be projected onto a screen. Central to the utility of the Memex, however, was its internal organizational principles—a set of index codes imprinted on the film to associate conceptual links among different records.

For years, it was assumed that Bush's Memex was a fantasy—a thought-piece to inspire other engineers to try their hands at making one. Indeed, generations of computer scientists took their lead from this article.

Bush passed away in 1974, long before the creation of the World Wide Web (the term “World Wide Web” was coined in 1990 by Tim Berners-Lee and Robert Cailliau, who also created the first web browser to explore those links). The computer pioneers who helped build that new realm of cyberspace, however, were following a path blazed decades before them. Ted Nelson, inventor of digital hyperlinks, happily tipped his hat to the legendary Bush for showing the way. Other computer pioneers like J.C.R. Licklider and Douglas Engelbart also acknowledged their debt to Bush.

Bush was reluctant to see himself as a pioneer. He pointedly declined credit for helping develop the computer, because his version was only analog. When opportunities arose to claim credit for his inventions, he would wave off the attention and emphasize how he relied on the genius of others who worked for or alongside him. He had no interest in looking backward, only forward.

“It is earlier than we think.”

David Kalat is Director, Global Investigations + Strategic Intelligence at Berkeley Research Group. David is a computer forensic investigator and e-discovery project manager. Disclaimer for commentary: The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.