From World Brain to the World Wide Web
Share
- Details
- Text
- Audio
- Downloads
- Extra Reading
The World Wide Web has evolved into a universe of information at our finger tips. But this was not an idea born with the Internet. This lecture recounts earlier attempts to disseminate information that influenced the Web - such as the French Encyclopédists in the 18th century, H. G. Wells' World Brain in the 1930s, and Vannevar Bush's Memex in the 1940s.
Download Text
FROM THE WORLD BRAIN TO THE WORLDWIDE WEB
Martin Campbell-Kelly, Warwick University
Annual Gresham College BSHM Lecture
Introduction There are quite a number of published histories of the Internet and the World Wide Web. Typically these histories portray the Internet as a revolutionary development of the late 20th century—perhaps with distant roots that date back to the early 1960s. In one sense this is an apt characterization. The Internet is absolutely a creation of the computer age. But we should not think of the Internet just as a revolutionary development. It is also an evolutionary development in the dissemination of information. In that sense the Internet is simply the latest chapter of a history that can be traced back to the Library of Alexandria or the printing press of William Caxton. In this lecture I will not be going back to the Library of Alexandria or the printing press of William Caxton. Instead I will focus on the contributions of three individuals who envisioned something very like the Internet and the World Wide Web, long before the Internet became a technical possibility. These three individuals each set an agenda. They put forward a vision of what the dissemination of information might become, when the world had developed the technology and was willing to pay for it. Since the World Wide Web became established in 1991 thousands of inventers and entrepreneurs have changed the way in which many of us conduct our daily lives. Today, most of the colonists of the Web are unaware of their debt to the past. I think Sir Isaac Newton put it best: “If [they] have seen further, it is by standing on the shoulders of giants.” This lecture is about three of those giants: H.G. Wells, Vannevar Bush, and J.C.R. Licklider.
H. G. Wells and the World Brain Today, H. G. Wells is still a well known figure, much better known than Bush or Licklider. But during the peak of his celebrity in the 1930s, Wells was one of the most famous people in the world. He was celebrated both as a novelist and as a pundit. Several of his books had been made into Hollywood movies, such as The Invisible Man, Things to Come, and The Man Who Could Work Miracles. In New York in 1938 a radio broadcast ofThe War of the Worlds, produced by Orson Welles, had caused a mass panic. Politically Wells was a prominent socialist, and he was a popular lecturer and broadcaster. Herbert George Wells was born in Bromley, Kent, in 1866, the youngest of five sons of a shopkeeper. At the age of 15 he became an apprentice in a drapers shop—which was the background to his novelKipps (1905). By dint of night-time study he won a scholarship to study biology under T.H. Huxley at the Royal College of Science (now Imperial College). After this he became a teacher and science writer, and then a novelist. Wells wrote scores of books. Although the idea of a World Brain crops up in several of them, there are two that are of most relevance here. In 1920 he published his most successful non-fiction work, the monumental Outline of History. This book told the story of civilization from antiquity up to the end of the Great War. It was an international best-seller running into several editions in several languages. In 1932 he wrote another encyclopedic book The Work, Wealth and Happiness of Mankind. This book was an economic, educational, and cultural study of human beings and their institutions. It will simplify a complex narrative if I were to assert that writing the Outline of History exposed Wells to the nature of research using primary and secondary sources, and that in the Work, Wealth and Happiness of Mankind he first wrote about the technologies of information management. Wells described the process of researching and writing the Outline of History as follows:
Before the present writer lie half a dozen books, and there are good indexes to three of them. He can pick up any one of these six books, refer quickly to a statement, verify a quotation, and go on writing. ... Close at hand are two encyclopedias, a biographical dictionary, and other books of reference.
Wells was acutely aware of the value of having these books on his desktop, and of owning his own books so that he could make marginal notes. He contrasted his lot with that of scholars in the time of the Library of Alexandria. Those scholars could only work in the Library, they could make no marginalia on manuscripts that consisted of rolled papyri, and there were no indexes or other finding aids. Taking information to the people, instead of the other way about, and creating adequate finding aids were two of the key ideas in the World Brain. He first elaborated his ideas for a World Brain in the Work, Wealth and Happiness of Mankind. The book contains a suggestive image. It is a photograph of the Reading Room of the British Museum Library, which he characterized as a “cell of the world’s brain.” His idea was that the World Brain would be an amalgam of the knowledge contained in the World’s great libraries, museums, and universities. He devised a complex taxonomy of how such knowledge should be organized and disseminated through education. This taxonomy clearly owed much to the Frenchencyclopédists, of whom Wells had made quite a study. Around 1937, Wells perceived that the world was drifting into war. He believed this was because of the sheer ignorance of ordinary people, that allowed them to be duped into voting for fascist governments. He believed that the World Brain could be a force in conquering this ignorance and he set about trying to raise the half-a-million pounds a year that he estimated would be needed to run the project. He lectured and wrote articles which were later published as a book called the World Brain (1938). He made an American lecture tour, hoping it would raise interest in his grand project. One lecture, in New York, was broadcast and relayed across the nation. He dined with President Roosevelt, and if Wells raised the issue of the World Brain with him—which seems more than likely—it did not have the effect of loosening American purse-strings. Sadly, Wells never succeeded in establishing his program before World War II broke out, and then of course such a cultural project would have been unthinkable in the exigencies of war. Wells was never very explicit about the technology of delivering the World Brain. Although, for sure, it would not be printed books. During his American visit in 1937, he visited the Kodak research laboratories in Rochester, New York, where he spent time with a scientist by the name of Kenneth Mees, an expert on the emerging technology microfilm. Wells wrote:
American microfilm experts, even now, are making facsimiles of the rarest books, manuscripts, pictures and specimens, which can then be made easily accessible upon the library screen. By means of the microfilm, the rarest and most intricate documents can be studied now at first hand, simultaneously in a score of projection rooms.
In his most prescient passage in the World Brain Wells wrote:
The general public has still to realize how much has been done in this field and how many competent and disinterested men and women are giving themselves to this task. The time is close at hand when any student, in any part of the world, will be able to sit with his projector in his own study at his or her own convenience to examine any book, any document, in an exact replica. This passage is very suggestive of the World Wide Web, although we have some way to go before all the world’s literature is available on-line to scholars. Wells was conscious that simply putting a microfilm projector on to peoples’ desks would not be sufficient. In addition, he proposed creating a universal index, and envisioned that “A great number of workers would be engaged perpetually in perfecting this index of knowledge.” No doubt this would have accounted for a good part of the £500,000 annual cost of the project. As World War II dragged on, Wells became increasingly depressed, and it reawakened melancholic thoughts about the end of civilization and whether the world’s store of knowledge would survive. This Doomsday scenario first appeared in The Time Machine in 1895, his first scientific romance. In that book he described a remote future, when civilized society has decayed. He describes how the time traveller stumbles across The Palace of Green Porcelain. The building is in ruins, full of broken and disorganized books and artifacts. This passage was probably inspired by the burning of the Library of Alexandria. Wells’ last book was titled Mind at the End of Its Tether (1943), which perhaps says all that needs to be said about the extent of his depression. He died in 1946 at the age of 80.
Vannevar Bush and the Memex Vannevar Bush was the most important scientific administrator of the twentieth century. In the 1940s he was a well known public figure in the United States, although today his is little remembered. He was also an outstanding intuitive engineer and inventor. In 1945 he produced a design concept that is a remarkable foreshadowing of the World Wide Web. Even more remarkable is that it predated the invention of the computer. Vannevar Bush was born in 1890 in Everett, Massachusetts, the youngest of three children of a preacher and his wife. He grew up to be an inveterate tinkerer in his father’s basement, dabbling with mechanics, electricity, and photography. In 1909 he enrolled at Tufts College, Massachusetts, for an engineering degree. There he developed an interest in mechanical computing systems, and made a significant invention while still an undergraduate. This was his “profile tracer,” a computing machine for measuring the undulation of terrain. He went to the expense of securing a patent, though he never made any money from it. After graduating, he worked as an electrical test engineer for General Electric in Schenectady, New York, and then became an instructor at Tufts. In 1919, having saved enough money to support himself for a few years, he enrolled for a PhD in electrical engineering at the Massachusetts Institute of Technology (MIT). Afterwards he stayed on as an instructor, and undertook research on the mathematics of electrical power networks; this was a very important problem because America was in the process of establishing its electricity supply networks. Determining the electrical characteristics of power networks involved the solution of Ordinary Differential Equations. After designing a number of experimental machines, he completed his Differential Analyser in 1931. This was the most important computing instrument of the inter-war period, and a dozen copies were made at other institutions in the United States and Britain. At MIT, Bush emerged as an outstanding administrator. He became Dean of Engineering in 1932, and in 1939 he was appointed head of the Carnegie Institution, Washington, America’s most prestigious scientific funding agency. After the bombing of Pearl Harbor in December 1941, America joined World War II. Bush was appointed head of the Office of Scientific Research and Development, the organization that co-ordinated all military and civilian research during the war. He became a confident of President Roosevelt and was his chief scientific advisor. Thus Bush had a unique, commanding overview of wartime scientific research—of which the A-bomb, radar, and code-breaking were just the best known developments. In late 1943, when the end of the war and victory were in sight, Bush began to reflect on the role of science and technology in the post-war world. He came to the view that the most pressing problem would be the dissemination of information—getting new scientific and engineering knowledge into the hands of researchers and practitioners. Information was so difficult to find, that often it was quicker to re-invent than to search the literature. Moreover, much wartime research was still secret, so that when the war was over there would be a flood of new knowledge spilling over research workers. Like Wells, Bush thought that microfilm would be the most practical way to convey large volumes of information to its potential users. For example, he estimated that using microfilm it would be possible to store the contents of the Encyclopedia Britannica in the volume of a matchbox, and the contents of a working library could be stored in one end of a desk. He designed—or rather, envisioned—a machine he called the memex. Memex was a contraction for “memory extender.” In July 1945 he published an account of the machine in the Atlantic Monthly, a liberal arts magazine. The article caused quite a flurry of interest, and it was condensed and republished in Life magazine the following September. The editors of Life commissioned some superb illustrations that brought the whole idea to life. These images have become iconic in the history of multimedia computing. The memex consisted of a wooden desk equipped with an automatic microfilm reader and two page-sized projection screens. Thus far the memex would not be very different to the microfilm readers still used in many newspaper libraries. However, the memex was also equipped with an indexing system and a means of book-marking interesting items so that they could be referred to again subsequently. The memex was a stunning invention—a web browser rendered for 1945 technology. It almost ranks alongside Leonardo da Vinci’s flying machine as a piece of imagineering. An illustration in Life shows a close-up of the memex. Bush explained:
If the user wishes to consult a certain book, he taps its code on the keyboard, and the title page of the book promptly appears before him, projected onto one of his viewing positions. Frequently-used codes are mnemonic, so that he seldom consults his code book; but when he does, a single tap of a key projects it for his use. Moreover, he has supplemental levers. On deflecting one of these levers to the right he runs through the book before him, each page in turn being projected at a speed which just allows a recognizing glance at each. If he deflects it further to the right, he steps through the book 10 pages at a time; still further at 100 pages at a time. Deflection to the left gives him the same control backwards. A special button transfers him immediately to the first page of the index. Any given book of his library can thus be called up and consulted with far greater facility than if it were taken from a shelf.
In the Life illustration the screens were shown displaying an article about longbows. Bush explained:
The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. ... Thus he builds a trail of his interest through the maze of materials available to him.
The curious choice of subject matter, incidentally, can be explained by the fact that Bush was a collector and authority on longbows. The ability to link across different documents was subsequently adopted by computer scientists in the 1960s and given the name hypertext. The memex was never built, as such. It needed a cheap control technology to make it economically feasible as a consumer item. He did, however, manage to interest the US Navy which built a machine called the Bush Rapid Selector. It was entirely mechanical and was used to store navy inventories. It cost an expensive $85,000 and not many were built before it faded into obscurity. In 1967 in his memoir Pieces of the Action, Bush reflected that the memex was “still in the future, but not so far.” He realized that a practical memex would involve the use of inexpensive digital computers. When he died in 1974, it was still another decade before computers were cheap enough to revisit his ideas.
J.C.R. Licklider and Man-Computer Symbiosis J.C.R. Licklider (who like H.G. Wells is best known by his initials rather than his first name Joseph) is often described as the father of the Internet. That is a strong claim, but it is certainly fair to say that the personal computing and information environment of today is very much the vision he first described in a seminal paper “Man-Computer Symbiosis” in 1960. Licklider was born in St Louis, Missouri, in 1915, the only child of an insurance salesman and his wife. Like Bush he was an intuitive engineer and an inveterate tinkerer—he made model aeroplanes as a boy and graduated to fixing up motor cars in his youth. Fixing up cars was a life long passion—it is said that all his life he never paid more than $500 for a car. He went to Washington State University where he studied the unusual combination of maths, physics and psychology. He then studied for a PhD in acoustic psychology. During World War II he worked in the acoustics laboratory at Harvard University, and became a lecturer at the University when the war was over. In 1950 Licklider became an associate professor at MIT where he inaugurated a psychology program for engineers. Teaching engineers to think about human beings when they design artefacts is still something that engineering schools do very badly. In 1950 this was a revolutionary idea. There is no doubt that psychology was to play a hugely important role in designing personal computers. For example, the success of the Apple Computer Corporation was as much due the psychologists it hired as the software programmers. Back at MIT in 1950, Licklider got involved in the SAGE program, a computerized national defence network. SAGE was a multibillion dollar project that was to accelerate the technical development of computers by several years, particularly the technologies for communication between humans and computers. Licklider led the program on human-computer interaction, the outcome of which was the SAGE console. There was an important separation of tasks in SAGE. The computer supplied raw data and humans interpreted it. For example, the computer would display the speed and direction of a missile or aeroplane, but the human being would decide what to do with this information. Today this division of skills seems rather obvious—as we will see, in the 1950s, it was not. Licklider was always rather vague about his specific technical contributions, and he probably did not think it mattered very much. His great contribution was his vision for human-computer interaction and his genius for communicating that vision to people and organizations so that his dreams were eventually fulfilled. In 1960 Licklider drew on all his experience with SAGE and other interaction studies in his most famous paper, “Man-Computer Symbiosis.” It was a manifesto for his life’s work. The key idea lies in the word “symbiosis.” This was a biological metaphor and he gave the example of the fig tree and an insect known as the blastophaga grossorun; the tree could not reproduce without the insect, and the insect could not live without the tree. This was symbiosis—a mutually productive interdependence. He described a form of computing which was a kind of symbiosis between a person and a machine. An individual would sit at a computer screen, accessing information through a network, and then use computer programs to manipulate that information. The terms word processing and spreadsheet had not yet been coined, but these were just the kind of computer tools Licklider had in mind. In short, Licklider envisioned almost exactly the personal computing environment of today. His vision turned out to be so close to today’s reality that one is inclined to think it must have been a rather obvious extrapolation of contemporary technology. But this is exactly the point—in 1960 Licklider’s ideas were truly revolutionary. It is difficult to imagine today, but in 1960 most people thought that the future of computers lay in artificial intelligence. For example, it was expected that it would soon be possible to translate languages automatically. One would feed a Russian text into one end of a computer, and an English translation would emerge from the other. Licklider did not think this would happen in the short or medium term. However, he saw an enormous potential for augmenting humans with computers. In Licklider’s scenario, a human translator would sit at a computer, consulting on-line dictionaries and reference materials, and typing in a translation. Very much as a translator works today, in fact. Licklider was shortly to get a chance to change the world. But first we need to be reminded of some political context. In 1957 Russia launched sputnik, the first space satellite. America was completely wrong-footed. To try to regain American initiative and prestige in science and technology, President Eisenhower established the Advanced Projects Research Agency (ARPA). The Agency had the remit to develop future technologies that might have a military significance. In 1962 the Agency established an Information Processing Techniques Office (IPTO) to advance computer technology. Licklider was offered the position as program manager. Licklider leapt at the opportunity. It would be the end of his personal research, of course. But instead he would be able to put the pieces of a great jig-saw puzzle into place. Over his two-year term as program manager, Licklider funded projects that would change the whole trajectory of computer development towards personal computing. First he funded MIT to develop Project MAC. This was a large mainframe computer that was shared by about 30 simultaneous users sitting at typewriter terminals. Known as a “time-sharing system,” this was the only way to achieve personal computing in the 1960s and 1970s. He funded similar time-sharing projects at Stanford University, UCLA Berkeley, and the System Development Corporation. Perhaps the most important bet he placed was on Douglas Englebart’s Knowledge Augmentation Laboratory at the Stanford Research Institute. Both Licklider and Englebart were familiar with Bush’s memex, and the Laboratory set out to turn such ideas into a reality. The Laboratory invented the computer mouse, and the ability to interact with documents displayed on a computer screen. This was where the idea of “cutting-and-pasting” originated. In 1965 Engelbart gave a demonstration of his system to the National Computer Conference in San Francisco. The demonstration was decidedly clunky— black-and-white computer screens, the strange new mouse device, and a million dollar mainframe computer on the end of a telephone line. But it was the first demonstration of personal computing as we now know it. It bears the same relation to a modern PC as the Wright Brothers’ first flight at Kitty Hawk does to a Lear Jet. In 1964 Licklider finished his two-year term at ARPA, but he had a powerful say in appointing his successor. As a result ARPA appointed someone who shared the Licklider vision. And so it went on, through successive changes of program manager the baton of personal computing was carried forward. Meanwhile, Licklider returned to his somewhat low-key career of being a research manager at IBM and MIT. The building of the Internet was left to others, who were carried forward by the momentum he had established. In 1970 the four time-sharing computers that Licklider had funded were connected together over a network, which linked the three West coast machines to the East Coast machine at MIT. This was the ARPA network, or Arpanet for short. It turned out that electronic mail was the glue that held these four communities together. Others wanted to join in, so that by 1975 there were more than a hundred interconnected computers. By 1990 there were over 300,000, and the network had been renamed the Internet. By 2000, there were over 100 million computers attached to the Internet.
Conclusion There were really two outcomes of the imagineering of Wells, Bush, and Licklider. First, they created a latent demand for information to be brought out of academic libraries and onto the desktops of researchers and scholars. Second, they inspired the new technologies of hypertext and multimedia computing. Once the PC became affordable in the early 1980s, these technologies were used to create computer games and interactive learning in schools. They led to CD-ROM encyclopedias, which were not only easier to use than traditional encyclopedias, but also replaced an entire shelf of books at a vastly reduced cost. Generally personal computers of that era were not connected to an external network. They were islands of information unto themselves. Indeed the CD-ROM encyclopedia came surprisingly close to realizing Wells’ World Brain or Bush’s memex. The main difference was that the CD-ROM encyclopedia stored its information in binary form on a disk, whereas Wells and Bush had envisaged using microfilm. The rapid growth of the Internet in the 1990s was primarily due to the World Wide Web. The Web Browser made using the Internet easy for ordinary people, and also worth doing and worth investing in. The World Wide Web was invented by Sir Tim Berners-Lee working in the CERN European particle physics laboratory in Geneva, in 1991. As Berners-Lee put it himself, the World Wide Web was “the marriage of hypertext and the Internet.” The ideas were in the air. He just put the pieces together. And in so doing, he set in train a chain of events that have changed the world. Would Wells, Bush, and Licklider have been surprised by the success of the Internet and the World Wide Web? I think somewhat. In part, however, the Web is very much as they anticipated. Of course, the particular enabling technology—computers—could not have been envisioned by Wells or Bush, who both wrote before the computer was invented in 1945. But that difference is superficial. We “surf the Web,” hopping from one web-page to another at the touch of a mouse, just as Bush has imagined in his longbow scenario. And Yahoo and Google are just the kind of global indexes that Wells anticipated. However, not one of Wells, Bush, or Licklider anticipated the huge economic significance of the Web as an enabler of electronic commerce. The systems they envisaged were primarily for a new form of library and a personal workspace. They were also largely one-directional. The student at his or her desk would be the recipient of information created by experts. There was no expectation that this might evolve into a two-way exchange of information. In none of their writings are notions such as e-mail, video-on-demand, or holiday and travel booking, or trading in stocks and shares so much as hinted at. I think there is a very good reason for this apparent limitation in their visions. When the railway was being invented, to have projected further into aeroplanes and flight transport systems would have been pure speculation. It was only when the railways were in place that one could in a practical sense begin to anticipate air transport. Thus what Wells, Bush, and Licklider envisaged was something like the World Wide Web. Only when that was in place could one begin to envisage what it could be used for. That is the job for our generation, and we have a long way to go before we exhaust the possibilities of the World Wide Web.
© Professor Martin Campbell-Kelly, Gresham College, 9 November 2006
This event was on Thu, 09 Nov 2006
Support Gresham
Gresham College has offered an outstanding education to the public free of charge for over 400 years. Today, Gresham plays an important role in fostering a love of learning and a greater understanding of ourselves and the world around us. Your donation will help to widen our reach and to broaden our audience, allowing more people to benefit from a high-quality education from some of the brightest minds.