On this day: Tim Berners-Lee invents the World Wide Web

On this day in 1990, physicist Tim Berners-Lee circulated a memo for a relatively modest information sharing proposal that would go on to revolutionise commerce, writes Eliot Wilson

At 35, Tim Berners-Lee was a fellow at the particle physics laboratory CERN. A physicist by training, he had become interested in computer networking and how organisations shared information. He understood the implication of Sir Isaac Newton’s dictum, “If I have seen further, it is by standing on the shoulders of giants”, and realised that access to data for scientists from different departments, institutions and even countries was the cornerstone of successful research.

In March 1989, Berners-Lee wrote a memorandum entitled “Information Management: A Proposal”. It addressed the challenge of keeping track of the vast amount of data a leading laboratory like CERN produced; his solution was “a ‘web’ of notes with links (like references) between them” which could accommodate any kind of information and required no major technological advances. As he observed later: “Most of the technology involved in the web, like the hypertext, like the Internet, multifont text objects, had all been designed already.”

Berners-Lee gave the proposal to his boss, Mike Sendall, head of CERN’s On-line Computing Group. A modest and witty man and a supportive manager, he too was interested in the problem Berners-Lee was trying to solve, and after reading the memorandum he wrote a judgement-cum-aide-memoire on the front which will stand as one of history’s great understatements: “Vague but exciting…”

It was a neat summation. Sendall sensed that Berners-Lee had identified something which could be revolutionary, but that it needed refinement and maturity. But his imprimatur let Berners-Lee continue, and he began collaborating with a Belgian-born systems engineer, Robert Cailliau, head of the Office Computing Systems in CERN’s Data Handling Division, who was looking at the same issue.

On this day in 1990, more than 18 months after the initial framework, Berners-Lee and Cailliau circulated “WorldWideWeb: Proposal for a HyperText Project”. It was a relatively modest plan: for an initial three months, a team of four software engineers and a programmer would create “simple browsers for the user’s workstations”; phase two would allow users not just to access data but to add to the existing information. The proposed overall budget was around CHF 80,000, at a time when the average Swiss salary was perhaps two-thirds of that amount.

Emerging significance

It might seem inexplicable, on the face of it, that the creation of an information-sharing system at a physics laboratory just outside Geneva should have been chosen by a British Council survey in 2016 as the most significant moment of the preceding 80 years: ahead of the development of the atomic bomb, the discovery of penicillin, the Holocaust and space exploration. It was an internal project using an existing technology, hypertext (a word coined in 1963), to create “a single user-interface to many large classes of stored information such as reports, notes, data-bases, computer documentation and on-line systems help”; it specifically excluded “research into fancy multimedia facilities such as sound and video”.

Its significance only started to become apparent if you thought about it conceptually, and considered the implications of Berners-Lee’s brilliant choice of name: the World Wide Web. The development of the browser provided a way to exploit what was already there, the internet, described by Berners-Lee as “a network of networks. Basically it is made from computers and cables.”

By contrast, the Web was “an abstract (imaginary) space of information… [it] made the net useful because people are really interested in information (not to mention knowledge and wisdom!) and don’t really want to have to know about computers and cables”.

The Web was released to other research institutions and then to the internet at large the following year. Crucially, CERN made the code and protocol royalty-free, and it grew at an astonishing rate. Mosaic, one of the first integrated browsers, was released in 1993, quickly followed by Netscape’s Navigator and Microsoft’s Internet Explorer, and it became clear that money was to be made – enormous sums of it.

Netscape’s IPO in 1995 gave it a market value of $2.9bn and helped start the dot-com bubble. When AOL acquired Netscape four years later, it was worth $10bn; the same year, Yahoo paid $3.6bn for GeoCities. Google, created as a search engine in 1998, released its own browser, Chrome, in 2008, and it now accounts for three-quarters of browser use.

The dot-com bubble may have burst in 2000, but in retrospect it seems like a blip. The Web was an increasingly indispensable part of work and leisure, and now technology companies are the biggest players in the corporate world. On current market capitalisation, the most valuable companies are all in the tech sector: Nvidia, Apple, Microsoft, Alphabet, Amazon, Meta, Broadcom. Between them they are worth more than $20 trillion.

Sir Tim Berners-Lee, as he now is, set out to make information more readily available. He certainly achieved that, and a by-product was enabling the growth of mighty corporations which make America’s Gilded Age look modest and thrifty. He wrote recently that “we have the chance to restore the web as a tool for collaboration, creativity and compassion across cultural borders… it’s not too late”. Our post-modern Prometheus might find that a greater challenge than even he can conquer.

Eliot Wilson is a writer and historian; senior fellow for National Security, Coalition for Global Prosperity; contributing editor, Defence on the Brink

Related posts

United Against Online Abuse Welcomes 5th Scholar to Fully Funded Research Programme

No selfies please: Croatia has a quiet luxury island that’s more Succession than Kardashian

Fitch Learning Completes Acquisition of Moody’s Analytics Learning Solutions and the Canadian Securities Institute