In the late 1980s, Tim Berners-Lee was working at CERN, the European particle physics lab near Geneva, and he was frustrated by how difficult it was for the many researchers and scholars working there to share data, information and progress reports. "In those days, there was different information on different computers, but you had to log on to different computers to get at it," he has written. "Also, sometimes you had to learn a different program on each computer. Often it was just easier to go and ask people when they were having coffee."
The London-born son of two early computer scientists, Berners-Lee studied physics at Oxford before coming to the lab. He started toying with a more efficient way to share information among the scientists affiliated with CERN, many of them based in universities and research centers around the world, and in March of 1989 he presented his first proposal for a way to take better advantage of networked computers. It was labeled "vague but exciting" by his boss. Another proposal in November 1990 didn't get a green light, either. But he kept working on his own. By the end of that year, he'd developed the underlying technologies for what he'd decided to call the World Wide Web.
And on August 6, 1991 -- 32 years ago this month -- Tim Berners-Lee launched the world's first web site. Appropriately enough, it was a guide on how to use this new technology.
On one hand, it's remarkable how little the foundations of Berners-Lee's invention have changed since he created them more than three decades ago. Before the end of 1990, he'd developed the three technologies that still power the web today: HTML, or hypertext markup language, the programming language that guides the web; the URL, or uniform resource locator, which serves as a unique address for every web page and file; and HTTP, the hypertext transfer protocol, which is the system that allows linked resources to be shared across the web.
On the other hand, it's remarkable how much the technology has changed - and how much it has changed the world around us.
The early web was used almost exclusively by academics and researchers; the first web server in the U.S. went live later that same year, in December 1991, at the Stanford Linear Accelerator Center in California. Early web sites existed mostly for information and databases -- research data, phone directories -- and most were text only, as only users running hugely expensive NeXT computers could use Berners-Lee's original, full-featured browser. The first major browser we'd recognize today, Mosaic, was released by the National Center for Supercomputing Applications at the University of Illinois in 1993. By the end of 1993, there were reportedly 500 known web servers in the world, which seemed like a lot. By the end of 1994, there were 10,000.
From the perspective of today's totally web-reliant world, it's astonishing to think about all the everyday services and habits that didn't exist not that long ago.
In 1990, according to a Pew Research Center history, only 42 percent of American adults had used a computer. In 1994, what may well be the first-ever online purchase was made, a Pizza Hut pizza with pepperoni, mushrooms and extra cheese. It wasn't until the next year, 1995, that Amazon launched, as did Craigslist, eBay and -- in a move that ultimately proved fatal to Mosaic -- Microsoft's Internet Explorer. 1996 brought us the first viral video, the famous, unnerving Dancing Baby.
In the 2000s we got social media and iTunes, the iPhone and at-home broadband. And in 2008 a majority of the U.S. adult population said they consumed news about the presidential election campaign online.
By the 2020s, the web has become the place where we watch TV and stream movies, where we hold our work meetings on Zoom and our family catchups via FaceTime. Web3, with its promise of blockchain-enabled financial something, may be fizzling out, but now we're waiting to see how websites enabled with artificial intelligence will change how we live, work, shop and interact.
Berners-Lee -- or Sir Tim, as he's been known since he was knighted by Queen Elizabeth II in 2009 - is determined to keep the modern web true to its early, idealistic roots. He's the founder and emeritus director of the World Wide Web Consortium, which sets standards for the web, and he cofounded the World Wide Web Foundation "to advance the open web as a public good and a basic right." And now his cause is helping individual web users reclaim control of their personal data.
Meanwhile, that very first web page he built lives on, more or less. In truth it was never originally archived, and in 2013 folks at CERN worked to restore it.
Click the social buttons to share this story with colleagues and friends.
The opinions expressed here are the author's views and do not necessarily represent the views of MediaVillage.org/MyersBizNet.