Web 2.0
Web 2.0, term devised to differentiate the post-dotcom bubble World Wide Web with its emphasis on social networking, content generated by users, and cloud computing from that which came before. The 2.0 appellation is used in analogy with common computer software naming conventions to indicate a new, improved version. The term had its origin in the name given to a series of Web conferences, first organized by publisher Tim O’Reilly in 2004. The term’s popularity waned in the 2010s as the features of Web 2.0 became ubiquitous and lost their novelty.
At the first conference in 2004, the term was defined by “the web as platform.” This, however, was augmented the following year with a still more nebulous expression incorporating the idea of democracy and user-driven content, especially as mediated by the Internet. In particular, many of the most vocal advocates of the Web 2.0 concept had an almost messianic view of harnessing social networking for business goals.
One of the most influential concepts of democratization was due to Chris Anderson, editor in chief of Wired. In “The Long Tail,” an article from the October 2004 Wired, Anderson expounded on the new economics of marketing to the periphery rather than to the median. In the past, viable business models required marketing to the largest possible demographic. For example, when there were few television networks, none could afford to run programs that appealed to a limited audience, which led to the characteristic phenomena of programming aimed at the lowest common denominator. With the proliferation of satellite and cable networks, however, mass marketing began to splinter into highly refined submarkets that cater better to individual tastes.
Similarly, where traditional brick-and-mortar bookstores could afford to stock and display only a limited range of titles, Internet bookstores such as Amazon discovered that total sales of niche titles actually exceed those of mass-market best sellers. The vast quantity of niche books makes up for the greater sales of a few popular titles—makes up, that is, in the new digital environment of e-commerce, where counter space is no longer limited.
Amazon.com was also a leader in adopting user-created content. One of the appeals to shopping at Amazon’s site was the inclusion of amateur book reviews, with users being able to leave personal perspectives and interact with other reviewers. An even more successful business example of user-created content came from electronic games. Many companies found that, by including simple programming tools with their games, ordinary gamers could create modifications, or mods, and new scenarios that generate as much or more interest as the original game and thereby extend its lifetime sales. This strategy proved especially effective in conjunction with Web sites that host players’ games and forums for exchanging ideas and files.
An exact definition of Web 2.0 proved rather elusive, in part because the concept encompassed different goals and expectations for the future of the Internet and of electronic publishing in general. A leading critic of the Web 2.0 concept was Web inventor Tim Berners-Lee, who pointed out that
Web 1.0 was all about connecting people. It was an interactive space, and I think Web 2.0 is of course a piece of jargon, nobody even knows what it means. If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along.
That is, social networking had always been central to the Web, for, according to Berners-Lee,
Web 2.0…means using the standards which have been produced by all these people working on Web 1.0. So Web 2.0…means moving some of the thinking client side so making it more immediate, but the idea of the Web as interaction between people is really what the Web is. That was what it was designed to be as a collaborative space where people can interact.
In contrast, Berners-Lee advocated the development of the Semantic Web, which some visionaries call part of Web 3.0.