Web 2.0

Internet and the Web

The first electronic digital computer was developed during World War II by the British to break the German’s secret code. The first full-service electronic computer, introduced in 1946 was ENIAC (Electronic Numerical Integrator and Calculator), introduced by scientist John Mauchly and John Presper Eckert of the Moore school of Electrical Engineering at the University of Pennsylvania. The commercial computers were introduced by the efforts of IBM. The Internet is in part the product of the military’s desire to maintain US defenses after a nuclear attack. It came as a result of the 1957 Soviet Union launch of sputnik, earth’s first human constructed satellite, which disputed the supremacy of the US in science and technology. The US research team found answer in decentralization as the key to enabling communication to continue no matter where an attack occurred, and the solution was a network of computer networks- the Internet. In 1969, Arpanet went online, and became full operational and reliable within one year. Other development soon followed. In 1972, Ray Tomlinson created the first e-mail programme and gave us the ubiquitous. The term Internet was coined in 1974 by Stanford University’s Vinton Cerf and Robert Katin of the U.S. military. In 1979, Steve Bellovin, a graduate student of the University of North Carolina, created Usenet and IBM crested BITNET. With the development of personal or microcomputers, the Internet became accessible to millions of non- institutional users. Its capabilities include e-mail, mailing list, Usenet, FTP and world wide web (WWW).

Web 2.0, term devised to differentiate the post-dotcom bubble World Wide Web with its emphasis on social networking, content generated by users, and cloud computing from that which came before. The 2.0 appellation is used in analogy with common computer software naming conventions to indicate a new, improved version. The term had its origin in the name given to a series of Web conferences, first organized by publisher Tim O’Reilly in 2004. The term’s popularity waned in the 2010s as the features of Web 2.0 became ubiquitous and lost their novelty.

At the first conference in 2004, the term was defined by “the web as platform.” This, however, was augmented the following year with a still more nebulous expression incorporating the idea of democracy and user-driven content, especially as mediated by the Internet. In particular, many of the most vocal advocates of the Web 2.0 concept had an almost messianic view of harnessing social networking for business goals.

One of the most influential concepts of democratization was due to Chris Anderson, editor in chief of Wired. In “The Long Tail,” an article from the October 2004 Wired, Anderson expounded on the new economics of marketing to the periphery rather than to the median. In the past, viable business models required marketing to the largest possible demographic. For example, when there were few television networks, none could afford to run programs that appealed to a limited audience, which led to the characteristic phenomena of programming aimed at the lowest common denominator. With the proliferation of satellite and cable networks, however, mass marketing began to splinter into highly refined submarkets that cater better to individual tastes.