|
subscribe |
3. Internet |
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . |
Internet TV with CU-SeeMe: Chapter 3 - The Internet
3
The Internet
After you read this chapter you'll know
Networks
In the beginning there were computers. Well, not that early on, but some time thereafter. Some people thought that computers were good things, and did fairly cool things with them. Then they did really cool things with them. Then they decided that the whole would be greater than the sum of its parts, and if I could have access to your data and you to mine, this would be another good thing. Networking was born.
Today, almost every personal computer comes ready-to-network, right out of the box. Every Macintosh, since day one (in 1984), is sold with AppleTalk installed and LocalTalk hardware. The latest version of Windows, 3.11, has slightly quixotic networking in its standard configuration. This has turned out to be a good thing, as it happens.
Local Area Networks
There are several different kind of networks. The most basic network is when two or more computers in the same physical location are physically connected with short bits of wire. This, basically, is a Local Area Network, a LAN.
Wide Area Networks
When several buildings are networked together with a bit of extra hardware, perhaps in a campus or business park, a Wide Area Network, a WAN, is created. In most cases this is a symantic definiton only of interest to the system and network administrators - users of the network usually can't tell whether they're on a LAN or WAN (unless they really care and dig a bit).
Internets
Sometimes it becomes useful to connect two different networks together. Perhaps Engineering and Accounting started out with two different networks (after all, Engineering was using screaming workstations and Accounting was using little Macintoshes) but now, in this age of Just-in-Time manufacturing and profit center mentality, it's deemed proper to marry, to internetwork, the two. An internet is the conglomeration of two or more networks.
Rather than forcing one group to abandon their machines and adopt the hardware of the other group, connecting the two networks with specialized hardware and software allows everyone to work as before (but now they have access to more computers). The equipment required to internetwork depends on the specific type of networks to be connected, the networking languages (networking protocols) they speak, and the type of hardware used to connect computers on the networks.
Internet with a capital 'I'
The Internet, with an uppercase 'I', refers to the worldwide conglomeration of internets. The Internet is growing in leaps and bounds, and any numbers that I included today would be somewhat out of date by the time this book goes to the publisher, a few weeks hence, and hopelessly out of date a few months thereafter. It doesn't really matter, though. The Internet is big, really big. Just as I can never remember if the total number of grains of sand on Earth (or was it stars in the Universe?) is a tillion gajillion bazillion or a bazillion gajillion tillion, the total number of computers and people connected to the Internet is only useful at very boring (or very technical) parties. Suffice it to say that over a thousand computers are being connected to the Internet each day, and that many millions of people in almost every country on the planet have access to the Internet. Even people in the Canary Islands off the African Sahara - where I spent many a happy childhood summer contemplating grains of sand (or stars).
The story of the blind men and the elephant is a telling one. There are many facets of the Internet, and people think of the ones they know when they describe the Internet to others. Few people have a good working understanding of the breadth and depth of the information available on the Internet. We know about some of the different kinds of programs we use to access the different kinds of information available on the Internet. We use one kind of program for reading and writing email, another for retrieving files from other computers to our own, a third for meeting people and chatting with them, and yet another to play Go, chess, or role-playing games with them. And, of course, we use CU-SeeMe to exchange audio and video with others.
Most interestingly, the word Internet refers simultaneously to the physical hardware that makes up the net, the information on the individual computers that populate the net, the software used to get at the different kinds of information, and to the vast multitudes of people who give the Internet its feel.
You say potatoe...
CU-SeeMe works over networks that use the TCP/IP (Transmission Control Protocol/Internet Protocol) method of moving information from one place to another. Many private networks use TCP/IP because it's an method guaranteed to deliver data intact to the intended recipient. Not surprizingly, the largest network that "speaks" Internet Protocol is the Internet itself. It's because of this that we, as CU-SeeMe users, are interested in the Internet.
The Internet of Yesterday
The Internet (much like humans) appear to be far too complicated to have arisen from an evolutionary set of changes. It's most likely true of humans, and definitely true of the Internet. The Internet started out as a small military project designed to increase the survivability of American defense capability. It's since metamorphosed into a venue for academic institutions to share research results, then into a mixed environment of personal, academic, and commercial uses. Today it's being used to explore audio and video, electronic anonymous cash, and the feasability of distributed information storage (the World Wide Web). Tomorrow's uses should be quite interesting.
Sneakernet
From the first days of modern computers in World War Two to the early 1970s, computer networking was nonexistent at worst to primitive at best. The only way of communication between two computers was the manual transfer of information by humans via punched paper tape, punch cards, and magnetic tape. Sneakernet, the tongue-in-cheek name given to people walking between the computers, served early computer users well, as long as the computers in question accepted the same physical media. If you only did tapes and I only did cards, we had troubles.
Always up
The next step in computer networking was to wire computers (in those days only "big iron" mainframe computers existed) together so they could communicate. One requirement of early networking was that each computer be up and running when any computers on the net were communicating. If one went down (whether for maintenance or because something caused it to crash), the entire net went down. This made networking unreliable and annoying.
The ARPANET
??Production - please make that an e-with-an-acent an the word "cafe" in
the following paragraph. - Michael
After World War II the United States found itself in an escalating "cold war" - partially of its own creation. Tensions were high, and enemies were everywhere. Joe Rinaldi, a regular at the café where I wrote this book, recalls "Russia was the enemy. General Douglas MacArthur was a hero to my family - he wanted to cross the 38th parallel and use atom bombs to make a 50-mile wide swath of radioactive cobalt so the North Koreans could never come down the peninsula again. General George Patton was a folk hero - he'd asked President Eisenhower, at the end of WWII, to sanction an invasion of Moscow to take clean up the communists once and for all. Senator McCarthy was a patriot to my family - I was only a 7-year-old and didn't have much of my own mind then - he was hunting for communists, and to be a communist - to us - was a very bad thing indeed."
School-kids were being taught that a nuclear attack was a conceivable event, and that by using the school's hallways as a shelter and doing a "duck and cover" when a nuclear flash was seen, could be survived. Towns had Civil Defense volunteers, and "Emergency Fallout Shelter" signs appeared on many public buildings. Supplies were stocked in the public shelters and the private ones being built in the backyards of many families. For many, the 1950s were a process of "learning to love the bomb".
??Editor - I have a pointer to Peter Salus's book "Casting the Net", in which he describes the early days of the internet. One of the events that apparently influenced the early desire for highly survivable systems was the terrorist bombing of three microwave relay sites in the early 1960s. I'm looking for a copy of the book in order to add that information right here. - Michael
The Russians launched Sputnik, and attack from the skies were added to our worries. America was humiliated, and in perhaps the best-remembered response, President John F. Kennedy announced that the race would be joined, and won, by landing a man on the moon by the end of the decade. (Neil Armstrong stepped on the lunar surface on 20 July 1969, and event that JFK sadly did not live to see.) Offensive capability had to be augmented by defensive stratagies. The national highways were built up in order to become a transportation mechanism for tanks and troop carries, for when (it was a certainty) the American mainland was invaded by the "reds". And so it's not surprizing that computers, in use by the military since WWII, and the recently-created networks, would be "hardened" to survive the inevitable enemy onslaught.
By the early and middle 1960s the U.S. Department of Defense (DoD) was a great consumer of computer technology. Because the (relatively) high-speed data processing was such an advance over the manually-calculated bombing tables of a few years earlier, computers represented a critically important resource to the armed forces. Networks that could be disabled by the malfunction of a single computer was clearly a major vulnerability, something clearly inferior to a network that would survive if some (or most) of the computers on the network didn't (an eventuality that was considered by military planners as a distinct possibility).
In 1963 the Advanced Research Projects Agency (ARPA), the branch of the DoD responsible for handing out grant monies, funded the Information Processing Technologies Office. At that time, ARPA-funded research didn't need to be directly related to military applications, a state of affairs that allowed ARPA to support basic research in novel areas. By 1969, Congress had second thoughts about allowing basic research to be supported by the defense budget, and required that ARPA show that its programs could be directly applied to the problems of military science. (Senator Edward Kennedy was one of the legislators responsible for the new requirement.) In response, ARPA became DARPA (Defense ARPA). Also in 1969, goals for a reliable network that could be used to link DoD, military research contractors, and the universities that were doing military-funded research was published by DARPA. Some of those goals were:
These were implemented in the early 1970s as the ARPANET, and have been inherited by us, the users of its ancestor, the Internet.
The ARPANET, connecting several computers in California and one in Utah, was born. The inclusion of military contractors and universities allowed Bolt, Beranek and Newman (BBN),the maintainers of the ARPANET, to learn from the problems the expanding network was having. New computers and more users changed the load on the ARPANET, and stressed it in unforseen ways. Maintaining the speed of traffic on the network turned out to be far less troublesome than keeping the constituent computers speaking the early Packet Switch Node (PSN) language of the ARPANET.
Stewart Brand, better-known as the founder of the
At present some 20 major computer centers are linked on the two-year-old ARPA Net. Traffic on the Net has been very slow, due to delays and difficulties of translation between different computers and divergent projects. Use has recently begun to increase as researchers travel from center to center and want to keep in touch with home base, and as more tantalizing sharable resources come available. How Net usage will evolve is uncertain.
Increased usage and a first-attempt programming solution was at the root of the growing pains the ARPANET was having in the late 1970s. PSN was a technology insufficient to support such a rapidly growing network. Engineers call this a "scaling" problem; what works in a small system may not work when the size of the system is scaled by 10, 100, or a thousand. Deficiencies in PSN prompted research that resulted in the creation and adoption of TCP/IP as the
A feature of IP is its guaranteed delivery. On an IP network each computer can determine the quickest route to a destination computer. This routing is done dynamically, and portions of a network that have been bombed "back into the Stone Age", cut by a backhoe, or inadvertently disconnected by a telephone technician are taken into account, and routed around. This makes for a flexible and robust network.
Steward Brand also said
There's a curious mix of theoretical fascination and operation resistance around the scheme. The resistance may have something to do with reluctance about equipping a future Big Brother and his Central Computer. The fascination resides in the thorough rightness of computers as communications instruments, which implies some revolutions.
Computing and networking technologies are a two-edged sword. Our rights to privacy may be irretrievably lost if care isn't taken, but IP has been a help, in a strange way. It's been noted that the Internet routes around censorship in the same way it routes around physical damage. The efficacy of the original design goals (and their subsequent implementation) has been proved again and again.
Begin Note
Can the Internet survive an enemy attack?
During the 1991 Gulf War the U.S. military targeted the Iraqi command, control, communication, and information networks (often abbreviated to C^3I). Because the Iraqis used commercially-available network routers that used standard TCP/IP routing and recovery protocols, which turned out to really work under the extreme stress of war, their network was able to withstand the punishment inflected by the multi-national force.
???Production: that's C<3 superscripted>I, as in "C-cubed I" - Michael
End Note
In the mid and late 80s companies like Sun Microsystems made popular the engineering workstation. These powerful desktop computers, usually with a large monitor, made it possible for scientists to model complex systems at will, without needing to schedule time on a large mainframe computer. Most of these workstations ran some "flavor" or UNIX, a inexpensive and popular operating system in academic and scientific environments. UNIX was created at Bell Labs and enhanced at the University of California at Berkeley. While Bell Labs, part of AT&T, was the home of telephonic networking, it took the folks at Berkeley to provide comprehensive networking capabilities for UNIX.
Two situations were coming to a head: engineering workstations were being attached to networks not designed for great loads by the hundreds (and then thousands), and each workstation, because of its speed, could generate more network traffic by itself than could the entire ARPANET population of a decade earlier. The sagging ARPANET couldn't survive this onslaught of popularity.
MILNET
Because much of the ARPANET was being used for non-military academic purposes, the DoD created a more secure military-only network creatively called MILNET.
??Editor - I'm researching more about MILNET
NSFNET
It was in 1986 that the National Science Foundation (NSF) made a decision that was to shape the future look and feel of the Internet. NSF wanted to purchase a few very expensive supercomputers, set them up into computing centers dedicated to research use, and provide them to researchers across the USA, who would submit programs and data across the network and who would (a very short time later) get back the results. The prohibitive cost of these supercomputers limited the plan to five machines at as many sites.
Their original plan to use the ARPANET as the connecting medium having fallen through, the NSF created small regional networks to connect researchers in the same geographic and its own network to connect the regional networks to the computing centers. NSFNET was born.
The ARPANET had been used as a practical model for NSFNET. Since many of the companies and academic institutions that were on NSFNET were also on the ARPANET, and because they both used TCP/IP as a communications standard, the two networks began a synergetic growth. The network managers began to cooperate in technology advances. Most notably, NSF pushed forward the research into higher-speed links. In many ways the NSF mirrored the sheparding of network technologies that ARPA had done years earlier. DARPA was eclipsed by NSF in its commitment to the advancement of the network.
At the request of the NSF, universities encouraged both staff and students to have access to the NSFNET, resulting in a much larger user population, which in turn resulted in increased network traffic. Coupled with the faster network links, the NSFNET was a great success, and eventually (in 1990) came to absorb the ARPANET. (MILNET, however, continues to this day.)
The cornerstone of the original NSFNET plan, sharing supercomputers, never lived up to its expectations. They were too expensive to purchase and maintain, they were difficult to use, and with the rapid evolution of the engineering workstation, not as attractive as they had once been. Luckily for us, the network itself was enough to keep the NSF involved in the project, and things survived without the computing centers. Most of what you know as the Internet is the NSFNET of recent years.
USENET
Despite its name, the USENET (the User's Network) is not a network in its own right in the same way the ARPANET or Internet are. Its a method of exchanging information based upon the bathroom graffiti model (explained below) that is available
Areas of interest are broken up into newsgroups (known in other systems as "forums"). Newsgroups operate with the same conversational dynamic as bathroom graffiti: someone scribbles a message (called a post), at a later time someone responds (with another post), and at yet a later time someone rebuts or confirms the previous contribution (with yet another post). Follow-up posts create a
Depending upon their administrative setup, USENET newsgroups may be propagated around the planet. Some newsgroups, such as ba.singles (events for single people in the San Francisco Bay Area), are available (but usually not read) outside of the Bay Area. Other newsgroups, such as comp.sys.mac.apps (applications that run on Macintosh computer systems) are available and read all over the planet. While English is the
There is no central administrative authority for USENET, a by-product of its growth. Each site administrator selects the newsgroups to be provided at that site (or lets them all pass) but can't select the newsgroups that are available elsewhere on the Internet. If enough site administrators don't approve of an action (such as the creation or deletion of a particular newsgroup) then it doesn't happen. (Of course, most site administartors has better things to do with their time then micromanage USENET.) Major events on USENET require support of a significant part of the community.
Early on, you got a USENET
Just as there are organizations that steer the Internet (described later on), the USENET has its own bodies. The USENET group moderator spearheads the process. The USENET Group Mentors provides an advisory body to assist in the feasability evaluation of a particular newsgroup proposal and the drafting of Requests for Discussion (RFDs) and Calls for Votes (CFVs). (Newsgroups are created for the population at large, even though some of them are quite technical. Proposed newsgroups with undetermined interest levels are usually run as mailing lists to gauge interest and participation.) The USENET Volunteer Votetakers provides an independent body to run the votes.
I've digressed a bit to show the difference between USENET and the true networks we've been following, primarily because USENET is often mistakenly lumped in with the ARPANET, NSFNET, and the Internet. (The USENET newsgroups are just one of the many resources available to users of the Internet.) But there's another reason: the existence of USENET today is directly a result of the growth of IP networks.
The telephone charges incurred by using modems to connect the USENET machines for the transfer of news were substantial, especially in the case of long-distance connections. Bean-counters, pens poised to slash any frivolous expenses from "their" budgets, were a very real threat to the continued survival of the USENET (and to any resource-draining program). The Network News Transfer Protocol (NNTP), released in 1986, implemented news transmission, posting, and reading using TCP/IP connections rather than using the traditional UUCP (UNIX-to-UNIX Copy), which allowed the news to travel over network connections that were already in place as result of the growth of the ARPANET, NSFNET, and finally, the Internet. The switch from modem-based to network-based transfer cut costs and ensured the survival of USENET. Two additional software enhancements to USENET, InterNetNews and News Overview (NOV), increased the efficiency of maintaining and serving news to user community, further helping USENET's survival chances.
BITNET
About two years after USENET started its journey a similar project began several hundred miles to the North. People at Yale University and the City University of New York started networking their IBM mainframes and exchanging information. The BITNET (Because It's Time NETwork) was born.
Aside from a bit of playing around on the ARPANET during visits to BBN and the Massachusetts of Institute Artifical Intelligence Labs, BITNET was the first network upon with I had solid experience. During my time as and undergraduate student at Boston University I was able to send email and participate in mailing lists over BITNET.
Unlike USENET's model of decentralized cooperative anarchy, BITNET has a hierarchical organization, run by an Executive Committee. This gives BITNET a very different feel (and size) from USENET. IBM provided much of the money, expertise, and technical support to BITNET. (I remember that IBM had six of its then-largest mainframes connected to BITNET to, we were told, monitor network traffic patterns for research purposes. Information flowed in, and despite our best efforts, nothing ever flowed out.) In 1984, IBM provided funds for centralizing network services, something inconceivable in the USENET world. BITNET became a not-for-profit endeavor in 1987; two years it merged its bureaucracy with that of the Computer+Science Network, CSNET, and changed its name to the Corporation for Research and Educational Networking, or CREN.
BITNET has become increasingly irrelevent in a world where the USENET is spreading in leaps and bounds, due in great part to its use of the TCP/IP communication standard. (BITNET still uses an outdated IBM networking standard and is available to end-users through less compelling means than the graphic clients that Macintosh, Windows, and UNIX users can use to read USENET newsgroups via NNTP over TCP/IP.)
BITNET and FidoNet (yet another network) were unaffiliated with the ARPANET and NSFNET, but as all these networks grew their users wanted to share information, and so gateways (computers that straddled two or more networks) were put on-line. My early years of being on the Internet at Boston University resulted from being able to pass messages from the BITNET (which our IBM mainframe was on) to the Internet through ucbvax, a DEC VAX computer at the University of California at Berkeley.
The Internet of Today
The tasks of management and upgrading the NSFNET was contracted out in 1987 to a group that included MCI Telecommunications, IBM, and Merit Networks. Merit is known for its development of MacPPP and its management of educational networking in the state of Michigan. MCI and IBM need no introduction. This contract is important to us because the experience these companies developed in running the NSFNET became the bedrock of the Commercial Internet Exchange, or CIX, described shortly.
National Research and Education Network
The High Performance Computing Act of 1991, sponsored by then-Senator Al Gore, was born of his conviction that America, to remain competitive in the world market, must have better and faster computing and network resources, available to all citizens, especially school-children. NSFNET benefitted "higher education" in the USA, leaving others out in the cold. The act mandated extending the "information superhighway", combining kindergarten, elementary, primary, and high schools, two-year colleges, community colleges, schools, public libraries, academic institutions, researchers, and governmental agencies into one very fast network called the National Research and Education Network, or NREN (pronounced "ehn rehn").
NREN will allow teachers to collaborate on courses of learning and special projects, it will allow students to share the learning experiences, and it will allow businesses to assist in the process. We don't have to wait for NREN to do any of this, though. Organizations like the Global School Net (discussed in Chapter 8, Usage) have been bringing innovative and entertaining educational programs from all over the world to school-children in participating schools.
The current expansion of the Internet to the poorer and more remote parts of the American school system (by special grants and cooperation between business and schools and entities such as GSN) is generating a lot of excitement. I'll provide some pointers to World Wide Web pages that contain articles by "Internet ambassadors", school-children who are involved in the on-going connectivity of the educational system.
The Internet is, de facto, the NREN until the political, commercial, and engineering struggles are resolved and the new network can be built (or more likely, evolved with the help of grant money).
President Bill Clinton's National Information Infrastructure (NII) proposal for expanding the Internet within the USA will provide both resources for the establishment of a far-reaching NREN tomorrow and a faster and more robust Internet today.
Commercial Internet Exchange
The NSF's Acceptable Use Policies forbids commercial activities on the NSFNET. These policies were vague and confusing to individuals - can I let you know I'm trying to sell my stereo via email? - and businesses that were on the Internet - can employees in far-flung offices discuss business via email? These policies also made it difficult for those without any connection with DoD-related research to gain access to the Internet at all. For many years people would maintain a presence in an institute of higher learning to keep access to the network, others would go through even stranger machinations. Many of these people were the hackers who torture-tested the network technologies, making things safer and more reliable for everyone else. When the existence of these folks was finally acknowledged by the NSF, an avenue for granting them access couldn't be far behind.
It wasn't. Several private companies banded together to provide a for-profit alternative connection between the regional networks (in parallel with the NSFNET). The Commercial Internet Exchange, or CIX, is comprised of well-known names such as IBM and Sprint and some lesser-known names (to the public) such as Performance Systems International and Alternet. CIX has been such a success that the NSFNET was put out to pasture in the middle of 1995, its traffic completely taken over by the CIX.
The Internet of Tomorrow
The Internet exists today as a symbiotic relationship between many self-preserving organisms. All must strike a gentle balance between exerting their will and killing their host. Several volunteer groups help regulate the wheels of Internet progress, increasing its survival chances.
The Internet Engineering Task Force, the IETF, is a public forum dedicated to discussing and handling technical problems facing the Internet. (The IETF is committed to doing its business in a manner accessable to all; some of their meetings have been held via Internet videoconferencing.) Problems deemed worth of effort result in the creation of "working groups", assemblages of computer scientists who craft recommendations for solving the problem and report back to the IETF. The system works because people interested in a problem (and willing to contribute time and effort to solving it) volunteer to do the work.
A good example of an IETF solution to an Internet-at-large problem is dealing with the inherent limits of the current IP addressing scheme. The explosion of machines connecting to the Internet was rapidly exhausting all the possible addresses; clearly a catastrophe. Several ideas were evaluated by the working group, and IPng (IP: The Next Generation) was the result. IP addresses will be made longer, more systems can be added, and peace and contentment exist in this little corner of the kingdom.
The Internet Architecture Board, the IAB, sets the communication standards for the many differing software and hardware systems that populate the Internet. A very important facet of the survival of the Internet today and tomorrow, the IAB is not open to the public at large, but has "invitation only" attendance. This hasn't proved to be a problem.
The Internet Research Task Force, the IRTF, handles the long-term issues, those that will affect the Internet in the next decade.
The Internet Society, the ISOC, ...
More coming...
Technological Problems
As the Internet grows in leaps and bounds, "information overload" becomes a very real problem. Subscribe to several mailing lists, and you'll be getting over a hundred email messages daily. Use email to stay in touch with friends and coworkers, and the amount of email you'll be forced to contend with can become problematic. An mail-reader that sorts incoming email by user or content helps with part of the problem, but what do you do when you need to find something from days past?
Similarly, as more and more information is tossed onto the World Wide Web it becomes more difficult to find sources of information to satisfy your research needs (be it for work or pleasure). The Cypherpunks speak of "security by obscurity", a unflattering description of keeping things secret by hiding the method of encrypting the information (a method that rarely works). We are facing a very real "security by obscurity" in an area where we don't want security; being flooded by an ever-increasing number of web sites, mailing lists, and email is, I believe, the defining problem facing the Intenet today and tomorrow.
Hundreds of years ago, librarians at the great library at Alexandria were faced with the same issues of cataloging and indexing information that we're faced with today. We, however, have machines that may allow us some measure of mastery over the information flow. Today, "search engines" such as those found at Yahoo and Lycos help us plow through piles of data on the Web and well-written mailers (such as Eudora, the one I'll be using to respond to your email) help us search through the email we've received and saved. Tomorrow, "intelligent agents" may do our work for us, learning from our past work habits and interests.
Societal Problems
Were it only that technical problems faced us. Douglas Adams, in
|
Have you found errors nontrivial or marginal, factual, analytical and illogical, arithmetical, temporal, or even typographical? Please let me know; drop me email. Thanks! |