The emerging model of the Internet
In the first of a series on the Internet's infrastructure, we outline the players and hear about the problems straight from the mouths of users
What are currently the biggest problems and issues surrounding the Internet and its rapid evolution? Who are the major players and what are their roles? In this first article of our series on Internet infrastructure, we answer these questions and consider its complex structure. (2,000 words)
Networld+Interop Spring 97, in Las Vegas last month, offered a great chance to talk with users and vendors about the state of the Internet and where they see it going. The first part of this series on the Internet is intended to frame the players, issues, and model of the Internet, a complex task in such a small space.
Just as neurologists are beginning to map and decipher the most basic works of the brain, engineers, content developers, and users are beginning to do the same for the Internet. It began as a tool to share knowledge in a simpler environment (compared with today), but now its underlying, complete intertwining of knowledge has revealed a series of issues and challenges perhaps equal to the mysteries of the human brain.
Examining the structure
To put this in some perspective, the Microsoft Web site currently experiences 45 million hits from 450,000 unique IP addresses each day. That translates into 100 hits per person. If a hit is defined as an object (Web page, graphic object, Java or ActiveX control object, etc.), you can think about the connection model looking like Figure 1. It resembles an art project where each nail on the board is an IP address or Internet object. Each string between the IP address and Internet object is a packet path.
Now we need to improve the structure of the model to better represent the true nature of the Internet. It is rare for any packet to travel only a single hop (straight path) to its destination. Instead, you need to add new nails to the board, defining points that the string needs to pass through to get to the final destination; each nail therefore represents a router or switch on the Internet. In theory this looks much simpler than the previous model, but it isn't.
As you add each string or packet/path, the router at each intermediary point requires enough capacity to allow each packet to pass through. Think of the router as a nail that is a few inches in height. Each packet is like a string. When it is passes through the router, it takes a little of the vertical space on the nail. At some point, the space is filled. This means leaving incoming packets will disappear from the network when this congestion point is reached. As you force more packets through a given router, the probability of that router's capacity overflowing increases, just as forcing a number of strings onto the same nail would fill the nail to capacity.
Since the Internet is actually a dynamic space, each packet occupies space in the router for an instance, but the sheer numbers of packets passing through the Internet makes this problem very tangible and similar to the string and nail in the artwork project. Now we have a basic definition of the topology and throughput problem of the Internet.
In a televised segment on ABC news on American Airlines, it is estimated that 266 million people will need to be tied together in a groupware-type of environment by 1999. Since this was an infomercial for Novell's GroupWise, the numbers may be overly optimistic, but even at 50 percent, no single LAN or even a reasonable number of LANs will serve the 133 million people looking for this functionality. It will require an Internet connection. If you then assume that each person is going to be exchanging rich content with at least several people and organizations everyday, then the minimum number of connections will blossom to the billions, with perhaps hundreds or more packets per transaction. This moves the packet count into the trillions mark, a number that is probably grossly underestimated.
At the Mid-Atlantic Regional Users Group for Interex, the International Association of Hewlett-Packard Computing Professionals, a straw poll revealed that virtually everyone was dissatisfied with the performance of the Internet. Whether their connections were to Sun or HP Unix servers or to Windows NT, the experience is the same -- an inconsistent throughput or quality of service from any site over any period of time. All of the big players suffer from problems thought only to plague the smaller Internet service providers (ISPs) or individuals directly connected to the Internet.
With a promise of anonymity, people from Cisco to Earthlink and Sun to Microsoft all agree that the current state of the Internet is insufficient to support all of the marketing and media promises made to date. In fact, the consumer end of the market, heralded as the financial foundation of the future Internet is now seen as a loss leader for access connection providers like AT&T and Pacific Bell. Rumors in the industry abound about players like this leaving the dial-up consumer market to focus on the business-to-business market that is emerging, again, as the most economically-viable customer base to grow.
For a more specific spin on the problem set, here is a list of issues summarized from the numerous conversations held at Networld+Interop Spring 97 -- at the airport, in taxicabs, and even in line at a movie theater.
If this list seems daunting, keep in mind that each item actually expands to a much larger problem set. Just summarizing the problems leads you to understand why the Internet issues will not be solved soon or easily. The good news is that the Internet has grown so big that it shows sufficient potential to now draw the needed investment to solve these problems and fulfill the early promises.
When the Internet began it was a solution to research organizations's and the government's issues of communicating vast amounts of data across large distances. It was easy to implement because there was a limited number of players connecting a relatively finite number of points together, numbering in the hundreds and then thousands. Today, that picture looks significantly different, where the number of players has grown to the millions level.
Consider just the number of categories of players. In the following table, each player is defined. As you can see, it is a large list often with a large membership base. Now to add to the problem, not everyone is marching to the same drummer. Each group has its own agenda for participating in this market. At times, perhaps more often than not, the agenda conflicts with another player. Consumers are lured and set to low-access costs coupled with free content -- the model derived from television and cable services. Smaller ISPs are grabbing niche markets through creative telephone tariffs and access solutions, reacting much faster to the market than larger ISPs, who are often hamstrung by the corporate bureaucracies that spawned them.
|Telephone Companies||Provides the transport medium for Internet, entering the market as Internet service providers||Dozens|
|Large Internet service providers||Provide access services, private networks, Web hosting||Dozens|
|Small Internet service providers||Provide access services, Web hosting||Thousands|
|Businesses||Web hosting, private networks, remote access, Web surfers||Hundreds of thousands|
|Consumers||Web surfers, Web hosts||Tens to hundreds of millions|
|Hardware manufacturers||Support Internet service providers, consumers, businesses, telephone companies||Hundreds to thousands|
|Software manufacturers||Support Internet service providers, consumers, businesses, telephone companies||Hundreds to thousands|
|Regulatory agencies||Monitor and shape the telecommunications infrastructure supporting all users||Dozens|
As we see the continued deregulation of the telephone industry, the government continues to keep some regulation in place, like the proposed second line tax to fund Internet dial tone for our schools. Somewhere in the process of evolving further into an information society, the idea that basic telephone dial tone has extended to a right of IP dial tone has emerged. Regulators in every state and at the federal level are faced with the obvious discontinuity that these extensive computer networks represent.
Since the Internet is an expensive technology, requiring a higher level of expertise, resource, and capital than is available to the majority of this country, a natural separation of classes is appearing -- the information haves versus the have-nots. The regulatory agencies face this dilemma with no real tangible solution and stand on the edge of imposing regulations into this complex marketplace, which may further exacerbate the problems of uniting numerous players to solve the issues.
What does it all mean?
Will the Internet survive? Absolutely yes! Will the Internet be as easy and reliable to use as your telephone or car? Someday, but certainly not now.
The technology is still rapidly evolving, with little time for the innovations and standards to emerge and be absorbed by the implementers in the market, let alone the final end users. Not a day goes by that doesn't show us examples of what's broken.
On a personal note, I recently used a search engine to look up two people I had met the week before to get their home addresses. In both situations they had been divorced three or more years ago. Yet when I found them in the various phone directory look-ups, I found each person listed with the ex-spouse, information that was obviously at least three years incorrect. When is the promise of real-time information, current within a more reasonable timeframe (like months or weeks for phone number changes) going to become a reality?
The upcoming series of articles on the Internet are going to open a window on the future that is striving to answer these questions. No one really knows the "when" part of this. It is too big to determine, but there are a number of fairly concrete trends and milestones emerging that everyone needs to understand. The future of commerce over the Internet, plus widespread consumer usage hangs in the balance of these answers, and with that, the real source of revenue to keep the entire process alive.
About the author
Robert E. Lee is a technology consultant, speaker, columnist, and author who has been in the computer industry for 20 years. He specializes in networking, Internet strategies, systems analysis and design activities, and has participated in the Windows NT and Internet Information Server betas since the start of those products. Reach Robert at Rob.Lee@sunworld.com.
If you have technical problems with this magazine, contact email@example.com