Click on our Sponsors to help Support SunWorld

Internet by the numbers

Benchmarking firm proposes performance measure for Internet access providers.

By Neal Nelson, Barry D. Bowen

SunWorld
July  1995
[Next story]
[Table of Contents]
[Search]
Subscribe to SunWorld, it's free!

Abstract
Buyers of Internet access today can compare service providers' prices and little else. A benchmarking firm proposes a provider performance measure using technology and methods developed for testing databases and servers. Providers shown the methodology and results are cautiously optimistic of the benchmark's validity.


Mail this
article to
a friend

The explosive growth of the Internet, especially among commercial users, is beginning to raise many questions. Will we run out of IP addresses? Can the Internet backbone continue to expand rapidly enough to handle expected demand? And now that most businesses and individuals can choose from local and national Internet access providers, how can consumers evaluate the relative strengths of each?

Neal Nelson & Associates, a Chicago-based firm that conducts benchmark tests on servers and databases, recently completed a proof-of-concept study to determine if Internet access providers can be benchmarked in a meaningful fashion. The initial study attempted to answer three questions:

Preliminary research indicates the answer to all three questions is a definite yes.

Who cares?
The anecdotal experience of a Michigan-based VAR illustrates some of the issues. Realizing that one of his customers was a perfect candidate to use the Internet in his business, the VAR spent the weekend preparing a tour of interesting pages on the World Wide Web, and other Internet hot spots.

Pleased with what he had put together, and sure the customer would sign-up to do business on the Net after seeing the demo, the VAR set up his computer in the client's office the following week and dialed up his Internet access provider.

Something was very different, however. Unlike the snappy and responsive performance he saw on the weekend, the VAR and his prospective customer waited and waited. Frustrated and bored, the customer walked out of the demonstration, concluding that the Internet might be fine for hobbyists, but was simply too slow for business users.

What went wrong?

It is reasonable to assume that if it takes much longer than expected to transfer data from a variety of end-point Internet nodes, via a variety of Internet access providers, then there is a problem with the core Internet backbone. If some end-point nodes deliver up data promptly while others are slow, and this performance is consistent over several Internet access providers, then the problem is the end-point Internet node being accessed. And if a given end-point node appears responsive via some Internet access providers, and slow when accessed via other service providers, the differences are likely to be the responsibility of the Internet access providers.

Moreover, if there are measurable differences between Internet access providers, those differences may not be consistent throughout the day. If, for instance, a given Internet access provider has a disproportionate population of individual dial-up users, it may experience peak demand during evening, early morning, and weekends. If the bulk of subscribers are business clients, peak demand is likely to be weekdays between 9 am and 5 pm. If a small, local Internet access provider has one or two large business users, the idiosyncratic demands of those users can radically alter the performance for everyone else.

The Internet Access Provider Benchmark is an attempt to measure fairly the ability of an Internet access provider to deliver data throughput to its customer. To do this, a remote terminal emulator (RTE) is used to automate and control four Unix personal computers to access a given Internet end-point node simultaneously via four different Internet access providers (see Internet access provider benchmark methodology sidebar for details).


Advertisements

Results
Proof-of-concept tests were run once each hour, 24 hours a day, for six consecutive days between February 3rd and February 8th. Although it was reasonable to expect some variation between Internet access provider performance, the results were startling.

[Benchmark Chart]

One premium-priced national Internet access provider consistently performed worse that others tested (Provider D) while a moderately priced regional access provider (A) led the pack in average performance.

Among the four Internet access providers selected for anonymous testing, two were regional service providers in the Chicago-area, and two were national service providers with 800-number service. Dial-up PPP service to the regional providers cost $20 to $30 a month, while the national providers charged nearly $10 an hour.

The performance of all providers was best on the two weekend days included in the sample. Theoretically, business data traffic would be much lighter on those days, and this is consistent with the anecdotal evidence of the Michigan VAR.

Contrary to the axiom that you get what you pay for, one lower-cost regional provider (Internet access provider A) performed consistently faster than the other three, and one of the premium national providers (Internet access provider D) performed consistently worse than the other three providers tested.

On four of the six days, afternoon and evening traffic on Internet access provider D was dramatically slower than all other providers. One data point revealed performance 10 times poorer than the top performer.

[Benchmark
Chart]
Some sample points showed great variability in access provider performance.

The two days where Internet access provider D was only marginally slower than other providers were Saturday and Sunday, where performance was 1.1 and 1.25 times slower than the averaged performance of the other three access providers. On the four weekday afternoons, Internet access provider D performance ranged from 1.71 to 3.83 times slower than the combined averaged performance of providers A through C.

[Benchmark
Chart]
Internet access provider benchmark Afternoon performance.

While some might think these results simply reflect an anomaly, remember that afternoon scores average 4 hourly samples -- 3 pm through 6 pm -- over six days. And for the comparison above, providers A through C were also averaged. The performance profile for Internet access provider D is completely consistent with a provider carrying large amounts of business traffic on weekdays.

Putting aside the issue of cost, for a moment, this preliminary study also indicates that the fastest access provider is not necessarily the best choice for all users. Internet access provider A provided the fastest data transfer for each time slot with B and C not far behind (based upon averaging four to six hours in each time slot over six days).

However, in this study modems failed to connect to Internet access provider A 11 times over the 144 sample points. Although this may not seem excessive, test computers made five attempts to connect to each provider to obtain the hourly sample before recording a failure. So a failed connection indicates a solid bank of busy modems. Given this, Internet access provider A might be a poor candidate for dial-up customers that want instant Internet access and have a low tolerance for busy signals.

[Benchmark
Chart]
The best-performing access provider, in terms of data transfer performance, also stood out as the most difficult connection for a dial-up user to make.

Of course dial-up access restrictions could be interpreted as good news for business customers wanting dedicated 56 Kb connections to the access provider. It may indicate the access provider has imposed limits on the amount of its bandwidth -- often a T1 connection -- that it will allow dial-up users to tap.

There were also sample points when all four Internet access providers showed significantly slower performance, which may have been due to the state of the end-point node or the level of traffic on the Internet backbone.

Implications
Benchmarking Internet access providers requires isolating as many variables to the service provider as possible. These proof-of-concept tests did not attempt to collect all the information necessary to explain performance differences. It was considered satisfactory to establish the presumption that whatever caused the performance difference was either closely tied to the Internet access provider, or was directly under the Internet access providers' control. By generating up to 144 samples over the six-day period, a truly anomalous result should average out.

If an Internet access provider delivers degraded throughput, relative to competitors, further research is necessary to diagnose what steps need to be taken to correct the deficiency. Some options can be ruled out intuitively. If an Internet access provider connected to the Internet via a T1 link services a relatively large number of business customers with 56 Kb connections, and also maintains a large number of high-speed modem connections, it doesn't take too much to figure out that the available bandwidth has been over-sold.

If saturated bandwidth is not the problem, it may be that an Internet access provider's own connection to the Internet backbone traverses an inefficient or unreliable backbone access provider, and the Internet access provider may want to supplement or change its backbone access provider.

If bandwidth is used heavily during the business day, but is readily available on evenings and weekends, the Internet access provider may wish to increase the discount offered to non-prime time dial-up accounts. This might move more demand off of peak usage hours, or generate an incremental revenue stream from new subscribers in a manner that does not degrade access for current customers.

Futures
Where the industry should go from here is an interesting question. Neal Nelson and Associates is eager to explore opportunities to advance this early benchmarking effort. One obvious possibility is quality assurance research conducted on behalf of the Internet access providers. (See Access providers react to the benchmark sidebar for their comments.)

Plans are in the works to conduct quality assurance benchmarking for providers that maintain multiple points-of-presence to assess whether or not the vendor has oversold connections to one or more specific POPs (points-of-presence). It is reasonable to envision a situation where the overall load on an Internet access provider does not exceed its capacity, but the proportion of that load being carried by one POP does.

When, for instance, should an Internet access provider add additional modems? Or when should a POP serviced by a 56 Kb connection to the Internet access provider core T1 service be upgraded to fractional T1 service by adding additional 56 Kb permanent virtual circuits over its Frame Relay link?

The proof-of-concept research done here can assess both issues. The automated remote terminal emulation software can exercise a providers modem bank to determine the average number of open connections available to dial-up users. In effect, the RTE software is well suited to perform a busy signal study.

Additionally, if ftp tests are run against the same end-point node, via multiple POPs maintained by one Internet access provider, the benchmark will assess the access provider's ability to deliver throughput equally to dial-up users, regardless of which POP they use.

The latest rage on the Net these days is the World Wide Web. Many current Web users have experienced long delays when attempting to view pages. These delays can be caused by:

Benchmarking Web sites is another innovation to which this proof-of-concept research points. If interaction with a Web server is scripted to replace the ftp server used in these tests, we can gather a great deal of data.

Accessing a Web site from multiple Internet access providers can discriminate between performance issues at the Web site versus those that are the responsibility of the access provider. If simultaneous access produces significantly different performance figures, then the Internet access provider will be identified as a bottleneck. If the Web site produces substantially similar performance numbers via several access providers, then performance is likely to be the Web site's responsibility.

Web publishers and volume users may wish to track key Web sites to determine which sites are heavily loaded, and determine if performance characteristics vary at different times of the day. The Internet Access Benchmark is suited to do that.

Broad scale competitive benchmarking of Internet access providers will probably have to await more extensive input from Internet access providers and key corporate customers in order to formulate a consensus on methodology issues. This might be advanced by informal cooperation among interested parties, or might require the formal organizational structure of a consortium.

Regardless of how these issues shape up in the coming months, it is clear that increasingly, the volumes of digital information required for day to day computing will come over the Internet via access providers. Clarifying a methodology to measure the quality of the services offered objectively is a step forward the industry must make.


Click on our Sponsors to help Support SunWorld

About the author
Neal Nelson (neal.nelson@sunworld.com) is the owner of Neal Nelson & Associates, a Chicago-based firm specializing in hardware and software benchmarking. Barry D. Bowen (barry.bowen@sunworld.com) is an industry analyst and writer with the Bowen Group Inc., based in Bellingham, WA.

What did you think of this article?
-Very worth reading
-Worth reading
-Not worth reading
-Too long
-Just right
-Too short
-Too technical
-Just right
-Not technical enough
 
 
 
    

SunWorld
[Table of Contents]
Subscribe to SunWorld, it's free!
[Search]
Feedback
[Next story]
Sun's Site

[(c) Copyright  Web Publishing Inc., and IDG Communication company]

If you have technical problems with this magazine, contact webmaster@sunworld.com

URL: http://www.sunworld.com/swol-07-1995/swol-07-benchmark.html
Last modified: