Click on our Sponsors to help Support SunWorld

What will the corporate computing infrastructure look like in the year 2000?

We hear from Sun executives and industry pundits about their expectations for desktops, servers, networks, and more

By Barry Bowen

SunWorld
May  1997
[Next story]
[Table of Contents]
[Search]
Subscribe to SunWorld, it's free!

Abstract
We consider potential developments in corporate computing environments as we inch closer to the year 2000. If you don't think radical changes can occur in such a short period of time, think about the fact that no one had even heard of Java just two years ago. (3,500 words)


Mail this
article to
a friend

In 1970 Intel had not yet introduced its first microprocessor. Microsoft and Apple were not even on the drawing board. In 1980 ARPANET connected about 200 host computers together, but the term "Internet" had not yet been coined, and Sun Microsystems did not exist. In 1990 the World Wide Web and Web browsers were still unknown.

As we look towards the year 2000 there is no doubt that it has been quite a ride so far. Sun Microsystems is now a dominant force in computing with $7-plus billion in revenue, and the idea that "the network is the computer" is taking on a rich new meaning.

It is hard to imagine the rate of fundamental change can keep going at this pace, concedes Jean S. Bozman, director of research in the Unix and client/server operating environments program at International Data Corp., but then no one can predict the future with certainty.

Most of the radical changes of the last two years can be chalked up to Internet technologies, Bozman and others agree. "When you see Microsoft responding to the Internet then you know it was an awfully big wave that hit us," she says, "and I think it will take some time to digest everything that has happened."

In assessing corporate computing networks in the year 2000, it is important to realize that while new technologies may develop, there is little time for them to be widely adopted by 2000, says Brian Croll, director of product marketing for Solaris servers with Sun Microsystems.

"To state the obvious, we are already into 1997, so the year 2000 is not that far away," says John Shoemaker, vice president and general manager of Sun's enterprise server and storage group. "If you want to figure out what is going to happen you need to look at what products are a reality today and then figure out what problems companies are trying to solve."

Major new architecture projects take about 18 months to plan and deploy, Shoemaker says, so products that do exist today have plenty of time to make it into mainstream use.

Shoemaker's short list of problems to be solved includes exploding corporate networks, huge legacy databases with hard-to-access data, and the ever-present problem of multiple languages and platforms making architectures, networks, and applications far more complex. Add to this the time-honored goals of boosting productivity and decreasing long-term IT costs.

It's that Java thing again
With the recent release of version 1.1 of the Java Development Kit (JDK), Java is well positioned for enterprise computing. Version 1.1 provides an enhanced security model, the JavaBeans component architecture to define the way all Java objects communicate, the JDBC for legacy database connectivity, and a completely rewritten Abstract Windows Toolkit (AWT).

"I've been in this business for thirty years," Shoemaker says, "and Java is the most unbelievable phenomenon I have ever seen. I think one of the biggest surprises will be how important a role Java and the Java computing paradigm plays by the year 2000."

Among the data available to back the claim that Java is on a roll is the tremendous amount of development activity taking place with Java. Forrester Research figures that 62 percent of Fortune 1000 firms are using Java for some development, while 42 percent claim Java will play a strategic role at their firms within a year.

IBM, for instance, has repeatedly asserted its strong commitment to Java, including global Java development teams working 24 hours a day. Lotus itself claims more than 1,000 full-time Java developers. Much of IBM's Java development work is aimed at the enterprise, says Scott Hebner, IBM's manager of application development marketing. Projects include DB2 interfaces, enterprise messaging, team development environments, version control, and many other value-added components. "For new programming projects, I think Java, and JavaBeans, is the preferred choice," Hebner says.

Internet technologies, such as Java, help to "paper over" many of the incompatibilities between various hardware platforms and operating systems, says IDC's Bozman. "It is not a cure-all, but it will help a lot. Corporate networks have mainframes running MVS, perhaps some old VAXs running VMS, the Sun stuff runs Solaris, you have the other Unix flavors, and a lot of Windows desktops. Java's write once, run anywhere promise directly responds to the need to solve cross-platform incompatibilities," she says.

Bozman is quick to add that ActiveX is something the industry must deal with because "clearly Microsoft is pushing it as hard as it can," and Microsoft independent software vendors (ISVs) are out there writing ActiveX programs.

How compatibility, interoperability, and market forces will shape the ActiveX versus Java struggle by the year 2000 is unclear.

Tim Sloane, director of Internet Research for Aberdeen Group, sees Digital very closely aligned with Microsoft's strategy, while Hewlett-Packard is staking out some middle ground, and most other large firms that are not PC-specific are moving towards Java.

The likely scenario, says Sloane, is that Java will play a significant role on the enterprise network, but ActiveX technologies will also play a role in desktop applications. ActiveX adherents will be drawn largely from firms now heavily using VisualBasic. Sloane also says (only half facetiously) that if market momentum behind Java continues to build at its current frantic pace, he would not be surprised to see a press release from Microsoft announcing that Java was really its idea from the very beginning.


Advertisements

What will be on the desktop?
Closely related to the impact Java will have on corporate networks by the year 2000 is a new set of desktop systems -- network computers. NCs, sometimes called "Internet appliances," are intranet/Internet-ready terminals optimized to run Java applications, Web browsers, and often Windows applications from NT servers. Forrester Research projects the NC market in 1998 will hit $1.5 billion. By 2001 worldwide NC sales will reach $12.5 billion, according to Input, a California-based research and consulting firm.

"Java and the network computer bring together the two most successful paradigms in the history of computing. The mainframe offered the efficiency and reliability of central administration. The PC brought local processing power to the desktop. NCs do both," says Steve Tirado, director of product marketing for Sun's Java systems group.

This combination offers tremendous cost savings by lowering administration and maintenance costs. Most NCs will not have local disks so administration is focused on the server, providing far more control. Gartner Group estimates the cost of ownership is about 40 percent lower for NCs, which would total nearly $20,000 over five years. Multiply that over a 1,000-unit deployment, and at least on paper, that's a $20 million savings.

"NCs will be very successful, as replacements for the millions of existing dumb terminals. Companies are scrambling to get all their users on the intranet, and NCs offer IT a less expensive way of upgrading terminal users to the Internet and Java capabilities than complex and hard-to-manage PCs," says Tom Rhinelander with Forrester's computing strategies service.

While most analysts agree that NCs will be selected over PCs to replace aging mainframe terminals, Sun's Shoemaker claims that more than 10 large corporations are already working with Sun to replace existing PCs with JavaStations. Two examples he cites are hotel check-in stations and healthcare data entry terminals. In both cases PCs are generally used for a single task, and it simply makes no sense, he says, for businesses to pay more to have a PC in that role.

Sun Microsystems will put more than 3,000 of its own employees on JavaStations over the next few months, says Shoemaker, in order to exercise this new computing paradigm and see if there is something they are missing now. He says his biggest question about corporate computing in the year 2000 is not whether NCs will make significant inroads, but how fast those adoptions will occur, and whether some unexpected glitch may pop up after there are widespread deployments.

"It is a lot like a new operating system release, only bigger. There are so many things that you just don't know," Shoemaker says. "I don't worry about all the things people talk about. I wonder whether there are one or two or three things that nobody ever thought about that will hit us in the face all of a sudden. That is one of the reasons we're being very aggressive in our own deployments."

While a bit too futuristic for widespread deployments by the year 2000, a whole new class of computing devices is entering the market as a natural continuation of the thin client revolution. Computing chips will be embedded in cell phones and probably combined with pagers, browsers, and Java capabilities, says Andy Charmas, a project manager in the Sun Microelectronics (SME) division of Sun Microsystems. Some folks are even talking about assigning IP addresses to automobiles, he says.

The ever crucial server side
Over the last decade Unix servers have continually pushed the performance and reliability envelope, pressuring the role of the mainframe in all but core data center functions. The continued evolution of distributed computing, fed in part by Java and network computers, will push servers to be bigger, faster, and more reliable. Mainframes will hold their own but will continue to see new applications deployed on servers.

By the year 2000, a 64-bit operating system will be a given for Unix servers and workstation desktops. Microsoft at least will have announced plans for a 64-bit OS for Intel-based servers, IDC's Bozman says.

This has profound implications for databases and data warehousing (see our cover story this month) applications because extremely large databases and data sets can be read into main memory and scanned much more quickly. Numerical computing -- statistical and visual modeling -- benefits similarly from the ability to handle huge files.

Solaris is already halfway into its 64-bit roadmap and will complete the journey in 1998, says Sun's Brian Croll. He says Sun learned a great lesson in the transition from SunOS to Solaris and has thus upgraded its operating system incrementally to ensure complete compatibility with 32-bit applications.

"That transition was a lot of fun for us and for our customers," Croll says sarcastically. "We learned that lesson at our core. That was sort of a pit stop where we had to stop and revamp everything. We've done that, and we don't have to do it again. The 64-bit transition will maintain complete compatibility. There will be no forced transitions for our customers. Other vendors still have a pit stop to make and that will force their customers to go through some hard choices."

Performance is always a top issue for server vendors, recognizing the constant demand to serve more users and to support higher-bandwidth applications. The core process technology for high-performance CPUs makes a new generation of chips possible every three years, meaning that a new round of chips for workstations and servers may hit in 1999 and generate a lot of attention. But it will be difficult to achieve widespread deployment unless the chips make it to market earlier than expected. Within the current families of server-class CPUs, performance is increasing 50 to 60 percent each year, says SME's Charmas. As better system buses boost the internal speed of multiprocessor systems, the performance equation will improve even further, he says.

Another dramatic difference for servers will be an absolute demand for reliability and ease of use, even for lower-end servers. "Until now vendors have been speaking to a relatively small crowd in the MIS department about advanced server features -- clustering with a single system image, fail over, sophisticated mirroring, etc.," Croll says. "Internet-based computing opens a whole new class of applications that must run 24-by-7" so there will be a far wider audience demanding high-end reliability even from low-end servers.

Sun has also coined the phrase "zero admin servers" for a new class of servers it likens to information refrigerators. "You plug it in, and it just runs all the time. I think these kinds of devices will be prevalent by the year 2000," Croll predicts.

What role will big iron play?
While big iron vendors are recasting the mainframe with more focus on the server, most analysts agree that new application development will still be targeted away from the mainframe. One of the big questions for IDC's Bozman is whether the mainframe will suffer from legacy applications that fail as a result of year 2000 date problems. If they do well in spite of the problems, that will be a big surprise, she says.

Although the need to go in and fix the way legacy applications and data stores treat dates -- so that automated systems know the difference between the 1900s and dates beginning with the year 2000 -- has been recognized for many years, many businesses have been slow to act. Some CEOs have been in denial, refusing to authorize spending millions of dollars to fix the problem. The expense provides no added benefit, says Mike O'Connell, research analyst with the Gartner Group's application development and management service.

"Our estimate is that 70 percent of the applications will fail in one form or another. And 30 percent of mission critical applications will fail. We are projecting that 10 percent of organizations will actually go out of business as a result," O'Connell says.

A hidden danger to firms that are investing early enough and aggressively enough to solve the date problem before crunch time is the risk that key customers or suppliers may fail or be crippled for a period of time. Gartner already sees firms touting year 2000 compliance to gain market advantage, says O'Connell, and he expects to see that more frequently.

Tom Berghoff, director of support services marketing for the SunService division of Sun Microsystems, doesn't expect to see a smaller base of mainframes per capita, but does expect application off-loading to cut into growth. "I think corporations will use new technology to embrace and extend mainframes to make it easier to get at legacy data. I think people focus too much on client/server computing being cheaper when the real problem businesses are trying to solve is data access," he says.

Sun's Shoemaker agrees, noting that half the data in the corporate world is stored on IBM mainframe legacy databases. "Businesses are saying it is hard to get to that stuff, analyze it, and use it for decision support purposes," and that is where Java, Internet/intranet applications, and servers will play a significant role in the year 2000.

It all boils down to the network
All the changes working their way into corporate computing dictates that the network -- both internal and external to the firm -- will have to deliver far more, far faster, to a significantly larger user base.

After working at Sun eleven years, Tirado says he sees a big change in the way CIOs treat the network. It used to be that few CIOs would begin a conversation about their computing strategy by using their network as the focal point. "Today there is not one CIO that doesn't start by drawing a line on a page and talking about what their network will look like and how they will structure computing around the network," he says. Computing strategies now make the network fundamental to everything that happens -- which computing activities are planned and how much bandwidth is required to pull it off effectively. "That is a very big change," he says.

For the last five years network discussion generally focused on the LAN. Between now and the year 2000 more attention will be given to the WAN and seamlessly integrating the two, Tirado says.

The business drivers here are very significant. For the last 30 years companies have been automating the same processes over and over again, says Croll. The first time paid huge dividends. The second time produced a smaller gain, and they have experienced ever-diminishing returns.

Looking to the year 2000 there are two fundamentally new opportunities -- automating the communication between the company and its customers, and automating the relationship between the company and its supply chain and business partners.

"Rather than simply reautomating the same processes again, firms now have the means to do something for the very first time. To push out the walls blocking communication with partners, suppliers, and customers," Croll says.

Internet standards and Java aim to solve the communication and platform incompatibility issues. Write once, run anywhere, means a company can push software out to these audiences with far fewer headaches.

Real world client/server computing that transcends corporate boundaries demands that firewalls must be more reliable and invisible. Market drivers will push security and encryption technology forward to be easier to configure and more widely available, Croll says. Consistent with Croll's vision, Dataquest projects spending on security products and technology will double between 1995 and the year 2000, reaching $13.1 billion.

"I think the technology is already available to enable firms to open up communication with those they want to interact with while still protecting information assets," Croll says. "It is simply a matter of packaging it, popularizing it, and deploying it."

The demand for more bandwidth to support applications like 3D graphics and video conferencing, as well as distributed client/server applications means companies will have to deliver higher speeds to the desktop. Increased dependence on the server to support network computers likewise means bigger pipes to the servers are essential.

Internet/intranet compatibility means TCP/IP will play an even larger role in corporate networking. Forrester Research has concluded that IP will comprise the majority of traffic on corporate LANs for over two-thirds of Fortune 1,000 firms by mid-1998. That growth should continue into the year 2000.

Roll all that together with the natural reticence of firms to swap out their physical wiring, says Tirado, and a few logical conclusions fall out. Computers are fast enough to compress, decompress, and display video, but 10-megabit Ethernet will not get the job done. While 100-megabit Ethernet will gain some adherents, the most likely scenario is for firms to swap out traditional hubs for Ethernet switches. Rather than having a LAN segment share 10 megabits of bandwidth, this kind of swap will guarantee 10 megabits of theoretical bandwidth to each desktop. The connection between the servers and the wiring closets is where 100-megabit Ethernet will see widespread deployments.

If asynchronous transfer mode (ATM) is to find success in corporate networks it will be on the WAN and on the corporate backbone. Even though Sun and other firms make ATM to the desktop possible, it would be very expensive to do the physical rewiring.

The growing dominance of IP will lead to a "sea change in corporate networking," says Forrester's Blane Erwin, a senior analyst in the firm's network strategy service. Newly debuted IP switches can replace slower and much more expensive multiprotocol routers -- long the mainstay of corporate networks -- to deliver a lot of cheap speed. Erwin suggests that the new rage from network device vendors will be IP-optimized hybrid devices that couple the dumb-but-fast IP switch with just enough multiprotocol routing smarts to get by.

IP-optimized devices may also be dropped into network "hot zones" to selectively boost performance where it is most needed, Erwin says.

So...now what?
Taking a comprehensive crystal ball look at corporate computing three years down the road is an intriguing endeavor, but one that is fraught with difficulty. You don't know what it is you don't know -- whether that is a new technology, a major glitch that has not cropped up, or if another overnight wonder like Netscape Communications is about to be born.

That said, the next few years ought to see the Internet and distributed client/sever computing continue to evolve and mature at a rapid pace, bringing significant changes across enterprise networks.

It may also bring significant changes to the mission of IT departments. With far more platform independence and easy-to-use tools, departments will be free to mix and match platforms to a particular application set. Technologies like Java that enable objects to easily interact will make the centralized authority of MIS wane in importance. IT management will take on more strategic planning and more fire fighting when departments start to experience difficulties. In between those two missions the user community will have more autonomy.

"The magnitude of the change IT must make is akin to a totalitarian dictatorship becoming a modern 20th century democracy," says Waverly Deutsch, director of Forrester's computing strategies service. "IT will create guidelines, back up everything, and provide the infrastructure and tools users need. The business units in turn will assume responsibility for the data they use and share and inform IT about their technology projects." To stay in the loop, IT must be proactive with information and tools -- readying technology and creating guidelines before user demands.


Click on our Sponsors to help Support SunWorld


Resources


About the author
Barry D. Bowen is an industry analyst and writer with the Bowen Group Inc., based in Bellingham, WA. Reach Barry at barry.bowen@sunworld.com.

What did you think of this article?
-Very worth reading
-Worth reading
-Not worth reading
-Too long
-Just right
-Too short
-Too technical
-Just right
-Not technical enough
 
 
 
    

SunWorld
[Table of Contents]
Subscribe to SunWorld, it's free!
[Search]
Feedback
[Next story]
Sun's Site

[(c) Copyright  Web Publishing Inc., and IDG Communication company]

If you have technical problems with this magazine, contact webmaster@sunworld.com

URL: http://www.sunworld.com/swol-05-1997/swol-05-computing2000.html
Last modified: