Classic Computer Magazine Archive CREATIVE COMPUTING VOL. 11, NO. 10 / OCTOBER 1985 / PAGE 20

Megabucks for megaflops; Uncle Sam boosts the supercomputer market. David Lytel.

Forget miniaturization," says the cartoon on Kenneth G. Wilson's door, "I want to build a really big computer." Wilson is professor of physics at Cornell University and winner of the 1982 Nobel Prize in physics. And thanks to a National Science Foundation award announced recently, Wilson and Cornell will soon be building a really big computer--one that will be 40 times faster than anything available today.

Cornell is one of four universities designated to share the $200 million NSF grant. The others are the University of Illinois at Champaign-Urbana, Princeton, and the University of California at San Diego, Cornell.

The supercomputers created as a result of the grant will be used for modeling complex systems--everything from studying black holes to forecasting the effects of numerous variables on the world economy. Simultaneous equations with thousands of dynamic variables can be created, so processes that are too elaborate to reproduce in a lab or too complicated to describe on paper can be studied. Atmospheric models built with data from Voyager missions to Jupiter and Saturn will be explored to enable scientists to learn more about the surface and environment of these planets. Geologists will be able to build a comprehensive model of the "earth engine" that moves the continents and produces mineral deposits.

How fast does a computer have to be to qualify as a supercomputer? Like other performance standards, this one changes frequently. For a long time the industry standard was the Cray 1, but the current top-of-the-line supercomputer is the Cray XMP/48. In the past, a supercomputer performed in the range of a few hundred megaflops (million operations per second). The Cray XMP/48 is capable of close to one gigaflop--a billion operations per second. Wilson's goal for the Cornell computer is 40 gigaflops.

The first system that will be installed at Cornell is an IBM 3084-QX mainframe connected to four Floating Point Systems scientific processors; its performance is in the range of the current Crays. A second system is expected to be installed within the next year or two. "We can't discuss it in detail, because the information is proprietary, and it is all based on very high risk development projects, so it is difficult to predict the timing," says Wilson. "But we expect it to be highly parallel with lots of processors operating simultaneously."

The Importance of Being Parallel

Parallel processing, the solution of several pieces of a problem at one time, is more than just another hardware consideration. According to Wilson, the experiments in parallel architecture will be critical in lowering the price of supercomputers and increasing their availability. "What we are trying to do," says Wilson, "is get a new generation of machines out and on the market. We are putting pressure on industry to lower entry level prices on the next generation of supercomputers to less than $100,000. Now that doesn't mean that you will get a lot for $100,000; the important thing is that people will be able to get started for that sum and then increase their computing power through upgrades rather than having to start over with a totally incompatible system."

Parallel processing plays an important role in this concept, because it allows the user to upgrade simply by adding processors. There is always a more powerful machine on the horizon, but at any step along the way, he has a reasonable computer.

This attempt to extend parallel processing and the collaboration between IBM, which has pledged $30 million in equipment and staff time to the project, and Cornell are the two aspects of the project that are arousing the most interest in the computer industry. Supercomputers are virtually the only computers that IBM does not currently manufacture. According to Wilson, "IBM is clearly becoming very concerned about the needs of the scientific and engineering market."

Jack Kuehler, who heads IBM's large computer development efforts, says that the Cornell approach is just one option the company is exploring in supercomputer design: "Through this joint research with Cornell, we hope to gain experience with parallel processors in large scale scientific operations."

Replace Fortran?

Concurrent with the parallel processing aspect of the project at Cornell, researchers are attempting to build a language to replace Fortran as the language of scientific computing. The problem with Fortran, according to Wilson, "is that the logical ideas that a scientist or engineer wants to express get all scrambled up in the computer program. You have to weave back and forth through the listing to figure out what is going on."

Wilson's team hopes to build a language called Gibbs that will allow scientists to express their ideas in a coherent fashion; through programs written in such a language, scientists could communicate with each other as they currently communicate through scientific papers and textbooks."

Spinoffs

Wilson expects the supercomputer grants to produce many opportunities for researchers to spin off new businesses. He cites as an example his brother, who was assigned the problem of designing a data acquisition system for an Apple computer while at Harvard. "A person working in biochemistry bought an early Apple and wanted to use it in his lab; my brother was given the task of building a device to connect the Apple to the apparatus." Having designed and built the system, Wilson's brother and some friends left Harvard and set up a company. "What is important," says Wilson, "is that they had a head start. When Electronics magazine did its first survey of data acquisition systems for personal computers, there were two companies at the top of the list, and my brother's was one of them."

That process will be repeated with the supercomputer market, Wilson thinks. "People will get involved in solving a specific problem as part of making the system work. They will then have to have the guts to use their knowledge to make a marketable product. They will have had an early look at some of the problems presented by the new technology, and they will be able to build a small company to serve a growing market. Timing is everything."

Whether the universities that have received these powerful new machines will serve as incubators for ideas that become commercially viable remains to be seen. "There is a certain infrastructure that exists around Boston and Silicon Valley that must be developed," says Wilson. In New York, the former chairman of the State Urban Development Corporation expressed some skepticism at the ability of the new supercomputer centers to become the focal points for coordinated economic development efforts. "New York has more than its share of important companies and universities," says William Stern. "But we have failed to bridge the gap between research in universities and commercialization in companies. Maybe Cornell will change that."