Computing history: a personal and industry view. C. Gordon Bell.
Each time I invest "in the past" it has future payoff.
The first time that I invested in learning was in 1966. I went on leave from DEC to become a professor at Carnegie Tech to learn about computer science. Even though I had already helped develop the first minicomputer (the PDP-8) and the first commercial timesharing system (the PDP-6), industry was unconcerned about the "science" of computing.
In 1967, Allen Newell, Alan Perlis, and Herbert Simon wrote a letter to Science and identified computer science. The next steps what we now call "the third and fourth generations" weren't at all clear. At Carnegie, Allen Newell and I collected materials and objects from past machines to build theories. This resulted in a book entitled Computer Structures that influenced at least two generations of computer architects. The concepts of the DEc Unibus and general registers came from this work. Other Carnegie alumni extended and implemented these ideas in subsequent (and future) DEC computers. These developments came from a deep knowledge of past computers, and how they were used and gave some insights about the future trajectory of computer evolution.
In 1972, when I returned to Digital the third generation of computing, based on the integrated circuit, was in full swing. While DEC had been almost alone building minis in the second generation, the technological barriers to building computers had been lowered with the IC. A company had only to understand packaging, logic design, peripheral interface design, and construction of software components to start in the business. By 1970, about 100 companies had formed oir attempted designs; seven really succeeds, about 20 are till trying, and a whole flock are no longer with us, including American Computer, Atron, BIT, and Viatron. With some understanding of the historic generational patterns my goal was to get DEC into large scale integrated circuits and to establish the VAX line as a new standard. And ironically, this year 20 years after the first PDP-8 was built, sales of the machine were higher than ever since it is implemented on a single chip and embedded in a word processor.
Now one of my goals is to consider not just the development of a single company, but of the entire industry--and not just architecture but programmers and users. To this end, I have been part of establishing The Computer Museum for everyone. The Museum came about through the generous sponsorship of DEC and Ken Olsen. The Role of the Computer Museum
Opened in Boston in 1984, The Computer Museum has on display the first interactive space game SpaceWar!; the first personal computer, the LINC, and the first mail-order home-built machine, the Altair. The Computer Museum is designed to help visitors understand the evolution of computing. Computer generations, marking technological time, are the main organizing principle. The new technologies, startup companies, and new products of each generation are listed and displayed.
every time I visit the museum, I get insight relevant to a current problem. A month ago while looking at the Honeywell 116, a very early IC minicomputer, and comparing it will Data General's first Nova, ideas about board size, pins, and function jelled. I also observed that nearly all of the micros repeated, for the third time, the time-worn memory management evolution path that begun in 1960 with the Manchester University Atlas in the early 60's, which we followed with the Decsystem 10 in the late 60's, and then again with minis in the mid-70's. IBM's path was about the same with the 360/370 evolution and its minis.
The Computer Museum is not just for me and my engineer friends; a dozen high school students came to an esoteric lecture on coding in the 1930's given by Donald Davies of England's National Physical Laboratory. Asked if they got anything from it they replied that they were going to use some of the ideas on setting secure codes for their school computer.
When I tour the Science Museum in London with my British friends, they often recounted anecdotes of how the exhibitions turned people on to science and technology. Now I see the same thing at The Computer Museum: bright kids and curious adults have a place where they can learn how computers got to be the way they are today.
Until the Computer Museum was established, there was no place where the objects, films, and programs of the past were collected. The Computer Museum provides this for the present and future generations of engineers, programmers, artists, and hackers who will make history. A View to the Future
Using the Museum to review the past, just as I did in the 60's at Carnegie, a view emerges of the future evolutionary path of computing.
The current computer industry is stratified by level of integration and completely product fragmented, offering the ultimate in entrepreneurism. Dozens of complete industries have been formed within a half dozen strata:
* Chips: microcomputers, peripherals, memories.
* Electro mechanicals: power supplies, disks, I/O, enclosures.
* Operating systems: communications, database access, human I/O.
* Languages: (eg. dozens of assemblers for a given micro), fourth generation languages.
* Generic applications: word processing, spreadsheets.
* Professional/Discipline applications: general business.
This new technology permits many more new computer structures than ever before including:
* All types of desktop terminals and phones.
* Portable and desktop personal computers, workstations and shared computers.
* Supermicros which replace mini and mainframes while providing increased reliability and performance by replication.
* Hybrid computer-telephony base computers and switches.
With the vast supply of venture capital, all you need to establish a company is a computer with a word processor and spreadsheet. A perpetual motion machine for creating companies can be expressed in a Pascal-like way: procedure VENTURE ENTREPRENEUR CYCLE while greed and not fear do begin write business plan; get venture funds; exit job; start new company; build product; sell product; sell company; (for 100 times sales) venture funds: = liquidity; end
The restructuring of the industry is good for individuals who both take the risks and create new products. But on a national scale I have four concerns about this restructuring.
First, the value of the companies appears to be far larger than any potential market. At the beginning of 1984, 123 workstation companies had a valuation of tens of billions of dollars with a total market of less than $10 billion. At most, there may be room for a dozen first rate companies.
Second, while the cycle creates some innovations in computing, most of the products do not improve productivity. The "me too" less costly solutions really cost the user when the company fails and the user is forced to convert the software to a reliable supplier.
Third, the U.S. industry is robbed of a critical engineering resource by constant churning. At a time when we need massive resources to compete with the invasion of every size and type of Japanese computer, most energy is going into replicating trivial products.
And fourth, successful software companies all appear to violate their own entrepreneurial energy. They amass vast programmer staff merely to evlve their single founding product. Having a large staff of programmers to improve a spreadsheet to new versions is like having Ernest Hemingway hire a team of writers to write Hemingway novels. Much programming may be best done as a cottage industry.
The answer to my original question about investing in the fiture versus the past now becomes evident. Quality investments that benefit both individuals and society need a long term vision based firmly on knowledge of our past.
The Computer Museum is more than the industry's attic, it provides a resource for observing major patterns and a forum for learning from the all-time great designs and people. I'm sure that you, like me, were strongly influenced by your frist computer and have a story about how it affected your future choices. Now, with The Computer Museum a new opportunity is provided to amass these "stories" into the history of computing.