I was unable to attend this summer's Consumer Electronics Show, and in deference to its importance, asked Selby Bateman, our Features Editor, to contribute a guest editorial.
Robert C. Lock
Editor In Chief
The old Chinese curse "May you live in interesting times"often seems to have been aimed directly at our present high-tech, microprocessor-based era.
At least that may have been the feeling for many of the 98,271 attendees who shuffled and stared their way through June's four-day Consumer Electronics Show in Chicago. More than 50,000 electronics retailers and over 2000 members of the press were among that number, each of them trying to comprehend the overwhelming quantity of new products being offered to the American and world markets.
Almost 1400 different exhibitors filled 811,000 square feet of space, displaying the latest stereo TV receivers, new-generation digital audio disc players, cellular telephones, color televisions that fit in the palm of your hand, videocassettes, car stereos, and—of course—computers, software, and hardware peripherals.
One of the clearest trends evident at CES was that computers are becoming linked more closely with almost every other consumer electronics product exhibited. In the not too distant future, the fairly clear-cut lines between computers, stereos, telephones, video systems, and many other products will disappear. This will become even more apparent by the beginning of 1985, with the arrival in quantity here of new MSX operating system micros from Japan.
One example of this trend: Atari chairman James Morgan, in his efforts to bring his company into the future with brighter prospects, emphasizes that Atari's goal isn't just to produce computers, but to "enhance consumers' lives through interactive electronics." That sentiment is being echoed in different words by many other electronics' manufacturers. They see their products getting "smarter," as everything from washing machines to automobiles begins to carry microprocessors.
Interesting changes in microcomputer hardware and software were everywhere at CES. While the great majority of the public attempts to understand microcomputer developments that are essentially several years old, the industry charges forward at a gallop. Even for those who stay abreast of the latest news from the high-tech front lines, the power and the pace of change in this industry are often bewildering.
How can an individual learn about and digest all of the innovations, new products, changing technologies, and scattered trends that take place in the computer and electronics field on a daily basis? More importantly, how can those changes be understood, wisely interpreted, and selectively used?
Although we're biased on the subject, it seems obvious that those who have found an interest in—sometimes a passion for—our remarkable computer revolution may be in a better position to understand and take advantage of what Eric Hoffer called the wrenching "ordeal of change."
One model for us is the subject of this month's COMPUTE! Interview, physicist Gerard O'Neill of Princeton. Throughout his career as a scientist, writer, lecturer, and entrepreneur, O'Neill has consistently blended an ability to understand society's changes with a clear vision of how things can and should work. His books and his interests reflect a mix of the hard sciences, human values, visionary ideas, and an unquenchable, optimistic curiosity.
His interests are eclectic—from developing colonies in space to piloting glider planes to researching high-energy physics to working with his Apple II + computer. Perhaps it is O'Neill's curiosity and his practical optimism which are fundamental to his highly successful approach to the whirlwind of technological change. Importantly, those seem to be characteristics which our readers and many of those who are intrigued by computing appear to have in abundance.
Is it really a curse or a blessing to live in interesting times? Samuel Clemens once remarked that anyone who has held a bull by the tail knows five or six things more than someone who hasn't. So enjoy the mixed blessings of the microcomputer revolution, and the fact that you know five or six things more than you did before.