Classic Computer Magazine Archive COMPUTE! ISSUE 72 / MAY 1986 / PAGE 6

Editor's Notes

A recent book, Alan Turing: The Enigma, is a fascinating study of the life of a brilliant scientist and of the development of early "computing engines" which helped decode secret German messages and significantly contributed to the Allied victory in World War II.
    Turing worked with primitive decoders. Called Bombes because they ticked loudly, they were something like old-style adding machines, computing with gears and wheels-all mechanical in the days before the electronic revolution. In fact, much of the computation was done by hundreds of women on an assembly line:
    ...the Bombes ticked away, getting on with the work ... while the Wrens [Women's Royal Naval Service] did their appointed tasks, without knowing what any of it was for. He [Turing] was fascinated by the fact that people could be taking part in something clever, in a quite mindless way.
    Machines, and people acting like machines, had replaced a good deal of human thought, judgment, and recognition. Few knew how the system worked, and for anyone else, it was a mystic oracle, producing an unpredictable judgment. Mechanical, determinate processes were producing clever, astonishing decisions.
    Indeed, this large room of workers surrounding the Bombe suggested nothing to Turing so much as a giant machine. Here a group was mechanically adding results; over there was another crew responsible for feeding information back into the Bombe. Some people had to file information, some had to compare a template against each new pattern as it was passed down the line. We can now easily recognize that these activities are the elements of computers and software: RAM, ROM, masking, CPU, feedback loops, branching, and so forth. In those days, however, it took genius to see that the Bombe could be expanded to take over and speed up the functions of the hundreds of clerks working around it.
    U-boats were sinking ships all over the Atlantic. Turing and his associates were always working against time, trying to decode messages faster. Eventually, they began to experiment with ways to store information electronically. It's intriguing to read of their efforts to hold onto a few bits of information for a brief time. One of the best solutions they came up with was to store the bits in a cathode ray tube, an early TV screen. This had the advantage that you could amuse yourself by watching the bits flickering while they briefly rested until needed again by the central processor.
    But Turing's most famous contribution to computing is the related concepts we now call the Turing machine and the Turing test. His idea of the machine revealed that he achieved the first comprehensive understanding of the possibility of artificial intelligence. He imagines a universal machine, one that could perform the job of all the other, more specialized, machines. Adding machines operated according to fixed rules which were reflected in their metal cogs and gears. The Bombe, too, performed its job because its mechanism was physically shaped in certain ways.
    Turing thought of "tapes" which could contain instructions describing the "state of mind" of the adding machine, the bombe, or any other calculating engine including human "computers." A tape could be fed into a supermachine, and it would then adapt to the state of mind, the description of some other machine, contained on the tape. In this way, the supermachine could "perform the equivalent of human mental activity. A single machine to replace the human computer! An electric brain!"
    By the 1950s, Turing had completely formulated another startling concept: How can you tell if a machine is truly thinking? The Turing test is deceptively simple: If a questioner cannot tell the difference between written answers from two intelligences, then, for any practical purpose there is no difference between the intelligences.
    He imagined a game in which an interrogator would have to decide, on the basis of written replies alone, which of two people in another room was a man and which a woman.... They would alike be making claims such as "I am the woman, don't listen to him!...... A successful imitation of a woman's responses by a man would not prove anything. Gender depended on facts which were not reducible to sequences of symbols. In contrast, he wished to argue that such an imitation principle did apply to "thinking" or "intelligence." If a computer, on the basis of its written replies to questions, could not be distinguished from a human respondent, then "fair play" would oblige one to say that it must be "thinking."
    ... he produced an argument in favor of adopting the imitation principle as a criterion. This was that there was no way of telling that other people were "thinking" or "conscious" except by a process of comparison with oneself, and he saw no reason to treat computers any differently.
    Turing expected a machine to pass his test around the end of this century:
    I believe that in about fifty years' time it will be possible to programme computers, with a storage capacity of about 109, to make them play the imitation game so well that an average interrogator will not have more than 70 per cent chance of making the right identification after five minutes of questioning. The original question, "can machines think?" I believe to be too meaningless to deserve discussion. Nevertheless I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.
    Ten to the ninth power is 122,070K or 119 megabytes. (Turing's 109 represents bits). One hundred and nineteen megabytes is not an uncommon storage capacity nowadays-we've got more than that here at COMPUTE! in the hard disks servicing our minicomputer editing system. Yet our system would never pass the Turing test.
    Nonetheless, Turing's machine, his test, and his other ideas continue to have enormous impact, and Alan Turing: The Enigma is a lively, understandable portrait of a major thinker's life and ideas. If you're curious about where computers came from and where they're likely to go from here, you'll enjoy this book very much indeed.

Richard Mansfield
Senior Editor