Classic Computer Magazine Archive COMPUTE! ISSUE 55 / DECEMBER 1984 / PAGE 110


Tom R. Halfhill, Editor

Learning To Program

Too many people who first begin using a computer are overwhelmed at the idea of learning how to program. It's hard to blame them. For years people have been led to believe that programming is an obscure and extremely difficult task, something best left to scientists, mathematicians, and technicians. Like nuclear physics, it was supposed to be far beyond the reach (and interests) of ordinary people.

By now we should know better. Not only have thousands of everyday people learned how to program, but some of the best programmers have turned out to be people who are too young to vote or even drive a car. Millions of grade-school children are pecking away at computer keyboards and programming while they're still learning the traditional three R's.

So if little kids can program, what's to stop anyone else?

Some people fear they can't learn to program because they've always been bad at math. But actually, programming has little to do with higher mathematics—unless, of course, you want to write programs that employ higher mathematics. For the most part, plain old addition, subtraction, multiplication, and division are all you'll need to know. You can write a program which calculates mortgage payments even if you can't tell trigonometry from a tyrannosaur.

Other people are discouraged by the complexity of learning a computer programming language. Yet, computer languages—such as BASIC, Logo, Pascal, FORTRAN, or even machine language—are far easier to tackle than human languages. All human languages have vocabularies consisting of tens of thousands of words, plus thousands more variations of words. And the grammatical rules for putting those words together into meaningful phrases are tricky and complicated. But practically all computer languages have vocabularies of less than 100 words, often closer to 50. Only about half of those words are used in everyday programming, and the rules of syntax are more rigidly defined. What's more, if you inadvertently break the rules, the computer tells you so and even gives you a clue about the nature of your error. (If only it were that easy to learn how to conjugate irregular verbs in French!)

Still, many people have a hard time with programming. Part of the problem may be that they're spending too much time learning all the commands and syntax rules instead of figuring out how to solve the problem they're working on. This is like learning by rote the vocabulary words of a foreign language without actually linking them together into sentences to express your thoughts. It's fairly easy to learn what the GOTO command does in BASIC, for example, but figuring out when to use it may be less obvious.

That's why many programming instructors favor a different approach to learning how to program—a problem-solving or algorithm-based approach rather than a language-based approach. In other words, once you learn the basic ways of solving problems on a computer, you just apply the vocabulary and syntactical rules of whatever language you're using and write your program.

In practice, it's a little more difficult than that—some languages are structured quite differently than others in order to make them more suitable for certain tasks, or to reflect a certain philosophy (the nearly GOTO-less structure of Pascal, for instance). But the basic approach holds true. Once you know how to solve problems in one computer language, it's relatively easy to apply your knowledge to other languages. The key is to learn the basics of problem-solving on a computer.

A Computer In Your Mind

To a large degree, your skill at programming depends on how well you can learn to think like a computer yourself. This might sound strange, but there's nothing hard about it at all. At their present state of technology, computers are rather simple "thinkers." They only seem so smart sometimes because they perform their simple thinking so rapidly—much faster than we mere humans.

However, any computer program—no matter how sophisticated it appears when it's running—is essentially just a list of instructions. The computer follows the instructions one at a time, in the order specified by the programmer. If you, a human, performed these same instructions in the same order, your results would be the same as the computer's (although it would probably take you longer, of course). There's nothing theoretical about this, because that's exactly how the programmer wrote the program. The programmer started out by defining the problem, conceiving a way of solving the problem, and then giving the computer a list of step-by-step instructions so it could find the solution.

Notice that only the third step involves actually programming the computer. Although many people think it's the major step, it might actually be a minor part of the process. The first two steps often demand the most skill and creativity. In fact, major software developers these days often employ teams of "programmers." The senior members of the team concentrate on defining the problem and constructing a method of finding the solution. Then they assign the task of coding the instructions in a computer language to the junior programmers. The senior programmers, or program designers, may never touch a computer keyboard.

Whether a team is involved or only one programmer, the process is the same. You can't program a computer to solve a problem until you first know how to solve it yourself. Not that you have to actually arrive at the solution—that's the computer's job. Your job is to encode the method of finding the solution into instructions the computer can understand and carry out. And to do that, you have to comprehend how the computer will interpret each instruction you give it before going on to the next instruction. You have to learn how to think like the computer.

How Computers Think

As we said above, learning to think like a computer isn't really very hard because computers right now are pretty simple-minded thinkers. They always think logically and sequentially. On their own, they aren't capable of illogical thinking, emotion, or leaps of insight. The fact is, they're utterly predictable. Even their randomness is the product of carefully simulated disorder. Their behavior is a lot easier to figure out than that of most people, which is why some obsessive programmers withdraw from the world and spend all their time programming.

Let's try an example. Assume you're a schoolteacher who wants to calculate a student's grade based on five test scores.

The first step is to define the problem. That seems easy: You just want to figure out a letter grade based on five numeric scores. But do all the scores carry the same weight? Were some tests more important than others? And how many points will it take to earn an A instead of a B?

To keep things simple for this example, let's say all the scores carry the same weight. Therefore, you need to calculate the mean average of the five scores. To translate the result into a letter grade, you'll use the following scale: 95–100 points is an A, 85–94 points is a B, 75–84 points is a C, 65–74 points is a D, and 0–64 points is an F.

Now that you've defined the problem, the second step is to figure out how to find the solution. Some people, especially when first learning how to program, work this out on paper before sitting down at the computer. There's even a formal way of doing this, called flow charting. It's similar to diagramming a sentence in English, except the object of flow charting is to figure out how to construct the program in the first place rather than analyzing the structure of an existing program.

We won't get into formal flow charting here, but we can do the same thing by drawing up a simple outline. Here's how we might tackle our sample problem:

  1. Calculate the mean average of the five test scores.
    1. Add the five scores together and remember the sum.
      1. Add the first test score to the second test score.
      2. Add the result of the previous calculation to the third score.
      3. Add the result of the previous calculation to the fourth score.
      4. Add the result of the previous calculation to the fifth score.
      5. Store the final sum for later use.
    2. Divide the sum by the number of test scores.
      1. Take the sum of the scores as calculated above and divide them by five.
      2. Store this result, the mean average, for later use.
  2. Translate the average score into a letter grade.
    1. Take the average score as calculated above and compare it to the grading scale.
      1. Is the score somewhere between 95 and 100? If so, then the grade is an A.
      2. Is the score between 85 and 94? If so, then the grade is a B.
      3. Is the score between 75 and 84? If so, then the grade is a C.
      4. Is the score between 65 and 74? If so, then the grade is a D.
      5. Is the score less than 65? If so, then the grade is an F.
    2. Give the result of the calculations by revealing the final letter grade.

Writing The Code

Whether you realize it or not, we've actually written a program. We've compiled a list of step-by-step instructions which, if followed exactly, will yield the solution to our problem. You could take this list and solve the problem yourself, right now, with pencil and paper or a pocket calculator. The only thing that's required besides the list is some knowledge of simple addition and division, plus the actual data (the test scores). You've already done the hard part; you've concocted the recipe. Now the problem can be solved by anyone who's capable of following instructions and handling sixth-grade arithmetic, whether he's a genius or an idiot.

In this case we'll submit the problem to an idiot—the computer. You don't have to worry about the computer jumping to an illogical conclusion or arriving at a wrong answer. As long as you do your job—give the right instructions to the computer in the proper order and in a language it can understand—the computer will do exactly what you say. It's not smart enough to disobey or come up with its own solution to the problem. It can't appear to be any more intelligent than its programmer.

At this point you could encode the instructions—that is, write the actual program—in any one of dozens of computer languages. BASIC, Pascal, PILOT, Logo, FORTRAN, machine language—the results will be the same. Which one should you choose? The decision is based on a number of factors: which language is best-suited to this type of problem; which language will give the fastest results; which language is easier to use; which language is readily available for your computer; and so on.

Since virtually all personal computers have some form of BASIC built-in, we'll write the sample code in BASIC. But it's important to realize that the program could be written more or less as well in any computer language.

Now let's see how the program might look. Keep in mind that this is a generalized example; because of variations between the BASICs built into various computers, it may require modifications to run on your particular computer (see the notes following the listing). Also, we'll explain the meaning of some special symbols and terms at the end of the listing. Comments explaining sections of the program are printed in italics.

[Store the five test scores in variables.]

10	TEST1 = 84 TEST2 = 76 TEST3 = 92 TEST4 = 88 TEST5 = 68

[Add the test scores together and store the sum in a variable.]


[Find the mean average by dividing the sum by the number of test scores.]


[Compare the average score to the grading scale to translate it into a letter grade.]

80	IF AVERAGE > = 95 AND AVERAGE < = 100 THEN GRADE$ = "A"
100	IF AVERAGE > = 75 AND AVERAGE < = 84 THEN GRADE$ = "C"
110	IF AVERAGE > = 65 AND AVERAGE < = 74 THEN GRADE$ = "D"

[Tell the result of running the program—the student's final letter grade.]


Analyzing The Program

If you compare the outline we prepared with the program listing, you'll see how closely they correspond. They're both linear and logical. The hard work, indeed, was in defining the problem and designing the method of solution. The actual coding or programming was almost an anticlimax. Even if you've never programmed in BASIC, you should be able to deduce what the program is doing by consulting a BASIC programming manual. To save you some time, here's what some of the special symbols and terms mean:

A variable is a way of storing a number in a program. The statement TEST1 = 84 assigns the number 84 to the variable TEST1. In effect, the variable becomes the number. The rules for using variables differ on various computers; on Commodore and Apple computers, for example, only the first two letters of a variable matter, so the computer couldn't distinguish TEST1 from TEST2. (Try T1 and T2 instead.)

Variables that end with a dollar sign ($) are string variables. Instead of storing numbers, they store strings of characters. In this program, we used GRADE$ to store the character of the letter grade (A, B, C, D, or F). Some forms of BASIC, such as Atari BASIC, require you to define the maximum number of characters a string variable will hold before using the string variable, so you'd need to add a statement like 15 DIM GRADE$(1).

In BASIC, the arithmetic operators are + for addition, - for subtraction, * for multiplication, and / for division. Thus, the statement AVERAGE = TESTSUM/5 in line 70 divides the variable TESTSUM by 5 and assigns the answer to the variable AVERAGE.

In BASIC, the symbol <= means less than or equal to and the symbol > = means greater than or equal to. Therefore, a statement like IF AVERAGE > = 75 AND AVERAGE < = 84 THEN GRADE$ = "C" in line 100 means, "If the average test score is between 75 and 84, then the letter grade is a C." In line 120, rather than checking to see if the average score falls between 0 and 64, the program just assigns an F if the number is anything less than 65.

Line 130 tells us the result by printing the answer on the screen. If the result is a B, the program prints THE STUDENT'S GRADE IS B.

As you can see, the program structure is pretty straightforward. Certainly more complex problems demand more complex programming. But trying to learn how to program just by memorizing all the commands in a language is like learning how to speak French just by memorizing vocabulary words. You won't become fluent until you actually begin linking the words together to express thoughts—the very purpose of a human language. And you won't become a fluent programmer until you start designing solutions to problems and expressing the solutions in programming commands—the purpose of a computer language.

Your programming manual is just a dictionary of instructions, and your computer is just a machine which can execute those instructions faster than you can. The real computer is in your brain.

Questions Beginners Ask

Q I've seen the phrase "full-screen editing" in advertisements, but I'm not sure what it means. Does it have something to do with word processing? Is this considered a valuable feature?

A Full-screen editing is indeed a valuable feature, and it's becoming standard on virtually all computers designed within the last few years. Although it applies to word processing, the term "full-screen editing" as used in advertisements usually refers to the editing features available in BASIC.

Very simply, full-screen editing means you can move a cursor anywhere on the screen with four directional cursor keys, make a change to a line of BASIC with insert and delete/backspace keys, and press the RETURN or ENTER key to register your change with the computer. This is an easy and fast way to edit BASIC programs. Computers which have full-screen editing include all Commodores, Ataris, and IBM Personal Computers.

Although computers which lack full-screen editing usually let you make changes to BASIC lines without retyping them entirely, the process is a little more tedious. Often you have to memorize special editing commands and key sequences. Sometimes, however, utility programs are available which enhance the computer's built-in editing capabilities.