Classic Computer Magazine Archive START VOL. 4 NO. 7 / FEBRUARY 1990




In the last two issues, we presented the first installments of Dave Small's UNIX tour. Part I covered the basics of UNIX file structure and several important UNIX concepts. Part II covered the concept of the Root and some oddly-named UNIX commands. In this issue, Dave looks at the concept of pipes in UNIX and its wonderful ability to easily redirect output.

One of the nicest and most powerful features of UNIX is redirection or piping--where you take the output of a command such as a directory listing from Is, and feed it someplace other than your terminal.

Say you're doing a BASIC compile for GFA BASIC on the TT (of course, this is all future tense) and you want error messages to go to a file instead of to the screen where you'd just have to write them down.

You'd do: GFACOMPILE > gfaoutput and GFACOMPILE would prompt you for the file name to be compiled, since it doesn't know. Instead of the listing going to your screen, it's written to the file "gfaoutput", which is created on the spot.

But still better, we can direct input, too GFACOMPILE < MYPROGRAM > gfaoutput takes MYPROGRAM as the "standard input" and writes the output to gfaoutput.

If you get lost on the < and >, remember that they point in the direction of data flow. Easy enough? In the above example, it flows out of MYPROGRAM into GFACOMPILE and out of GFACOMPILE into gfaoutput.

(Admittedly, it would be more clear if it looked like this: MYPROGRAM > GFACOMPILE > gfaoutput, but UNIX needs the command name first, so we're stuck with it this way.)

Okay, let's RUN your program, say it's a word counter for your word-processor files. You need to give it an input file and tell it where you'd like the output displayed. Furthermore, let's say your program encounters a Major Error and needs to generate an error message (such as, say, you try to run a word count on a nonexistent or a binary file).

Ordinarily, you would just tell WORDCOUNT the name of your file and it would display the word count, like this:

#WORDCOUNT < mytext
2333 words. (generated by the program)

But since we can redirect standard input, standard output and error output, we can do this, too:

WORDCOUNT < inputfile > outputfile2 > errorfile will READ from inputfile, write the total number of words to "outputfile" and send any error messages (the "2 >") to "errorfile."

Now, let's say you have a file on which you want to perform several operations. As an example, we want to take the file, use STRIPWS to strip out any "high parity" characters that WordStar might have put into it, feed it into SUPERWRITE, take the PostScript output from that and laser print it using MajorScript (these are all future-tense programs, of course).

We could do things one step at a time:

#STRIPWS < myfile.txt > myfile.stripped
#SUPERWRITE < myfile.stripped > myfile.postscript
#MAJORSCRIPT < myfile.postscript > laserfile Ip < laserfile (print the laser file)

Or, we could just "pipe" them together. We pipe with the " | " character (not a colon!), that character on your ST keyboard that you've been wondering about. This automatically takes the output from one thing and feeds it into another. So it becomes:


This does it all in one step. Hence, when you're running UNIX, you have incredible power over redirecting where everythinggoes. You can hook together many, many different operations to get your particular job finished.

UNIX Philosophy

This leads us to what has become the UNIX Philosophy, the way things are done on UNIX--and the way you'll pretty much have to do them, because that's the way the tools you'll be using are laid out.

There aren't any 500-function programs in UNIX. Forget the dropdown menus and whatnot.

There arelots and lots of tiny little programs that are really good and efficient at doing one thing. It's up to you to redirect, pipe, script, and otherwise "glue" them together to get what you want done.

This has mixed results. On the one hand, it's really powerful. You can dream up mixtures of commands to do nearly anything, particularly to text, since UNIX is so text-oriented. (Hopefully, graphics utilities for UNIX will start to catch up soon.)

For example, I can take a list of my files, prune out the ones that don't matter, mail the list to Amy so that she sees them next time she's on, go through the list, spellcheck and print each one out for final hand-editing, and so on--all in one command.

Remember how ls gave us only a pathetic list of file names, with nothing else? That's because ls is designedto feed other programs with just that list--and other programs don't want that extraneous junk like lengths, file type and so forth in the listing they receive. Just the names, ma'am.

On the down side of this, you have a million little utilities to learn. All of them have options you must learn, too The ls utility, for example, lists files in a current subdirectory; mv moves files; cp copies a file; cpio moves a whole bunch of files, subdirectories included and so forth.

It's with some regret that I have to tell you that you're never going to get away from the manuals for these commands. There are simply too many for any human to learn all of them--and all their options. UNIX programmers themselves got so sick of leafing through books that they added the man command (manual lookup); this prints out the official AT&T documentation page for a given command onscreen, right there at your terminal. For example, if you've forgotten how to make ls list out file lengths, do

#man Is

and you'll soon learn. And, of course, you can take that output, save it, maybe send a copy to the printer.

Now, of course, I'm assuming that Atari will make a real UNIX system and put MAN and the manuals on the disk. The UNIX PC I'm using didn't do that; the designers figured they didn't have room. (They figured right, back in the era when 10MB hard disks were expensive instead of being doorstops, like now.)

Even with 40MBs, there's barely room to maneuver and I don't even have the manual in there even yet, a year later--and do I have some tattered manuals by now!

Which brings me to something you've felt creeping up on you. Hundreds of commands and subdirectories, huh?

Just How Much Space Does All This Need?

At this point, I'll tell you: 40MBs at least--and believe me, as a 40MB UNIX box owner, you'll want more. Apple ships their A/UX on an 80MB hard disk! 100MBs is at least fairly okay and might work well enough on a system with just a few users; it's comfortable for me as the only user. I'm not kidding! Sure is a good thing hard-drive prices have nosedived, isn't it? Only recently has the kind of size required by UNIX to work well become affordable.

It takes around 20MBs to store a reasonable UNIX system and some of the many UNIX utilities. Add another ten for extensions you want. X-windows (which we'll get to next issue) takes up megabytes of space, particularly the source code. And then add the space you want for yourprograms, over and above the operating system--you can see why I say 40 is a minimum.

As of this point, Atari has not revealed what they think is a minimum, nor what drive options will be supported. In terms of main memory, you want all you can get. UNIX gets more efficient when it can keep things in memory, instead of "swapping" them to disk from time to time. Probably 2MBs of RAM is an absolute minimum. I have an 8MB UNIX machine that uses up four megs just idling and another 3MBs when I bring up X-windows! 7MBs gone--and I haven't even started up an application!

Atari lists the TT as beginning with 2MBs, but is expandable. Let's hope so!

Contributing Editor Dave Small is one of a small circle of ST gurus who have helped to make the ST as popular as it is. Dave has been a pioneer in developing Macintosh emulation on the ST, culminating with the release of his latest triumph, Spectre GCR.