Part I: Atari Video Graphics And The New GTIA
In this, the first of a two-part series on the inner workings of Atari Graphics, the author reviews the computer's system of screen management and defines several important terms including color clock, playfield, mode line, and display list. Next month, the article concludes with techniques for using color indirection, a powerful graphics tool, and explores the new GTIA chip in detail. This new chip costs nothing if your Atari is still under warranty. If you have an older machine, the nearest authorized service center should be stocking it by now and will install it for about $60.
The GTIA is an exciting new graphics chip now being shipped in Atari 400/800 computers. Among its special features are a sixteen color mode with a resolution eight times better than the Apple's, and the capability of generating 256 color variations. The GTIA chip provides three new graphics modes in addition to the normal fourteen, totally different, full-screen modes. This article defines a few terms relating to graphics, explains the normal graphics modes, then introduces the new modes provided by the GTIA.
ANTIC Is A Busy Chip
We all know that the Atari 400 and 800 have superior graphics capabilities. This has been achieved by designing special chips to handle video display tasks, taking that burden off the main microprocessor. In Atari computers these special chips are known as ANTIC and CTIA.
The ANTIC chip is actually an advanced DMA (direct memory access) controller that qualifies as a true microprocessor. It has an instruction set (mode lines and "load memory scan" operation), a program (the good 'ole display list), and data (display memory and character sets).
This special chip is a rather busy fellow. Its responsibilities include doing DMA for the display list, the display data (playfields), the character set, and player/missile buffers. Besides that, it sets the playfield width, controls horizontal and vertical fine scrolling, keeps track of the vertical position of the scan beam, and handles NMI interrupts. It also supports a light pen.
The GTIA: Three New Modes
The other chip is the CTIA, or Computer Television Interface Adapted integrated circuit. This is the chip which handles all color and luminance (brightness) information to send to the television screen. This is a complicated process, but the chip designers at Atari got carried away and created whole new functions which we know as the player/ missile graphics system. It is the CTIA which processes the horizontal position, size, priority, and color of the players. The CTIA also watches for player/playfield collisions, joystick triggers, and console keys. Like the ANTIC, it is a busy chip.
The new GTIA chip replaces the CTIA. Rumor has it that the "G" stands for George. Apparently some fellow named George was still not satisfied with all the special functions of the CTIA, and gave it the ability to generate three totally new graphics modes. When you find out what these new modes can do, I think you will appreciate "George" and his GTIA.
The three new modes are 9, 10 and 11. The operating system and, therefore, Atari BASIC, supports these new modes. But before describing all the features of these new modes, I want to define a few essential terms and review the normal graphics modes 0 through 8.
In order to fully understand Atari graphics, one must have a solid concept of how a television display is generated. And no discussion on "television theory" would be complete without a definition of the "color clock." The term color clock derives from the fact that there is a problem in measuring distances on a television screen. Different television sets have different screen sizes, with 9", 13" and 19" being common diagonal measurements. All television sets, however, have a scanning beam which translates a signal from the computer into a picture on the screen.
The signal coming from the computer contains two characteristics. It has a frequency, which defines a color, and it has an amplitude, which defines the luminance of that color, often referred to as the brightness or intensity. These qualities of the computer signal affect the way in which the scanning beam shoots electrons at the phosphors on a television screen. This electron shooting process is done horizontally, one line at a time, but it is done so quickly that it is not noticeable to the human eye.
When drawing a line, the scanning beam starts at the left edge of the screen and proceeds to the right edge, shooting electrons the whole time. Since the beam has a finite amount of time it can spend drawing one line, the beam will seemingly have to move faster to cover more area on a larger screen. Thus the problem of trying to measure horizontal distances is further complicated by the fact that different scanning beams not only travel different amounts, but also at different rates. Our unit of measurement cannot really be a distance; it must be a unit of time. The hint I gave a moment ago was that the scanning beam has a certain amount of time it can spend on one scan line. How fast or how far the beam travels is insignificant.
Understanding Color Clock
The fact that our unit of measurement is based on time explains the word clock in the term color clock. A color clock is the amount of time the computer needs in order to sufficiently change the frequency of the signal it generates so as to produce a different color. What a mouthful! This is my own personal definition; it has worked for me, but some people may not agree with it. Here's another definition. A scan line is the horizontal path of the scanning beam from the left edge of the screen to the right edge.
Scan lines extend horizontally across the screen, but it takes a lot of them stacked vertically to fill up the screen from top to bottom. Therefore, horizontal resolution is usually expressed in terms of color clocks while vertical resolution is expressed in scan lines. Of course, on different television sets the actual lengths will differ, but the resolution horizontally to vertically is always proportionate. It turns out that, on any screen, one color clock appears to be equal in length to two scan lines.
Now we have to get even more technical for a moment. The scanning beam starts at the upper left corner of the screen and travels horizontally to the right. By the time it hits the right edge it has drawn one scan line that is 228 color clocks wide. The team then shuts off for a short period while it returns to the left edge, only one scan line lower. This period is called the "horizontal blank" for obvious reasons. The beam then turns on again and starts drawing the next scan line. This sequence of drawing scan lines continues 262 times. At that point, the scanning beam, at the lower right corner of the screen, shuts off and returns to the upper left corner of the screen during a period known as the (guess what!) "vertical blank."
This whole process of drawing 262 scan lines, each of 228 color clocks, plus the blanking periods, constitutes one "frame." The television draws sixty of these frames every second, because your home power line is 60 Hz (cycles). The name given to this display method is "raster scan." The fact that your Atari follows a broadcast standard referred to as "NTSC" makes it one of the few home computers that can be video-taped without special equipment.
Just because the scanning beam generates all those scan lines and color clocks doesn't mean that the computer is generating that much display data. Even if the computer did, you wouldn't see the whole image since most television sets display a little less than 200 scan lines of about 170 color clocks. The part where the true picture exists is called the playfield, and now it's time for another definition.
Playfields And Mode Lines
The playfield is the portion of each scan line for which data read from memory can produce colors and luminances. The background exists at the ends of each scan line; the playfield is in the middle. From the viewpoint of one frame, the playfield appears as a rectangular region which extends to the sides of the screen.
Two things control the size of this playfield area. The height in scan line is controlled by the display list as you will see in a moment. Recall that the width in color clocks is set by the DMA control register of the ANTIC.
SDMCTL $022F 559 shadow DMACTL $D400 54272 hardware D5 1 display list DMA enable 0 display list DMA disable D1, D0 00 playfield DMA disable (no play Held) 01 narrow playfield (128 color clocks) 10 standard playfield (160 color clocks) 11 wide playfield (192 color clocks)
The OS screen handler always uses a standard width playfield. The advantage of the narrow playfield is that less DMA is required, so programs execute faster. Unfortunately, the screen handler routines do not work properly when the playfield width is other than the standard. The wide playfield generates more data than the television can display; its uses are rather limited. It's even possible to turn off playfield completely, in which case ANTIC fills the screen with scan lines of the background color. As will be shown in a moment, the playfield also requires a "display list" so bit five must be set for any playfield type to be generated.
Remember that a byte is made up of eight binary "bits." If playfield and display list DMA is enabled, bits may be read from the computer memory during the course of one scan line. The bit pattern determines the frequency and intensity changes of the scanning beam, with the result being different color/luminances. The same bit pattern may be repeated for several scan lines. And the bit pattern can be interpreted in different ways. This leads us to yet another definition.
A mode line is a contiguous group of scan lines for which display memory is read only once.
There are two main types of mode lines. In direct memory map modes, the bit pattern produces the same image on each scan line. Text modes are a more complicated mode type which use a character set.
The ANTIC knows how to handle fourteen different kinds of mode lines. Each mode line corresponds to a different method for interpreting a bit pattern. A full screen graphics mode is actually just a series of identical mode lines.
The display list is merely a sequence of bytes in memory that, among other things, tells ANTIC the proper sequence of mode lines for one screen.
Whenever the screen is opened (accomplished in Atari BASIC with the GRAPHICS statement), the screen handler establishes a display list of many mode lines to produce a screen of the desired mode. Modes can be mixed by manually changing the display list. Display lists produced by the screen handler always contain the proper number of mode lines for exactly 192 scan lines of playfield. Altering the display list can affect the total number of scan lines, which is how the vertical size of the playfield is controlled.
The display list also has other functions, such as control of fine scrolling, horizontal blank interrupts, and loading the memory scan counter of the ANTIC so it knows where to start reading memory.
A mode line divided into several parts forms pixels, which are single plotting points somewhere within the playfield area. A pixel's vertical resolution is the same as the mode line in which it is displayed, so there can be just as many pixels vertically as mode lines in the display list. The number of color clocks over which one pixel is spread is also determined by the mode line. Here is a little chart to show you the pixel size for the primary mapping modes:
|MODE||COLOR CLOCKS||SCAN LINES||RESOLUTION (full/split screens)|
|3||4||8||40 by 24/20|
|4,5||2||4||80 by 48/40|
|6,7||1||2||160 by 96/80|
Note that each time the width of a pixel is reduced, its height also decreases, so a single pixel appears to be square in shape regardless of the graphics mode.
Some Observations About Memory
Now to talk about memory. In the one-color modes, one pixel is represented in memory by one bit. If the bit is on, playfield zero shows. If the bit is off, the background shows. Modes 4 and 6 are the one-color modes. For more color, modes 3, 5 and 7 allow three colors. The tradeoff is that a single bit is no longer sufficient. Two bits, a pair, are required. The total value of the two bits selects either one of three play fields or the background:
|BIT PATTERN||COLOR||PLAYFIELD TYPE|
Playfield zero is the same thing as COLOR 1 in Atari BASIC. Playfield one is really COLOR 2, and so on, with COLOR 0 being the background.
Although modes 4 and 5 both have the same resolution, or pixel size on the screen, mode 5 will require twice as much memory. In the lower resolution modes which require little memory in the first place, the additional memory needed is rather insignificant. You might have noticed that mode 3 had no single color counterpart. Consider that in a 48K system it is possible to have about 150 different mode 3 screens in memory simultaneously. The chip designers probably decided it wasn't worth the effort or memory savings to provide a one color mode with such low resolution.
Therefore, the size of a pixel on the screen is determined by two things: how many scan lines high, and how many color clocks wide. The amount of memory required for a mode is also determined by two things: how many separate pixels to one mode line, and how many color possibilities per pixel. The only real connection between pixel size on the screen and size in memory is that bigger pixels fill up a screen faster, so there are fewer of them, and less memory is needed.
Now, three colors means two bits must be used. Does that mean we are always stuck with only three colors which can't be changed? No. The CTIA is capable of generating 128 color/luminance variations. It can produce sixteen different colors, each in eight different degrees of luminance. But 128 possibilities means seven bits would be required, and, in most cases, seven bits per pixel is simply not feasible. There is a limit to how much memory can be devoted to a screen. The solution to this problem is a sort of compromise, but it also presents some powerful and flexible advantages, too. The solution is to use color indirection.