Posted Fri Jan 28, 2011 10:46 PM
Artifacting is color graphics produced by interpretation of high-frequency luma (brightness) signals as chroma (color). This is due to incomplete separation of luma and chroma, made more difficult by the video produced by this style of computer (lack of interlace and slightly different horizontal timing means that the chroma subcarrier doesn't invert between either scanlines or frames). On the Apple II, it was the only way to produce color, and when such games were ported to the Atari sometimes they just used Graphics 8 and kept the artifacting. It was also occasionally used to draw color bars next to 40 column text, such as in Pitstop II.
With NTSC, the dot clock by the Atari in hi-res mode is exactly twice the color subcarrier -- 7.16MHz. This means that an alternating pattern of on and off dots produces a color, and the opposite pattern produces the opposite color. Depending on the hardware, doing this with black and white pixels produces green/purple or blue/red. Other colors can be used for the alternation in which case the artifact color includes the source colors by vector sum. The interaction between luma and chroma also causes color fringing on sharp edges and intermediate colors to appear where sharp color contrasts occur; these are usually unwanted in any case.
Artifacting isn't as useful in PAL for two reasons. First, the PAL chroma subcarrier is faster than the NTSC subcarrier, 4.43MHz instead of 3.58MHz. A PAL color cycle is 5/4ths of a hires pixel instead of two hires pixels as on NTSC, and this means that more luma detail can be encoded and preserved. It also means that you can't get a stable color out of pixel patterns because the Atari's dot clock doesn't line up, so the best you can get is some ugly rainbows. Second, the PAL color subcarrier reverses direction on alternating scanlines and displays average chroma components on adjacent lines, partially canceling some of the chroma artifacts. From what I've seen in pictures, you can get some colored vertical lines with PAL artifacting, and that's about it -- nothing like the solid colors you can get on NTSC.
My emulator Altirra has four different modes for emulating artifacting, each with increasing cost. The cheapest is simple NTSC artifacting, which just consists of a 3-tap horizontal filter on luma to detect alternation patterns and to kick in the artifacting color. This doesn't give fringing artifacts, but it's simple and fast. The second cheapest is simple PAL artifacting, which emulates the chroma averaging in the TV receiver. The algorithm is trivial -- just a 50/50 blend on the RGB chroma -- but it requires a scanline worth of storage, which I presume is at a premium on an FPGA.
The other two modes Altirra supports are the high-artifacting modes, which give fringing artifacts but are a lot more expensive. Both work by using digital filters that emulate the process of producing a 14MHz NTSC or 35.5MHz PAL signal and then separating it out again to produce 14MHz pixels (double hi-res). The NTSC algorithm is the cheaper of the two and uses three 24-tap filter banks with two phases each; the PAL algorithm I just added and is more costly as it requires 16 phases and a YCbCr-to-RGB conversion. The filter banks are large in order to avoid intermediate lookups and multiplies entirely (960K for the SSE2 specialization), but in a hardware implementation you'd probably want more arithmetic units and less ROM. In both cases, though, the output isn't as good as what you'd get from a good TV, partially because the algorithms could probably use some tuning and partially because the filter design prohibits non-linear elements.