Jump to content
dmlloyd

Artifacting - isn't it weird?

Recommended Posts

I was reading flashjazzcat's thread about his GUI project and saw his screenshots, and it got me thinking about artifacting (again). Artifacting, for those who aren't familiar with the term, is the effect on NTSC systems with a composite monitor where the LUM signal interferes with the COLOR signal, which causes half-size pixels to display with a colored "fringe". If you draw a whole row of even-numbered or odd-numbered pixels, you get one of two "fake" colors. By setting the background color to black and the foreground to white (or vice-versa), the effect is most pronounced.

 

Anyway what got me thinking about it is, FJC showed (I believe) an 800XL with purple/green artifacting, and a 65XE with blue/orange. My XEGS does purple/green, and my old (long-dead) 800XL did blue/orange. I've also heard of yellow/blue though I've never seen it.

 

So I'm wondering: what actually determines the color of the artifact? I doubt it's the (solely) IC because swapping GTIAs doesn't seem to affect it (with my limited sample set anyway). To answer this mystery I dug into the GTIA (and CGIA) data sheets to figure out exactly how this color timing works.

 

Hope this comes out right:

 

_____________

/

LUM ------------------------------------------------------------------------------------------X LUM 0-F

|<----------------------------------------------------------------------------------->|\_____________

| T(lum1) = Luma output delay (max 450ns)

___ | ________________________ __________________

\| / \ /

OSC \ OSC low 140nS / OSC high 140nS \ /

|\________________________/ \________________________/

|

| T(inv) = Color output delay (max 190nS)

|<-------------------------------->|

______________________________________|_________________________________________________________________

|/

COL / Color Reg = 0 (No color output)

_____________________________________/

 

 

_________________________________________________________________________________________________

\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \

COL \ 1 \ 2 \ 3 \ 4 \ 5 \ 6 \ 7 \ 8 \ 9 \ A \ B \ C \ D \ E \ F \

\___\___\___\___\___\___\___\___\___\___\___\___\___\___\___\____

|<->|

Δt = 16-25

 

You can see from the diagram the same thing I realized: the luma output for a given color clock comes much later than the color output! This seeming discrepancy is possibly explained by delay introduced by capacitance in the color output circuitry, which can work as an AC delay (technically [in a perfect capacitor] it causes the signal to be changed into the derivative of its input, for any calculus people out there).

 

Now one thing you may be thinking is "What about the color adjustment pot?", which is a valid question. The CADJ signal is a voltage input to C/GTIA which adjusts the delta between different colors. Because this affects the delta, there is only one setting that is "right": too high or too low and your spectrum goes either above or below 360°. So given that only one setting is "right", it follows that you cannot change artifacting using this input without actually making "normal" colors incorrect.

 

As you can see, every hue value corresponds to a 16-25 nS delay; thus a difference in delay of as little as 10s of nanoseconds in either the luma or color output can, without impacting normal color output, dramatically change the artifacting colors observed. So take this data and combine the fact that just about every Atari model (and, thanks to widespread modification, often multiple systems in the same model) has a different color output circuit, and the conclusion I come to is that the video output circuit used is a primary (if not THE primary) influence on the artifacting output.

 

Interestingly when I added color output to my 600XL, I used the simple XE color circuit, and as a test I created a simple "composite" output by bridging the chroma to the luma via a capacitor, and observed the output - and the output was purple/green just like my XEGS with the identical color circuit.

 

The only problem with my theory is the observations of FJC's demo output. He showed his 65XE with blue/orange and his XL with purple/green! This would seem to poke a big hole in my theory that I cannot explain - so FJC, did you happen to mix up the labels on those two screen grabs?

  • Like 2

Share this post


Link to post
Share on other sites

When I tried FJC's demo on my 130XE, I definitely got blue/orange artifacting. In fact, I don't think any of my computers that have games/programs using artifacting have put out anything other than blue/orange for black and white pixels when I tried them. Admittedly, I'm mostly used to output from my CoCo 2 in that regard, though.

Share this post


Link to post
Share on other sites

Yeah I think blue/orange is the "standard", such as it is. It certainly is more attractive-looking than green/purple anyway. It's interesting that your 130XE yields blue/orange though. Perhaps the run of GTIA has more to do with it than I thought, or maybe there are other factors in play...

Share this post


Link to post
Share on other sites
Admittedly, I'm mostly used to output from my CoCo 2 in that regard, though.

 

Yeah, most of the hi-res games on the coco2 relied on the two color mode producing four color with artifacts. Orange and Blue.

With the coco3 and an RGB monitor these games are black and white.

Share this post


Link to post
Share on other sites

The dominant factor affecting artifacting is the chroma/luma buffering circuitry outside the GTIA chip which is why each new generation of machines produced different colors. This is because the colorburst (chroma) provides the reference timing for the color signal but the color is being produced by a waveform on the luma signal. Any difference in phase/delay between the two circuits will skew the colors. Most Ataris have a similar luma circuit, but the chroma circuit varies wildly. The 1200XL added extra stages to try to boost the color signal while keeping the burst at a lower level. The 800XL removed the boost sections, and the XE's went to a much simpler scheme using an inductor. The colors may vary a little due to manufacturing variances in GTIA, but only CTIA has a major swap due to the 1/2 clock shift of the luma signal. For the most part, moving a GTIA from machine to machine will consistently produce the correct colors for that machine type.

Share this post


Link to post
Share on other sites

This gives the opportunity for an interesting experiment though. Anyone who is willing and able, run the attached XEX and post a screengrab of the output along with your machine type and CTIA/GTIA information if possible (including the part number + -suffix, mfr location, and date code if possible).

 

This XEX should work on any atari with at least 32K of RAM (it loads at $2000 and uses $4000 for graphics).

artif.xex

Share this post


Link to post
Share on other sites

I ran that in Altirra and went to System -> Video -> Adjust Colors -> Artifacting Phase and was able to sweep through purple/green, blue/orange and yellow/blue. As you explained in your original post, phaeron also explains that the relative delay between the chroma and luma paths determine the combination in section 6.5 of his Altirra Hardware Reference Manual.

Share this post


Link to post
Share on other sites

Yes, I did explain in my original post that the artifacting color is caused by the delta between the color signal and the luma signal. What I'd like to gather data on though, is what actual hardware outputs what artifact colors. An emulator will of course display any color you tell it to. :)

Share this post


Link to post
Share on other sites

aspeqt is telling me that XEX is broken (seems to think it should be one byte longer). it also doesn't load in MyDOS.

Share this post


Link to post
Share on other sites

OK, got it to run from mypicodos. I have magenta/green (as I thought) but I don't currently have any info on my GTIA in this 400. I don't like opening it either, I have some dead bugs in there I don't want to disturb. :) all I know for now is that I have GTIA though.

Share this post


Link to post
Share on other sites

The other interesting fact: What happens to the colors if you change COLPF2 to a different hue? On a real Atari a different background hue affects the artifact colors, but on most emulators (excepting Atari800Win), the colors don't change at all.

Edited by Synthpopalooza

Share this post


Link to post
Share on other sites

OK, got it to run from mypicodos. I have magenta/green (as I thought) but I don't currently have any info on my GTIA in this 400. I don't like opening it either, I have some dead bugs in there I don't want to disturb. :) all I know for now is that I have GTIA though.

This will run from DOS. It lacked runad.

ARTIF2.zip

Share this post


Link to post
Share on other sites

An interesting hardware note: the CADJ input can be driven at color clock frequencies, giving you independent control of a pixel's color. Just lift the pin and supply a voltagr level.

 

Bob

Share this post


Link to post
Share on other sites

This will run from DOS. It lacked runad.

Well, it runs from SpartaDOS, but you have to 'M' run at $2000 to get it to work with MyDOS453, probably DOS 2/2.5 also. It won't run from DOS 2.5.

Edited by russg

Share this post


Link to post
Share on other sites

If there had been a register to pick the "color" of the color burst (to shift its phase) then we could have had a wide range of artifact colors available to the programmer. If I ever get a time machine, it'll be one of my little suggestions to George McLeod.

Share this post


Link to post
Share on other sites

Slight problem there - changing the colourburst means every colour on that scanline will be affected, so you'd need to adjust your programming accordingly.

Share this post


Link to post
Share on other sites

This topic brings up a feature addition for Altirra, the ability to have NTSC artifacting on when in NTSC mode but it have a separate profile for PAL as opposed to it still being marked as on when you switch back to PAL.

 

I play things like DROL in NTSC mode with the artifacting on as that's how it should be played if its a US game, and when its a UK game I play in PAL as that's what it was designed for (well rarely designed specifically for PAL but you know what I mean)

 

As the colour profiles for PAL & NTSC are managed separately it should not be a big bit of work?

Share this post


Link to post
Share on other sites

Was there a reason why Atari implemented 'artifacting' on non NTSC computers if it only works properly/only suitable for NTSC computers (or at least systems designed for use on NTSC tv's or monitors)

 

Or if they didn't implement 'artifacting' on non NTSC computers would that have meant that Gr.8 (and compatible modes) would'nt have been available on non NTSC computers or at least design a high res mode that didn't use Artifacting

Edited by carmel_andrews

Share this post


Link to post
Share on other sites

They didn't implement it - it's a side effect of how RF and composite TV signals work.

 

You can get the same effects from a modern device like a DVD player - although when graphics are produced destined for TV, they usually take interlace flicker and artifacting into consideration, and take measures to avoid it.

Share this post


Link to post
Share on other sites

They didn't implement it - it's a side effect of how RF and composite TV signals work.

 

You can get the same effects from a modern device like a DVD player - although when graphics are produced destined for TV, they usually take interlace flicker and artifacting into consideration, and take measures to avoid it.

 

 

 

 

So i guess, something like a flicker fixer hardware (like similar on the amiga) might sort out artifacting on non NTSC Atari's

 

There again, didn't the early amiga's (pre flicker fixing hardware) have artifacting as well...and what of the Comm.64 or 128 or did those systems use a slightly different way of dealing with high res modes (since i guess artifacting only comes into play when using higher resolutions)

Share this post


Link to post
Share on other sites

Flicker fixer outputs RGB so it becomes irrelevant anyhow. Similar deal if you use VBXE or stock over S-Video.

 

C64 uses a different pixel clock so it's results are somewhat different.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...