phaeron Posted February 23, 2018 Share Posted February 23, 2018 Thanks to a generous contribution, I now have an 800 for testing and have finally been able to investigate firsthand the elusive mystery of why the 800 produces such odd artifacting colors. It's been known for a long time that this model is odd in being able to produce green/blue artifacting, which is unlike other models that produce opposite colors. I'd determined a while ago that simulating this with linear filters wasn't possible and had been wanting to track down the cause. However, when I hooked up the 800 to a Commodore 1702, I was a bit disappointed: Brown/blue artifacting, opposite colors. But then on a whim I hauled the machine into the living room and hooked it up to a Samsung TV, and was shocked to see this: A Dell U2711 monitor shows similar: But, then, capturing with an All-in-Wonder USB device produces this: This is similar to what I see on an 800XL. Mind you, this is the exact same machine, just swapping the composite output over to each monitor. I've never seen anything as crazy as this. To double check, I repeated the same test with a 130XE, and all four displays showed similar brown/blue artifacting colors. Anyone familiar with composite video or the output circuit who might have insight into why the displays vary so much in decoding artifacted colors from the 800? The one thing I've been able to determine so far is that the 800 has greater skew between the luminance bits, which manifests as pretty dark bars between luminances 3-4, 7-8, and 11-12 in a color ramp. This causes the artifacting colors to vary slightly at different luma levels. That makes me suspect that there might be something odd about the color burst, like an uneven duty cycle. Still, I can't think of what would cause the artifacting colors to vary by as much as 90 degrees between displays. 6 Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 (edited) I ran down how the colors cycled a time or two before, pretty certain I explained at some point that different CRT masks have the RGB group reversed as BGR, since the artifacts are technically a half pixel and contain no color data they processed in timing order and not decoded to match the mask. normal decoding is matched to the mask and the corresponding phosphor. When looking at a crt you can physically see this, if memory still serves me, when differences between artifacts on different CRT's and the chip revisions / video circuits became apparent, some software would ask you what colors you saw and set the artifact accordingly to pick what was closest to the desired output. Toshiba vs Goldstar vs Sharp vs Sony vs Thompson (RCA) Bonus for those that want to have a CRT repaired, I'm talking about the display tube itself.... most people don't realize a display tube can be rebuilt by at least two U.S. companies that rework cathode ray tubes—Video Display Corp. and Quest International, last I remember they are the only two left that can take a worn out CRT with a failed electron gun, open it and replace it with new electron gun and seal the cathode ray tube again. It's been a number of years but they should still be around. Edited February 23, 2018 by _The Doctor__ Quote Link to comment Share on other sites More sharing options...
phaeron Posted February 23, 2018 Author Share Posted February 23, 2018 Uh, artifacting has nothing to do with the phosphor/mask pattern. If it did, you would see shifts in artifact colors when changing the horizontal position control and they wouldn't be consistent anyway due to pincushioning. 1 Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 (edited) artifacting has everything to do with grouping order and timing with the mask.... this is an analogue process... You most certainly can see changes in artifacts messing about with geometry...but not position adjustment, I am trying to explain the interaction between how the decoder is processing and passing this along, but also it's the grouping order on the tube... the decoder and the grouping order affect the artifact... position doe not matter the way you think it does.... rgb bgr Edited February 23, 2018 by _The Doctor__ Quote Link to comment Share on other sites More sharing options...
foft Posted February 23, 2018 Share Posted February 23, 2018 Weird. I guess the actual phase gives blue/brown but the digital decoders are thrown off by something. Time to hook up a scope and show the difference between the 800xl vs 800 signals. 1 Quote Link to comment Share on other sites More sharing options...
phaeron Posted February 23, 2018 Author Share Posted February 23, 2018 Actually, looking at the images again, I just realized that the blue and green being picked up by the TV and monitor look suspiciously like the green that the capture device is seeing and the blue that the 1702 is seeing. That leads me to believe that the vidcap device and 1702 are locking onto two distinct phases and the TV and monitor are somehow locking onto a mix of the two. But I can't see that coming from the color burst itself, or the non-artifacted colors would be badly distorted. 1 Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 (edited) sorry to drag things slightly away from your exact problem, too much time working with generating video displays has a drifting effect as I thought it was something of interest to people perusing the thread. I'll just leave you to it. I removed the post and put this in it's place. Edited February 23, 2018 by _The Doctor__ Quote Link to comment Share on other sites More sharing options...
phaeron Posted February 23, 2018 Author Share Posted February 23, 2018 (edited) another thing wrong with modern lcd displays is the interaction with adjacent scan lines, it causes color bleed, mixing, and makes graphics look more pixelated and blocky, though not a true scanline on problem updates this device helps and the video shows the difference and why Do you mind? This has nothing to do with what is being discussed here. Where do you see any opportunity for blending between scanlines being a factor in those giant blocks of solid color produced by the test program? What does this have to do at all with figuring out why the 800's NTSC artifacted colors are inconsistent? Edited February 23, 2018 by phaeron 1 Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 (edited) YES! you are on the trail to understanding this... if I only could locate the notes and references on this for you. You are very precise person and the laid back explanations combined with the analog almost organic nature of the beast is not compatible with the thought process of most people with such ordered minds. Okay deleted the scan line link and video... sorry if it was a distraction. old muddled minds are of no use here... I probably won't find the scribblings of interest anyway, good luck. Edited February 23, 2018 by _The Doctor__ Quote Link to comment Share on other sites More sharing options...
Rybags Posted February 23, 2018 Share Posted February 23, 2018 Could the answer be found by comparing phase difference between colourburst and the master clock? Like maybe there's a difference among machines. When trying to work out how to do software interlace by Antic bug exploit, I used a CRO (purchased just for that reason). The trick I used is hook up a probe to a joystick port and use a 1-0 transition of a PORTA bit as the trigger. It's pretty simple to do in software - I was interested in stuff happening around VSync so used a VBI with waits on VCOUNT then STA WSYNC to get the exact cycle lock. For this sort of thing, probably use a DLI. My only authentic NTSC machine is a 1200XL which I suspect the keyboard isn't working, so I might be limited to what I could do. 1 Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 (edited) I'm sure you know this already, but it can't hurt... color is encoded as the phase and amplitude of colorburst frequency. the phase changes... Not that it will matter but there is more than one way to get artifacts. 1) primarily it is alternating light and dark pixels which can create a 3.579545MHz signal either in phase with the colorburst or out of phase. Atari clock *2 2) each pixel is half the clock of the colorburst wavelength, some sets have difficulty decoding abrupt phase shifts. 3) the interaction of the two.. Some reasons why you can't see light-dark luma patterns are that decoder/filters in the set try to filter out the frequency so you don't see the color carrier in the form of a dot pattern and set detects the luma pixels as a color signal with a high saturation value, it adds brightness to the overall signal obscuring or masking darker pixels. While the artifacts may not always appear when you use separate luma chroma lines, you will get some sets that will still get some color because of all of the above, it is not necessarily a result of composite bleed through etc etc that is often discussed in such video mod threads.... For color artifacts the color of the pixel isn't important, but background color can affect it, it's the repeating on/off pixels which cause the artifacts. The artifacts may allow colors to be shown which aren't in the current palette, the artifacting can be used to display more colors than a mode or mixed mode is capable of. I'm sure if I made any mistakes or remember things wrongly, it will stand out like a sore thumb, please forgive it and correct it accordingly, it's been a long night please combine this with my other rambling... when put all together and mixed with Rybags method of discovery, I think it will all make sense and you might just have some answers May the phosphor be with you... and one last note... I said it before... the pixel out could be a half clock off (shifted) this it does not have to fall on rrg and gbb Edited February 23, 2018 by _The Doctor__ 1 Quote Link to comment Share on other sites More sharing options...
Rybags Posted February 23, 2018 Share Posted February 23, 2018 Supposedly some sets will artifact with seperate C/L inputs because they mix the signals before seperating them again. You seem to have contradicted your earlier post by putting a more correct description in - it's like Avery said, the shadow mask is pretty much irrelevant though of course overall definition of a CRT is usually better on bigger sets. I've got a ~ 4 inch portable which is pretty poor in that department. Quote Link to comment Share on other sites More sharing options...
dmsc Posted February 23, 2018 Share Posted February 23, 2018 Hi! I agree with you, this must be some non-linear effect in the analog path. Seen your captures, my theory is that the 1-0 transition produces too much undershot, and that the affected TV sets "inverts" the voltaje bellow some level, effectively enlarging enlarging the 1 and shifting the phase. Also, to make the theory work, only the transitions in phase with the colorbust produce under/over shot, not the other half. So, your monitors are: - 1702: don't saturate, colors are 180° shifted and luminance is 50%. - Dell and Samsung TV: saturate on low voltage only, luma is increased to 70% and phase is shifted to green. - AIW: saturate on low and high voltage, luma is nearer 50% and both phases are shifted, making the display the same as the XL. Well, only a theory, please put a scope to the signals, alone and when connected to each device! 1 Quote Link to comment Share on other sites More sharing options...
Kr0tki Posted February 23, 2018 Share Posted February 23, 2018 (edited) Phareon: I think you should add standard solid color bars to your testing program, as a reference. E.g. a bar for each of the 15 hues, or at the very least, a bar for hue $1 (ie. the colorburst phase). This is to rule out the possibility of the receivers interpreting hue/tint differently. EDIT: and yeah, you'll definitely need an scope to get to the bottom. Edited February 23, 2018 by Kr0tki 2 Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 (edited) To be sure it's not an either/or I haven't contradicted myself.. it's ALL part of it.... he wanted to know why it ramps up also, it has to do with the beams hitting part of the mask and part of the rgb group, as the intensity value is raised in the Atari the apparent ramping is a product of the beams being driven more, it causes the beams to brighten and widen slightly allowing more to pass directly through the mask as well as slightly reflect. this can also slightly change the color(slight)... not much but can be noticed on some monitors depending on dot pitch and mask type. There were hundreds of crt based television sets and monitors each with their own unique characteristics... some produce 80 column text in a perfectly legible fashion off of composite video.... NEC was awesome at it, and it artifacted perfectly also, some where a mess artifacting the crap out of nearly everything, some had horrible dot pitch and terrible moire patterns ensued.... some didn't show the digital jail bars, other were so bad you thought you were in jail....and on and on.... You don't wan't to emulate all of that stuff for sure! The only thing common to all is slight scan lines for clarity, but that's a whole 'nother can of worms Edited February 23, 2018 by _The Doctor__ 1 Quote Link to comment Share on other sites More sharing options...
Wilheim Posted February 23, 2018 Share Posted February 23, 2018 Just a random thought... Have you tried swapping the GTIA chip with an XL one? Maybe it has nothing to do. I'm not familiar with CRT TV screening technology... Quote Link to comment Share on other sites More sharing options...
+DrVenkman Posted February 23, 2018 Share Posted February 23, 2018 This thread is right up Bryan’s alley ... hope he gets a chance to see and comment. Quote Link to comment Share on other sites More sharing options...
_The Doctor__ Posted February 23, 2018 Share Posted February 23, 2018 The point of the thread as I understood it is to understand the 800's artifact with it's chipset. But hey I love experiments. Quote Link to comment Share on other sites More sharing options...
Mclaneinc Posted February 23, 2018 Share Posted February 23, 2018 Woohoo....Leading down the path of even greater emulation accuracy... Props to the donor of the computer and for all trying to find the cause.. 1 Quote Link to comment Share on other sites More sharing options...
zzip Posted February 23, 2018 Share Posted February 23, 2018 I'm confused why modern high-def flatscreens show artifacting colors at all? Do they have some kind of CRT-emulation built in or something? Quote Link to comment Share on other sites More sharing options...
foft Posted February 23, 2018 Share Posted February 23, 2018 (edited) I'm confused why modern high-def flatscreens show artifacting colors at all? Do they have some kind of CRT-emulation built in or something?Artifacting comes from sending luma + chroma down a single wire. Nothing to do with crts. The filter to separate colour and brightness can not tell the difference between brightness stripes and the colour portion of the signal (where he frequency matches...). Edited February 23, 2018 by foft 3 Quote Link to comment Share on other sites More sharing options...
zzip Posted February 23, 2018 Share Posted February 23, 2018 Artifacting comes from sending luma + chroma down a single wire. Nothing to do with crts. The filter to separate colour and brightness can not tell the difference between brightness stripes and the colour portion of the signal (where he frequency matches...). ok but then why did the higher-res composite monitors BITD show the actual bitmap patterns and not the artifact color? I guess I don't understand it as much as I thought. It was always explained to me that artifacting was a result of limited horizontal resolution of the screen. 1 Quote Link to comment Share on other sites More sharing options...
Bryan Posted February 23, 2018 Share Posted February 23, 2018 Okay, I'll go back and read this after I post the lowdown on artifacting. 1. It has nothing to do with the display output device (the CRT or the LCD panel). The artifact colors are in the signal before it reaches the tube. The alignment of the image to the phosphors is completely arbitrary (and will actually vary quite a bit as the TV warms up). The size of the phosphors relative to the resolution of the video source is different from set to set. Phosphor triads are NOT related to pixels. The shadow mask blocks the guns from hitting anything except their designated phosphors, but this is done after the continuous analog image has left the guns giving us sort of a Venetian blind view of each of the the 3 individual signals. If R,G and B were run to 3 discrete B&W tubes, you'd still see the artifact colors happening.** 2. Artifacting is generated in the video decoding circuits. The NTSC color carrier is centered around 3.58MHz. The A8's pixel clock is based on this same frequency, so drawing on-off-on pixels in 320 mode actually puts a 3.58MHz waveform in the picture. The TV sees this and screams,"COLOR!!!1!omg". Actually, the TV only screams 2 colors. One if the pattern is 1010 and another if it's 0101. The colors produced depend on what the phase relationship is between the on-off waveform and the colorburst wave extrapolated out to the same point in time (which the TV does via a PLL). Artifacting is putting a color generating waveform into the picture using luma signals instead of the automatically generated Chroma. 3. S-Video (separated video) won't produce the colors because the monitor knows that color is only contained in the Chroma signal. Those 3.58MHz waves in the Luma don't fool it. With a Composite signal, there's no way for the monitor to know the difference. A 3.58MHz wave in the image = color, no matter what. 4. Different A8's have different video buffer circuits and these introduce differing amounts of skew: the delay relationship between Chroma and Luma. This skew changes the colors the Luma appears to be generating because it skews the timing relationship to the colorburst which is our reference for color. 5. I suspect that the 800's strange colors are the result of a sub-optimal clock circuit causing an odd duty cycle in high resolution. High resolution is created by showing 2 pixels per 3.58MHz clock cycle. One LUM value is shown when the clock is high, and another when it's low. If the clock isn't a true 50/50 waveform, then even and odd pixels will have different widths, and will create artifact patterns that aren't ideal. This probably explains the odd color shifting and why some monitors may interpret them differently. EDIT: I also noticed a reduced adjustment range with the UAV pot on the 800 which points to a clock issue. These are things on my laundry list for Rev E. ** not as color, of course, but in the relative strengths of the images. 6 Quote Link to comment Share on other sites More sharing options...
Bryan Posted February 23, 2018 Share Posted February 23, 2018 I'm confused why modern high-def flatscreens show artifacting colors at all? Do they have some kind of CRT-emulation built in or something? Because anything that decodes NTSC video from a Composite source will do it. The signal contains frequencies that must be decoded as color. ok but then why did the higher-res composite monitors BITD show the actual bitmap patterns and not the artifact color? I guess I don't understand it as much as I thought. It was always explained to me that artifacting was a result of limited horizontal resolution of the screen. They only failed to display color if they were using separate Chroma/Luma mode. Hopefully my post above helps. Quote Link to comment Share on other sites More sharing options...
zzip Posted February 23, 2018 Share Posted February 23, 2018 Because anything that decodes NTSC video from a Composite source will do it. The signal contains frequencies that must be decoded as color. They only failed to display color if they were using separate Chroma/Luma mode. Hopefully my post above helps. I guess that explains it. I've never had a composite monitor myself so I didn't know much about the different modes-- I just knew that when I'd see these monitors on other people's Apple II or Atari, I would see the pixel pattern as rendered with no artifacting. I assumed it was the resolution of those monitors that caused the difference. The book I learned Atari graphics from all those years ago explained artifacts as caused by limited horizontal res.. As in The TV received the pixel data, but could not display it at that resolution, resulting in false colors. So that's what I always believed until now! (Of course I never could understand why some other computers didn't seem to be affected as much by false colors when they operated at the same res.) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.