Mr SQL #1 Posted March 6, 2016 The Defender style game StarBlitz caused significant controversy with a few programmers for running virtually flicker-free rendering 30 fps of animation on stock hardware. Breaking the Stella emulator on some Windows 10 systems in a way that looked like flicker, and not being optimized for PAL added to the confusion. International reviewers find StarBlitz flicker well within limits for an Atari game and should because the design closely follows the guidelines from the Atari programming manual: This thread topic is to discuss the Atari manual; new ideas outside the scope of the manual are encouraged (rods and cones are welcome) as well as any illustration where StarBlitz deviates from following the flicker optimization techniques. Flicker optimization is a very manual process; rtm and discuss! Quote Share this post Link to post Share on other sites
alex_79 #2 Posted March 6, 2016 (edited) virtually flicker-free rendering 30 fps of animation on stock hardware How could you call it "virtually flicker free" when every other frame is completely black? You are flickering the entire screen at 30Hz here. The 2600 standard display is 262 lines at 60 fps (312 at 50Hz for PAL). Since it doesn't generate the extra half scanline which in TV standard video caues the interlacing (but it has been proven that it can do that), the scanlines of each frame are drawn in the same position as those of the preceding one which makes a 60Hz progressive display. Simple as that, for the matter of this discussion, an object is said to flicker if it isn't displayed on every one of those 60 frames for second. Frequency of animation is another thing. You can animate at 15Hz and still display at 60Hz without flicker (each frame of animation would be repeated 4 times in that case). International reviewers find StarBlitz flicker well within limits for an Atari game and should because the design closely follows the guidelines from the Atari programming manual: Everyone is entitled to its own opinions. What I think, and - by looking at past discussions on this matter - most people here seem to think, is that flickering is annoying. The fact that something is written in a book doesn't make it absolute truth, whoever the author is and especially if it's in contrast with direct experience (I live in Italy, I'm quite used to fight similar statements...). I'm fine if you think that full screen 30Hz flicker looks good, but I'm not if you're intent is to affirm that it must be good for everyone beacause it's written in a manual. Flicker is a technique used in 2600 games to overcome the hardware limitations of the machine. It was used back in the day and it's still used on modern homebrews. It is annoying anyway, and it's common opinion that it should be avoided whenever possible. From the excerpt you posted: The human eye has a time resolution of about 1/16 of a second, so a program can cycle between four images, one every 1/60 of a second, so that each repeats every 1/15 of a second. That frequency is indeed considered the limit to have the illusion of motion when seeing a series of still images. But projecting those images as brief flashes at that frequency is extremely unpleasant to the eyes and in fact even in movie projection each frame on the film is displayed twice or three times depending on the framerate it was used during filming (24 or 16/18 fps). This is also stated later in that doc: Furthermore, there will be some unpleasant screen flicker when this technique is used. The fact is that frequency is not the only factor that influences how annoying the flicker is. Flickering only small objects on the screen is much better than flickering the entire display. (e.g. the ghosts in PAC-MAN flicker at 15 Hz. It's annoying but still acceptable for most people and it also give them a sort of transparency effect which is appropriate for ghosts. But flickering the entire display at 15Hz would cause epileptic seizure even to the least sensitive person) Flicker looks worse on solid objects than on objects made up of thin lines (e.g. the original asteroids VS those hacks which simulate vector graphics gfx) Flicker looks worse on bright objects. Also color/intensity of the flickering object vs background ones influences the result. Looking to flickering images for long periods cause strain to the eyes, so intelligent flicker which is only applied when it's strictly necessary (eg. when more than the max number of sprites allowed by the hardware need to be displayed on the same horizontal line), is better than constant flicker. Interlacing hugely reduces flicker perception because the brain tends to fill the gaps between the lines (compare the text display in Stellar Track with the one I used here ) Ambient light, TV settings, individual sensitivities, age and fatigue are other factors as well. Edited March 6, 2016 by alex_79 1 Quote Share this post Link to post Share on other sites
Mr SQL #3 Posted March 6, 2016 How could you call it "virtually flicker free" when every other frame is completely black? You are flickering the entire screen at 30Hz here. The 2600 standard display is 262 lines at 60 fps (312 at 50Hz for PAL). Since it doesn't generate the extra half scanline which in TV standard video caues the interlacing (but it has been proven that it can do that), the scanlines of each frame are drawn in the same position as those of the preceding one which makes a 60Hz progressive display. There is no "black frame"; black doesn't oscillate on CRT, only on a display where you see a dark gray. On CRT instead of 60 hz, the timing change divides the signal to 30 hz. 30 hz is considered virtually flicker free when colors are carefully chosen that impact phosphor persistance so that it is enough to maintain the screen when it's updated 30x a second (NTSC Television only draws the screen 30x a second, the phosphor does the rest). Read the description from the manual; how does your view differ from the authors? And no the 2600's standard display is 60 hz not 60 FPS, it can't ever do 60 FPS unless the dumb terminal technique that I pioneered to render 30 FPS is leveraged with the ARM chip as SpiceWare was inspired to do What I got from the discussion is that a couple of junior programmers followed me onto the emulator thread throwing insults and calling names when they couldn't reproduce the problem and became angry calling the game "the worst display ever" while simultaneously lamenting repeatedly that their own video of StarBlitz display running on broken hardware looked "too good". They did claim to speak for everyone but only managed to get a third programmer to join in for victory laps, potty talk and hate mail; perspectives vary but no other programmers did that so it looked alot like jealousy of the smooth scroller Quote Share this post Link to post Share on other sites
alex_79 #4 Posted March 6, 2016 There is no "black frame"; black doesn't oscillate on CRT, only on a display where you see a dark gray Well using a different terminology from what is the common sense doesn't make difference (apart making it harder to understand). Anyway if you don't like "black frame" I can rephrase my sentence: in Starblitz you draw a frame, then, during the time in which another frame could be drawn, you don't display anything. This "lack of frame" cause the phosphors that you turned on on the "active" frames to decay in brightness for much longer than they would if you'd draw the display 60 times per second. This cause most people to perceive an evident flicker. I'm referring only to hardware running on CRT, I don't use modern displays with the 2600 and I'm not talking about emulation here. 30 hz is considered virtually flicker free Who says that? I disagree. Even if you choose the colors which cause more phosphor persistance, brightness is decaying for much longer between two "active" frames than it would at 60Hz and I can clearly see the difference. And I'm not particularly sensible to flicker. NTSC Television only draws the screen 30x a second, the phosphor does the rest That's a bit a biased way to describe it. Each scanline is updated 30 times per second, but the screen is redrawn from top to bottom 60 times per second alternating odd and even scanlines. That considerably reduces the flicker perception (your brain "fills the gaps" between scanlines). And is much less evident when displaying photographic images like movies and TV shows, where it's rare to have perfect horizontal lines which exactly correspond to one scanline on the screen. (if you display a computer generated image without some smoothing algorithm like the OSD menu from a digital TV receiver on a CRT TV, you can see a bit of flicker on the horizontal lines and that's annoying if you have to look at it for extended periods of time). Also for content recorded directly on video (not telecined from film), motion is at 60Hz, because each field has been captured at different times. In your statement, by ignoring interlacing, you seem to imply that the way you're drawing the display in Starblitz is the same as standard NTSC trasmissions, which is not true. Read the description from the manual; how does your view differ from the authors? It doesn't actually, because the author is just saying that flicker can be used to display more objects on the screen. I know that, and the technique was and continue to be used. He also says that the flicker can be annoying and that's the point. How much flicker is too much for most people? What I'm saying is that flicker perception doesn't just depends on frequency but there are many other factors some of which I mentioned in my previous post. Flickering the entire display is one factor that makes it more annoying. And no the 2600's standard display is 60 hz not 60 FPS Well, I don't really know what you're talking about here and what exactly you mean with 60Hz display versus 60 Frames per second. I'm not native english speaker and my english is not that good so I cannot go into those semantic intricacies. The fact is, 2600 games can display 60 distinct images per seconds and animate them at the same rate without additional hardware. They do since 1977. Additional hardware surely allows for more complex images to be generated and animated, which I think is not relevant in discussing how much flicker is acceptable in a 2600 game. 1 Quote Share this post Link to post Share on other sites
+SpiceWare #5 Posted March 6, 2016 Don't bother Alex, a number of us (including Thomas Jentzsch) have already tried (3 different links) to explain this to him. I made a video for him at one point to help him find a screen roll problem and he keeps spinning that video as support for his flicker method even though I explained (#33) that camera recordings can hide/suppress flicker. That video looked far better than what I saw in real life. I gave up trying and added him to my ignore list. 3 Quote Share this post Link to post Share on other sites
Mr SQL #6 Posted March 6, 2016 Well using a different terminology from what is the common sense doesn't make difference (apart making it harder to understand). Anyway if you don't like "black frame" I can rephrase my sentence: in Starblitz you draw a frame, then, during the time in which another frame could be drawn, you don't display anything. This "lack of frame" cause the phosphors that you turned on on the "active" frames to decay in brightness for much longer than they would if you'd draw the display 60 times per second. This cause most people to perceive an evident flicker. I'm referring only to hardware running on CRT, I don't use modern displays with the 2600 and I'm not talking about emulation here. Who says that? I disagree. Even if you choose the colors which cause more phosphor persistance, brightness is decaying for much longer between two "active" frames than it would at 60Hz and I can clearly see the difference. And I'm not particularly sensible to flicker. That's a bit a biased way to describe it. Each scanline is updated 30 times per second, but the screen is redrawn from top to bottom 60 times per second alternating odd and even scanlines. That considerably reduces the flicker perception (your brain "fills the gaps" between scanlines). And is much less evident when displaying photographic images like movies and TV shows, where it's rare to have perfect horizontal lines which exactly correspond to one scanline on the screen. (if you display a computer generated image without some smoothing algorithm like the OSD menu from a digital TV receiver on a CRT TV, you can see a bit of flicker on the horizontal lines and that's annoying if you have to look at it for extended periods of time). Also for content recorded directly on video (not telecined from film), motion is at 60Hz, because each field has been captured at different times. In your statement, by ignoring interlacing, you seem to imply that the way you're drawing the display in Starblitz is the same as standard NTSC trasmissions, which is not true. It doesn't actually, because the author is just saying that flicker can be used to display more objects on the screen. I know that, and the technique was and continue to be used. He also says that the flicker can be annoying and that's the point. How much flicker is too much for most people? What I'm saying is that flicker perception doesn't just depends on frequency but there are many other factors some of which I mentioned in my previous post. Flickering the entire display is one factor that makes it more annoying. Well, I don't really know what you're talking about here and what exactly you mean with 60Hz display versus 60 Frames per second. I'm not native english speaker and my english is not that good so I cannot go into those semantic intricacies. The fact is, 2600 games can display 60 distinct images per seconds and animate them at the same rate without additional hardware. They do since 1977. Additional hardware surely allows for more complex images to be generated and animated, which I think is not relevant in discussing how much flicker is acceptable in a 2600 game. No this is a common misnomer Alex; under NTSC 30 hz the screen is still only redrawn 30 times per second just like the individual scanlines. The Atari really does draw the screen 60 times per second by drawing wider bands to fill the screen than NTSC but it's the phosphor phosphor persistence keeping the scanlines from flickering perceptibly in the iterim. To avoid confusion the misnomer of NTSC drawing the screen 60 times a second, we should stay on topic with the Atari programming manual;they're not talking about an alternating field when they reference 30 hz, but rather the most conservative use of flicker hz wise. And they already make it clear that updating the image 60x a second requires no flicker optimizations (that's the max). You had a great point about the terminology of "black frame" because the emulator really does draw a black frame which is only bad emulation since even a CRT connected to an Atari is likely to store the frames and combine them so never shows an extra frame and avoids doubling up the image; I don't want to get into that, this thread is about optimizing flicker based on the Atari programming manual specifically designed for old school unmodified equipment - a real Atari and a real Television. If you know of an Atari game that is showing 60 distinct images per second and animating them at the same rate I'd like to see it. StarBlitz is written in BASIC and uses the camera object to pan the display just like a Television image at 30 fps, that's why it scrolls super smooth. I've seen demo's do this but no other games so far besides KC and 9lineBlitz. Instead of letting Spiceware think for you, why not play around with Virtual World BASIC and try for yourself? BASIC is high level and it only takes a few lines of code to get up and running. Spice looks like you want to participate - that's fine, take me off ignore and don't talk potty mouth and you can discuss directly otherwise please stay off the thread Quote Share this post Link to post Share on other sites
alex_79 #7 Posted March 6, 2016 (edited) No this is a common misnomer Alex; under NTSC 30 hz the screen is still only redrawn 30 times per second just like the individual scanlines. The Atari really does draw the screen 60 times per second by drawing wider bands to fill the screen than NTSC but it's the phosphor phosphor persistence keeping the scanlines from flickering perceptibly in the iterim. Don't know where did you get that information, but that's just wrong. Scanlines generated by the Atari aren't "wider" than those of standard NTSC signal. There's nothing in a video signal that controls how "wide" scanlines are. And yes, NTSC effectively draws the screen at 60Hz. The only difference is in the timing of the vertical sync pulse that on standard NTSC video cause each field to be vertically offset so that the scanlines are drawn in-between those of the previous one, while Atari games draw each field (which we could call frame in this case) exactly over the previous one. That's it. 60 frames per second with 262 scanlines each all in the same vertical position on the Atari, 60 fields per second with 262.5 scanlines each where every other field is offset by 1/2 scanline vertically on standard NTSC. The time required to draw a scanline and the vertical sweep speed on the CRT is the same, else the TV wouldn't sync to the video signal. ...even a CRT connected to an Atari is likely to store the frames and combine them so never shows an extra frame and avoids doubling up the image On standard TV sets, the electron beam inside the CRT is piloted in real time by the video signal, the TV doesn't "store" anything. If you know of an Atari game that is showing 60 distinct images per second and animating them at the same rate I'd like to see it. Almost every one does that... just pick one! StarBlitz is written in BASIC and uses the camera object to pan the display just like a Television image at 30 fps, that's why it scrolls super smooth. As I said before, in TV transmission which were recorded on video, motion is at 60fps. That's twice as smooth.... Each field is captured at a different time and in fact a deinterlacing filter is needed to see those videos on a progressive display, else they show "combing" effect on moving objects. I expressed my opinion about the document you posted in my first reply and I don't have anything to add to this subject. You have some serious misconceptions about how a CRT works, and I suggest to research a bit on the argument to clear things up. Edited March 6, 2016 by alex_79 2 Quote Share this post Link to post Share on other sites
Mr SQL #8 Posted March 6, 2016 Don't know where did you get that information, but that's just wrong. Scanlines generated by the Atari aren't "wider" than those of standard NTSC signal. There's nothing in a video signal that controls how "wide" scanlines are. And yes, NTSC effectively draws the screen at 60Hz. The only difference is in the timing of the vertical sync pulse that on standard NTSC video cause each field to be vertically offset so that the scanlines are drawn in-between those of the previous one, while Atari games draw each field (which we could call frame in this case) exactly over the previous one. That's it. 60 frames per second with 262 scanlines each all in the same vertical position on the Atari, 60 fields per second with 262.5 scanlines each where every other field is offset by 1/2 scanline vertically on standard NTSC. The time required to draw a scanline and the vertical sweep speed on the CRT is the same, else the TV wouldn't sync to the video signal. On standard TV sets, the electron beam inside the CRT is piloted in real time by the video signal, the TV doesn't "store" anything. Almost every one does that... just pick one! As I said before, in TV transmission which were recorded on video, motion is at 60fps. That's twice as smooth.... Each field is captured at a different time and in fact a deinterlacing filter is needed to see those videos on a progressive display, else they show "combing" effect on moving objects. I expressed my opinion about the document you posted in my first reply and I don't have anything to add to this subject. You have some serious misconceptions about how a CRT works, and I suggest to research a bit on the argument to clear things up. Excellent point Alex, I meant to say LCD not CRT; interlacing doubles up the frames without the deinterlacing filter as you just explained, that's what I was referring to. I don't believe film was shot at 60 FPS in the 80's, a lot of film was lower than 30 FPS. The "field" is only half the screen, the Atari signal and the Atari programming manual refer only to a full screen update. The rich Atari video signal indeed outputs wider scanlines otherwise 200 of them couldn't fill the visible screen and we'd need two passes to build them each 2 lines wide if there was no beam spread! Really think every game does 60 FPS of animation? I think this is miscommunication unless you can illustrate a single game that is rendering even 30 FPS of animation; there are smooth scrolling demo's like this one that was posted in response to StarBlitz, but I haven't seen any other games that scroll more smoothly than Super Cobra from the 80's. Perhaps you mean something else besides scrolling/panning the camera? Quote Share this post Link to post Share on other sites
roland p #9 Posted March 6, 2016 (edited) The rich Atari video signal indeed outputs wider scanlines otherwise 200 of them couldn't fill the visible screen and we'd need two passes to build them each 2 lines wide if there was no beam spread! Could you give a technical explanation? Alex' explanation is spot-on and I have nothing to add; the atari just has to conform to the standards of a video signal (probably it is a little bit off, but it's close enough and within the tolerance of crt-tv's). Edited March 6, 2016 by roland p 1 Quote Share this post Link to post Share on other sites
Joe Musashi #10 Posted March 6, 2016 Really think every game does 60 FPS of animation? I think this is miscommunication unless you can illustrate a single game that is rendering even 30 FPS of animation; there are smooth scrolling demo's like this one that was posted in response to StarBlitz, but I haven't seen any other games that scroll more smoothly than Super Cobra from the 80's. Perhaps you mean something else besides scrolling/panning the camera? It seems with "60 FPS of animation" you mean shifting the playfield horizontally at 60Hz. This is indeed not very common, but it has been done before. For example, Empire Strikes Back does this. 1 Quote Share this post Link to post Share on other sites
Mr SQL #11 Posted March 7, 2016 Could you give a technical explanation? Alex' explanation is spot-on and I have nothing to add; the atari just has to conform to the standards of a video signal (probably it is a little bit off, but it's close enough and within the tolerance of crt-tv's). I like Alex's explanations Roland, but I'm also interested in hearing your thoughts about this: The rich Atari video signal indeed outputs wider scanlines otherwise 200 of them couldn't fill the visible screen and we'd need two passes to build them each 2 lines wide if there was no beam spread! How would you explain it? It seems with "60 FPS of animation" you mean shifting the playfield horizontally at 60Hz. This is indeed not very common, but it has been done before. For example, Empire Strikes Back does this. Good example Joe but the scrolling jerks a bit in Empire Strikes Back, it may be more smooth in spots than Super Cobra though. The demo linked above and StarBlitz both scroll super smooth by comparison. The runtime does vertical, horizontal or diagonal scrolling and surfaces a camera object to the BASIC: Here are some earlier versions of the engine doing diagonal scrolling; 30 fps animation but not as optimized a 30 hz as StarBlitz (too much on screen, not enough black). The horizontal scroller shifts back and forth between 60 and 30 hz depending upon weather the camera is being panned. Quote Share this post Link to post Share on other sites
+Random Terrain #12 Posted March 7, 2016 What I got from the discussion is that a couple of junior programmers followed me onto the emulator thread throwing insults and calling names when they couldn't reproduce the problem and became angry calling the game "the worst display ever" while simultaneously lamenting repeatedly that their own video of StarBlitz display running on broken hardware looked "too good". When I search for "the worst display ever" all I get is this thread. Do you have a link? 1 Quote Share this post Link to post Share on other sites
roland p #13 Posted March 7, 2016 I like Alex's explanations Roland, but I'm also interested in hearing your thoughts about this: The rich Atari video signal indeed outputs wider scanlines otherwise 200 of them couldn't fill the visible screen and we'd need two passes to build them each 2 lines wide if there was no beam spread! How would you explain it? 1. How do you experience 'wider' scanlines? Is there less overscan area? Are there less black borders left/right? 2. What has the horizontal width to do with the number '200', since the number 200 indicates a vertical space. 3. '2 lines wide', I still don't think you can make the lines wider. (widescreen displays can though ) Can you describe this process? I've looked at your game, starblitz, and on my laptop, with the brightness turned down, the scrolling looked nice. There are indeed a black frame every other frame so that causes flicker, but reduces the motion blur otherwise caused by a 'sample and hold effect'. The effect is nice, but I could imagine that it's unbearable on a crt with the brightness turned up. My thoughts on 'Manual Flicker Optimization': You should only use it if there is absolutely no other way. If you use it, you should use muted colors (no max brightness), so the flicker is less visible. 1 Quote Share this post Link to post Share on other sites
Mr SQL #14 Posted March 7, 2016 1. How do you experience 'wider' scanlines? Is there less overscan area? Are there less black borders left/right? 2. What has the horizontal width to do with the number '200', since the number 200 indicates a vertical space. 3. '2 lines wide', I still don't think you can make the lines wider. (widescreen displays can though ) Can you describe this process? I've looked at your game, starblitz, and on my laptop, with the brightness turned down, the scrolling looked nice. There are indeed a black frame every other frame so that causes flicker, but reduces the motion blur otherwise caused by a 'sample and hold effect'. The effect is nice, but I could imagine that it's unbearable on a crt with the brightness turned up. My thoughts on 'Manual Flicker Optimization': You should only use it if there is absolutely no other way. If you use it, you should use muted colors (no max brightness), so the flicker is less visible. Great questions Roland, here's how I percieve it: I think wider means vertical width when talking about scanlines. Only half as many scanlines are required to paint the entire screen - the higher intensity signal from the Atari saturates all of the phosphor with a single pass that would otherwise paint only one field. ie, 200 scanlines can paint the entire visible screen instead of the 400 scanlines normally required. How could this be possible if the 200 scanlines are not each twice as wide vertically? The richer signal creates spread in the electron beam. The scrolling effect is even nicer on CRT because there is phosphor persistence in lieu of a black frame and augmented greatly by artifacting with specific Atari models like the Vader provided there is no composite mod; the combination creates a plasma glow and other cool effects. I don't know what happens when you change the brightness though, I leave the factory settings right in the middle defaults. Agree on optimizing the colors, they are optimized for NTSC but I don't have a PAL optimized version yet so the game will not look as good on PAL and even when I've released a PAL version it cannot ever match NTSC because of the artifacting. Quote Share this post Link to post Share on other sites
alex_79 #15 Posted March 7, 2016 (edited) I don't believe film was shot at 60 FPS in the 80's, a lot of film was lower than 30 FPS. In fact I specified twice that I'm talking about content shot on "video", not film. Films are converted into video suitable for broadcasting through a technique called "Telecine", and that's another story and not related to the topic, there are many places where you can find info about that if you're interested. The "field" is only half the screen, the Atari signal and the Atari programming manual refer only to a full screen update. What in the manual is called "frame" (correctly, because since there'e not interlacing it doesn't make much sense to use the term "field"), happens to be, from the point of view of the actual video signal and actual display shown on the TV, the exact same thing as a single "field" in the interlaced broadcast video with just the vertical pulse triggered on a slighty different time. It has been explaied several times here and elsewhere in the forums. Really think every game does 60 FPS of animation? I think this is miscommunication unless you can illustrate a single game that is rendering even 30 FPS of animation; there are smooth scrolling demo's like this one that was posted in response to StarBlitz, but I haven't seen any other games that scroll more smoothly than Super Cobra from the 80's. Perhaps you mean something else besides scrolling/panning the camera? You keep mixing terms. "Animation" is not synonym for "scrolling". If you meant specifically scrolling the playfield, intended as the graphic object tied to registerd PF0,PF1 and PF2 of the TIA, and with 60fps you mean that you have to scroll 1 PF "pixel" that is 4 TIA color clocks for each of the 60 frames in a second, then "The Empire Strikes Back" as mentioned by Joe Musashi does exactly that when you're moving at max speed (and without flicker). The fact that other games don't scroll that fast isn't due to the fact that they can't because the don't use a coprocessor, but rather because that's a very fast scrolling that only fits certain types of games. The 2600 doesn't use a bitmapped display, and there are almost infinite ways to mix the available graphics objects, as well as using advanced tricks like changing the registers for color and/or graphics on the fly to achieve complex displays. The static playfield is mixed with moveable objects that can also be duplicated and optimizations on the way the graphic data is stored (or generated by specific algorithms) are also used by skilled programmers, so it doesn't make any sense, in my opinion, to affirm that there's a well defined limit on how fast the display can updated on the 2600 or what's the max number of pixels that you can update on a frame, as each game is a case on its own. ...the higher intensity signal from the Atari.. There are full schematics available for the 2600, could you explain how exactly it generates a signal which is "higher intensity" and compared to what? Is the atari output not within the analog broadcast signal specs? If so, how can the TV display it? Progressive display can look brighter than interlaced, because each scanline is updated at twice the frequency. ...saturates all of the phosphor with a single pass that would otherwise paint only one field. ie, 200 scanlines can paint the entire visible screen instead of the 400 scanlines normally required. How could this be possible if the 200 scanlines are not each twice as wide vertically? The richer signal creates spread in the electron beam. That resoning, if we can call it that way, is backwards. First you have to demonstrate scientifically that the signal from the Atari "creates spread in the electrom beam" (whatever that means), and only then you can conclude that the scanlines are "fatter". As better explained by others before, the scanlines generated by a progressive 60Hz signal are exactly as wide as those in each field in a 60Hz interlaced signal. The gaps between the scanlines are smaller than the lines themselves so that they would partially "overlap" if you could take still pictures of each field and overimpose them. The fact is that when the second field is drawn, the phosphors activated by the previous one have decayed already, the persintence in mostly in our brain (and each individual has different visual persistence, that's way some are more sensitive to flicker). When they alternate at 60Hz you really can't spot the overlapping area and scanlines appear thinner than on a progressive 60Hz display. They are not, though. A single field looks exactly like a single "frame" in an Atari game. This old demo by Billy Eno (http://atariage.com/forums/topic/249102-calling-all-harmony-cart-users-for-science/page-5?do=findComment&comment=3442797) produces on real hardware a 60Hz interlaced display like broadcast NTSC. In the upper part of the screen you can see some thin diagonal lines on a black background. If you look closely at the pixels (each one is on a different scanline) you'll see how their would overlap if vertically alligned. The scanlines "looks" thinner on solid areas in the bottom of the screen, where they in fact overlap. So, how is that? Is there something inside the console that detect if you're sending VSYNC pulses with the correct timing to generate an interlaced signal and turns down the "fat scanline generator" in that case? Can you point me what part of the circuit does that in the TIA schematic? The scrolling effect is even nicer on CRT because there is phosphor persistence in lieu of a black frame and augmented greatly by artifacting with specific Atari models like the Vader provided there is no composite mod; the combination creates a plasma glow and other cool effects. Claiming that your display is triggering "artifacting" requires a technical explanation if you wants to be taken seriously. You must explain what triggers the artifact on scientific basis. Anyone with the necessary technical skills should be able to build a circuit that generates a video signal which shows the artifacting based on your explanation. Just to be clear, what you experience on your specific TV set with your specific atari is not a proof that your triggering some sort of artifacting. For what we know, what you see could be caused by a TV which is out of convergence or which needs new capacitors, or it might be the console itself to need servicing. For example the Game "Tower Toppler" on the 7800 creates a solid color by showing alternating on and off pixels at double the NTSC coloburst clock. You can find that "trick" documented on many places (here, for example https://sites.google.com/site/atari7800wiki/graphics-modes/color-artifacts).It is known why this happen and why it will show on stock RF and composite modded consoles and why it won't work on S-video modded ones. You're saying that this "artifacting" will show on some Atari models and only on RF. What triggers the artifacting in the video signal? What circuitry which is present in "some vader models" and not on others cause this difference in the video signal? You're saying that it won't work on video modded consoles. Why? Is the artifacting the results of the RF modulation? How exactly? Does it happen on both channel 2 and 3? Suppose I have a NTSC vader console wich shows this artifact and I disconnect the RF modulator and install a simple trasitor amplyfier to get composite. If I then send the composite output through an external RF modulator so that it outputs on channel 2 or 3 like the stock atari, will the artifact be there? Whatever the answer is, why? Edited March 7, 2016 by alex_79 1 Quote Share this post Link to post Share on other sites
Mr SQL #16 Posted March 7, 2016 When I search for "the worst display ever" all I get is this thread. Do you have a link? Sure RT, http://atariage.com/forums/topic/249102-calling-all-harmony-cart-users-for-science/?p=3442039 Jealous of the BASIC smooth scroller! Initially they volunteered to help improve the emulator when StarBlitz broke it 5 different ways: http://atariage.com/forums/topic/248509-starblitz/?p=3438889 In fact I specified twice that I'm talking about content shot on "video", not film. Films are converted into video suitable for broadcasting through a technique called "Telecine", and that's another story and not related to the topic, there are many places where you can find info about that if you're interested. What in the manual is called "frame" (correctly, because since there'e not interlacing it doesn't make much sense to use the term "field"), happens to be, from the point of view of the actual video signal and actual display shown on the TV, the exact same thing as a single "field" in the interlaced broadcast video with just the vertical pulse triggered on a slighty different time. It has been explaied several times here and elsewhere in the forums. You keep mixing terms. "Animation" is not synonym for "scrolling". If you meant specifically scrolling the playfield, intended as the graphic object tied to registerd PF0,PF1 and PF2 of the TIA, and with 60fps you mean that you have to scroll 1 PF "pixel" that is 4 TIA color clocks for each of the 60 frames in a second, then "The Empire Strikes Back" as mentioned by Joe Musashi does exactly that when you're moving at max speed (and without flicker). The fact that other games don't scroll that fast isn't due to the fact that they can't because the don't use a coprocessor, but rather because that's a very fast scrolling that only fits certain types of games. The 2600 doesn't use a bitmapped display, and there are almost infinite ways to mix the available graphics objects, as well as using advanced tricks like changing the registers for color and/or graphics on the fly to achieve complex displays. The static playfield is mixed with moveable objects that can also be duplicated and optimizations on the way the graphic data is stored (or generated by specific algorithms) are also used by skilled programmers, so it doesn't make any sense, in my opinion, to affirm that there's a well defined limit on how fast the display can updated on the 2600 or what's the max number of pixels that you can update on a frame, as each game is a case on its own. There are full schematics available for the 2600, could you explain how exactly it generates a signal which is "higher intensity" and compared to what? Is the atari output not within the analog broadcast signal specs? If so, how can the TV display it? Progressive display can look brighter than interlaced, because each scanline is updated at twice the frequency. That resoning, if we can call it that way, is backwards. First you have to demonstrate scientifically that the signal from the Atari "creates spread in the electrom beam" (whatever that means), and only then you can conclude that the scanlines are "fatter". As better explained by others before, the scanlines generated by a progressive 60Hz signal are exactly as wide as those in each field in a 60Hz interlaced signal. The gaps between the scanlines are smaller than the lines themselves so that they would partially "overlap" if you could take still pictures of each field and overimpose them. The fact is that when the second field is drawn, the phosphors activated by the previous one have decayed already, the persintence in mostly in our brain (and each individual has different visual persistence, that's way some are more sensitive to flicker). When they alternate at 60Hz you really can't spot the overlapping area and scanlines appear thinner than on a progressive 60Hz display. They are not, though. A single field looks exactly like a single "frame" in an Atari game. This old demo by Billy Eno (http://atariage.com/forums/topic/249102-calling-all-harmony-cart-users-for-science/page-5?do=findComment&comment=3442797) produces on real hardware a 60Hz interlaced display like broadcast NTSC. In the upper part of the screen you can see some thin diagonal lines on a black background. If you look closely at the pixels (each one is on a different scanline) you'll see how their would overlap if vertically alligned. The scanlines "looks" thinner on solid areas in the bottom of the screen, where they in fact overlap. So, how is that? Is there something inside the console that detect if you're sending VSYNC pulses with the correct timing to generate an interlaced signal and turns down the "fat scanline generator" in that case? Can you point me what part of the circuit does that in the TIA schematic? Claiming that your display is triggering "artifacting" requires a technical explanation if you wants to be taken seriously. You must explain what triggers the artifact on scientific basis. Anyone with the necessary technical skills should be able to build a circuit that generates a video signal which shows the artifacting based on your explanation. Just to be clear, what you experience on your specific TV set with your specific atari is not a proof that your triggering some sort of artifacting. For what we know, what you see could be caused by a TV which is out of convergence or which needs new capacitors, or it might be the console itself to need servicing. For example the Game "Tower Toppler" on the 7800 creates a solid color by showing alternating on and off pixels at double the NTSC coloburst clock. You can find that "trick" documented on many places (here, for example https://sites.google.com/site/atari7800wiki/graphics-modes/color-artifacts).It is known why this happen and why it will show on stock RF and composite modded consoles and why it won't work on S-video modded ones. You're saying that this "artifacting" will show on some Atari models and only on RF. What triggers the artifacting in the video signal? What circuitry which is present in "some vader models" and not on others cause this difference in the video signal? You're saying that it won't work on video modded consoles. Why? Is the artifacting the results of the RF modulation? How exactly? Does it happen on both channel 2 and 3? Suppose I have a NTSC vader console wich shows this artifact and I disconnect the RF modulator and install a simple trasitor amplyfier to get composite. If I then send the composite output through an external RF modulator so that it outputs on channel 2 or 3 like the stock atari, will the artifact be there? Whatever the answer is, why? Alex, You explained "your brain fills the gaps between scanlines" previously when describing a field; now you say the Atari signal is "the exact same thing as a field" from the point of view of the display. I agree with your first statement since there are no gaps between scanlines in the Atari signal. Here's a picture of one of my games from the 80's; It shows up in black and white if you do a composite mod. Can you tell me why? Hypothetical discussion is interesting too, but my game designs from then and now are tangible technical examples. If you know of another game smoothly animated at 30 fps please share it on this thread. The hypothetical game that runs entirely at 60 hz wouldn't need any flicker optimzations. That seems very hypothetical since 60 hz is more stable than NTSC broadcast while Atari games are associated with some degree of flicker - I tend to divide the signal just as the manual describes so I can do more with it. Quote Share this post Link to post Share on other sites
Trinity #17 Posted March 7, 2016 Round two,,, Fight!!! 1 Quote Share this post Link to post Share on other sites
+SpiceWare #18 Posted March 7, 2016 I knew I shouldn't have clicked on "view it anyway?" Spice looks like you want to participate - that's fine, take me off ignore and don't talk potty mouth and you can discuss directly otherwise please stay off the thread So warning people that you have a proven track record of ignoring reality and twisting what other people have said in order to support your ignorance is now considered "potty mouth?" Well, at least that was a good laugh. Initially they volunteered to help improve the emulator when StarBlitz broke it 5 different ways: http://atariage.com/forums/topic/248509-starblitz/?p=3438889 The issue is jitter/roll support in Stella is new and it doesn't correctly emulate Starblitz' incorrectly written VSYNC routine. This is my Todo list for the next time I work on Stella: Screen roll in StarBlitz - VSYNC's shorter than 3 scanlines. Could be the source of roll, need to write a test program to test the results of different VSYNC lengths. http://atariage.com/forums/topic/248950-new-code-making-stella-47-display-inconsistant/?p=3437474 received a PM, StarBlitz does not have VBLANK turned on during VSYNC. Non-black scanlines during Vertical Sync can cause the display to be unable to "see" the sync signal, and since the screen rolls occur during background flashes this is most likely the source of the screen roll rather than the short VSYNC duration. Noticed that debugger triggers screen jitter/roll logic when it shouldn't: http://atariage.com/forums/topic/249107-stella-471-released/?p=3445547 The test program would be like the one I wrote to test screen rolls due to variations in scanline counts. The new test ROM will let me change the duration of VSYNC signal, as well as whether or not VLBANK is on or off during VSYNC. Being able to replicate the problem goes a long way towards emulating it correctly. Don't know when I'll have time to write it it, but it won't be until next month at the earliest. So in spite of your implication, I do plan to fix it so that your poorly written code will trigger screen rolls in Stella. Anyhow lesson learned - "view it anyway?" shall remain unclicked. 2 Quote Share this post Link to post Share on other sites
Mr SQL #19 Posted March 7, 2016 I knew I shouldn't have clicked on "view it anyway?" So warning people that you have a proven track record of ignoring reality and twisting what other people have said in order to support your ignorance is now considered "potty mouth?" Well, at least that was a good laugh. The issue is jitter/roll support in Stella is new and it doesn't correctly emulate Starblitz' incorrectly written VSYNC routine. This is my Todo list for the next time I work on Stella: The test program would be like the one I wrote to test screen rolls due to variations in scanline counts. The new test ROM will let me change the duration of VSYNC signal, as well as whether or not VLBANK is on or off during VSYNC. Being able to replicate the problem goes a long way towards emulating it correctly. Don't know when I'll have time to write it it, but it won't be until next month at the earliest. So in spite of your implication, I do plan to fix it so that your poorly written code will trigger screen rolls in Stella. Anyhow lesson learned - "view it anyway?" shall remain unclicked. Potty mouth is throwing insults and calling names in lieu of exchanging ideas; close your mind, miss the fun This seems like more miscommunication; I'm open to your idea about exact VSYNC timing and wrote the test code for our experiments, but I also implemented code to test the red shift. Turns out the timing change made no difference (2.9 was close enough), but your setup is overly sensitive to red. One reason could be adjusting the vertical hold to play PAL games but not turning it back precisely. Another could be your composite mod; I've put the red flash back in when meteors disintigrate against the ion shield or are taken out by the drones because in reality meteors get vaporized and disperse rapidly into the ether when that happens, causing a red flash. imo there's no credible reason so far to make the emulator break more with StarBlitz. Keatah identified a big list of emulator bugs with StarBlitz and solved the shaking and jittering issue where Windows 10 had simply changed the hooks for Direct3D and OpenGL; that's an easy solution to implement. If you can find bugs in StarBlitz, I'll help you test for them too to improve the emu. It's more fun when programmers disagree and have different ideas and perspectives to exchange, and besides your games are inspiring - I want to make a 60 hz version of Virtual World BASIC with ZackAttack's engine when he finishes it, and after I've finish porting the runtime to the 8-bit/5200. You should take me off ignore and add me to the friend list, inspire the other developers! :) Quote Share this post Link to post Share on other sites
+Random Terrain #20 Posted March 7, 2016 Sure RT, http://atariage.com/forums/topic/249102-calling-all-harmony-cart-users-for-science/?p=3442039 Jealous of the BASIC smooth scroller! Initially they volunteered to help improve the emulator when StarBlitz broke it 5 different ways: http://atariage.com/forums/topic/248509-starblitz/?p=3438889 OK, thanks. So you were paraphrasing. That's why I couldn't find anything. 1 Quote Share this post Link to post Share on other sites
roland p #21 Posted March 7, 2016 The ballblazer kernel uses a two line kernel which is just blank every other line. I once had the idea to 'interlace' it, so one frame the odd lines where used, and the other frame the even lines where used. This way, I could effectively increase the vertical resolution. It also resulted in smoother vertical motion (forward/backwards). Well, the audience didn't liked it so I dropped it. But it was an interesting effect anyway. It appears that people prefer things that don't flicker ballblazer_sort_of_interlaced.bin 3 Quote Share this post Link to post Share on other sites
alex_79 #22 Posted March 7, 2016 (edited) Alex, You explained "your brain fills the gaps between scanlines" previously when describing a field; now you say the Atari signal is "the exact same thing as a field" from the point of view of the display. I agree with your first statement since there are no gaps between scanlines in the Atari signal. Sure. Here's a picture of one of my games from the 80's; It shows up in black and white if you do a composite mod. Can you tell me why? Nope, and it would be hard to say, anyway, based only on a tiny image without any information about what hardware is supposed to generate that picture. But I could understand a well documented technical explanation (I'd surely find it interesting). But again, back on the VCS, you didn't answer my question. How could you claim that you're using your programming skills to trigger video artifacts that enhance the picture if you have no idea what is causing those supposed artifacts? Do you know anyone with tech skills that can confirm that those aren't just the results of your specific set up and that can explain how they work? It's a reasonable question, since you clearly have no clue about how an analog TV works. Hypothetical discussion is interesting too, but my game designs from then and now are tangible technical examples. If you know of another game smoothly animated at 30 fps please share it on this thread. The hypothetical game that runs entirely at 60 hz wouldn't need any flicker optimzations. That seems very hypothetical since 60 hz is more stable than NTSC broadcast while Atari games are associated with some degree of flicker "The Empire strikes back" scrolls the playfield at 60fps without flicker. It has been mentioned twice already in this thread, and you just ignored it. Anyone can verify that by just running it frame by frame in the Stella debugger. "Thrust" scrolls the playfield at 30fps again without flickering the entire display. Many other good games with smooth scrolling (using also other graphic elements to draw the screen, not just the PF) are mentioned in one of the threads linked by Spiceware in a previous post. There's a nice collection of roms on Atarimania. In that zip archive you can find over 1800 "hypothethical" 60Hz games. StarBlitz is the only game entirely flickering at 30Hz that I'm aware of. I tend to divide the signal just as the manual describes so I can do more with it. I didn't read the entire "De Re Atari" manual, as I'm not interested in atari 8-bit computers, but I'm pretty sure that it doesn't say anywhere that flickering the entire display is considered "standard" or "good practice". Take care! Edited March 7, 2016 by alex_79 1 Quote Share this post Link to post Share on other sites
Mr SQL #23 Posted March 8, 2016 Sure. Nope, and it would be hard to say, anyway, based only on a tiny image without any information about what hardware is supposed to generate that picture. But I could understand a well documented technical explanation (I'd surely find it interesting). But again, back on the VCS, you didn't answer my question. How could you claim that you're using your programming skills to trigger video artifacts that enhance the picture if you have no idea what is causing those supposed artifacts? Do you know anyone with tech skills that can confirm that those aren't just the results of your specific set up and that can explain how they work? It's a reasonable question, since you clearly have no clue about how an analog TV works. "The Empire strikes back" scrolls the playfield at 60fps without flicker. It has been mentioned twice already in this thread, and you just ignored it. Anyone can verify that by just running it frame by frame in the Stella debugger. "Thrust" scrolls the playfield at 30fps again without flickering the entire display. Many other good games with smooth scrolling (using also other graphic elements to draw the screen, not just the PF) are mentioned in one of the threads linked by Spiceware in a previous post. There's a nice collection of roms on Atarimania. In that zip archive you can find over 1800 "hypothethical" 60Hz games. StarBlitz is the only game entirely flickering at 30Hz that I'm aware of. I didn't read the entire "De Re Atari" manual, as I'm not interested in atari 8-bit computers, but I'm pretty sure that it doesn't say anywhere that flickering the entire display is considered "standard" or "good practice". Take care! I didn't ignore Empire Strikes Back Alex but you'll have to reread my reply to Joe to make sure; I pointed out the movement was jerky and similar to Super Cobra by comparison to both StarBlitz and the demo posted on pouet. Does it seem super smooth to you? Ditto for Thrust, and the objects double up on the bottom like double vision - that's not even as smooth as Super Cobra. If you can take a smooth scrolling demo and find enough cycles to turn it into a game you can go up against this BASIC program with your Assembly skills The exact page from the manual I posted describes this design for very conservative use of flicker - you can't get more conservative than 30 hz; good point about not filling the entire display, you can see the issues that causes in my diagonal scroller. That's also perspective; does it appear to you I've filled the entire display in StarBlitz or is it mostly 0 hz (black)? Please post a link to a super smooth scroller that's a game and not a demo if you want to keep insisting they exist and claiming I ignore them. I'd really like to see one, I don't think it's been done before. Quote Share this post Link to post Share on other sites
Mr SQL #24 Posted March 8, 2016 The ballblazer kernel uses a two line kernel which is just blank every other line. I once had the idea to 'interlace' it, so one frame the odd lines where used, and the other frame the even lines where used. This way, I could effectively increase the vertical resolution. It also resulted in smoother vertical motion (forward/backwards). Well, the audience didn't liked it so I dropped it. But it was an interesting effect anyway. It appears that people prefer things that don't flicker BallBlazer has awesome effects, your latest release looks very solid The 30 hz in StarBlitz only seems outrageous to a couple of programmers, most folk like the display as you did. It's also an automatic deinterlacing filter if you have the Atari connected to an LCD; you won't see the black frames like on your laptop. 1 Quote Share this post Link to post Share on other sites
+Random Terrain #25 Posted March 8, 2016 I didn't ignore Empire Strikes Back Alex but you'll have to reread my reply to Joe to make sure; I pointed out the movement was jerky and similar to Super Cobra by comparison to both StarBlitz and the demo posted on pouet. Does it seem super smooth to you? Ditto for Thrust, and the objects double up on the bottom like double vision - that's not even as smooth as Super Cobra. I just loaded The Empire Strikes Back in Stella, and if you start the game, then tap the left arrow key 4 or 5 times, the playfield at the bottom seems to scroll pretty smoothly. Go full speed and the mountains at the top scroll smoothly. I turned on debug colors mode in Stella and they don't seem to be using the ball or anything else, so I don't know why the playfield seems to scroll so smoothly at certain speeds. 1 Quote Share this post Link to post Share on other sites