Jump to content
IGNORED

Atari 8bit is superior to the ST


Marius

Atari 8bit is superior to the ST  

210 members have voted

  1. 1. Do you agree?

    • Yes; Atari 8bit is superior to ST in all ways
    • Yes; Atari 8bit is superior to ST in most ways
    • NO; Atari ST is superior to 8bit in all ways
    • NO; Atari ST is superior to 8bit in most ways
    • NO; Both systems are cool on their own.

  • Please sign in to vote in this poll.

Recommended Posts

The crappy YM chip provides bitwise I/O similar to a PIA. So maybe they went that route rather than the alternative of Pokey, which would have meant they'd also need a PIA/CIA chip.

 

The other possibility is that using something other than Pokey differentiates the machine from the XE.

 

Yet another one is that maybe Pokey wouldn't work at 2 MHz. They could use a /8 and run it at 1 MHz but that affects the tone range. Most Pokey applications (ie arcade) ran it at speeds between 1.2 and 1.79 Mhz.

 

I seem to remember it being mentioned in a previous discussion that POKEY would have trouble interfacing with the ST architecture (or 68k system in general) for some reason, making its I/O capabilities useless. (otherwise it could have been used for keyboard reading as well) If that had been possible, it wouldn't replace the YM2149's I/O functionality, so the 2 would need to be used together. (although that wouldn't make too bad a combo for sound, dual POKEYs probably better, lest alone quad) I still think the YM2203 is the simplest/cleanest option: available in 1985, same I/O (and sound) functionality as the 2149 but adding 3 4-operator FM synthesis voices. (4-op being significantly more capable than 2-op of the OPL series like in Adlib/SB, and the type used in the popular YM2151 in arcade/x68000 plus the chips used in the Genesis and Neo Geo, plus various other arcade machine -often the 2151, but sometimes others)

Link to comment
Share on other sites

The very first example I gave was Amiga monitor which does RGB at 640+ resolution but cannot do the NTSC composite at 640+ resolution (using example of ". In fact pluggin in other svideo/video sources also fails to do 640+ resolution. I don't know where you get your information, but there are articles that state clearly that horizontal resolution limiting factor isn't just the target TV set. Here's one.

 

http://www.hometheaterhifi.com/volume_6_3/essay-video-resolution-july-99.html

 

Of course, you can prove it for yourself given the bandwidth used to encode the luminance. Of course, the color signal is much lower resolution. And for your information, the VGA to NTSC encoder was plugged into a TV capable of HDTV.

Umm, if it was an HDTV, that's even less relivant, especially if it was an LCD/plasma, but even CRTs would have similar problems, SDTV tends to look worse on HDTVs than native CRT SDTVs (or even LCD SDTVs). If that was your case, you'd be scaling your VGA signal to Standard Definition resolution (probably 640x480 for simplicity of square pixels) and outputs a 480i signal, the HDTV can't use that for display (even CRTs), so it deinterlaces and filters it, then scales it to the TVs native resolution. So you're losing a ton of detail. (best case is with a HDCRT, but even then you loose a lot) Which is also why it's really silly for people to get HDTVs who tend to use SD content most often, or use cheaper cable/satellite decoder boxes with only composite/RF output.

 

As I understand it, the only hard standard for NTSC (or analog broadcast in general) is the number of scanlines used, with any interpretation of pixels made as an approximation from the quality of the analog signal used. (unless the output device is digital itsself, converting to analog, in which cse there is a finite number of pixels output)...

The bandwidth plays a role in determining what resolutions you can encode. For DVDs and other things, they usually round to 640*480 just to keep aspect ratio not because it can actually deal with 640 pixels although each one would have some effect fractionally.

 

There's bandwidth limitations of the method of transmition (RGB~YPbPr>Y/C>composite), and that's certainly part of it, but just how blurry a composite signal looks is not only determined by the bandwidth limitations (also by things like comb filters and the sourse's encoder). That's also why composite and S-Video aren't (supposed to be) used for anything higher than 480i, not ED/HD content. (you technically can display them, and some devices will, but the display won't have all of the pixels discernable -well, except without color -as luma would have the same bandwidth as with YPbPr)

 

Also, Digital Video formats used for NTSC tend to either use 704x480i or 720x480i (and same for PAL except 576i), in th ecase of DV NTSC format, 720x480i. And that can either be used fr a 4:3 display or 16:9 (anamorphic) with the pixel aspect ratio varying (neither aspect ratio using square pixels). So conveinience of using square pixels is definitely not the reson for using such high horizontal resolutions. (and in the case of DV NTSC, it's definitely 720x480i, if you set it to square pixels it will look off and it is most definitely interlaced -recored as such, combing is quite priminent with fast panning)

 

Interesting article too, but didn't really tell me anything specific I didn't already know.

 

Why? Some games/applications work fine at 160*240 on A8 with or without DLIs. That's an advantage of A8-- that it can deal with these modes at a faster throughput than Atari ST. That's why some of the games on A8 are better than their ports on ST.

 

OK, scratch the 192 part, but the 160 pixel width still stands and the lower color count as well. (since the A8 can't be direcly compared to ST at 320x200x16-colors, the only way to go is having the ST drop to something comparable to the A8) If not clipping it to 160 pixels, maybe have each pixel repeated (like the low detial mode in Doom), thus keeping the full window size.

Edited by kool kitty89
Link to comment
Share on other sites

 

Yes, ST can do software sprites, but A8 wins in games where hardware sprites are predominant (like where 4 p/m per scanline are multiplexed).

 

 

And this would be which game(s) exactly?

 

are u seriously trying to convince me as an ex 8 and 16 bit developer that the a8s paltry sprite hardware, which doesnt even come close to a c64s sprite ability, is now also better than having an 8mhz cpu chucking sw sprites at the screen?

 

so lets recap, u have:

 

4 tiny player mono player sprites and 4 even tinier mono missile sprites at 1.8mhz vs any size of sw sprite in 16 colour 4 bitplane or 8 colour 3 bitplane being whacked on the screen at 8mhz.

 

which of these do u think a developer would rather have i wonder?

 

and just how many 16 pixel wide a8 size sprites do u think an ST could whack onto the screen inside 1 frame? tell u what why dont u go and code it to find out?

 

and lastly, if this a8 super multiplexing in reality any use, why does everything after 1984 have to use sw sprites? funny that isnt it?

 

Steve

Link to comment
Share on other sites

 

Yes, ST can do software sprites, but A8 wins in games where hardware sprites are predominant (like where 4 p/m per scanline are multiplexed).

 

 

And this would be which game(s) exactly?

 

are u seriously trying to convince me as an ex 8 and 16 bit developer that the a8s paltry sprite hardware, which doesnt even come close to a c64s sprite ability, is now also better than having an 8mhz cpu chucking sw sprites at the screen?

...

Let's not talk C64 again-- it has other problems when compared with A8 hardware not sprites. I stated the following:

 

(1) Atari 8-bit has sprites and ST does not nor cannot it make up for them with its faster CPU.

(2) Atari 8-bit has more shades and the ST cannot make up for it even with more colors.

(3) Atari 8-bit has faster joystick I/O than ST.

(4) Atari 8-bit can update a scrolling display faster although at lower resolutions thanks to ANTIC's various modes and features.

(5) Atari 8-bit has easier overscanning capability

Haven't tested all aspects of Atari ST hardware to draw any conclusions or give some more points but the above should be obvious.

 

The SPRITES cannot be made up for by the faster ST CPU; i.e., the time it takes me to write to HPOS registers is much faster than erasing and redrawing on the screen. In fact, using the cycle-free mode of A8 sprites, updating HPOS is faster than most modern video cards what to speak of a simple 8Mhz 68K CPU with it's inefficient memory cycles.

 

so lets recap, u have:

...

I just recapped it above. I am NOT comparing entire graphics system, but just the sprites in (1).

 

which of these do u think a developer would rather have i wonder?

...

Both is better than one. I would rather have software and hardware sprites rather than just software. Some games like River-raid, Joust, and other I mentioned get away with just hardware sprites.

 

and just how many 16 pixel wide a8 size sprites do u think an ST could whack onto the screen inside 1 frame? tell u what why dont u go and code it to find out?

...

I'll tell you what-- use logic and you won't have to waste time coding it. Compute the pixels rendered by A8 hardware sprites per second and then compute how many you can do on ST w/CPU full throttle.

 

and lastly, if this a8 super multiplexing in reality any use, why does everything after 1984 have to use sw sprites? funny that isnt it?

 

Steve

 

Amiga, C64, A8, A7800, A2600, etc. (all those with some gaming in mind) used hardware sprites. PC/Apple had gaming as an after-thought so they relied on inferior software sprites. And I don't see that as a valid argument that just because people stopped using them, they aren't that useful.

Link to comment
Share on other sites

The very first example I gave was Amiga monitor which does RGB at 640+ resolution but cannot do the NTSC composite at 640+ resolution (using example of ". In fact pluggin in other svideo/video sources also fails to do 640+ resolution. I don't know where you get your information, but there are articles that state clearly that horizontal resolution limiting factor isn't just the target TV set. Here's one.

 

http://www.hometheaterhifi.com/volume_6_3/essay-video-resolution-july-99.html

 

Of course, you can prove it for yourself given the bandwidth used to encode the luminance. Of course, the color signal is much lower resolution. And for your information, the VGA to NTSC encoder was plugged into a TV capable of HDTV.

Umm, if it was an HDTV, that's even less relivant, especially if it was an LCD/plasma, but even CRTs would have similar problems, SDTV tends to look worse on HDTVs than native CRT SDTVs (or even LCD SDTVs). If that was your case, you'd be scaling your VGA signal to Standard Definition resolution (probably 640x480 for simplicity of square pixels) and outputs a 480i signal, the HDTV can't use that for display (even CRTs), so it deinterlaces and filters it, then scales it to the TVs native resolution. So you're losing a ton of detail. (best case is with a HDCRT, but even then you loose a lot) Which is also why it's really silly for people to get HDTVs who tend to use SD content most often, or use cheaper cable/satellite decoder boxes with only composite/RF output.

 

As I understand it, the only hard standard for NTSC (or analog broadcast in general) is the number of scanlines used, with any interpretation of pixels made as an approximation from the quality of the analog signal used. (unless the output device is digital itsself, converting to analog, in which cse there is a finite number of pixels output)...

The bandwidth plays a role in determining what resolutions you can encode. For DVDs and other things, they usually round to 640*480 just to keep aspect ratio not because it can actually deal with 640 pixels although each one would have some effect fractionally.

 

There's bandwidth limitations of the method of transmition (RGB~YPbPr>Y/C>composite), and that's certainly part of it, but just how blurry a composite signal looks is not only determined by the bandwidth limitations (also by things like comb filters and the sourse's encoder). That's also why composite and S-Video aren't (supposed to be) used for anything higher than 480i, not ED/HD content. (you technically can display them, and some devices will, but the display won't have all of the pixels discernable -well, except without color -as luma would have the same bandwidth as with YPbPr)

...

So you agree that both chroma and luma have limits.

 

Also, Digital Video formats used for NTSC tend to either use 704x480i or 720x480i (and same for PAL except 576i), in th ecase of DV NTSC format, 720x480i. And that can either be used fr a 4:3 display or 16:9 (anamorphic) with the pixel aspect ratio varying (neither aspect ratio using square pixels). So conveinience of using square pixels is definitely not the reson for using such high horizontal resolutions. (and in the case of DV NTSC, it's definitely 720x480i, if you set it to square pixels it will look off and it is most definitely interlaced -recored as such, combing is quite priminent with fast panning)

...

You can encode to whatever resolution you want but what shows depends on many factors including NTSC bandwidth and what your TV can display. And they don't choose 640*480, 704*480, etc. because NTSC can show them clearly at pixel accuracy.

 

Interesting article too, but didn't really tell me anything specific I didn't already know.

...

It made a claim that the limitations is not due to TV set alone: "In reality, there are a number of other very technical factors that limit the actual resolution you physically get, but these are beyond the scope of this introductory paper. " And it specifically states that broadcast TV is 330 horizontal resolution limit. That contradicts your absurd notion of unlimited horizontal resolution. QED.

 

Why? Some games/applications work fine at 160*240 on A8 with or without DLIs. That's an advantage of A8-- that it can deal with these modes at a faster throughput than Atari ST. That's why some of the games on A8 are better than their ports on ST.

 

OK, scratch the 192 part, but the 160 pixel width still stands and the lower color count as well. (since the A8 can't be direcly compared to ST at 320x200x16-colors, the only way to go is having the ST drop to something comparable to the A8) If not clipping it to 160 pixels, maybe have each pixel repeated (like the low detial mode in Doom), thus keeping the full window size.

 

Come on, a superior machine should be able to do anything from an inferior machine + MORE. And 160 isn't a limit either, you can do a 384*240 set up on Atari although colors are more restricted but the scrolling will be smoother. I picked 160 mode since many games use that mode, but even in this mode, you can do overscan and get 192*240 which is sufficient for many games. I am not comparing it to 320*200*16 but to full screen graphics modes. It has to handle all Atari full screen graphics modes at a faster or equal speed. It can't do the GTIA Gr.9 mode due to shading, it can't do scrolling of any modes at even close to A8 speeds, it can't do text modes either at same/faster speed than A8.

Link to comment
Share on other sites

yes very good atariksi u do love your figures dont you? never done it practically but u do like figures coz they dont show up failed on screen i suppose?

 

AND saying that the a8s hardware sprite system is better for games and u quote river raid and Joust? when the ST was being tasked with stuff like R-type? which the A8 would have no chance of doing under hardware. (even tho certain other "unmentionable" 8 bits did.

 

Lets be frank the ST could and did knock the a8s hardware sprite capability it a cocked hat.

 

and yes as 16 bit developers we would have traded much for decent hardware sprites on both ST AND Amiga. but then 10 years apart the paltry a8 hardware sprites are what u could call "decent" are they?

 

Steve

 

edit:

 

actually your comment about the amiga routinely using hardware sprites is not really true. we considered them fairly useless as their size number and colour disadvantages made them pretty useless for much more than bullets, and why write another rotine just to display bullets? however if they had had the size and capabilities of sprites from the "unmentionable" machine it would have been quite different.

Edited by STE'86
Link to comment
Share on other sites

yes very good atariksi u do love your figures dont you? never done it practically but u do like figures coz they dont show up failed on screen i suppose?

...

Oh come on, using LDA #,STA HPOS gives you plenty of time on A8 to do other things. On ST and other non-sprite based machines, you have to erase previous figure and redraw the other one. So every little helps and these processors aren't that fast so the more the support hardware the better.

 

Lets be frank the ST could and did knock the a8s hardware sprite capability it a cocked hat.

...

I would say that ST allowed for other types of games like King's Quest that didn't involve much sprite motion to be better than on A8.

 

and yes as 16 bit developers we would have traded much for decent hardware sprites on both ST AND Amiga. but then 10 years apart the paltry a8 hardware sprites are what u could call "decent" are they?

...

Creativity helps as can be seen on many A8 games that use sprites.

 

actually your comment about the amiga routinely using hardware sprites is not really true. we considered them fairly useless as their size number and colour disadvantages made them pretty useless for much more than bullets, and why write another rotine just to display bullets? however if they had had the size and capabilities of sprites from the "unmentionable" machine it would have been quite different.

 

I am not making claim as to how routinely they are used, but they were an option on Amiga. And you had blitter to help you along as well.

Link to comment
Share on other sites

I'd agree with the LDA STA HPOS statement if that was all that was involved but you know as well as anyone else does that when you move vertically you've got a load of cpu intensive clearing/drawing to do into the PMGs and that is a real disadvantage because then you're in the realms of them being close to software sprites again. Of course if you want to do games where your sprites never move Y pos you're sorted.

 

I'd say from what I've seen/coded, both machines can probably move around the same number of same sized objects. The thing is the ST ones would be 16 colour square pixel while the A8 would be mono 2:1. Unless of course you do special cases for the A8 where nothing moves.

 

 

Pete

Link to comment
Share on other sites

The very first example I gave was Amiga monitor which does RGB at 640+ resolution but cannot do the NTSC composite at 640+ resolution (using example of ". In fact pluggin in other svideo/video sources also fails to do 640+ resolution. I don't know where you get your information, but there are articles that state clearly that horizontal resolution limiting factor isn't just the target TV set. Here's one.

 

http://www.hometheaterhifi.com/volume_6_3/essay-video-resolution-july-99.html

 

Of course, you can prove it for yourself given the bandwidth used to encode the luminance. Of course, the color signal is much lower resolution. And for your information, the VGA to NTSC encoder was plugged into a TV capable of HDTV.

Umm, if it was an HDTV, that's even less relivant,...

One more think regarding this, so far I can't show 640+ pixels on Amiga monitor capable of 640+ pixels horizontally and I can't do it on HDTV. So it's like some Pegasus (flying horse) exists but you can't see it ever. So neither VCR, Amiga video out, Hi8 video cameras, etc. cannot show it but it's doable. Right, nice try.

Link to comment
Share on other sites

Your response was mostly answered, but just want to address/reinforce a few things.

.

.

 

.

It's not that narrow of an area. Given that A8 programmers will target the hardware (unless it's being ported from some other platform), it can be problematic for ST. So games like Joust, River-raid, Boulder Dash, Hero, etc. which employ some or most of the A8 hardware features cannot be outdone by ST. Of course, the ST applications targetting the higher resolution with higher color depth would be problematic for A8.

 

The ST had at least one decent port of Joust. It had all major features and animations of the arcade game, decent sound effects, and the movement the objects and response to player input were good. I spent many hours playing it. Joust at least was not as problematical as you say. Perhaps the developers had to expend some effort but my non STE 520ST had no difficulties with that one. I couldn't locate video of it but here are some screenies:

 

http://www.mobygames...ust/screenshots

 

The site with the Spectrum 512 pix had the STE pix clearly labeled though I see little reason to exclude any example of the ST line; it seems yet another example of the arbitrary rules that crop up in these threads. Be any of this as it may be the ST was generally more powerful than an A8, even the lowly 512K 520ST. I'll also note that same lowly example unit will display Spectrum 512 pix. These feature 48 colors out of 512 per scanline. The Spectrum 512 drawing program could rip and retry those colors so that an artist didn't to program as well as draw the pix. A few extra lumas here and there still don't get anywhere near that degree of freedom.

 

Like others, I'd rather mess around with an A8 rather an ST these days because the ST seems much more like a early modern machine. There are more old A8 games I'd rather play but I don't see any need for rose colored glasses directly comparing the two. Overall, there is much more the ST can do raw capability wise that is out of the A8's reach than vice versa. It was obviously more powerful to me as a teen and it remains so even though I have fonder memories of the A8.

Link to comment
Share on other sites

Your response was mostly answered, but just want to address/reinforce a few things.

.

.

 

.

It's not that narrow of an area. Given that A8 programmers will target the hardware (unless it's being ported from some other platform), it can be problematic for ST. So games like Joust, River-raid, Boulder Dash, Hero, etc. which employ some or most of the A8 hardware features cannot be outdone by ST. Of course, the ST applications targetting the higher resolution with higher color depth would be problematic for A8.

 

The ST had at least one decent port of Joust. It had all major features and animations of the arcade game, decent sound effects, and the movement the objects and response to player input were good. I spent many hours playing it. Joust at least was not as problematical as you say. Perhaps the developers had to expend some effort but my non STE 520ST had no difficulties with that one. I couldn't locate video of it but here are some screenies:

 

http://www.mobygames...ust/screenshots

...

The joust I played on ST had shearing problems in its graphics updates. Nonetheless, it's logical to see how such low-res sprite oriented games would be hard to do on ST (especially ones with overscan and scrolling).

 

The site with the Spectrum 512 pix had the STE pix clearly labeled though I see little reason to exclude any example of the ST line; it seems yet another example of the arbitrary rules that crop up in these threads. Be any of this as it may be the ST was generally more powerful than an A8, even the lowly 512K 520ST. I'll also note that same lowly example unit will display Spectrum 512 pix. These feature 48 colors out of 512 per scanline. The Spectrum 512 drawing program could rip and retry those colors so that an artist didn't to program as well as draw the pix. A few extra lumas here and there still don't get anywhere near that degree of freedom.

...

STE wasn't as prevalent; probably more Video Toaster Amigas sold than STEs but nonetheless Video Toasters are not a standard component of Amigas. For gray-scale imagery including interlaced ones, A8 wins. I think ComputerEyes used the 16-gray mode. Yeah, 68000 is overall superior to 6502, but ST (not STe) is overburdening the CPU to do many of it's audio-visual tasks. I am not saying ST is inferior, just making some points where A8 wins and those points are pretty common in A8 software.

Link to comment
Share on other sites

Come on, a superior machine should be able to do anything from an inferior machine + MORE. And 160 isn't a limit either, you can do a 384*240 set up on Atari although colors are more restricted but the scrolling will be smoother. I picked 160 mode since many games use that mode, but even in this mode, you can do overscan and get 192*240 which is sufficient for many games. I am not comparing it to 320*200*16 but to full screen graphics modes. It has to handle all Atari full screen graphics modes at a faster or equal speed. It can't do the GTIA Gr.9 mode due to shading, it can't do scrolling of any modes at even close to A8 speeds, it can't do text modes either at same/faster speed than A8.

 

Yes, a new machine should be more capable, and I certainly agree there, but I was commenting on the shift in discussion on ST vs A8 capabilities regardless of age difference. At their respective times of release, the A8 was far more inovative and advanced for sure.

As has been mentioned before, the ST might be more analogous to something like a 16-bit counterpart to the ZX Spectrum or Amstrad CPC as far as 8-bit contemporaries are concerned. (the Speccy probably the better example given how late the CPC came out, perhaps the would be comparable MSX as well)

 

Of course, you can prove it for yourself given the bandwidth used to encode the luminance. Of course, the color signal is much lower resolution. And for your information, the VGA to NTSC encoder was plugged into a TV capable of HDTV.

Umm, if it was an HDTV, that's even less relivant,...

One more think regarding this, so far I can't show 640+ pixels on Amiga monitor capable of 640+ pixels horizontally and I can't do it on HDTV. So it's like some Pegasus (flying horse) exists but you can't see it ever. So neither VCR, Amiga video out, Hi8 video cameras, etc. cannot show it but it's doable. Right, nice try.

 

No, an HDTV can certainly disply that resolution, connecting DVI/HDMI, VGA, or YPbPr will work fine with higher resolutions (best with the native resolution, which oddly enough HDMI doesn't always support properly, hence why we went with VGA to hook up our PC to an HDTV with 1360x768 native video display, HDMI wanted to use 1280x720, which was scaled and thus blurry, VGA was much sharper because it used the native resolution).

Anyway, VGA and HDMI are one thing, but another issue is YPbPr: that can output HD signals, but can also display Standard Def NTSC/PAL type signals (480/576i), no HDTVs can natively support these resolutions (not even CRTs), they need to be deinterlaced, and are usually filtered after that, then scaled to a compatible native resolution (with 1080i CRTs, it would be displayed th esame way 540/480p is, so no actual scaling I beleive) with plasma and LCDs that would be fixed to the physical number of pixels in the display. Even worse is 240p which tends to get deinterlaced and line doubled, adding significant artifacts, and won't be accepted by some TVs at all.

 

Still, the luma signal of S-Video and YPbPr should be identical for displaying the same immage, so luma would have similar limitations in both (color bandwidth is obviously more limited with Y/C, and with composite, Luma can be limited by degridation durring encoding). I think RGB might be limited in a similar manner (at NTSC/PAL frequencies, what SCART uses and is standard definition only), with 15 kHz horizontal sync.

Edited by kool kitty89
Link to comment
Share on other sites

It made a claim that the limitations is not due to TV set alone: "In reality, there are a number of other very technical factors that limit the actual resolution you physically get, but these are beyond the scope of this introductory paper. " And it specifically states that broadcast TV is 330 horizontal resolution limit.

Uh-uh. That article clearly states that the resolution of 330 pixels is "number drawn within the distance of the height of the picture" - that is, if your screen has height of Y, then that 330 is a number of pixels visible horizontally on the distance of Y. Since width of a screen is 1.33 times Y, you have to multiply 330 by 1.33 to get the horizontal resolution.

 

However that value is only for broadcast TV - bandwidth of signals in terrestial television is limited by law. When sending a singnal between devices through an S-video or composite cable you are free of that limitation - in the link you've provided it is clearly stated a DVD player can have horizontal resolution of 720. Since you are not receiving an Amiga signal through an RF antenna, the number 330 does not really relate to our case.

 

Whoops, I too am able to found a link on teh interwebs to support my claim; see Table 1.

 

One more think regarding this, so far I can't show 640+ pixels on Amiga monitor capable of 640+ pixels horizontally and I can't do it on HDTV. So it's like some Pegasus (flying horse) exists but you can't see it ever. So neither VCR, Amiga video out, Hi8 video cameras, etc. cannot show it but it's doable. Right, nice try.

In case of Amiga the reason is simple: Amiga's pixel clock has frequency of 14.286 MHz, which is precisely 4x of NTSC colour clock frequency. This produces artifacts (the same mechanism like in Atari 8-bit computers - their pixel clock is 2x colour clock frequency). However, in general, artifacts are avoidable by having pixel clock frequency that is not an integral multiplier of colour clock frequency. For example, the pixel clock in Commodore 64 has frequency a bit larger than 2x NTSC colour clock - that gives slightly narrower pixels, but you then horizontal resolution of 320 without artifacts.

Edited by Kr0tki
Link to comment
Share on other sites

In case of Amiga the reason is simple: Amiga's pixel clock has frequency of 14.286 MHz, which is precisely 4x of NTSC colour clock frequency. This produces artifacts (the same mechanism like in Atari 8-bit computers - their pixel clock is 2x colour clock frequency). However, in general, artifacts are avoidable by having pixel clock frequency that is not an integral multiplier of colour clock frequency. For example, the pixel clock in Commodore 64 has frequency a bit larger than 2x NTSC colour clock - that gives slightly narrower pixels, but you then horizontal resolution of 320 without artifacts.

The non integer multiplier doesn't prevent artifacting. The reason why C64 shows far less artifacting is simple: The luminance is low pass filtered.

Link to comment
Share on other sites

Come on, a superior machine should be able to do anything from an inferior machine + MORE. And 160 isn't a limit either, you can do a 384*240 set up on Atari although colors are more restricted but the scrolling will be smoother. I picked 160 mode since many games use that mode, but even in this mode, you can do overscan and get 192*240 which is sufficient for many games. I am not comparing it to 320*200*16 but to full screen graphics modes. It has to handle all Atari full screen graphics modes at a faster or equal speed. It can't do the GTIA Gr.9 mode due to shading, it can't do scrolling of any modes at even close to A8 speeds, it can't do text modes either at same/faster speed than A8.

 

Yes, a new machine should be more capable, and I certainly agree there, but I was commenting on the shift in discussion on ST vs A8 capabilities regardless of age difference. At their respective times of release, the A8 was far more inovative and advanced for sure....

If someone is going to go for the next generation Atari and owns an 8-bit and realizes that the new ST won't be capable of or less capable of most of ANTIC modes, scrolling, overscan, shades, joystick i/o, etc. why in the world would he consider it an "upgrade". I forgot to mention collision detection in my list.

 

Of course, you can prove it for yourself given the bandwidth used to encode the luminance. Of course, the color signal is much lower resolution. And for your information, the VGA to NTSC encoder was plugged into a TV capable of HDTV.

Umm, if it was an HDTV, that's even less relivant,...

One more think regarding this, so far I can't show 640+ pixels on Amiga monitor capable of 640+ pixels horizontally and I can't do it on HDTV. So it's like some Pegasus (flying horse) exists but you can't see it ever. So neither VCR, Amiga video out, Hi8 video cameras, etc. cannot show it but it's doable. Right, nice try.

 

No, an HDTV can certainly disply that resolution, connecting DVI/HDMI, VGA, or YPbPr will work fine with higher resolutions (best with the native resolution, which oddly enough HDMI doesn't always support properly, hence why we went with VGA to hook up our PC to an HDTV with 1360x768 native video display, HDMI wanted to use 1280x720, which was scaled and thus blurry, VGA was much sharper because it used the native resolution)....

That's sort of circular reasoning. Of course, HDTV can display it but the point was I am hooking up a composite output from VGA to NTSC encoder to the HDTV and it can't show it. If I hook up VGA directly or use HDTV in its native format, it can display it because HDTV has a different bandwidth limitation.

 

Still, the luma signal of S-Video and YPbPr should be identical for displaying the same immage, so luma would have similar limitations in both (color bandwidth is obviously more limited with Y/C, and with composite, Luma can be limited by degridation durring encoding). I think RGB might be limited in a similar manner (at NTSC/PAL frequencies, what SCART uses and is standard definition only), with 15 kHz horizontal sync.

 

Yep, RGB also has to be encoded at certain frequency so the frequency limits its resolution.

Link to comment
Share on other sites

If someone is going to go for the next generation Atari and owns an 8-bit and realizes that the new ST won't be capable of or less capable of most of ANTIC modes, scrolling, overscan, shades, joystick i/o, etc. why in the world would he consider it an "upgrade". I forgot to mention collision detection in my list.

 

For purposes other than gaming perhaps?

 

One more think regarding this, so far I can't show 640+ pixels on Amiga monitor capable of 640+ pixels horizontally and I can't do it on HDTV. So it's like some Pegasus (flying horse) exists but you can't see it ever. So neither VCR, Amiga video out, Hi8 video cameras, etc. cannot show it but it's doable. Right, nice try.

No, an HDTV can certainly disply that resolution, connecting DVI/HDMI, VGA, or YPbPr will work fine with higher resolutions (best with the native resolution, which oddly enough HDMI doesn't always support properly, hence why we went with VGA to hook up our PC to an HDTV with 1360x768 native video display, HDMI wanted to use 1280x720, which was scaled and thus blurry, VGA was much sharper because it used the native resolution)....

That's sort of circular reasoning. Of course, HDTV can display it but the point was I am hooking up a composite output from VGA to NTSC encoder to the HDTV and it can't show it. If I hook up VGA directly or use HDTV in its native format, it can display it because HDTV has a different bandwidth limitation.

 

No, not really circular reasoning, you mentioned an NTSC encoder, not transcoder for a transmition format (comp/s-video/YPbPr), but for a specific video format: NTSC (Standard definition), which has a fixed vertical resolution no matter what, and will necessarily be scaled from a high resolution (ie your 1024x768), and interlaced in addition to that (assuming you're using normal 480i and not 240p). You'd have the same issues using S-video, component, or even RGB, as all are still limited to the standard definition format. So the VGA=> NTSC adaptor isn't a valid example demonstrating the limits of composite video. (note: that in the case of YPbPr, while an effective HD transmition method as well as SD, some HDTVs won't even accept SD signals through those connectors, though others definitely will, not really relivant in this discussion though)

 

Such a transcoder should theoretically be capable of retaining the horizontal resolution an donly scaling the vertical resolution to match standard definition since only vertical res is fixed, but you'd easily exceed the practical resolution of the phosphor dots on an ST screen. (plus the transcoder would need to work with non-square pixels) On top of that there's the general blurring problem with the colors in composite video (and possible degridation of luma), however, you should at least be able to get a clean monochome image with no color data. (also avoiding degridation of the luma signal)

 

 

 

Still, the luma signal of S-Video and YPbPr should be identical for displaying the same immage, so luma would have similar limitations in both (color bandwidth is obviously more limited with Y/C, and with composite, Luma can be limited by degridation durring encoding). I think RGB might be limited in a similar manner (at NTSC/PAL frequencies, what SCART uses and is standard definition only), with 15 kHz horizontal sync.

 

Yep, RGB also has to be encoded at certain frequency so the frequency limits its resolution.

 

Yes, but not horizontal resolution. (as vsync is variable)

Edited by kool kitty89
Link to comment
Share on other sites

RGB isn't encoded at any "frequency" - it's just a variable voltage level on each component as the screen is drawn. If a scanline is constant white, then the signal for that period should be at a non-varying level for each component.

 

Anything you hear relating to frequency in RGB discussion will relate to the h-sync rate. Monitors tend to have limitations in that regard which in turn can limit their resolution and vertical sync rates for a given resolution.

 

Problem RGB can encounter, especially at very high resolutions or longer distances is the signals getting out of sync and wrong colours appearing, hence the move to Digital standards. Throw in interference as another reason.

Link to comment
Share on other sites

RGB isn't encoded at any "frequency" - it's just a variable voltage level on each component as the screen is drawn. If a scanline is constant white, then the signal for that period should be at a non-varying level for each component.

 

Anything you hear relating to frequency in RGB discussion will relate to the h-sync rate. Monitors tend to have limitations in that regard which in turn can limit their resolution and vertical sync rates for a given resolution.

 

Problem RGB can encounter, especially at very high resolutions or longer distances is the signals getting out of sync and wrong colours appearing, hence the move to Digital standards. Throw in interference as another reason.

 

Monitors have limitations as to frequencies they can handle but the transmitter is also sending the RGBs at a certain frequency according to the display adapter. I remember there were some CGA programs that reprogrammed the clocking registers and caused destruction of monitors. Before the monitors weren't multisync.

Link to comment
Share on other sites

If someone is going to go for the next generation Atari and owns an 8-bit and realizes that the new ST won't be capable of or less capable of most of ANTIC modes, scrolling, overscan, shades, joystick i/o, etc. why in the world would he consider it an "upgrade". I forgot to mention collision detection in my list.

 

For purposes other than gaming perhaps?

...

I'm sure you can use your Atari 8-bit for other things besides gaming. I certainly used my Atari for many things like science projects (controlling stuff through joystick ports), video titling, doing BASIC programs (for learning), etc. I'm sure ST can do better in word processing (given higher res.) and paint programs (with GUI/mouse), but it's a DIFFERENT machine. You lose the capabilities of the 8-bit in it. They should have left the A8 chips in the ST.

 

That's sort of circular reasoning. Of course, HDTV can display it but the point was I am hooking up a composite output from VGA to NTSC encoder to the HDTV and it can't show it. If I hook up VGA directly or use HDTV in its native format, it can display it because HDTV has a different bandwidth limitation.

 

No, not really circular reasoning, you mentioned an NTSC encoder, not transcoder for a transmition format (comp/s-video/YPbPr), but for a specific video format: NTSC (Standard definition), which has a fixed vertical resolution no matter what, and will necessarily be scaled from a high resolution (ie your 1024x768), and interlaced in addition to that (assuming you're using normal 480i and not 240p). You'd have the same issues using S-video, component, or even RGB, as all are still limited to the standard definition format. So the VGA=> NTSC adaptor isn't a valid example demonstrating the limits of composite video. (note: that in the case of YPbPr, while an effective HD transmition method as well as SD, some HDTVs won't even accept SD signals through those connectors, though others definitely will, not really relivant in this discussion though)

...

Actually, you can try VGA to NTSC in 640*480, 800*600, and 1024*768 and you will see blurring of text or dropped pixels. And this is being ouput on an Amiga monitor capable of 640+ resolution and HDTV which is capable of 640+ pixels. Same with trying Amiga output, Hi8 camcorders, etc. So the example of Pegasus-- what flying horse does the 640+ resolution using NTSC standard?

 

Yep, RGB also has to be encoded at certain frequency so the frequency limits its resolution.

 

Yes, but not horizontal resolution. (as vsync is variable)

 

You try to show 1280*1024 on an LCD with 1024*768 resolution and you'll see that both axes are limited even in RGB mode.

Link to comment
Share on other sites

They should have left the A8 chips in the ST.

Perhaps they would have if the machine had been designed as a successor to the Atari 8-bit line, but it was in development prior to having any relation to Atari whatsoever. Still, even if Atari Inc had continued to exist and they eventually came out with a successor to the 8-bit line, I'm not sure backwards compatibility would be all that likely, the Amiga didn't carry the C64 chipset either. (not that wouldn't have necessarily been a good idea) I think we already discussed this before, but the best case for integrating the 8-bit hardware with the ST would have been adding the chipset onboard, relagating the 6502 to role as a coprocessor and perhaps make some use of other components. (system would likely have been clocked differently too, to match the A8 chipset, so the 68k might end up running at 7.16 MHz) POKEY would have been a nice addition, if the 8-bit I/O hardware could be used as well (for more than compatibility), you might start to get a cost effective machine. (probably more capable in some ways than the ST, but neither as simple nor as cheap)

 

 

Actually, you can try VGA to NTSC in 640*480, 800*600, and 1024*768 and you will see blurring of text or dropped pixels. And this is being ouput on an Amiga monitor capable of 640+ resolution and HDTV which is capable of 640+ pixels. Same with trying Amiga output, Hi8 camcorders, etc. So the example of Pegasus-- what flying horse does the 640+ resolution using NTSC standard?[/qupte]

The amiga monitor is completely valid as it can display Standard definition natively, the HDTV is going to lose something in transition and it's higher resolution capabilities are farily meaningless in that context as it will simply being displaying a lower resolution immage scaled to the screen's native resolution. You're going to loose quality from converting the VGA signal regardless, due to the scaling process, plus an old composite monitor like the Amiga's would likely not have as good a comb filter as some newer TVs.

Regardless, my major point was that the luma signal carried by composite (non degraded at least), s-video, and YPbPr component are identical and interchangeable when displaying the same content. (comp/s-video aren't supposed to carry anything other than standard definition, though that rule sometimes gets broken)

 

Yep, RGB also has to be encoded at certain frequency so the frequency limits its resolution.

 

Yes, but not horizontal resolution. (as vsync is variable)

 

You try to show 1280*1024 on an LCD with 1024*768 resolution and you'll see that both axes are limited even in RGB mode.

 

I meant a CRT... of course a digital display will have a finite resolution (fixed in fact), you'd need to downscale that immage to fit onto the smaller screen (like 1080i/p HD content displayed on a 1360x768 "1080i/p compatible" LCD HDTV -1080i will be worse as it's deinterlaced) Upscaling also results in a poorer immage though, ideally such a tv in my example you display 1280x720 video with in a window with a boarder such that the native resolution is used, not upscaled to fit the larger screen.

Edited by kool kitty89
Link to comment
Share on other sites

They should have left the A8 chips in the ST.

Perhaps they would have if the machine had been designed as a successor to the Atari 8-bit line, but it was in development prior to having any relation to Atari whatsoever. Still, even if Atari Inc had continued to exist and they eventually came out with a successor to the 8-bit line, I'm not sure backwards compatibility would be all that likely, the Amiga didn't carry the C64 chipset either. (not that wouldn't have necessarily been a good idea) I think we already discussed this before, but the best case for integrating the 8-bit hardware with the ST would have been adding the chipset onboard, relagating the 6502 to role as a coprocessor and perhaps make some use of other components. (system would likely have been clocked differently too, to match the A8 chipset, so the 68k might end up running at 7.16 MHz) POKEY would have been a nice addition, if the 8-bit I/O hardware could be used as well (for more than compatibility), you might start to get a cost effective machine. (probably more capable in some ways than the ST, but neither as simple nor as cheap)

 

 

Would Atari have even had this option after the Amiga brain-drain? They might not have even had the expertise in house any longer.

Link to comment
Share on other sites

They should have left the A8 chips in the ST.

Perhaps they would have if the machine had been designed as a successor to the Atari 8-bit line, but it was in development prior to having any relation to Atari whatsoever. Still, even if Atari Inc had continued to exist and they eventually came out with a successor to the 8-bit line, I'm not sure backwards compatibility would be all that likely, the Amiga didn't carry the C64 chipset either. (not that wouldn't have necessarily been a good idea) I think we already discussed this before, but the best case for integrating the 8-bit hardware with the ST would have been adding the chipset onboard, relagating the 6502 to role as a coprocessor and perhaps make some use of other components. (system would likely have been clocked differently too, to match the A8 chipset, so the 68k might end up running at 7.16 MHz) POKEY would have been a nice addition, if the 8-bit I/O hardware could be used as well (for more than compatibility), you might start to get a cost effective machine. (probably more capable in some ways than the ST, but neither as simple nor as cheap)

 

 

Would Atari have even had this option after the Amiga brain-drain? They might not have even had the expertise in house any longer.

 

I wasn't talking about that complex of a thing. A co-op worker could be hired from college (perhaps coolkitty89) and get the work done within a few weeks although perhaps he has to change his name Cool K. Itty so it sounds like a real name.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...