Jump to content
IGNORED

5200 vs. 7800


jbanes

Recommended Posts

It's entirely possible to simulate vector graphics as long as all your graphics fit in pre-generated sprites. Even arcade machines like Asteroids were programmed to control everything as a bunch of moving sprites, even though the hardware was vector based and the sprites themselves were made up of line segments.

 

About the NES... Though not as flexible as the 7800, the NES was a decent system based on actual arcade hardware which made it easier to develop for. I think the NES version of the Famicom was poorly designed (with the VCR loading idea and all), but it was good enough. Of course what really matters is software not hardware, and Nintendo actually developed stuff for thier system.

 

-Bry

Link to comment
Share on other sites

I never liked the NES. Didn't like it when I was 13, don't like it now at 33. I'm not going to start liking it, simply because you brag about it's technological abilities (or your misperceived vision of them).

 

The color palette of the NES was suitable for quilting, or making fake throw-up. I don't particulary like my videogames to be consistently forest green, vomit fuscia, and shit brown, thanks. The "cheap trick" you refer to had one majorly dissapointing side effect - graphics that flicker so badly, most games were rendered unplayable due to their ability to trigger epileptic seizures in certain players. The artistry you claim is "superior" I always found to be childish and lacking in originality. If I want to save the world as a "Kiss-Kiss Musroom" a "Fuzzy Teddy Bear" or a smiling Pokemon inspired stuffed animal - I'd be shopping at Babies R US.

 

I like my opinion. I have nothing to be "ashamed" of for stating it, and frankly find your inability to separate a stated personal preference from a comparison rather laughable. Note - I never claimed the 7800's superiority - I simply said that the NES is a steaming pile of crap.

 

And to me - it is a steaming pile of crap.

 

get it?

 

If I wanted to discuss the merits of the NES over the 7800, I wouldn't be posting in the 7800 forum of a website called "Atariage".

997442[/snapback]

 

I totally agree with you. I never liked the colours on the NES I always found them to be very bland and dull. I also hated the way they took games and did "Nintendo" versions of them and adding pointless sub-games/levels and most games suffered from bad flicker too. One springs to mind that combined all of these deficiances, Double Dragon. It had awful flicker, incredibly bland backdrops (heck the 2600 version had nicer backgrounds and even my NES mate agreed!) stupid sub levels and just to ruin it even more it was only one player. I too hated the kids approach to everything. It never did that well in the U.K., especially compared to the Master System and 2600. I guess thats because we have always had a more adult orientated look at things, no offence to the Americans.

Link to comment
Share on other sites

The NES is a giant steaming pile of cutesy mushroom and fat plumber infested crap.

996171[/snapback]

 

You're right...that's why no one bought one :roll:

996871[/snapback]

 

Britney Spears sell millions of records too.

997078[/snapback]

 

Which means that alot of people like her music...just not me...doesn't mean it's crap. Your statement of the NES is your opinion, yet you state it as if it is fact. Seems to me that you think that the NES is completely worthless, which simply isn't the case. Surely there are 1 or 2 games on it worth your time. Even the Odyssey 2 has KC Munchkin :)

997590[/snapback]

 

 

Actually - it means a lot of people buy her music(that doesn't necessarily mean they like it), because they have it drilled into their heads that they should like it. It's less a conscious choice and more like a pavlovian response. Same thing with the NES. Clever and persistent marketing could make a Tyson beef-product pattie be remebered as a 16 ounce Porterhouse, if applied correctly and repetitively.

 

Sure there are a game or two that the NES deserves merit for. But I just don't like it overall. I think the graphics are overrated by many fans, the color palette is putrid, the characters are (as with all Nintendo systems) too cutesy and kidsy, and the flickering is distracting enough that I could not retain interest in playing 99% of the games.

 

And it just bugs me when someone comes into the 7800 forum to praise teh NES and pan the 7800. Do that in the other systems forums, or on another nintendo board, ya know?

997664[/snapback]

 

that sounds a bit more reasonable :)

Link to comment
Share on other sites

I try to get out, and they pull me back in!

 

jbanes, seriously, YOU HAVE NO CLUE WHAT YOU'RE TALKING ABOUT. You think you do, but you don't. You really, really don't.

 

Dude, chill out. Again, we have a failure to communicate here.

 

Not the Apple routines. These are too primitive for Bresenham's algo. They draw only horizontal or vertical lines. Which I'm not saying that Choplifter is using, but probably something similar, perhaps custom.

 

Do you realize that the "Apple routines" are just software? There's nothing magical about them. They're just 6502 code.

 

Yes, I do realize this. I was stating that the standard Apple line drawing routines (at least the ones I mentioned, see the paragraph below) are too primitive to implement Bresenhams. They draw a line either vertically or horizontally. As I said, Chopper Command probably uses custom drawing routines for speed. It one of the first things that game programmers did back in the day.

 

Do you realize that in HIRES mode, there's a command that can draw lines between any two arbitrary points? Not just horizontal and vertical?

 

I was digging a bit more through the reference material, and it looks like you're right. There's an HPLOT command in the AppleSoft commands for Hi-Res mode. I was looking at the Integer BASIC command set as an example. Thanks for pointing that out. :)

 

Do you realize that a 1-MHz system simply IS NOT FAST ENOUGH to draw and fill a complex vector object in realtime? For a real example of a game that uses vector objects, look at Out of This World/Another World. That ran sluggishly on systems vastly more powerful than the Apple II.

 

Two points:

 

1. Define "real-time". Like I said, with a framebuffer you can drop your framerate and still produce a highly playable game. 1MHz is definitely not a lot, but to rotate a single sprite that isn't that bad. If you're running at 10FPS, you have about 100,000 cycles per frame. Since you're not forced to spend the majority of that updating the screen, you're left with a LOT of time to do your work in comparison to a 2600/5200/7800. So rotating a single line-art sprite *is* possible.

 

2. Out of this World was an all-polygon game. Without some sincerely bizarre magic, you can't render that many polys on a 1MHz system. Hell, most Apogee and ID games could barely run on a 10MHz PC with full 2D graphics. And those guys were wizards at this stuff. That doesn't mean that you can't compromise. Might I remind you of... (wait for it) Ballblazer?

 

Do you realize that Choplifter is not tilting (rotating) the chopper sprite data in realtime, but in fact uses dozens of predrawn bitmaps to create the ILLUSION that it's tilting it?

997666[/snapback]

 

Do you realize that I pointed to a hypothetical version of Gravitar? Of course I know that Chopper Command used a set of bitmaps! I chose a different situation because I was trying not to give the impression that I was saying that Chopper Command used line art for the chopper.

 

I've already named the primary fill areas on the Choplifter screen: The scoreboard, the ground, the stars, etc. I have questions about the base itself, and about the rotor on the chopper. I have no doubt that the Chopper, the People, the Tanks, and the Planes are buffered sprites.

 

The general topic here is advantages of 5200 vs. the 7800. The 5200 gained a lot of Apple ports because it was able to emulate a framebuffer. The 5200 does NOT have a true framebuffer, as the main processor has to setup a blit of each partial line of data via a display list at the time of the screen rendering. (Thank you to those who pointed me toward the ANTIC docs. They were very informative.) A true framebuffer generates a signal off of video memory, and will do so independent of instructions from the processor. This is probably why the 5200 Choplifter flickers while the Apple II version doesn't. (At least in the emulators. I have no reason to believe that the real hardware is any different, though.)

 

Now, will you please calm down?

Link to comment
Share on other sites

The 5200 does NOT have a true framebuffer

Wrong.

 

as the main processor has to setup a blit of each partial line of data via a display list at the time of the screen rendering.

Wrong.

 

I don't know what screwball definition of "framebuffer" you've internalized, but on the Atari 8-bit computers and 5200, you can set up a contiguous block of RAM which will automatically display anything you stick into it without further CPU intervention. If that's not a frame buffer, I don't know what is.

Edited by ZylonBane
Link to comment
Share on other sites

I don't know what screwball definition of "framebuffer" you've internalized, but on the Atari 8-bit computers and 5200, you can set up a contiguous block of RAM which will automatically display anything you stick into it without further CPU intervention. If that's not a frame buffer, I don't know what is.

997751[/snapback]

 

Ok, fine. How? I'm looking at the ANTIC docs right now. Taking Mode 8 (mentioned by Jaybird previously) as an example, "This mode, as other non-character graphics modes do, uses data in the display buffer as a bit map to be displayed. A command to display in mode 8 will cause the ANTIC chip to read the next 10 bytes in the display buffer." Each mode after that specifies the same thing, only with different pixel sizes and color counts.

 

Now I have no doubt that you can configure a display list list to emulate the framebuffer. That's still not a true framebuffer, though it's pretty darn close. The biggest problem with this scheme is that you're locking out your CPU every time the graphics hardware needs to be fed data. (Most true framebuffers have their own memory to generate the signal off of. This frees the processor to do other things. The IBM PCjr differed in that it used a Shared Memory Architecture, but as far as I know this didn't impact the main processor.) As Bryan stated, this scheme results in a significant drop in processing power. I originally thought that the GITA must be fed on a different bus, but I was corrected on that one. Which means that there are distinct disadvantages to the ANTIC emulation of a framebuffer.

Link to comment
Share on other sites

Antic is able to read the gfx data from different memory locations so you can write into a memory area while displaying a different one.

I if you are finished writing to memory you just switch the pointer to this location.

The next operations can work on the memory displayed before and so on.

This technique was called "page flipping" and I would say it's a kind of framebuffer.

Link to comment
Share on other sites

"(...) This mode, as other non-character graphics modes do, uses data in the display buffer as a bit map to be displayed. A command to display in mode 8 will cause the ANTIC chip to read the next 10 bytes in the display buffer." Each mode after that specifies the same thing, only with different pixel sizes and color counts.

 

Now I have no doubt that you can configure a display list list to emulate the framebuffer. That's still not a true framebuffer, though it's pretty darn close. The biggest problem with this scheme is that you're locking out your CPU every time the graphics hardware needs to be fed data. (...)

997786[/snapback]

You are right if this is your definition of a frame buffer.

The CPU will be halted during fetch.

Link to comment
Share on other sites

Antic is able to read the gfx data from different memory locations so you can write into a memory area while displaying a different one.

I if you are finished writing to memory you just switch the pointer to this location.

The next operations can work on the memory displayed before and so on.

This technique was called "page flipping" and I would say it's a kind of framebuffer.

997817[/snapback]

 

Indeed. Page flipping is also often used as a feature in framebuffer hardware that's got way more onboard memory than it needs for a single frame. Obviously it's a lot faster than double buffering, which requires that the full screen be dumped to vidmem after being constructed in main memory.

 

The cool part about the ANTIC's approach, however, is that it can not only perform a page flip, but a complex display list list (I love that name :D ) could actually interlace various line buffers to produce different neat effects on a shoestring memory AND CPU budget. That's a rare thing in my experience. Usually you trade one for the other.

 

Of course, after all the *#$@ "fun" I had with SuperVGA bank switching, I never thought I'd actually be praising the ability to have a disjointed video memory map. ;) :P

Link to comment
Share on other sites

The cool part about the ANTIC's approach, however, is that it can not only perform a page flip, but a complex display list list (I love that name :D ) could actually interlace various line buffers to produce different neat effects on a shoestring memory AND CPU budget. That's a rare thing in my experience. Usually you trade one for the other.

You mixed something up ANTIC has only a display list no matter how complex it is.

But MARIA has a display list list ;)

 

Of course, after all the *#$@ "fun" I had with SuperVGA bank switching, I never thought I'd actually be praising the ability to have a disjointed video memory map. ;)  :P

997846[/snapback]

:)
Link to comment
Share on other sites

And it just bugs me when someone comes into the 7800 forum to praise teh NES and pan the 7800. Do that in the other systems forums, or on another nintendo board, ya know?

997664[/snapback]

 

I don't mind an honest discussion as there are strengths and weaknesses of each.

 

But I firmly believe that regardless od what each system could and could not do technically, it was moot because the playing field was never level.

 

It's really simple:

 

Nintendo spent money, Atari didn't. The 7800 suffered far more from that than ANY technical strength, weakness or difference that the GCC folks had in their design.

 

Nintendo hired GOOD developers. The Tramiels hired the likes of IBID INC.

 

Nintendo paid for ongoing hardware development, cartridges with lots of storage, saves and more. The Tramiels gave developers shit for including RAM or POKEYS in their games.

 

Nintendo gave people time to program well-thought-out games like Super Mario 3. The Tramiels hired clowns like IBID INC to do a half-assed job porting stuff like Karateka

 

And:

 

Nintendo ADVERTISED AGGRESSIVELY. The best developers WANTED to develop for the NES because there was tons of money to be made. They weren't enticed enough to go against the Nintendo regime to Atari's "We have a $300,000 marketing budget when our competitors spent a hundred times that a year" system.

 

At the end of the day, that is what killed the 7800, not the TIA sound or the line-by-line display.

 

Good developers find their way to push systems. Bad developers worked on 7800 games because the good developers wouldn't hire them.

Edited by DracIsBack
Link to comment
Share on other sites

Indeed. Page flipping is also often used as a feature in framebuffer hardware that's got way more onboard memory than it needs for a single frame. Obviously it's a lot faster than double buffering, which requires that the full screen be dumped to vidmem after being constructed in main memory.

997846[/snapback]

 

Defender uses a nice approach. It just has one frame buffer, but an interrupt is triggered at both the top and middle of the frame. When the top-of-frame interrupt happens, the system erases and redraws everything that needs to change in the bottom half. When the middle-of-frame interrupt occurs, it erases and redraws anything that needs to change in the top half.

 

As for the question of whether something is a "framebuffer", as far as I'm concerned, any area of RAM which is organized as a two-dimensional array of data which will be read out in real time for display at a fixed part of the screen. The term is most commonly applicable to systems which use dedicated hardware for the readout, but can also be applied to certain kernel designs on the 2600. The data in a frame buffer will typically be either a bitmap or a tilemap, though other arrangements may be possible as well. A key feature of a framebuffer is that objects within it are generally moved around the screen by removing them from one part of the buffer and placing them in another, rather than by moving the frame buffer as a whole (though sometimes scrolling may be done via pointer magic).

Link to comment
Share on other sites

Now I have no doubt that you can configure a display list list to emulate the framebuffer. That's still not a true framebuffer, though it's pretty darn close. The biggest problem with this scheme is that you're locking out your CPU every time the graphics hardware needs to be fed data. (Most true framebuffers have their own memory to generate the signal off of.

 

jbanes, we're getting several different things mixed up in this discussion. A framebuffer is the concept that there is RAM which holds the entire screen. It can occupy the same RAM as other things or it can have its own RAM. Some low end PC's use a few megs of your main memory for video using 'shared memory archetecture' which also locks out the CPU, but they still have a framebuffer. The 5200 has a very flexible framebuffer system, but it does not have a dedicated framebuffer.

 

-Bry

 

quick edit: Having a dedicated framebuffer didn't always help the PC that much. Games didn't start becoming fluid until VESA local bus replaced ISA for video cards.

Edited by Bryan
Link to comment
Share on other sites

Ok, fine. How? I'm looking at the ANTIC docs right now. Taking Mode 8 (mentioned by Jaybird previously) as an example, "This mode, as other non-character graphics modes do, uses data in the display buffer as a bit map to be displayed. A command to display in mode 8 will cause the ANTIC chip to read the next 10 bytes in the display buffer." Each mode after that specifies the same thing, only with different pixel sizes and color counts.

Holy crap. That quoted bit even uses the phrase "display buffer"... TWICE! And that's still not clueing you in.

 

I think this is where you're tripping up-- the "command" referred to is not a CPU command, it's a display-list command to the video coprocessor-- ANTIC. Every video-capable computer has, somewhere in its guts, a circuit that scans a chunk of memory and spews it out to the video generator. This is what you call a "true" frame buffer, yes?

 

Well on the Atari, this circuit just happens to be programmable instead of hardwired. Almost all computers have programmable displays to some extent. Even the Apple II -- your gold standard of frame buffer goodness -- lets your switch between several display modes. But ANTIC, instead of querying a single memory location to set its display mode (as the Apple II does), uses a list of display mode commands. Do you understand now? The Apple and the Atari are doing the EXACT SAME THING, only the Atari gives the user more control over the process.

 

Ultimately, "frame buffer" is a conceptual term, not a technical one. If I have a block of RAM, and anything I write to it appears on-screen automatically, then it's a frame buffer. It doesn't matter what magical yellow brick road lies between the RAM and the screen, all that matters is that it works.

Link to comment
Share on other sites

Wow.. 2 posts that say pretty much the same thing. I agree about the "conceptual" aspect. The thing you have to do before you start comparing systems is qualify the the type of framebuffer. The Apple has a fixed one, the Atari and Commodore have relocatable ones. The Atari even allows segmented/fragmented ones. The PC generally has a dedicated one. If you're going to take supercat's argument that the 2600 can have one (which I personally wouldn't do), then you'd have to call it a simulated or software framebuffer since there's no hardware support for one.

 

Since the term 'framebuffer' more or less implies that the screen is a single object, the 7800 isn't locked this category. Maria doesn't see the screen as a 'frame' but rather a series of combined objects. You could create a display list to treat the screen as a frame, but you don't have to.

 

-Bry

Edited by Bryan
Link to comment
Share on other sites

jbanes, we're getting several different things mixed up in this discussion. A framebuffer is the concept that there is RAM which holds the entire screen. It can occupy the same RAM as other things or it can have its own RAM. Some low end PC's use a few megs of your main memory for video using 'shared memory archetecture' which also locks out the CPU, but they still have a framebuffer. The 5200 has a very flexible framebuffer system, but it does not have a dedicated framebuffer.

 

To be clear, "framebuffer" can be correctly used to mean a RAM location holding screen data. More correctly, however, it refers to a hardware device that drives a monitor using pixel data from RAM. Such devices were once valued for their simplicity over their expense. As in:

 

"This Sun workstation is equipped with a simple framebuffer capable of driving the monitor at 800x600x256."

 

A RAM location that stores a screenful of data is properly referred to as a "backbuffer" or "frontbuffer'. As in:

 

"I wrote the screen to the backbuffer, then flipped it to be the frontbuffer during the Vertical Sync."

 

This is how I'm referring to it. The ANTIC is not a true framebuffer device. It actually has more in common with modern 3D Vector hardware in that it is a separate processor that queues up commands for just in time processing. Probably the biggest difference I see is that the ANTIC doesn't have its own bus. Performance-wise, you just can't do much worse than that.

 

quick edit: Having a dedicated framebuffer didn't always help the PC that much. Games didn't start becoming fluid until VESA local bus replaced ISA for video cards.

997911[/snapback]

 

Agreed. :) The PC had an accelerated framebuffer because it was designed for "serious" home and business applications. The use of a framebuffer and character ROM greatly simplified the programming, making it much more useful as in business and BASIC programming. IBM made a half-hearted attempt to customize the PC for "family use" (i.e. games) with the PCjr, but the implementation sucked so bad that it was never taken seriously. (Ah, I still remember that stupid "grab the key" game that could be found by hitting the right function key when the built-in BASIC ROM started. And let's not forget how it would freeze if you hit a key sometime within the minute and a half while the memory was being checked! Stupid PCjr.)

 

It's worth noting, however, that the PC didn't have a true framebuffer device either. The framebuffer had a character blitter added to it that allowed the hardware to produce text. This stands in direct comparison to the Unix-machine framebuffers which drove everything through software blitting. The PC had different graphics modes similar to the ANTIC to help programs produce text screens. Many Unix machines, OTOH, had to draw everything in software.

 

The advantage to the full framebuffer was that there was no such thing as graphics modes. "Text Mode" was just an operating system driver that converted ASCII codes to bitmaps on the screen. You could call a graphics library at any time to produce pretty pictures right on top of the console. If X-Windows took over, all it did was disable to console driver and begin writing to the frontbuffer. (I'm somewhat simplifying this. X-Windows often switches a few registers to change the parameters of the video signal so that a higher resolution can be used.) That's why if you've ever seen a Sun machine, the console looks so damn beautiful. Pure white background with black anti-aliased text. *hugs Ultra 10 box* :lust: So beautiful. :_( *sniff*

 

Of course, for video games the differences don't really matter. Save for the fact that the PC is pain in the ass to code for.

 

Holy crap. That quoted bit even uses the phrase "display buffer"... TWICE! And that's still not clueing you in.

 

I think this is where you're tripping up-- the "command" referred to is not a CPU command, it's a display-list command to the video coprocessor-- ANTIC. Every video-capable computer has, somewhere in its guts, a circuit that scans a chunk of memory and spews it out to the video generator. This is what you call a "true" frame buffer, yes?

 

Oh, I love it. "You're wrong because you're right! See?"

 

Let me ask you this: If you have a port that allows you to use the CPU to write a value to the monitor cable, is it a framebuffer device? No? What if you implement a program that produces a video signal with the correct timing. Is it a framebuffer then?

 

Sort of. Your program is emulating (or simulating if you prefer) the actions of a common piece of hardware. But it wasn't designed to perform this role, nor is it the default state of the hardware. That's why I refer to it as an "emulated framebuffer". And just like with the ANTIC, such a device would tie up the system's resources as the picture is being drawn.

 

The Apple and the Atari are doing the EXACT SAME THING, only the Atari gives the user more control over the process.

 

No, the Atari can be programmed to do the same thing. That's software my friend, like it or not.

 

Ultimately, "frame buffer" is a conceptual term, not a technical one. If I have a block of RAM, and anything I write to it appears on-screen automatically, then it's a frame buffer. It doesn't matter what magical yellow brick road lies between the RAM and the screen, all that matters is that it works.

997923[/snapback]

 

Ultimately you're wrong. There are frame buffers, then there are graphics accelerators, and then there are video overlay generators. A modern graphics card tends to incorporate all three. The frame buffer is the destination for the commands given to the graphics accelerator, which is overridden by the video overlay device to produce effects like a mouse cursor. In engineering these terms have very specific meanings. They used to have different meanings to consumers as well. Today they tend to be used interchangably outside of engineering, but we're all engineers here, right? Right?

 

*crickets chirp*

 

Hmm. From the Sun framebuffer FAQ:

 

In its simplest meaning, and as far as the Graphics hardware engineers are concerned, a frame buffer is simply that; the video memory that holds the pixels from which the video display (frame) is refreshed.

 

In the early days, Sun provided seperate "Graphics Processor" or "Graphics Accelerator" and "Graphics Buffer" boards - the processor/accelerator was an add-on (sometimes optional) that provided efficient, tuned hardware pipelines to support graphics functions normally done in software, for example shading, Z-buffering, picking and hidden surface removal.

 

Nowadays with ever-improving manufacturing techniques, the two are tightly linked on the same board. Since every graphics device incorporates a frame buffer, but not all have graphics processors, the term "frame buffer" has become synonymous (outside Engineering at least) for a graphics device of any type.

 

<southern-lawyer>Your Honor, Ah rest mah case.</southern-lawyer> :D ;)

 

(For what its worth, I also used to think that "framebuffer" was a loose term for "the thing that displays the graphics." It was a real eye opener to learn that specifying "framebuffer" meant I might be getting very different hardware than what I wanted.)

Link to comment
Share on other sites

To be clear, "framebuffer" can be correctly used to mean a RAM location holding screen data. More correctly, however, it refers to a hardware device that drives a monitor using pixel data from RAM. Such devices were once valued for their simplicity over their expense.

 

I am a hardware/software engineer who has designed and implemented a 320x200x4gray LCD controller for an 80186-clone single-board computer (which had a DMA controller, but no built-in video of any sort). I know what's entailed in display generation. I would refer to the area of memory that I had the DMA controller clock out to the display as a framebuffer, even though the designers of the SBC probably never imagined it as such.

 

A RAM location that stores a screenful of data is properly referred to as a "backbuffer" or "frontbuffer'. As in:

 

"I wrote the screen to the backbuffer, then flipped it to be the frontbuffer during the Vertical Sync."

 

Frontbuffer and backbuffer are appropriate terms for systems that have more than one framebuffer and provide the ability to switch among them.

 

This is how I'm referring to it. The ANTIC is not a true framebuffer device. It actually has more in common with modern 3D Vector hardware in that it is a separate processor that queues up commands for just in time processing.

 

Can the ANTIC write to RAM at all? It might be useful in some cases if it could but I know of no such ability.

 

Probably the biggest difference I see is that the ANTIC doesn't have its own bus. Performance-wise, you just can't do much worse than that.

 

Sure you can. How about a display controller that can't fetch anything from memory at all and must be spoon-fed by the processor?

 

In 40-column text mode, the Atari runs a dot clock of 7.16Mhz (chroma*2), a character clock of 0.84Mhz, a memory clock of 1.79Mhz, and the CPU also at 1.79Mhz. Every frame will represent 29,344 memory/CPU cycles of which something over 9,000 are consumed by the display leaving about 20,000. The Apple II also runs a dot clock of 7.16Mhz, but the character clock is 1.02Mhz. The CPU clock is also 1.02Mhz, and the CPU alternates memory fetches with the display hardware. Each frame represents 16,768 cycles, all of which are available to the CPU.

 

Seems the Atari doesn't do too badly. There are some real advantages to having a fixed alternation of display/processor memory access so I can't fault the Apple for its approach, but I wouldn't condemn the performance of the Atari.

 

Agreed. :) The PC had an accelerated framebuffer because it was designed for "serious" home and business applications. The use of a framebuffer and character ROM greatly simplified the programming, making it much more useful as in business and BASIC programming.

 

The CGA framebuffer could only be written during hblank or vblank when using 80-column text mode. This severely limitted screen update speed. Graphics mode did not have this limitation. Performance of CGA games was often in fact somewhat better than Apple II counterparts. To the extent that performance wasn't wonderful, it's because frame buffers aren't great at handling moving objects in the absense of hardware blitters.

 

It's worth noting, however, that the PC didn't have a true framebuffer device either. The framebuffer had a character blitter added to it that allowed the hardware to produce text. This stands in direct comparison to the Unix-machine framebuffers which drove everything through software blitting. The PC had different graphics modes similar to the ANTIC to help programs produce text screens. Many Unix machines, OTOH, had to draw everything in software.

 

The CGA had five modes worth mentioning (with BIOS mode numbers)

 

0 and 1 - 40-column character mode

 

2 and 3 - 80-column character mode

 

4 and 5 - 320x200x4 color bitmap

 

6 - 640x200 black+1 color bitmap

 

unnumbered - Hack 80-column character mode so characters are two scanlines high, yielding a pseudo 160x100 16-color mode.

 

Even in character mode, I see no reason not to refer to it as a framebuffer device since the display was generated by clocking data out of RAM in real time with a 1:1 correspondence between RAM and screen locations.

Link to comment
Share on other sites

The advantage to the full framebuffer was that there was no such thing as graphics modes. "Text Mode" was just an operating system driver that converted ASCII codes to bitmaps on the screen. You could call a graphics library at any time to produce pretty pictures right on top of the console. If X-Windows took over, all it did was disable to console driver and begin writing to the frontbuffer. (I'm somewhat simplifying this. X-Windows often switches a few registers to change the parameters of the video signal so that a higher resolution can be used.) That's why if you've ever seen a Sun machine, the console looks so damn beautiful. Pure white background with black anti-aliased text. *hugs Ultra 10 box* :lust: So beautiful.  :_(  *sniff*

 

Of course, graphics mode is only wonderful if you have enough RAM to accommodate your display and enough processor/memory bandwidth to fill up the RAM quickly. Even with the restricted writing, 80x25x16-color text mode was faster than 640x200 black and white graphics and in many cases looked better too.

 

Of course, for video games the differences don't really matter. Save for the fact that the PC is pain in the ass to code for.

 

No worse than the Apple II.

 

Let me ask you this: If you have a port that allows you to use the CPU to write a value to the monitor cable, is it a framebuffer device? No? What if you implement a program that produces a video signal with the correct timing. Is it a framebuffer then?

 

Sort of. Your program is emulating (or simulating if you prefer) the actions of a common piece of hardware. But it wasn't designed to perform this role, nor is it the default state of the hardware. That's why I refer to it as an "emulated framebuffer". And just like with the ANTIC, such a device would tie up the system's resources as the picture is being drawn.

 

What matters is the result, not the means. Would you object to saying that Adventure uses a frame counter at addresses $AF and $A0 to control color flashing when the game is idle? The 2600, after all, has no hardware frame counter unlike some other computers (the CGA has a hardware frame counter that blinks the cursor, e.g.) So is it wrong to refer to $AF/$A0 as a frame counter?

 

Ultimately you're wrong. There are frame buffers, then there are graphics accelerators, and then there are video overlay generators. A modern graphics card tends to incorporate all three. The frame buffer is the destination for the commands given to the graphics accelerator, which is overridden by the video overlay device to produce effects like a mouse cursor. In engineering these terms have very specific meanings. They used to have different meanings to consumers as well. Today they tend to be used interchangably outside of engineering, but we're all engineers here, right? Right?

 

A frame buffer is an area of memory that is clocked out to the display in such fashion as to produce a generally-unchanging correspondence between memory locations and screen locations. A graphics accellerator is a device for shovelling large amounts of data into and within a frame buffer. A video overlay generator is a device to inject an alternate source of video data for part of the frame, often obscuring data read from a framebuffer.

 

In its simplest meaning, and as far as the Graphics hardware engineers are concerned, a frame buffer is simply that; the video memory that holds the pixels from which the video display (frame) is refreshed.

 

It is the memory. Not the memory CHIP, and not the device that reads data from the memory, but the memory itself. In its default text mode, the Apple II uses RAM in the region $400-$7F7 as a framebuffer. This isn't a special chip; that RAM is stored in the same chips as everything else from $0000-$3FFF.

 

(For what its worth, I also used to think that "framebuffer" was a loose term for "the thing that displays the graphics." It was a real eye opener to learn that specifying "framebuffer" meant I might be getting very different hardware than what I wanted.)

997995[/snapback]

 

A framebuffer is an area of RAM. It may exist within dedicated chips, or it may be borrowed from other chips. In some cases, hardware will force a particular area of memory to be used as a framebuffer, but in many cases software can control how memory is used. An area of RAM is a framebuffer when it's being used as one. When it isn't, it isn't.

Link to comment
Share on other sites

Of course, graphics mode is only wonderful if you have enough RAM to accommodate your display and enough processor/memory bandwidth to fill up the RAM quickly.

 

A framebuffer device always has the RAM onboard unless the system uses a Shared Memory Architecture. Agreed on the memory bandwidth.

 

Even with the restricted writing, 80x25x16-color text mode was faster than 640x200 black and white graphics and in many cases looked better too.

 

Definitely faster since you have hardware acceleration. Technically you're still in a graphics mode, but the blitting is being done automatically by the accelerator.

 

As for whether it looks better or not, that's a non-argument. A text display is a form of graphics mode. Whether you're blitting, using sprite overlays, or allowing the hardware to blit to a framebuffer or not is irrelevant. The results will generally look the same at the same resolution. Sun systems tend to look much better because they have high quality software that can produce spectacular results on rather "dumb" hardware. Contrast this with the Apple Macintosh which produces a greater number of character glyphs at high resolution rather than using the resolution to produce nicer looking text. (You can see this mode by activating the OpenFirmware prompt.)

 

What matters is the result, not the means.  Would you object to saying that Adventure uses a frame counter at addresses $AF and $A0 to control color flashing when the game is idle?  The 2600, after all, has no hardware frame counter unlike some other computers (the CGA has a hardware frame counter that blinks the cursor, e.g.)  So is it wrong to refer to $AF/$A0 as a frame counter?

 

Yes, it's a frame counter done in software. But since the processor wasn't purpose built for this specifc task, you can't call the 6502 a frame counter. Just as how you can't call the ANTIC a framebuffer. The ANTIC is a graphics co-processor (generally referred to as a graphics accelerator) that drives the GTIA to produce a video signal. You can write software that makes it act like a framebuffer, but it's still a "soft" implementation.

 

A frame buffer is an area of memory that is clocked out to the display in such fashion as to produce a generally-unchanging correspondence between memory locations and screen locations.

 

Close. A framebuffer is a device that produces a generally-unchanging correspondence between memory locations and screen locations. More generally, it's a device that produces a direct correspondence between a given memory heap and the output of the video display. The memory doesn't have to mapped to a given location. That's primarily a feature for easy addressing. Framebuffer cards exist that allow their memory to be accessed via ports, bankswitched memory locations, and other "fun" (*cough*) tricks.

 

A graphics accellerator is a device for shovelling large amounts of data into and within a frame buffer.

 

Close. A graphics accellerator is a device to translate abstract graphics commands (commonly 2D or 3D vector commands) into a more displayable form. That could be framebuffer data (as with many rasterizers) or it could be a set of commands (as with the ANTIC's control of GTIA).

 

A video overlay generator is a device to inject an alternate source of video data for part of the frame, often obscuring data read from a framebuffer.

 

Drop the part about the framebuffer and you've got it spot on. Keep in mind that sprite hardware is also a video overlay device. (The sprite overlays the background in the video signal.) Last I checked, there's no framebuffer driving the 2600.

 

In its simplest meaning, and as far as the Graphics hardware engineers are concerned, a frame buffer is simply that; the video memory that holds the pixels from which the video display (frame) is refreshed.

 

It is the memory. Not the memory CHIP, and not the device that reads data from the memory, but the memory itself. In its default text mode, the Apple II uses RAM in the region $400-$7F7 as a0 framebuffer. This isn't a special chip; that RAM is stored in the same chips as everything else from $0000-$3FFF.

 

No, no it isn't. The memory is the "buffer". (Frontbuffer or backbuffer, depending on whether it's being displayed or not.) The framebuffer is the device that renders the video signal based on the buffered data. As I stated before, the memory used may not even be mapped into the CPU's memory locations.

 

As I said previously, "framebuffer" is sometimes colloquially used to describe a generic video buffer, but its technical meaning is much different.

 

(For what its worth, I also used to think that "framebuffer" was a loose term for "the thing that displays the graphics." It was a real eye opener to learn that specifying "framebuffer" meant I might be getting very different hardware than what I wanted.)

997995[/snapback]

 

A framebuffer is an area of RAM. It may exist within dedicated chips, or it may be borrowed from other chips. In some cases, hardware will force a particular area of memory to be used as a framebuffer, but in many cases software can control how memory is used. An area of RAM is a framebuffer when it's being used as one. When it isn't, it isn't.

998050[/snapback]

 

No, no it isn't. The memory is the "buffer". (Frontbuffer or backbuffer, depending on whether it's being displayed or not.) The framebuffer is the device that renders the video signal based on the buffered data. As I stated before, the memory used may not even be mapped into the CPU's memory locations.

 

As I said previously, "framebuffer" is sometimes colloquially used to describe a generic video buffer, but its technical meaning is much different.

Link to comment
Share on other sites

I just saw that copyright law thread and . . . wow. I'm not smart enough to understand most of what's in this thread so far, but on the copyright issues you were just flat-out making things up.

 

We've all had bad assumptions, that's okay. But defending such assumptions in the face of overwhelming expert evidence is just insane. In the same way you made up copyright rules which simply don't exist, you're making up definitions for "Framebuffer" that simply don't exist. Just googling around in my ignorance, I found multiple references which state that a framebuffer is not necessarily a device as such, but an application of a device or software. It appears to be the ends, not the means which make a framebuffer.

Link to comment
Share on other sites

I was going to correct some of the more glaring errors in jbanes' latest post, but for all the good it would do I think I'll find this more satisfying--

 

jbanes, you are a clueless fuckwit.

 

Ahh, that felt good. Anyway...

 

Wikipedia says you're wrong.

Webopedia says you're wrong.

The Free Computing Dictionary says you're wrong.

OpenGL.org says you're wrong.

LinuxQuestions.org says you're wrong.

ComputerHope.com says you're wrong.

HWUpgrade.com says you're wrong.

Uniblue says you're wrong.

 

The Sun usage of "framebuffer" is nonstandard. Get over it.

Edited by ZylonBane
Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...