Jump to content

jbanes

Members
  • Content Count

    3,083
  • Joined

  • Last visited

Posts posted by jbanes


  1. On the other hand, I think that mini-arcades in airports and other places that people have to kill some time would do very well to focus on things similar to the $0.25 bartop games.  Such places aren't really going to be in competition with home systems since the people waiting in airports aren't going to have ready access to those.  And on such systems, approachable gameplay will be far more important than super-duper graphics, sound effects, motion feedback, etc.

     

    I see no reason $0.25/play shouldn't do quite well at such places.

    1003415[/snapback]

     

    The first video game I ever saw as a kid was a Pacman Cocktail Table at my local Pizza Hut. It was pushed up against the wall near the counter as there was very little space in the restaurant.

     

    I've always wondered why someone didn't open a restaurant that featured ruggedized versions of these as every table? Get people to plunk money in as they wait for their food. Of course, the food has to be good, but otherwise the idea seems sound enough. (I hear that Nolan Bushnell is planning something similar with modern games?) The biggest problem I see is that the initial startup cost would be high, and you'd have a lot of potential maintenece issues. But still, it could be a very attractive restaurant. Especially in high traffic areas (like airports) where a little gimmicking goes a long way. Not to mention all the people who would sit down and play just because they're waiting for their plane. :)


  2. While I agree that the Laserdisc would have been mostly useless, I chose it because it's the best out of three useless options.

     

    The mindlink was junk. Nothing more than a cutesy headband that would "read" your brow. If it was released, it would have gone the way of the PowerGlove. Except with less fanfare.

     

    As Bruce pointed out, the 5200 adapter would have been nothing more than a 5200 machine that plugged into your 7800.

     

    The Laserdisc, while guaranteed not to be successful, might have been at least interesting. i.e. It could have been the SuperCharger of the 7800. Just more expensive. At the very least, it might have been useful with a lightgun for an early version of Maddog McCree. :D

     

    As for the keyboard, I think it might have done alright for itself. I have to disagree that it was a distraction. It could have been used to create more complex games that needed more keys/buttons than the 7800 could provide. It could also be used to load games from tape, allowing for large worlds that could never have fit on a cartridge. Basically, the keyboard could have had a purpose, even though it may have seemed gimmicky.

     

    I wonder if anything else could have been done with that expansion port?

    I don't see why not. For example, you could create a CDROM drive with extra RAM for the machine. Maybe even include a cool video or sprite compression decoder to free up the CPU. All you really need is the documentation that explains the pinouts. (And some hardware design experience, but that goes without saying. ;))


  3. Or, he may have been referring to the Flashback 3 rumors. Curt has hinted that there will be a cartridge port of some kind on it, leading many AtariAgers to believe that the FB3 will be a 400/800 clone. Curt is also working on the previously-unreleased Keyboard component for the 7800.

     

    Basically, there's a lot of cool stuff going on with the ancient Atari hardware. Though I'm not sure I'd go as far as to say that Atari is "making a new computer".


  4. I also think a Stargate (the tv show) sidescroller or verticle scroller (like Ikari Warriors) adventure type of game would be fun as well if it could be done to look half way decent.

    1003101[/snapback]

     

    Technically speaking, an Atari video game for Stargate Atlantis already exists. Gameplay is pretty simple too:

     

    The city is under siege by the Wraith, and you must use your rail gun installations (recently shipped in from Earth) to defend the city against the Wraith! Watch out, though! If a dart manages to penetrate the shield, it will beam in a task force of Wraith solders who will destroy that structure. If the city sustains too much damage, you'll have no choice but to evacuate in the Daedalus and destroy the city!

     

    Hurry, you're the only hope the Pegasus Galaxy has against the Wraith! :ponder: :D


  5. You'll have to excuse the slight shift in perspective here, but if I may wax poetically for a moment....

     

    "Say not, Why were the days which have gone by better than these? Such a question comes not from wisdom." -Ecclesiastes 7:10

     

    The arcades that we knew in our younger days are long gone, never to be brought back to life. Concerning ourselves with the difficulties of creating new machines is a probably a pointless exercise, in part because of the points Mr_8bit_16bit brought up.

     

    That being said, I still enjoy playing Galaga, Cruisin' USA, Killer Instinct, After Burner, BattleZone, TMNT, Tempest, and the hundreds of other games. I also enjoy taking my family to the little hotdog places around Chicago so we can get some tasty fast food grub (Vienna Beef, Mmmmm...) and play a few of the multi-arcade machines they have setup. (My older son seems to like DigDug, while the younger one just likes to hit the ball in Pinball.) I see no reason why bringing these experieneces back in an arcade setting wouldn't work.

     

    Look at it this way. There's currently a booming market in PNP TV Games, primarily because they deliver to people's homes all the original games they used to (and still do) enjoy. In my experience, these TV Games also help bring back some of the social aspect that's been missing from gaming for so long.

     

    What I'm getting at is, I don't see why someone can't run a classic arcade that runs existing machines in a traditional arcade setting with food, tokens, loud noises, and other aspects that we used to enjoy. So what if Sky Kid is sitting next to Hydrothunder? The point is to create a common area where people can have fun! No, it won't be exactly like the arcades were back in the day. In fact, such arcades would probably be idealized versions. But they'll still be fun, and that's all that counts. :)

     

    It's been mentioned around here before that a few of these types of arcades have popped up in heavily trafficed areas. If I ever take a vacation in Geneva, WI, I'll have to look up the one that was mentioned there. Bound to be lots more fun than the gutted arcades in the Wisconsin Dells.

     

    Edit:

     

    Good News: I found the post I referenced here. :)

    Bad News: Most of the reviews are not positive. :(


  6. Three more things that made the arcade experience different from the home experience: -1- The vanity board (somewhat supported via the Internet, but it's not really the same) -2- Being able to actually see other people play; -3- Putting in quarters.  Although the financial drain posed by the latter is in some ways annoying, it does tend to change the way games are approached.

     

    Right you are. The social aspect of gaming just can't be overstated. IMHO, gaming went down the drain the moment the PS1 grabbed the market away from Nintendo. Instead of games that encouraged siblings, friends, and older family members to play together, gaming became a form of unsatisfiable escapism.

     

    Surprisingly, quarter draining is also a very important element. It seems a little weird, but it places a limit on your gameplay that is difficult to overcome. On a home port of a game you can keep playing until you reach some arbitrary limit of lives or continues. In the arcade, you can keep playing as long as you want, but only if you can bleed enough quarters to keep up. In a social game like TMNT, Simpsons, or X-Men, there's a lot of pressure (but positive pressure, IMHO) to keep playing. Even games like Killer Instinct keep people plunking quarters so they can prove who has better mastery of the game, and/or take over the single player game.

     

     

    I'd be curious to know what studies have been done on game pricing versus demand, both with regard to price variations among games in an arcade, and with regard to a price structure for an arcade as a whole.

    1002892[/snapback]

     

    This is a good question. One has to wonder if such studies were done before arcade manufacturers ran themselves bankrupt creating new machines. Anedotal evidence definitely suggests that a quarter/token at a time is ideal for "quarter sucking games". i.e. Players will be attracted by the low cost, and then kept at the game machine by the low cost of continuing their game. OTOH, most racing games would feel too "cheap" (as you say) if they were anything less than 50 cents. Why? Because you are paying for a fixed amount of gameplay. (Usually one level or track.) This is different from quarter suckers that tend to prompt you for more money at much more arbitrary intervals.

     

    I'd be interested in seeing if a proper study agreed. :)


  7. His postuation for Tekken killing the arcade is that the home version was arcade perfect and almost simultaneous in it's release. And that once arcade perfect was capable at home, the desire to go to the arcade to pump laundry money into arcade machines went away..afterall, they were no longer technically superior......they had nothing to offer that your TV at home couldn't with a PS1 plugged in.

     

    An interesting take on the market downturn. Though I'm not really sure I can agree with it. I was a big arcade go-er at the time, and despite the near flawless transition of SF:Rush and other arcade games to home systems, it just wasn't the same. The arcade experience included things like rumble chairs, steering wheels, gas pedals, stick shifts, force feedback, recoiling guns, solid joysticks, and other goodies that helped make the game what it was.

     

    Or in other words, it was really about the total package. With a home system, your total package is just the game. It's a game, very much like any other. You use the same controller (no matter how unnatural it is), and play it on the same console with the same sound system and the same video. A lot of what made the game interesting in the arcade is lost. What you're left with is a "game" that's designed to draw quarters from you, except that you don't need to plunk any quarters into the machine!

     

    Even games that don't have such specialized controls tend to be more fun in the arcade than at home. Killer Instinct is a perfect example. The arcade version of the game can be run from home on a PC emulator. Is it as much fun? Nope. Just not the same experience.

     

    Home games tend to benefit more from more drawn out play. Sure, a quick Atari game can be fun, but for the money you spend on an average console, you tend to expect to get hours upon hours of enjoyment out of a game. Arcade ports don't provide that.

     

    As a customer of the arcades when they died, I feel I may be able to shed some light on why they died.

     

    The key thing I remember about Hydrothunder is that it was very much the last interesting game to hit the arcades. All the previous good games were slowly disappearing, but weren't being replaced by fresh new games. The few new machines that showed up were rehashes of concepts that had been done a billion times before. (e.g. Another F1 racer game, another Street Fighter Alpha game, another poor shoot-the-zombie game with cheap-ass-guns, etc.)

     

    But why weren't the games being replaced? Where was all the innovation, the pushing of the limits?

     

    The other thing that stuck out to me at the time was that the machines were getting VERY expensive to produce. So expensive that prices for a single game were growing by leaps and bounds. It started as 50 cents a game, ballooned to 75 cents a game, and hit a dollar per game in no time. Some arcades charged a $1.50 a game!

     

    Basically, this hyperinflation was unsustainable. Customers were driven away from the arcades by high prices while the machines grew in expense. What was an arcade owner to do? Stop buying new games, and replace some of the older games with classic money makers like Skeeball, that's what. Some of my favorite arcades from days gone by are either nothing but ticket games or are simply out of business. The few that have any arcade games at all tend to follow the same formula:

     

    1. 2 or 3 DDR or DDR clone machines

    2. 1 or 2 racing games

    3. A few gimicky games (e.g. Sword fighting with a wand, rotating tail gunner, etc.) that aren't actually that much fun to play

     

    A perfect case in point is the Chuck E. Cheese restaurants. Remember when they had TMNT, Super Mario, Excite Bike, Simpsons, Sky Kid, and other fun games? Yeah, well they're all just rides and ticket games now. You have to really search to find an actual video game. Big Rigs and some Firefighter game tend to be "it". It saddens me too, because I remember Showbiz/Chuck E. as a great arcade when I was a kid. Now I can't share those memories with my kids. :(

     

    So that's my opinion. The arcade games got too competitive and basically put themselves out of business. If the home consoles had anything to do with it, they simply helped force that competition by ensuring that arcade game producers would have to try harder than another Pacman to keep kids coming to the arcades.


  8. Perhaps a better example of the actually death warrant of the arcade would be, say, Hydro Thunder.

    1002607[/snapback]

     

    Them's fightin' words! :x

     

    Seriously, what's wrong with Hydrothunder? I always thought it was a great game. Probably the last arcade game to ever drain quarters from my pocket. It was especially fun if you found an arcade that had four of the machines linked up. Almost (but not quite) as fun as playing against friends in San Francisco: Rush.

     

     

    STUN Runner used a standard raster monitor, in other words.  I don't think there were any vector-monitor games made after 1985 or so.

    1002637[/snapback]

     

    Indeed. From KLOV:

     

    Monitor:
    
       * Orientation: Horizontal
       * Type: Raster: Standard Resolution
       * CRT: Color

     

    Shadow, maybe you're thinking of the 3D Vector Graphics that the game used? Like most modern 3D displays, the graphics are rasterized prior to display. I don't think a filled-poly vector display was ever produced.


  9. #2: The entire concept of the type of machine that it was, a video came console with program cartridges, was also not theirs to protect. That falls under Mr. Baer and later, Magnavox & Fairchild. Atari only made video games a household word by producing and selling their own versions, they didn't invent/develop the technology.

     

    Mr. Baer's design (the Odyssey) didn't have cartridges in the same way that the 2600 did. The Odyssey's cartridges contained none of the game code or hardware, and merely rerouted the circuits inside the Odyssey to activate a game that was already in the system. It's difficult to say if Atari would have been able to block the VES/Channel F had they filed a patent immediately after they'd developed the cartridge idea.

     

    As for the rest of the stuff, Atari only needs to show that they significantly improved on existing concepts in order to be granted a patent. They would then have exclusive rights to those improvements for the duration of the patent. Atari in fact, did file a patent on the TIA in 1976, and was granted the patent in 1978. It didn't seem to do them much good against Coleco, though. The judge decided that because Coleco had used off the shelf parts instead of Atari's custom hardware, Atari's patents didn't apply. (Does anyone actually have the full text of the judge's findings in the Atari v. Coleco case? I've been trying to find it for a little while now.)

     

    Addendum: Here's the patent Atari filed on their cartridge design. I believe someone mentioned this earlier in the thread. (supercat?)

     

    #3: As a continuation to 1, there is no software in it, copyrighted or otherwise. Just 99% free use hardware. Even the Fairchild Channel F, which pre-dates the 2600, had a bios and thus copyrightable code contained within.

    1002445[/snapback]

     

    Software wouldn't have protected the Channel F prior to 1980. Before the 1980 ammendment to the Copyright Act of 1976, there was a lot of question over whether software was copyrightable. Thus Atari took the patent route instead of bothering with BIOS software. It's difficult to say if it would have helped them, though. In 1982, Compaq reverse engineered the PC BIOS. The courts decided that it was a completely clean room revision, and that Compaq was in the clear. On the other hand, Keith Robinson seems to believe that Coleco failed to produce it's Intellivision module for the Colecovision because of fears over software copyright infringement. So, it's hard to say what would have happened if Atari had used software as a protection mechanism.


  10. ...Nolan was smart enough to get people together to come up with this stuff, smart enough to make their efforts work-for-hire product...

    ...but not smart enough to get legal protection for his stuff?

     

    Who was giving him legal advice, John Santangelo?

     

    That's not really fair to Nolan. You need to realize that the Atari VCS was a followup to the single-purpose Pong units. Atari's plan was to expand their Pong market by creating a more general machine that played about 10 cartriges which could be purchased separately. I seriously doubt they saw that that VCS would become as popular as it did. They certainly didn't expect their own programmers to leave and create competing companies. (i.e. Activision and IMagic)

     

     

    Sheesh. And I though it was bad enough that Toys-R-Us is selling two  storebrand knockoffs of the same Hasbro/MB game. Knockoffs are bad enough...

    ...but not even bothering to legally protect your work except after people have started copying it is downright short-bus stoopid. :dunce:

    1002320[/snapback]

     

    Hindsight is 20/20. Back when the VCS was made, there wasn't a whole lot of legal protection in the computer industry. Ffiling for a patent was like annoucing your inventions to your competitors. A lot of the intellectual rights of computer companies didn't stablize until the 1980's, in part because of Atari's suits.

     

    The weapon of choice used by modern console makers is the Digital Millinium Copyright Act which makes it illegal in many circumstances to break the encryption or lock-out software present on console systems. Such provisions didn't exist in the 1970's when it wasn't clear if software could be copyrighted and the USPTO rejected patent applications on software. As a result, "security through obscurity" and healthy doses of trade secret contracts were seen as the best solution.


  11. I don't know if all versions of the 7800 are incompatible with Pitfall II. There is a compatibility analysis here: http://www.emucamp.com/vgee/2600/2600faq.htm#78incom

     

    Unfortunately, Pitfall II is not mentioned. However, they do describe how to identify the oldest, most compatible version of the 7800 console. If any 7800 can play Pitfall II, it's probably that one.

    998496[/snapback]

     

    If I understand that link right, the way to tell if you have a first release or not is to check for the expansion port. Since I do have the "Expansion Port" sticker and the port itself, I assume that I have the first revision. Those are reasonably rare, aren't they?


  12. That's what I'm wanting.

    998494[/snapback]

     

    I agree whole heartedly. This thread is long ruined. I don't know why anyone's still here.

     

    <Ferris-Bueller>You're still here? It's over. Go home. Go!<Ferris-Bueller>

     

    (Makes shooing motion.)

     

    :)


  13. I've heard that Pitfall II is supposed to be rare

     

    Someone will correct me if I'm mistaken, but I don't think Pitfall 2 works on the 7800.

    998388[/snapback]

     

     

    Nope.

    998479[/snapback]

     

    Wait, I'm confused. I have a brand-spanking old 7800 at home. Last weekend I picked up Pitfall II from Sean Kelly's shop for $7.99. I took it home and popped it into my 7800 with no problems to speak of. Are we talking about some special version of Pitfall 2, because I found it neither rare nor incompatible?


  14. So on one hand there's ZylonBane putting on his best tough-guy impression to distract everyone from the fact that he didn't have the balls to take up the challenge (which he soundly "lost"), SuperCat PMing me to convince me that "framebuffer" has been "retronymed" to mean something else despite the mountain of evidence (including some modern evidence produced by none other than ZylonBane), and Danno telling me that he didn't really want an apology and telling me to <really-gruff-voice>"Stay out of my PM box!"</really-gruff-voice>

     

    :rolling: :lol: :rolling: :lol: :rolling: :lol:

     

    You're all a bunch of right loons, you are!

     

    :rolling: :lol: :rolling: :lol: :rolling: :lol:

     

    (wipes tears from eyes)

     

    Ah, man. Thanks guys, I needed that.


  15. Then you just go on saying "dumb framebuffer", and we'll all carry on with the modern usage.

    998288[/snapback]

     

    Fine with me. I never demanded anything else.

     

    And you owe a lot of people apologies for train wrecking a perfectly good conversation.

     

    For those of you who are interested in the historical usage, I prepared a few links for the challenge that ZylonBane refused to accept. You may find this to be of interest if you're the type who enjoys studying the history of computer science.

     

    ------------

     

    In 1969 Joan Miller experimented with a paint program on a 3-bit framebuffer developed at Bell Labs(1). While the concept of a framebuffer had been theorized about for quite a long time, this was the first known example of such hardware.

     

    In 1972, Richard G Shoup created the first complete, fully functional framebuffer along with a paint program to utilize it. This system was dubbed "SuperPaint"(2), and had a user interface very similar to paint programs we use today. The hardware was implemented as a 307,200-pixel shift register, allowing pixels to be accessed only when the specific scan line and pixel time were reached. This shift register was synchronized with the television scan rate. Richard also implemented the ability to read in a video signal by synchronizing the television signal between the inputs and outputs. This allowed SuperPaint to also be the first example of a video capture system. The complete SuperPaint system currently resides in the permanent collection of the Computer Museum History Center in Mountain View, California.

     

    In 1974, Evans & Sutherland brings the first commercial framebuffer(3) (designed by Jim Kajiya with full Random Access Memory(6)) to the market. The device costs upwards of $50,000, but starts a revolution in graphics development across(5) Universities nation wide.

     

    Within a few years, memory starts to become cheap enough to allow devices like the Apple II to contain framebuffers. By the 1980's, Unix manufacturers began appearing to provide high quality graphics workstations to the market. SGI (7)(8 ), HP(9), DEC(10), and Sun Microsystems(11)(12) all released framebuffers throughout the 80's, and well into the 90's.

     

    Development didn't stop there, however, and manufacturers began to add Graphics Acceleator chips to accelerate their frame buffers for text modes, graphic primitives, and many other features used by the emerging GUI systems. The final result is the highly advanced 2D graphics cards we have today. They couple a graphics accelerator, framebuffer, and video overlay device to produce high quality imagery at blistering speeds. Many also include 3D Vector Processors which can be used to rasterize millions of 2D or 3D vector shapes per second to the framebuffer.

     

    ------------

     

    1. http://accad.osu.edu/~waynec/history/PDFs/Annals_final.pdf

    2. http://accad.osu.edu/~waynec/history/PDFs/14_paint.pdf

    3. http://accad.osu.edu/~waynec/history/lesson15.html

    4. http://www.siggraph.org/movie/

    5. http://accad.osu.edu/~waynec/history/PDFs/paint.pdf

    6. http://research.microsoft.com/users/kajiya/

    7. http://scanimate.zfx.com/DVD2T.html

    8. http://hardware.majix.org/computers/sgi.iris/iris3130.shtml

    9. http://openpa.net/systems/snakes.html

    10. http://q.dyndns.org/~blc/DS3100/specs.html

    11. http://www.sunhelp.org/faq/FrameBufferHistory.html

    12. http://www.sunhelp.org/faq/FrameBuffer.html

     

    And with that my friends, I bid this thread adieu. Thank you to those of you who had positive contributions to add. I hope that we can intelligently discuss many of the points discussed at a future date, in a hopefully less hostile forum. Good day.


  16. What many people have said to you many times, using iteratively smaller and smaller words, is that the MEMORY is the framebuffer.

     

    Which I keep repeating and repeating that this is a colloquial defintion that is not precisely correct. According to this definition, rendering a frame of animation to memory is a "framebuffer". But technically, this is incorrect. A framebuffer is a device that uses buffered pixel data used to drive a display. Period, end of story. That's a framebuffer.

     

    And yet, you refuse to grasp this point. It's like you've brainwashed yourself or something.

     

    I grasp your point just fine. I keep having to tell you that your definition of framebuffer, while acceptable in many non-technical circles, is skewed. You don't seem to want to follow that.

     

    Incidentally, in the Linux world, a framebuffer is "a hardware-independent abstraction layer". That's right, it's a completely software concept. Do you want to call up every Linux dev in the world and tell them they're using the wrong word?

    998265[/snapback]

     

    The Linux Framebuffer driver is a "Virtual Framebuffer" similar to the X Virtual Framebuffer. It emulates the hardware, allowing programs designed for a real framebuffer to operate. From your link:

     

    The Linux framebuffer (fbdev) is a graphic hardware-independent abstraction layer to show graphics on a console without relying on system-specific libraries such as SVGALib or the heavy overhead of the X Window System.

     

    It was originally implemented to allow the Linux kernel to emulate a text console on systems such as the Apple Macintosh that do not have a text-mode display, and was later expanded to Linux's originally-supported IBM PC compatible platform, where it became popular largely for the ability to show the Tux logo on boot up.

     

    You'll note that nowhere in your link does it say, "a framebuffer is memory". It clearly states that the driver is a software emulation of a hardware device. It has been expanded to provide flat memory emulation for the PC VESA modes, which are usually bank switched.

     

    From the Xfvb man page:

     

    Xvfb is an X server that can run on machines with no display hardware and no physical input devices. It emulates a dumb framebuffer using virtual memory.

     

    Did you catch that? It EMULATES a dumb framebuffer using virtual memory. In case that didn't sink in, let me repeat it. It EMULATES a dumb framebuffer using virtual memory.

     

    If it was just a memory backing as you say, Xvfb would be a framebuffer rather than emulating one.

     

    The problem is that at some point along the line, people didn't understand the technical nature of a framebuffer and started referring to buffer memory as the framebuffer itself. This is incorrect, even though it's an error that's often repeated.


  17. I'm genuinely at a loss at to why jbanes thinks the Apple has a framebuffer, but the Atari doesn't.

    998240[/snapback]

     

    Way to avoid the challenge, there. If I really am as witless as you make me out to be, then why not take me up? The gauntlet is on the ground. Are you going to pick it up?

     

    I'll explain (if you're actually interested in hearing). I've said many, many, many times that the ANTIC can emulate a framebuffer. The ANTIC isn't designed as a framebuffer (rather, more of a natural evolution of the playfield design of the Atari 2600), but it can be programmed to act as one. Once it's programmed to act as one, it is "emulating" a framebuffer. From the dictionary:

     

    3. Computer Science. To imitate the function of (another system), as by modifications to hardware or software that allow the imitating system to accept the same data, execute the same programs, and achieve the same results as the imitated system.

     

    1. To take as a model or make conform to a model: copy, follow, imitate, model (on, upon. or after), pattern (on, upon. or after).

     

    Why you find that to be such a hard concept to grasp, I do not understand. The end result is the same: You have a working framebuffer, sans the drawbacks of the ANTIC/GTIA system. Yet you insist that I'm "wrong". From there, you and Supercat have come to maintain that a "framebuffer" is (precise definition here) nothing more than a buffered frame in memory. By that definition, you could have a machine without a video output and still have a framebuffer.

     

    I maintain that the precise definition of "framebuffer" is a device that drives a display based on a buffered grid of "Picture Element" (pixel) data.

     

    So, either we can agree that we just have different terminology (and you can stop casting insults at people), or we can complete a challenge to show who is "correct". What say you?


  18. To hell with it. I can't sleep now anyway.

     

    Heh... "fluffwit"...

     

    So, let's get this straight. From the moment this thread started, you've been nothing but negative and abrasive toward the dicussion. Your comments directed at me have been nothing but inflammatory, and wanting in any real technical data. All the while you claim you're "trying to mix a little information" of which you'd provided zero. You got supercat to charge in on arguing semantics he doesn't understand either, and you haven't given to a single point ANYONE has made in this thread.

     

    Now you have the gall to try to start a flamewar by casting insults?

     

    I'll tell you what. How about we settle this right here and right now? We'll solve the matter of the framebuffer, permanently.

     

    To recap the overall situation, you contest on the basis of (lemme see here), 1 stub of an article, 4 non-authoritive sources, and 3 sources that don't disagree with a thing I've said (I'll let you figure out which is which), that the "framebuffer" as a device is an invention of Sun Microsystems' marketing, and that everyone else in the world only uses the term to refer to any memory anywhere that used to hold a frame of data, regardless of whether or not it's driving a monitor. Is that correct?

     

    Well, let's have a contest to prove it. If you can prove that I'm not right in a fixed amount of time, you win. If you give up and I can't prove I'm correct by the end of that time, you win. What do you say? Are you up for a simple challenge that will settle this matter once and for all? Shall we match wits and see who's are "fluffier"?

     

    Supercat, how about you? You in?

     

    Edit: Clarified framebuffer defintion as per Bryan's post.


  19. I am a hardware/software engineer who has designed and implemented a 320x200x4gray LCD controller for an 80186-clone single-board computer (which had a DMA controller, but no built-in video of any sort).  I know what's entailed in display generation.  I would refer to the area of memory that I had the DMA controller clock out to the display as a framebuffer, even though the designers of the SBC probably never imagined it as such.

     

    Nice to meet you. It sounds like you created a Shared Memory Architecture framebuffer through the DMA controller. Since I'm assuming that's it's designed puropse, I see no issue with calling it a framebuffer. The primary difference is that you're driving an LCD display (a fairly digital process, and nowhere near as finicky on the timing) rather than an electron beam.

     

    The concept of a framebuffer is to buffer the data for driving the electron beam in much the same way you use a FIFO buffer to drive a serial port. The biggest difference is that when framebuffers allow random access to the video buffer, they become useful for emulating the capabilities of sprite hardware.

     

    For an LCD display, driving the display means that there's no need to worry about the timing of a physical beam. (No HSync, no VSync, no unviewable area, etc.) You just cycle through the pixels one by one, preferrably fast enough to prevent them from fading. Most LCDs have a driver chip that does this automatically, allowing for a variety of signals to be sent to the display. Some even allow internal programs running on an internal microprocessor so that you can interpret the signals into something usable by the display. (e.g. Most LCD TVs emulate the NTSC signal for backward compatibility. They have no need for such complex signals, though.)

     

    Frontbuffer and backbuffer are appropriate terms for systems that have more than one framebuffer and provide the ability to switch among them.

     

    Indeed. Otherwise it's just referred to as the "video buffer", "display buffer", or "graphics buffer". These terms are pretty much interchangable, though the latter is often applied to per-pixel modes while "display buffer" is often used to refer to text mode data.

     

    This is how I'm referring to it. The ANTIC is not a true framebuffer device. It actually has more in common with modern 3D Vector hardware in that it is a separate processor that queues up commands for just in time processing.

     

    Can the ANTIC write to RAM at all? It might be useful in some cases if it could but I know of no such ability.

     

    I'll grant you that I'm far from an expert on the ANTIC, but as far as I know, no. Why does it need to? (i.e. What are you getting at?) It takes commands in from RAM, and outputs them to the GTIA. The GTIA then drives the video signal.

     

    Probably the biggest difference I see is that the ANTIC doesn't have its own bus. Performance-wise, you just can't do much worse than that.

     

    Sure you can. How about a display controller that can't fetch anything from memory at all and must be spoon-fed by the processor?

     

    As I said: "you can't get much worse than that". No engineer in his right mind would spoon feed the display controller unless he a)dedicated the processor or b) didn't actually need the processor time. Granted, the 2600 gets close, but it was breaking new ground in making formerly expensive features inexpensive.

     

    In 40-column text mode, the Atari runs a dot clock of 7.16Mhz (chroma*2), a character clock of 0.84Mhz, a memory clock of 1.79Mhz, and the CPU also at 1.79Mhz.  Every frame will represent 29,344 memory/CPU cycles of which something over 9,000 are consumed by the display leaving about 20,000.  The Apple II also runs a dot clock of 7.16Mhz, but the character clock is 1.02Mhz.  The CPU clock is also 1.02Mhz, and the CPU alternates memory fetches with the display hardware.  Each frame represents 16,768 cycles, all of which are available to the CPU.

     

    Seems the Atari doesn't do too badly.  There are some real advantages to having a fixed alternation of display/processor memory access so I can't fault the Apple for its approach, but I wouldn't condemn the performance of the Atari.

     

    Mmm. Except that my argument was based on the fact that the ANTIC could have had a separate bus. This would have greatly improved performance over the existing solution.

     

    The Apple advantages are:

     

    1. The Apple gets away with 100% of its cycles vs. the 68% of the Atari. The Atari advantage is entirely in the CPU clock. Had Apple released a higher speed version in the 80's, the Apple II could have easily outpaced the contemporary 5200. Instead, it gave it a run for its money using tech from 1977. (The same year as the 2600 was released.) Had the ANTIC been able to read its commands and data without disturbing the processor, it would have left the CPU with 100% of its time as well.

     

    2. The Apple doesn't have to render 60 frames per second. The framebuffer allows it to skip several frames, thus giving it more processing time. 30 FPS was considered silky smooth back then (still is, really), meaning that you could double your processing time to 33,536 cycles. The difference between 30 and 60 FPS would likely go unnoticed by gamers.

     

    And why are you referring to character mode? The primary comparison is between the Apple II and the 5200. Unless I missed something, the 5200 was not booted into character mode very often.

     

    Agreed. :) The PC had an accelerated framebuffer because it was designed for "serious" home and business applications. The use of a framebuffer and character ROM greatly simplified the programming, making it much more useful as in business and BASIC programming.

     

    The CGA framebuffer could only be written during hblank or vblank when using 80-column text mode. This severely limitted screen update speed. Graphics mode did not have this limitation. Performance of CGA games was often in fact somewhat better than Apple II counterparts.

     

    Indeed. It's worth noting that the Apple II was also not a "game machine" either, despite its reasonably good capabilities. Both the PC and the Apple II were often pressed into service, however. The greatest issue I took with CGA mode was that it was just plain ugly. The choice of cyan, magenta, and white did not make for very good graphics. The few games I remember playing on it, however, worked well enough. There just weren't many of them. The CGA 320x200 graphics mode did have an alternate color palette of red, green, and brownish-orange, but I can't ever remember it being used.

     

    IBM actually made a move away from supporting gaming by introducing the far superior EGA graphics adapter. The EGA adapter was capable of up to 640×350 with 16 colors from a 64 color palette. It looked great for business applications and basic graphics work. However, this resolution easily outstripped the ability of many programs to keep up the rendering, making slow programs and screen tearing a common occurance. Some popular games were made for this mode (e.g. EGA Trek, EGA Asteroids, and Where in Time is Carmen Sandiego), but it was otherwise ignored until the introduction of VGA and the 286 processor.

     

    unnumbered - Hack 80-column character mode so characters are two scanlines high, yielding a pseudo 160x100 16-color mode.

     

    As I remember, PC-BASIC used to be able to access this mode by setting the characters per line correctly. I was a bit disappointed when I found that this trick didn't work in the later GW-BASIC language.

     

    SuperCat: "Even in character mode, I see no reason not to refer to it as a framebuffer device since the display was generated by clocking data out of RAM in real time with a 1:1 correspondence between RAM and screen locations." (Sorry, the board breaks if this is in quote tags)

     

    Um, no. The resolution of the screen was still 320×200 or 640×200, but the CGA card was automatically looking up the bitmaps and producing the correct multi-pixel video signal. True framebuffers are generally dumb devices that produce a signal based on data fed to them according to various timing registers. By fiddling with the timing registers (such as the undocumented VGA "Mode X") you could directly modify the resolution of the video signal.

     

    Edit: I'll respond to your other reply tomorrow. Right now I need to hit the hay.


  20. Of course, graphics mode is only wonderful if you have enough RAM to accommodate your display and enough processor/memory bandwidth to fill up the RAM quickly.

     

    A framebuffer device always has the RAM onboard unless the system uses a Shared Memory Architecture. Agreed on the memory bandwidth.

     

    Even with the restricted writing, 80x25x16-color text mode was faster than 640x200 black and white graphics and in many cases looked better too.

     

    Definitely faster since you have hardware acceleration. Technically you're still in a graphics mode, but the blitting is being done automatically by the accelerator.

     

    As for whether it looks better or not, that's a non-argument. A text display is a form of graphics mode. Whether you're blitting, using sprite overlays, or allowing the hardware to blit to a framebuffer or not is irrelevant. The results will generally look the same at the same resolution. Sun systems tend to look much better because they have high quality software that can produce spectacular results on rather "dumb" hardware. Contrast this with the Apple Macintosh which produces a greater number of character glyphs at high resolution rather than using the resolution to produce nicer looking text. (You can see this mode by activating the OpenFirmware prompt.)

     

    What matters is the result, not the means.  Would you object to saying that Adventure uses a frame counter at addresses $AF and $A0 to control color flashing when the game is idle?  The 2600, after all, has no hardware frame counter unlike some other computers (the CGA has a hardware frame counter that blinks the cursor, e.g.)  So is it wrong to refer to $AF/$A0 as a frame counter?

     

    Yes, it's a frame counter done in software. But since the processor wasn't purpose built for this specifc task, you can't call the 6502 a frame counter. Just as how you can't call the ANTIC a framebuffer. The ANTIC is a graphics co-processor (generally referred to as a graphics accelerator) that drives the GTIA to produce a video signal. You can write software that makes it act like a framebuffer, but it's still a "soft" implementation.

     

    A frame buffer is an area of memory that is clocked out to the display in such fashion as to produce a generally-unchanging correspondence between memory locations and screen locations.

     

    Close. A framebuffer is a device that produces a generally-unchanging correspondence between memory locations and screen locations. More generally, it's a device that produces a direct correspondence between a given memory heap and the output of the video display. The memory doesn't have to mapped to a given location. That's primarily a feature for easy addressing. Framebuffer cards exist that allow their memory to be accessed via ports, bankswitched memory locations, and other "fun" (*cough*) tricks.

     

    A graphics accellerator is a device for shovelling large amounts of data into and within a frame buffer.

     

    Close. A graphics accellerator is a device to translate abstract graphics commands (commonly 2D or 3D vector commands) into a more displayable form. That could be framebuffer data (as with many rasterizers) or it could be a set of commands (as with the ANTIC's control of GTIA).

     

    A video overlay generator is a device to inject an alternate source of video data for part of the frame, often obscuring data read from a framebuffer.

     

    Drop the part about the framebuffer and you've got it spot on. Keep in mind that sprite hardware is also a video overlay device. (The sprite overlays the background in the video signal.) Last I checked, there's no framebuffer driving the 2600.

     

    In its simplest meaning, and as far as the Graphics hardware engineers are concerned, a frame buffer is simply that; the video memory that holds the pixels from which the video display (frame) is refreshed.

     

    It is the memory. Not the memory CHIP, and not the device that reads data from the memory, but the memory itself. In its default text mode, the Apple II uses RAM in the region $400-$7F7 as a0 framebuffer. This isn't a special chip; that RAM is stored in the same chips as everything else from $0000-$3FFF.

     

    No, no it isn't. The memory is the "buffer". (Frontbuffer or backbuffer, depending on whether it's being displayed or not.) The framebuffer is the device that renders the video signal based on the buffered data. As I stated before, the memory used may not even be mapped into the CPU's memory locations.

     

    As I said previously, "framebuffer" is sometimes colloquially used to describe a generic video buffer, but its technical meaning is much different.

     

    (For what its worth, I also used to think that "framebuffer" was a loose term for "the thing that displays the graphics." It was a real eye opener to learn that specifying "framebuffer" meant I might be getting very different hardware than what I wanted.)

    997995[/snapback]

     

    A framebuffer is an area of RAM. It may exist within dedicated chips, or it may be borrowed from other chips. In some cases, hardware will force a particular area of memory to be used as a framebuffer, but in many cases software can control how memory is used. An area of RAM is a framebuffer when it's being used as one. When it isn't, it isn't.

    998050[/snapback]

     

    No, no it isn't. The memory is the "buffer". (Frontbuffer or backbuffer, depending on whether it's being displayed or not.) The framebuffer is the device that renders the video signal based on the buffered data. As I stated before, the memory used may not even be mapped into the CPU's memory locations.

     

    As I said previously, "framebuffer" is sometimes colloquially used to describe a generic video buffer, but its technical meaning is much different.

×
×
  • Create New...