Jump to content

jbanes

Members
  • Content Count

    3,083
  • Joined

  • Last visited

Everything posted by jbanes

  1. jbanes

    5200 vs. 7800

    Which I keep repeating and repeating that this is a colloquial defintion that is not precisely correct. According to this definition, rendering a frame of animation to memory is a "framebuffer". But technically, this is incorrect. A framebuffer is a device that uses buffered pixel data used to drive a display. Period, end of story. That's a framebuffer. I grasp your point just fine. I keep having to tell you that your definition of framebuffer, while acceptable in many non-technical circles, is skewed. You don't seem to want to follow that. The Linux Framebuffer driver is a "Virtual Framebuffer" similar to the X Virtual Framebuffer. It emulates the hardware, allowing programs designed for a real framebuffer to operate. From your link: You'll note that nowhere in your link does it say, "a framebuffer is memory". It clearly states that the driver is a software emulation of a hardware device. It has been expanded to provide flat memory emulation for the PC VESA modes, which are usually bank switched. From the Xfvb man page: Did you catch that? It EMULATES a dumb framebuffer using virtual memory. In case that didn't sink in, let me repeat it. It EMULATES a dumb framebuffer using virtual memory. If it was just a memory backing as you say, Xvfb would be a framebuffer rather than emulating one. The problem is that at some point along the line, people didn't understand the technical nature of a framebuffer and started referring to buffer memory as the framebuffer itself. This is incorrect, even though it's an error that's often repeated.
  2. jbanes

    5200 vs. 7800

    Way to avoid the challenge, there. If I really am as witless as you make me out to be, then why not take me up? The gauntlet is on the ground. Are you going to pick it up? I'll explain (if you're actually interested in hearing). I've said many, many, many times that the ANTIC can emulate a framebuffer. The ANTIC isn't designed as a framebuffer (rather, more of a natural evolution of the playfield design of the Atari 2600), but it can be programmed to act as one. Once it's programmed to act as one, it is "emulating" a framebuffer. From the dictionary: 3. Computer Science. To imitate the function of (another system), as by modifications to hardware or software that allow the imitating system to accept the same data, execute the same programs, and achieve the same results as the imitated system. 1. To take as a model or make conform to a model: copy, follow, imitate, model (on, upon. or after), pattern (on, upon. or after). Why you find that to be such a hard concept to grasp, I do not understand. The end result is the same: You have a working framebuffer, sans the drawbacks of the ANTIC/GTIA system. Yet you insist that I'm "wrong". From there, you and Supercat have come to maintain that a "framebuffer" is (precise definition here) nothing more than a buffered frame in memory. By that definition, you could have a machine without a video output and still have a framebuffer. I maintain that the precise definition of "framebuffer" is a device that drives a display based on a buffered grid of "Picture Element" (pixel) data. So, either we can agree that we just have different terminology (and you can stop casting insults at people), or we can complete a challenge to show who is "correct". What say you?
  3. jbanes

    5200 vs. 7800

    To hell with it. I can't sleep now anyway. So, let's get this straight. From the moment this thread started, you've been nothing but negative and abrasive toward the dicussion. Your comments directed at me have been nothing but inflammatory, and wanting in any real technical data. All the while you claim you're "trying to mix a little information" of which you'd provided zero. You got supercat to charge in on arguing semantics he doesn't understand either, and you haven't given to a single point ANYONE has made in this thread. Now you have the gall to try to start a flamewar by casting insults? I'll tell you what. How about we settle this right here and right now? We'll solve the matter of the framebuffer, permanently. To recap the overall situation, you contest on the basis of (lemme see here), 1 stub of an article, 4 non-authoritive sources, and 3 sources that don't disagree with a thing I've said (I'll let you figure out which is which), that the "framebuffer" as a device is an invention of Sun Microsystems' marketing, and that everyone else in the world only uses the term to refer to any memory anywhere that used to hold a frame of data, regardless of whether or not it's driving a monitor. Is that correct? Well, let's have a contest to prove it. If you can prove that I'm not right in a fixed amount of time, you win. If you give up and I can't prove I'm correct by the end of that time, you win. What do you say? Are you up for a simple challenge that will settle this matter once and for all? Shall we match wits and see who's are "fluffier"? Supercat, how about you? You in? Edit: Clarified framebuffer defintion as per Bryan's post.
  4. jbanes

    5200 vs. 7800

    Nice to meet you. It sounds like you created a Shared Memory Architecture framebuffer through the DMA controller. Since I'm assuming that's it's designed puropse, I see no issue with calling it a framebuffer. The primary difference is that you're driving an LCD display (a fairly digital process, and nowhere near as finicky on the timing) rather than an electron beam. The concept of a framebuffer is to buffer the data for driving the electron beam in much the same way you use a FIFO buffer to drive a serial port. The biggest difference is that when framebuffers allow random access to the video buffer, they become useful for emulating the capabilities of sprite hardware. For an LCD display, driving the display means that there's no need to worry about the timing of a physical beam. (No HSync, no VSync, no unviewable area, etc.) You just cycle through the pixels one by one, preferrably fast enough to prevent them from fading. Most LCDs have a driver chip that does this automatically, allowing for a variety of signals to be sent to the display. Some even allow internal programs running on an internal microprocessor so that you can interpret the signals into something usable by the display. (e.g. Most LCD TVs emulate the NTSC signal for backward compatibility. They have no need for such complex signals, though.) Indeed. Otherwise it's just referred to as the "video buffer", "display buffer", or "graphics buffer". These terms are pretty much interchangable, though the latter is often applied to per-pixel modes while "display buffer" is often used to refer to text mode data. Can the ANTIC write to RAM at all? It might be useful in some cases if it could but I know of no such ability. I'll grant you that I'm far from an expert on the ANTIC, but as far as I know, no. Why does it need to? (i.e. What are you getting at?) It takes commands in from RAM, and outputs them to the GTIA. The GTIA then drives the video signal. Sure you can. How about a display controller that can't fetch anything from memory at all and must be spoon-fed by the processor? As I said: "you can't get much worse than that". No engineer in his right mind would spoon feed the display controller unless he a)dedicated the processor or b) didn't actually need the processor time. Granted, the 2600 gets close, but it was breaking new ground in making formerly expensive features inexpensive. Mmm. Except that my argument was based on the fact that the ANTIC could have had a separate bus. This would have greatly improved performance over the existing solution. The Apple advantages are: 1. The Apple gets away with 100% of its cycles vs. the 68% of the Atari. The Atari advantage is entirely in the CPU clock. Had Apple released a higher speed version in the 80's, the Apple II could have easily outpaced the contemporary 5200. Instead, it gave it a run for its money using tech from 1977. (The same year as the 2600 was released.) Had the ANTIC been able to read its commands and data without disturbing the processor, it would have left the CPU with 100% of its time as well. 2. The Apple doesn't have to render 60 frames per second. The framebuffer allows it to skip several frames, thus giving it more processing time. 30 FPS was considered silky smooth back then (still is, really), meaning that you could double your processing time to 33,536 cycles. The difference between 30 and 60 FPS would likely go unnoticed by gamers. And why are you referring to character mode? The primary comparison is between the Apple II and the 5200. Unless I missed something, the 5200 was not booted into character mode very often. The CGA framebuffer could only be written during hblank or vblank when using 80-column text mode. This severely limitted screen update speed. Graphics mode did not have this limitation. Performance of CGA games was often in fact somewhat better than Apple II counterparts. Indeed. It's worth noting that the Apple II was also not a "game machine" either, despite its reasonably good capabilities. Both the PC and the Apple II were often pressed into service, however. The greatest issue I took with CGA mode was that it was just plain ugly. The choice of cyan, magenta, and white did not make for very good graphics. The few games I remember playing on it, however, worked well enough. There just weren't many of them. The CGA 320x200 graphics mode did have an alternate color palette of red, green, and brownish-orange, but I can't ever remember it being used. IBM actually made a move away from supporting gaming by introducing the far superior EGA graphics adapter. The EGA adapter was capable of up to 640×350 with 16 colors from a 64 color palette. It looked great for business applications and basic graphics work. However, this resolution easily outstripped the ability of many programs to keep up the rendering, making slow programs and screen tearing a common occurance. Some popular games were made for this mode (e.g. EGA Trek, EGA Asteroids, and Where in Time is Carmen Sandiego), but it was otherwise ignored until the introduction of VGA and the 286 processor. As I remember, PC-BASIC used to be able to access this mode by setting the characters per line correctly. I was a bit disappointed when I found that this trick didn't work in the later GW-BASIC language. SuperCat: "Even in character mode, I see no reason not to refer to it as a framebuffer device since the display was generated by clocking data out of RAM in real time with a 1:1 correspondence between RAM and screen locations." (Sorry, the board breaks if this is in quote tags) Um, no. The resolution of the screen was still 320×200 or 640×200, but the CGA card was automatically looking up the bitmaps and producing the correct multi-pixel video signal. True framebuffers are generally dumb devices that produce a signal based on data fed to them according to various timing registers. By fiddling with the timing registers (such as the undocumented VGA "Mode X") you could directly modify the resolution of the video signal. Edit: I'll respond to your other reply tomorrow. Right now I need to hit the hay.
  5. jbanes

    5200 vs. 7800

    A framebuffer device always has the RAM onboard unless the system uses a Shared Memory Architecture. Agreed on the memory bandwidth. Definitely faster since you have hardware acceleration. Technically you're still in a graphics mode, but the blitting is being done automatically by the accelerator. As for whether it looks better or not, that's a non-argument. A text display is a form of graphics mode. Whether you're blitting, using sprite overlays, or allowing the hardware to blit to a framebuffer or not is irrelevant. The results will generally look the same at the same resolution. Sun systems tend to look much better because they have high quality software that can produce spectacular results on rather "dumb" hardware. Contrast this with the Apple Macintosh which produces a greater number of character glyphs at high resolution rather than using the resolution to produce nicer looking text. (You can see this mode by activating the OpenFirmware prompt.) Yes, it's a frame counter done in software. But since the processor wasn't purpose built for this specifc task, you can't call the 6502 a frame counter. Just as how you can't call the ANTIC a framebuffer. The ANTIC is a graphics co-processor (generally referred to as a graphics accelerator) that drives the GTIA to produce a video signal. You can write software that makes it act like a framebuffer, but it's still a "soft" implementation. Close. A framebuffer is a device that produces a generally-unchanging correspondence between memory locations and screen locations. More generally, it's a device that produces a direct correspondence between a given memory heap and the output of the video display. The memory doesn't have to mapped to a given location. That's primarily a feature for easy addressing. Framebuffer cards exist that allow their memory to be accessed via ports, bankswitched memory locations, and other "fun" (*cough*) tricks. Close. A graphics accellerator is a device to translate abstract graphics commands (commonly 2D or 3D vector commands) into a more displayable form. That could be framebuffer data (as with many rasterizers) or it could be a set of commands (as with the ANTIC's control of GTIA). Drop the part about the framebuffer and you've got it spot on. Keep in mind that sprite hardware is also a video overlay device. (The sprite overlays the background in the video signal.) Last I checked, there's no framebuffer driving the 2600. It is the memory. Not the memory CHIP, and not the device that reads data from the memory, but the memory itself. In its default text mode, the Apple II uses RAM in the region $400-$7F7 as a0 framebuffer. This isn't a special chip; that RAM is stored in the same chips as everything else from $0000-$3FFF. No, no it isn't. The memory is the "buffer". (Frontbuffer or backbuffer, depending on whether it's being displayed or not.) The framebuffer is the device that renders the video signal based on the buffered data. As I stated before, the memory used may not even be mapped into the CPU's memory locations. As I said previously, "framebuffer" is sometimes colloquially used to describe a generic video buffer, but its technical meaning is much different. A framebuffer is an area of RAM. It may exist within dedicated chips, or it may be borrowed from other chips. In some cases, hardware will force a particular area of memory to be used as a framebuffer, but in many cases software can control how memory is used. An area of RAM is a framebuffer when it's being used as one. When it isn't, it isn't. 998050[/snapback] No, no it isn't. The memory is the "buffer". (Frontbuffer or backbuffer, depending on whether it's being displayed or not.) The framebuffer is the device that renders the video signal based on the buffered data. As I stated before, the memory used may not even be mapped into the CPU's memory locations. As I said previously, "framebuffer" is sometimes colloquially used to describe a generic video buffer, but its technical meaning is much different.
  6. jbanes

    5200 vs. 7800

    To be clear, "framebuffer" can be correctly used to mean a RAM location holding screen data. More correctly, however, it refers to a hardware device that drives a monitor using pixel data from RAM. Such devices were once valued for their simplicity over their expense. As in: "This Sun workstation is equipped with a simple framebuffer capable of driving the monitor at 800x600x256." A RAM location that stores a screenful of data is properly referred to as a "backbuffer" or "frontbuffer'. As in: "I wrote the screen to the backbuffer, then flipped it to be the frontbuffer during the Vertical Sync." This is how I'm referring to it. The ANTIC is not a true framebuffer device. It actually has more in common with modern 3D Vector hardware in that it is a separate processor that queues up commands for just in time processing. Probably the biggest difference I see is that the ANTIC doesn't have its own bus. Performance-wise, you just can't do much worse than that. Agreed. The PC had an accelerated framebuffer because it was designed for "serious" home and business applications. The use of a framebuffer and character ROM greatly simplified the programming, making it much more useful as in business and BASIC programming. IBM made a half-hearted attempt to customize the PC for "family use" (i.e. games) with the PCjr, but the implementation sucked so bad that it was never taken seriously. (Ah, I still remember that stupid "grab the key" game that could be found by hitting the right function key when the built-in BASIC ROM started. And let's not forget how it would freeze if you hit a key sometime within the minute and a half while the memory was being checked! Stupid PCjr.) It's worth noting, however, that the PC didn't have a true framebuffer device either. The framebuffer had a character blitter added to it that allowed the hardware to produce text. This stands in direct comparison to the Unix-machine framebuffers which drove everything through software blitting. The PC had different graphics modes similar to the ANTIC to help programs produce text screens. Many Unix machines, OTOH, had to draw everything in software. The advantage to the full framebuffer was that there was no such thing as graphics modes. "Text Mode" was just an operating system driver that converted ASCII codes to bitmaps on the screen. You could call a graphics library at any time to produce pretty pictures right on top of the console. If X-Windows took over, all it did was disable to console driver and begin writing to the frontbuffer. (I'm somewhat simplifying this. X-Windows often switches a few registers to change the parameters of the video signal so that a higher resolution can be used.) That's why if you've ever seen a Sun machine, the console looks so damn beautiful. Pure white background with black anti-aliased text. *hugs Ultra 10 box* So beautiful. *sniff* Of course, for video games the differences don't really matter. Save for the fact that the PC is pain in the ass to code for. Oh, I love it. "You're wrong because you're right! See?" Let me ask you this: If you have a port that allows you to use the CPU to write a value to the monitor cable, is it a framebuffer device? No? What if you implement a program that produces a video signal with the correct timing. Is it a framebuffer then? Sort of. Your program is emulating (or simulating if you prefer) the actions of a common piece of hardware. But it wasn't designed to perform this role, nor is it the default state of the hardware. That's why I refer to it as an "emulated framebuffer". And just like with the ANTIC, such a device would tie up the system's resources as the picture is being drawn. No, the Atari can be programmed to do the same thing. That's software my friend, like it or not. Ultimately you're wrong. There are frame buffers, then there are graphics accelerators, and then there are video overlay generators. A modern graphics card tends to incorporate all three. The frame buffer is the destination for the commands given to the graphics accelerator, which is overridden by the video overlay device to produce effects like a mouse cursor. In engineering these terms have very specific meanings. They used to have different meanings to consumers as well. Today they tend to be used interchangably outside of engineering, but we're all engineers here, right? Right? *crickets chirp* Hmm. From the Sun framebuffer FAQ: <southern-lawyer>Your Honor, Ah rest mah case.</southern-lawyer> (For what its worth, I also used to think that "framebuffer" was a loose term for "the thing that displays the graphics." It was a real eye opener to learn that specifying "framebuffer" meant I might be getting very different hardware than what I wanted.)
  7. jbanes

    5200 vs. 7800

    Indeed. Page flipping is also often used as a feature in framebuffer hardware that's got way more onboard memory than it needs for a single frame. Obviously it's a lot faster than double buffering, which requires that the full screen be dumped to vidmem after being constructed in main memory. The cool part about the ANTIC's approach, however, is that it can not only perform a page flip, but a complex display list list (I love that name ) could actually interlace various line buffers to produce different neat effects on a shoestring memory AND CPU budget. That's a rare thing in my experience. Usually you trade one for the other. Of course, after all the *#[email protected] "fun" I had with SuperVGA bank switching, I never thought I'd actually be praising the ability to have a disjointed video memory map.
  8. jbanes

    5200 vs. 7800

    Ok, fine. How? I'm looking at the ANTIC docs right now. Taking Mode 8 (mentioned by Jaybird previously) as an example, "This mode, as other non-character graphics modes do, uses data in the display buffer as a bit map to be displayed. A command to display in mode 8 will cause the ANTIC chip to read the next 10 bytes in the display buffer." Each mode after that specifies the same thing, only with different pixel sizes and color counts. Now I have no doubt that you can configure a display list list to emulate the framebuffer. That's still not a true framebuffer, though it's pretty darn close. The biggest problem with this scheme is that you're locking out your CPU every time the graphics hardware needs to be fed data. (Most true framebuffers have their own memory to generate the signal off of. This frees the processor to do other things. The IBM PCjr differed in that it used a Shared Memory Architecture, but as far as I know this didn't impact the main processor.) As Bryan stated, this scheme results in a significant drop in processing power. I originally thought that the GITA must be fed on a different bus, but I was corrected on that one. Which means that there are distinct disadvantages to the ANTIC emulation of a framebuffer.
  9. jbanes

    5200 vs. 7800

    Dude, chill out. Again, we have a failure to communicate here. Do you realize that the "Apple routines" are just software? There's nothing magical about them. They're just 6502 code. Yes, I do realize this. I was stating that the standard Apple line drawing routines (at least the ones I mentioned, see the paragraph below) are too primitive to implement Bresenhams. They draw a line either vertically or horizontally. As I said, Chopper Command probably uses custom drawing routines for speed. It one of the first things that game programmers did back in the day. I was digging a bit more through the reference material, and it looks like you're right. There's an HPLOT command in the AppleSoft commands for Hi-Res mode. I was looking at the Integer BASIC command set as an example. Thanks for pointing that out. Two points: 1. Define "real-time". Like I said, with a framebuffer you can drop your framerate and still produce a highly playable game. 1MHz is definitely not a lot, but to rotate a single sprite that isn't that bad. If you're running at 10FPS, you have about 100,000 cycles per frame. Since you're not forced to spend the majority of that updating the screen, you're left with a LOT of time to do your work in comparison to a 2600/5200/7800. So rotating a single line-art sprite *is* possible. 2. Out of this World was an all-polygon game. Without some sincerely bizarre magic, you can't render that many polys on a 1MHz system. Hell, most Apogee and ID games could barely run on a 10MHz PC with full 2D graphics. And those guys were wizards at this stuff. That doesn't mean that you can't compromise. Might I remind you of... (wait for it) Ballblazer? Do you realize that I pointed to a hypothetical version of Gravitar? Of course I know that Chopper Command used a set of bitmaps! I chose a different situation because I was trying not to give the impression that I was saying that Chopper Command used line art for the chopper. I've already named the primary fill areas on the Choplifter screen: The scoreboard, the ground, the stars, etc. I have questions about the base itself, and about the rotor on the chopper. I have no doubt that the Chopper, the People, the Tanks, and the Planes are buffered sprites. The general topic here is advantages of 5200 vs. the 7800. The 5200 gained a lot of Apple ports because it was able to emulate a framebuffer. The 5200 does NOT have a true framebuffer, as the main processor has to setup a blit of each partial line of data via a display list at the time of the screen rendering. (Thank you to those who pointed me toward the ANTIC docs. They were very informative.) A true framebuffer generates a signal off of video memory, and will do so independent of instructions from the processor. This is probably why the 5200 Choplifter flickers while the Apple II version doesn't. (At least in the emulators. I have no reason to believe that the real hardware is any different, though.) Now, will you please calm down?
  10. jbanes

    5200 vs. 7800

    I used to do quite a bit of assembly on the IBM PC, and I have tried my hand at the 6502. (It's kind of fun to do, BTW. The instruction set is so simple in comparison to most other processors.) Not the Apple routines. These are too primitive for Bresenham's algo. They draw only horizontal or vertical lines. Which I'm not saying that Choplifter is using, but probably something similar, perhaps custom. He needs a fill routine for several areas on his screen, plus point routines to create the stars. If this was an Intel, I'd use a REP STOSW loop to blit the fill as fast as possible. Sadly, you can't do that sort of thing on a 6502, and have to resort to a standard loop or some sort of hardware acceleration. On most Ataris I've seen, this isn't such a big problem as each byte of data is used to produce 8 or 4 pixels of 1-bit or 2-bit color. I confess to not knowing all the details about how the Apple II framebuffer worked (other than it's worse than programming bank switched SuperVGA), so I wouldn't quite know how the developer would go about writing the high-performance line, point, and fill routines, but the concepts are the same. You have a starting point, and ending point, and the routine performs a fill between. Taking the Atari 2600 and 7800 as examples, you can't really perform that same sort of fill without some trickery. A vertical line is not a vertical line, but rather a blip on each line that needs to be rendered. This makes programming these systems rather interesting since you don't have the memory to simply pre-compute the values, then pump them to the screen when the time comes. (Side Question: Does anyone know if the SuperCharger games used framebuffer emulation? With an extra 6K of RAM to work with, it strikes me that they could have done much of the playfield rendering in memory during the Vertical Sync. The graphics could have been pumped directly from memory before and during each scanline. But I'm getting off topic.) If you want high performance, you don't use Bresenham's. At least not back then. One can get away with a lot on modern computers, but back then you tried to use something faster. Even Breshenham's, however, can be much faster in specific circumstances. For example, if you wanted to port Gravitar to a system, it's going to be a LOT faster to rotate only the end points and draw the sprite line by line rather than rotating each individual pixel. This is effectively the method used in most early 3D engines. Rather than rotate and scale each pixel in the texture, it was far faster to rotate the ends and use a line drawing algo. Of course, Affine Texture Mapping wasn't exactly the best looking, but it was screaming fast. Certainly. Sprite hardware will always be faster than a frame buffer. The primary advantage to a frame buffer is that you are not limited (graphics-wise) by the hardware. So if you need to draw a highly detailed background with a large number of foreground objects (such as Microgammon SB), you're not limited by a specific number of sprites or the auto-expanding of the playfield registers/display commands. The Apple II, as I understand it, was actually clocked slower than even the Atari 2600. As you said, it's amazing that it was ever able to do an action game like Choplifter. Of course, with a framebuffer you do have one major advantge: Framerate. The Ataris forced programs to keep up the 60Hz signal no matter what. With a framebuffer, you can drop to 10FPS and the hardware will make sure that the screen doesn't flicker.
  11. You're probably right, but it's fun to kick around ideas like this. See if you can get people thinking outside the box, as it were. Looks like at the moment we've got a three way tie between "Yay", "Nay", and "Huh?" We'll have to see if a concensus emerges or if the idea is just too weird.
  12. jbanes

    5200 vs. 7800

    Nah, just trying to communicate. I'm not always successful, I'm afraid. Like I said, I called it "vector-like" for want of a better term. Believe it or not, I am listening to what people are saying, and I am learning new things. Sadly, I'm not always good at making that clear. My apologies if my failure to communicate is causing you any grief.
  13. That's not true of all the games, however. Take Shark! Shark! as an example. Simple, straightforward, and easy to play. Similarly, Astrosmash, Atlantis, Beauty and the Beast, and many others are easy to pick up and play. These could easily encourage users to read the manual for more complex games like Star Strike. Also, you may be underestimating the difficulty in getting started with a VCS clone. It might have been obvious in the late 1970's to use the reset button to start a game, but in 2006 it often confuses players. Similarly, games like Radar Lock, Quadrun, and Fatal Run require manuals to understand their complex controls or gameplay. Indeed. I was thinking about throwing one of those promo images for the overlays onto the box mockup I did. I don't see any reason why the overlays for the bundled games can't be included. They're not that expensive to manufacture. Hrumph. You guys are no fun.
  14. jbanes

    5200 vs. 7800

    Who the heck is talking about shape tables? I'm talking about basic drawing commands. You know, PLOT x,y Draws a dot at location x,y HLIN x1,x2 at y Draws a horizontal line from x1,y to x2,y VLIN y1,y2 at x Draws a vertical line from x,y1 to x,y2 Various points are filled in using routines like these, such as the stars, ground, and score table. Modern libraries (and even several on other systems at the time) have much more complex routines that allow for diagonal lines, filled primitives, and other fun features. (Which again, are vector in nature, but are rasterized to the screen.) So the stars are bitmaps? The scoreboard? The ground? I find that highly unlikely. Like I said, it's not true vector graphics. But I don't know what else to call it in comparison to the full sprite versions of the Sega and 7800 versions.
  15. I want the FB3 to be... ...An Intellivision! ... Um, actually the FB4 since the FB3 is probably finalized at this point... You can pick your jaws up off the floor now. I haven't quite lost it. (Yet.) Just hang with me for a moment while I explain. You see, while Atari has a large number of classical holdings, they are really only well known for one of them. That one is the Atari 2600. The problem is that they've already used up their 2600 card with the Flashback 2.0. From here on out, Atari is going to be fighting an uphill battle in getting their new, non-2600 Flashback units accepted. Hopefully they'll be very cool and will go over well, but each generation is going to have to be more technologically sophisticated. Even doing an 8-bit computer as has been suggested would be much more difficult than the original 2600-on-a-chip. (Not to say that they can't do it. I have faith in Curt and his team.) To make matters worse, any attempts to create a Flashback 5200 or 7800 will be hampered by the fact that these consoles' libraries are mostly composed of Arcade ports for which Atari may or may not have the rights. What's needed is a simple hardware platform with an abundance of games that are not tied up in complex licenses. Now consider something for a moment. IntellivisionLives.com has rights to hundreds of games that are not encumbered by third-party licenses. They've got the software, they've got the original team of Intellivision developers, and they've got everything except the resources to build Intellivision hardware. Their attempts at a PNP system to date have been poor, and have completely missed any sort of attraction the system might have to consumers. As far as most people are concerned, it's just another gamepad with lousy games. Atari, OTOH, has the resources to build the hardware. They also have the "Flashback" name, which is well on its way to becoming its own brand. What they need are unencumbered games. So if Atari enters into a deal with Intellivision, both companies win! My friends, I give you: The Intellivision Flashback! Just imagine the library this thing could have: Shark! Shark! Star Strike Minotaur Astrosmash Atlantis Beauty and the Beast Bowling Chip Shot Super Pro Golf Diner Space Spartans White Water! Dragonfire Demon Attack Stampede World Series Baseball And many, many other cool games! All packed in the stylish and sleek form factor of the original Intellivision. Personally, I think it would sell like hotcakes. Minor Edit: Fixed some typos Edit: Looks like Curt has turned his attention to making an actual Intellivision Flashback! So if we can convince Keith, the FB4 may very well be an Intellivision console!
  16. That's why Infotari *cannot* be that old Atari. Anyone carrying the Atari name must be a new company, with new ideas, but leveraging the old name to market those ideas. So far the Flashback 2.0 is the smartest move that Infotari has made yet. It has done well for the company, and has paved a road forward that I feel they would be wise to follow. The problem is that they're going to have to get creative with this road. There are only so many old systems to revive. I think that Infotari would do well to investigate some of the holdings they inherited with the Atari brand, and perhaps use 20/20 hindsight to bring previously unreleased concepts to market. Take the holographic handheld as an example. The implementation from back in the day just wouldn't work now. But nothing stops Infotari from developing a new device that builds on that technology. Imagine, for example, a handheld gaming system that replaced its screen with every cartridge. The cartridge itself would consist of the transparent Hologram, LCD pictures, and coloring over the LCD pictures. The coloring would be lit up by the unit's backlight, giving the illusion of a very colorful handheld game. Priced right, such a device could do well in today's market. Another idea is a laser-projector, vector-graphics console. Can you imagine reliving Asteriod, Tempest, Gravitar, and other vector favorites, but projected directly onto a large wall instead of a tiny Vectrex screen? The technology exists, and is currently in use by hobbyists. Basically, Infotari can have a very interesting future ahead of them. But they're going to have to play this different than everyone else, and follow the niche they've created.
  17. jbanes

    5200 vs. 7800

    *sigh* I stated "vector-like" and "vector-style" several times before I dropped the moniker. It should be obvious what I'm talking about. Basically, the 5200 version appears to emulate the Apple II's use of line, box, and other primitives that can be achieved on most systems via their basic graphics routines. The routines are actually vector in nature, but are normalized to a framebuffer. The graphics work on the 5200 because the 5200 has enough memory to emulate a framebuffer, whereas the 7800 doesn't. I *believe* that the 7800 could perform the same graphical feats, save for the fact that it lacks the memory. That's part of what makes it so much easier to draw bitmap mountains in the background rather than twinkling stars. (Like in the original.) People who make stupid statements like this are ALWAYS doing themselves a disservice. This is the exact same thinking that produces the "5200 is waste of plastic and solder" line. Neither one is true, and you ought to be ashamed of yourself. The truth of the matter is quite simple. The NES beat the 7800 technologically in the same way that the 7800 "beat" the 5200. Like we've been discussing, the 5200 had more memory, more powerful controllers (even if the non-centering point sucked), and a few other points in its favor. It's generally beat out by the 7800, however, by the fact that the 7800 can produce more sprites that look better, and more complex playfields. In the same way, the Nintendo beats out the 7800 not on the number of sprites, but on its "cheap trick" of providing a tiling background and more earthy color tones. In addition, the Nintendo also had far superior sound and music ability, in a time when gamers were not used to background music in their home consoles. These factors combined, and made for a "superior" console to the 7800. Sure, the 7800 could push more sprites, but that doesn't help anyone when you have no need to render so many. The Nintendo's sprite capability was sufficient for most games, leaving the rest of its features to carry the day. Combine this with the superior artistry demonstrated in the games from Japan, and you will find that the 7800 never stood a chance.
  18. Not a chance. That show (Enterprise) was a hideous abomination that never should have been. Bad acting, bad plots, bad continuity, and the producers absolutely despised the fans. I can't beliece that there were those trying to keep it on the air. They should have spent their energies on FireFly, a show people actually wanted to see. Oh well, no accounting for bad taste. No, I just happen to know a few things about business, that's all. I'm not necessarily saying that fans should buy out Infotari, I'm just pointing out how it could work.
  19. First and foremost, you need to realize that a holding company would be setup to actually "own" the mark. The fans who ponied up the cash would merely be investors. Since a company is directly responsible to its investors, the holding company would need to do whatever the fans say. (Might make sense to run it as a Co-op.) Secondly, I think that such a company would do well to expand on its hardware console business. The flashbacks are a great start, and the old ST-style lines can be continued with new computers for kids. A new game console should be released, targetting the "casual gamer" and "young children" markets. That would keep it from going head to head with the Big 3 consoles. Yet through a fairly traditional licensing scheme, the fantari company could make much of its money off of third party title royalties, many of which would be created by fans. Basically, fantari could cream LeapFrog and VTech in one fell swoop, and use the name recognition to convince parents to purchase their hardware and software for both their kids and themselves. Thirdly, since fantari would be a "real" company, it could make the necessary deals with manufacturers to get its hardware produced at industrial rates. The only barrier would be the capital, which would be provided by investors and early sales. If necessary, fantari could run very small batches at higher prices to begin with, then use that capital to produce larger batches at lower costs. Sound like a plan? Cool. Now who's going to be in charge of raising the capital?
  20. Definitely the NES. Despite Nintendo's attempts to keep the number of games down, the library ballooned out of control by the time the SNES came around. You may remember the Toys'R'Us promotion where they had an entire WALL of games for $20 or less. The 2600 had a respectable list, but the newness of the console made it slow to take off in title count (remember, it was only designed for about 10 games!), then the crash of '83 brought cartrige production to a screeching halt. All in all, it got about 7 years of development. In comparison, the NES was going strong in both the Japanese and American markets for about 8 or 9 years. Note that while the NES wasn't released in the US until 1986, the massive software base for the Fanicom provided the NES with a large launch library of titles. The last official title to be released for the NES was Wario's Woods in 1994. (Three years after the release of the SNES in the US!)
  21. You know, that game was amazingly addictive. I almost never pumped quarters into a machine when I was a kid (I'd wait until I get to Showbiz, where my parents gave me tokens), but something about Stun Runner made me sink tons of money into it. I think it was a combination of innovative gameplay and the controller. Since you can't reproduce the controller at home, it's just not as much fun to emulate. It's too bad S.T.U.N. Runner fell into the dustbin of history. I always got the impression that the game didn't do all that spectacular when it was released.
  22. Centipede. I can just boot the sucker up, sit back, and mindlessly blow things away. Millipede smokes my rear way too fast. Besides, all those mode switches interrupt my mindless shooting trance.
  23. jbanes

    5200 vs. 7800

    Very nice. When can I get a cart? However, could you explain the graphics modes a bit? The Frogger is certainly interesting, but how does it actually compare to the 5200 hi-res mode? Choplifter, Microgammon SB, and Mini-Golf were all very detailed vector art. (They were also ported from the Apple II, so it's no wonder they all used this mode.) Can the 7800 do the same thing? Score one for the 5200! I knew that 16K was useful for something. Indeed. Choplifter for the 7800, for example, eschewed the vector mode in favor of traditional bitmaps. That made it not quite like the Apple II version, and not quite like the Sega version. Sadly, it seems to be generally hated for that. Personally, I find it to be a lot of fun, but the gameplay is a bit different.
  24. jbanes

    5200 vs. 7800

    Yes, thank you. Superchip, not Supercharger. My bad. Here's an interesting question. I got the impression that games like Chopper Command (original Apple II port, not the Sega one), MiniGolf, and Backgammon were only available on the 5200 because the 5200 was better able to produce hi-res vector-like graphics. Is that a correct supposition, or is it that no one ever bothered writing a hi-res game for the 7800?
  25. jbanes

    5200 vs. 7800

    *jaw drops* But that defeats the entire point of a DMA transfer! It might as well use the 6502 to do the dirty work if it's going to halt it! Was this some sort of misplaced attempt at multi-processor synchronization, or was there some sort of technical reason introduced that they didn't take the time to iron out?
×
×
  • Create New...