bmcnett
Members-
Content Count
131 -
Joined
-
Last visited
-
Days Won
3
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by bmcnett
-
Software that relies on the four sprite limit would have worked just as well if if had uploaded partially vertically occluded sprite shapes to VRAM instead. Why bend over backwards to support such software rather than the 99.9% of software that would flicker less. I just don't get it.
-
don't forget that to exercise your "superiority" of a 5 bit frame buffer on NES you need 5 bit sprites and tiles, which requires 2.5x the ROM. what a disaster! a conventional Genesis title needed only 2x the ROM of NES for twice as many colors overall as 5 bit!
-
huh. no? software that looks funny without the limitation is probably more like 1 in 1000, and software that malfunctions without it is probably zero. why make most games flicker needlessly, just to make a very few games support gradual vertical clipping of sprites? as for having more colors, it's very cheap and 100% backwards compatible, so nobody would even know it's there unless they looked on purpose. it doesn't affect anyone who doesn't want to use it, so those who don't like it can safely pretend it doesn't exist. anyway if someone wants no improvements to their vdp, why buy an improved vdp at all?
-
Personal opinion: there's a lot more love for removing the limitation than for keeping it. So maybe 1% of programs rely on it. 99% don't. Go with the silent majority. I hope you will implement at least my idea for one 16 color palette for each of the two color nybbles, since its only cost is an extra set of 16 color registers. Hey twice the on-screen colors, and nothing really new to implement for you or for app developers to learn. My idea for combining both color nybbles into an index to a table of 256 color pairs (512 on-screen colors) improves the visuals more dramatically but requires a total 512 color registers and a little explanation for app developers too, so I understand if you don't want to go there. As for more colors in sprites, you can add an extra bit per pixel and treat the bits like a binary number between 0 and 3, or you can implement something like what I was talking about, which costs the same in terms of bits but is 256 colors. Though you get SNES quality sprites for NES prices, you also require app developers to learn something extremely alien, which may not be a valid tradeoff. To keep maximum backwards compatibility with old modes, I would recommend that tile memory be 64x32, in which the 32x24 onscreen tiles scroll and wrap around according to hardware registers. The eight bit tile indices of the old VDP would remain, but there would be another tile memory elsewhere to hold the upper bits of the sixteen bit tile indices. Someone will use the lines to remake Battlezone and Elite, and demo coders will go nuts with the lines for a while. If you implement 3D matrix math and perspective transform in the VDP, 3D wireframe games will be about as smooth as they've ever been on any platform. Can't think of any practical use for the ellipses. good idea! not sure what this means. could you explain? obviously. The Coleco guys would have a field day with this, trust me.
-
Not at all, there's a lot of useful factoids in your rants. They just aren't conducive to conversation. I hope you understand that I can't reply to all your points. I'm going to disagree with your claim that the framebuffer architecture never succeeded in 2D game consoles by coincidence alone. A frame buffer requires not just a lot of RAM to make all the pixels addressable, but requires all the backgrounds and sprites to end up in a unified palette space. NES graphics would therefore typically require a 5-bit frame buffer, or an extra 37.5k RAM, but wait that's just for the front buffer. You also need a back buffer to render to, so now an extra 75k RAM are required just for frame buffers on NES. Also, the background must be written freshly to the backbuffer each frame, which commits a lot of time and bandwidth to pixels that aren't even moving. SNES would need an 8-bit frame buffer or an extra 120k RAM just for frame buffers, and Genesis would need a 6-bit frame buffer or an extra 90k RAM just for frame buffers. It's just ludicrous. The RAM isn't just expensive to buy and put in the console, but that's also a lot of bus bandwidth to read the frame buffer, read the graphic, combine them, then write it back out. The frame buffer was a liability for game consoles until 3D became obligatory. 3D doesn't absolutely require a frame buffer (the DS' algorithm for 3D would work with a scanline buffer) but as a practical matter, after a DS-like level of geometric complexity is exceeded, a frame buffer is cheaper than alternatives. p.s. oh yeah Intellivision would have required 15k extra RAM for its frame buffers! That's on top of the 1456 bytes of RAM it already had.
-
I'm curious what that meant. Amiga had a blitter from day one, and ST got one later. They needed them because they had no decent sprite/tile chip, and so every frame something had to copy all the background pixels, and then copy all the sprite pixels on top of them. It made for a lot of bytes being read, shifted, masked, and written to the frame buffer. The blitter took this load off the CPU when it could, but couldn't reduce the bus traffic it generated. This sort of system made for worse-looking games than SNES/Genesis and with a higher system price. For a drop-in replacement GPU for an old computer like TI-99/4A or Atari 8-bit, a blitter is far more desirable because it's just impossible for the CPU to copy pixels by itself at interactive rates. Still no replacement for good sprite/tile logic when it comes to games.
-
Ok, thanks for providing your example. Yes, that is a very nice property of byte-per-pixel frame buffers. Reading and writing individual pixels on the CPU is relatively simple, because a pixel is as big as the data from one VRAM read/write. A byte-per-pixel mode with a 256 color palette would be a very nice thing to have in a VDP, and probably not the hardest thing to implement, either. I wouldn't ever argue against such modes - but I also don't think that they are ideal for sprite/tile videogames, which is what I care about most. If the goal with a new VDP is to simply put whatever in there and see what happens, then my prediction is that you'll see a lot of interesting demos, and games that are lopsided toward having a smaller quantity of prettier graphics. I've been working against that in my fictional VDP design. If any of my ideas are useful in the real world, well that would be nice too.
-
Though more pixels and more bits-per-pixel is a problem for storage and I/O no matter what you do, there are ways to mitigate its impact on the CPU's ability to update the screen interactively. The most obvious solution I could think of is to "tile" all the new fat-pixel modes. Regardless of the fact that each pixel is uniquely addressable, each 8x8 pixels gets a "tile word" that says which tile to draw there. If there are enough tiles to choose from, there should be no need to use any tile more than once. So then each pixel is uniquely addressable, just as if there weren't tiles at all. But when the CPU wants to update the screen, it can touch just the "tile words" when that is adequate. For example, it could run a windowing system in a 512x384 24-bit pixel mode, where windows are snapped to 8 pixel intervals, and text is an 8x8 font. A window can display a 24-bit image, but moving the window around or updating text would manipulate only "tile words." If a tile word is 16 bits, and a tile covers 8x8 pixels, then the cost is 0.25 bits per pixel in the worst case of a full-screen update. Since the screen has 4x the pixels of a standard mode, the cost is equivalent to 1 bit per pixel in standard mode. That's quite a lot compared to old applications, but it's well within the realm of possibility, especially if VDP writes are full-speed outside of VBLANK. There are good reasons **not** to go with 24-bit RGB, and instead to go with a 16-bit HSV, but that's for another post...
-
No, not always. It depends upon what you're doing and how you're doing it. Again, it depends on what the mode does and what you're trying to achieve. Yes, I can imagine cases where what you are saying is true, though I wish you would have given an example so that this would remain a conversation. Replying to my message with "no it depends" is not giving me much to work with. But in the general case of bitmapped graphics, such as we see in 99% of real games for old computers, more pixels and more bits-per-pixel means more bytes to store and move. No matter what the VDP is capable of, the CPU is stuck moving all those bytes from ROM or some slow old I/O bus into the VDP. Though the VDP can have fancy sprites and a blitter, for scrolling games the CPU will eventually need to update the pixels. A VDP that is completely programmable can fix even that, but then you have what amounts to a CPU inside the VDP. I guess that's something interesting, but it doesn't seem to me like a TI-99/4A or ColecoVision anymore. It seems more like a fifteen-year-old PC that is stuck somehow inside a TI-99/4A, which feeds it bytes very very slowly
-
Hi folks. In my spare time I've also been designing a chip to replace ColecoVision's VDP, but I never intended to use my design in anything but an emulator. I PM'ed matthew about some of my ideas and he said I could dump them here. It may seem obvious that a new VDP should have "new modes" that have more pixels and more bits-per-pixel, but the reality is somewhat more complicated. One problem is that increasing the pixels or bits-per-pixel effectively reduces the amount of graphics that can fit in a kilobyte. This means that less graphics will fit into a cartridge, cassette or disk than people are accustomed to. Another problem is that increasing the pixels or bits-per-pixel effectively reduces the amount of graphics that the CPU can manipulate in a second. This means less action is possible than people are accustomed to. So I have been working on ways to increase visual quality without increasing the number of pixels or bits-per-pixel. Here is what I have so far: The VDP's nicest graphics mode is 256x192, two bits per pixel. Each 8x1 pixel chunk gets two bytes: the first byte says which pixels are color A or B, and the second byte has the palette indices of colors A and B with four bits per index (16 colors total.) Most obviously, these sixteen colors should be user-definable, which matthew has already discussed. There are other ways to interpret the second byte, however: 1. The first four bits can index into one 16-color palette, and the second four bits can index into another 16-color palette, for 32 colors total. 1. The first four bits can choose one of sixteen 4-color palettes, and the second four bits can store the palette indices of colors A and B as two bits each, for 64 colors total. 2. The entire byte can be a single index into a palette of 256 color-pairs, for 512 colors total. It is also possible to get more visual quality - with the same two bits per pixel - by looking at bigger chunks. 1. An 8x2 pixel chunk has four bytes: two bytes to say which pixels are which colors, and two bytes that contain one four-bit palette index for each of four 2x2 subchunks. So each 2x2 subchunk gets only two colors, but there are 16 sets of two colors to choose from with complete freedom. 2. A 4x4 pixel chunk has four bytes. Let's imagine that each pixel has eight bits. It just happens to share the upper bits with its neighbors. The first two bytes contain one unique 0 bit for each pixel, and then there are four three-bit values, shared by all the pixels in each of four 2x2 chunks as bits 1,2,3. Then there's one four-bit value, shared by all the pixels in the 4x4 chunk as bits 4,5,6,7. The eight bits are used to index into a palette. SO: each 4x4 can choose one of sixteen 16-color palettes. Each 2x2 can choose one of eight color-pairs from that palette. Each pixel can choose color A or B from that color pair. Tricky to author, but the result is nearly 8-bit color quality. There is plenty more I've been working on, but this is enough for now. I hope this will be of interest, and if anyone has tips for me, I'm all ears!
-
The Bit Wars: Was it all BS?
bmcnett replied to toptenmaterial's topic in Classic Console Discussion
Neither does a 32-bit have any practical advantage over a 16-bit CPU when it comes to 2D sidescrollers. Most 2D game physics rely on object coordinates. The average 2D game uses a resolution of 256x256 pixels. It takes 256 screens for a level to be more than 65536 pixels long. Has there ever been a 2D game with levels that are more than 256 screens long? This is one of the reasons I always thought the 68000 was an highly overhyped and overrated CPU. Another being that 4-cycle memory accesses cancel out the performance advantage of having a 16-bit data bus. The 68000 is easy to program, and can run at adequate speeds off of slower clocked memory chips, but just because it has 16 32-bit registers, doesn't mean it's (16 regs)*(32 bit)/(8 bit)= 64x more powerful than an 8-bit 6502 with 1 register, as some people like to believe. yeah, you're totally right about this one. a home computer fan would tell you that a 32-bit CPU can copy pixels faster than a 16-bit CPU, but why is the CPU copying pixels? because you're on a home computer that has no sprite chip, that's why. -
Ah, yeah you're right about that one! Stupid mistake on my part. It's not clear to me how the removal of features from just the 400's chips would have saved money in 1979. If anything, two sets of chips would have cost more money? You'd want to eventually merge all those chips, and a side effect of that would be the removal of useless features, but it's the merging of the chips itself that saves you money eventually if volume is high enough. But removing a big physical hunk of electronics - like a keyboard? That reduces costs up and down the supply chain. Your box gets smaller, so you can fit more on a truck! I don't understand your vision of cost management. We're just going to disagree about whether a keyboard belongs in a game console. History seems to have shown that demand for keyboards and keypads in game consoles is low, but everyone's entitled to an opinion. I'm not talking about home computer video hardware. This may not be obvious because I'm talking about Atari 400. It's on the table only because we're talking about 5200 and Atari's design process. With the exception of 2600, 5200, 7800 and Jaguar, I don't know of a successful game console from 2600<X<PS1 that didn't have hardware sprites and hardware tiles. Pretty much all the arcade hits from Pac Man onward had them, too. Atari's game console designs didn't have them and were more flexible, but in retrospect this was a liability after 2600. A8 had great hardware tiles for 1979, I'll give you that. For 1982, not so good. But enough to get a lot of great games onto 5200. Yes, you've found the other problem with Atari's game console designs: a shared bus for GPU/CPU. 5200 and 7800 had it, I don't know about Jaguar. That was a good idea for a home computer, and a bad idea for a game console. You're right when you're talking about home computers, but I'm simply not talking about them. Can you list arcade boards that did frame buffer graphics? I guess the very late sprite games with lots of really huge sprites did that. But by that time, the world had largely switched to 3D which always has a frame buffer.
-
great work, bravo!
-
Makes you wonder why. Has anyone ever spoken with the people who made the decision to give the IntelliVision or ColecoVision joysticks and keypads that resembled no arcade game in history? My Dad was in charge of design at Coleco at the time, but he took whatever info he had to his grave in 2009. Can't remember him ever displaying an opinion about games or computers - he seemed to hate them both. I remember once telling Dad that my Amiga had 4,096 colors. He made a frown and said that if sixteen colors was good enough for his pencil set, should be good enough for me! LOL
-
No, the Mk.III/SMS was natively 100% backwards compatible out of the box with the older (colecovision like) SG-1000 Mk.I/II. The consumer had to buy an adapter, just like with Coleco and 2600. The SMS chips are in every Genesis, but that's a technical detail, of interest only to nerds like you and me. None of my friends with Coleco could play 2600 games, and none of my friends with Genesis could play SMS games, either. As far as they were concerned, it was the same deal. Yeah, I remember that too. Of course the 400 was priced way out of the mass market for a few years, and the same was true of most consoles since PS1. Whether Atari needed a new console in '79 or not, they sold the 400 as a games machine anyway. All I'm suggesting is that (if they had perfect foresight, which nobody does) Atari could have launched the 400 in '79 with a form factor that would support a gradual decline into the mass market in '82 with backwards compatibility back to '79, rather than launching the 5200 in '82 with '79 hardware and zero backwards compatibility. Even if a keyboard were free or printed money, its presence in every 400 caused games to require it, which made it impossible to later release a backwards-compatible game console that didn't also require a keyboard. IIRC there were "computer" games that legitimately required a keyboard, and "arcade" games that gratuitously required the keyboard for one or two keys like the space bar. It would have been nice to force a distinction between the two, for the purpose of my argument, which you don't seem very interested in. You mean a keyboard would be a selling point for a game console in 1982? Well yes, and so would an internal hard disk. The problem then is to get the consumer to pay for these things, when the competition is offering a system that provides them the basics for a lower price. No, that's very wrong, the VCS is the ONLY Atari system that ever did that as such. There must be some misunderstanding here. The 5200 couldn't do sprites that moved vertically without the CPU to move the pixels around, and the 7800 and Jaguar* couldn't do sprites OR tiled backgrounds without the CPU to manage scanlines or "zones." None of these platforms let you display a sprite by poking bytes for X, Y, COLOR, and SHAPE. All of them required you to write some kind of "system" that burned CPU cycles and consumed unpredictable amounts of RAM. *I may be a little wrong about Jaguar here, someone correct me if I am. The limit of 4 8 pixel wide monochome sprites per scanline was a much greater limitation than the hassle of managing those sprites on-screen. (hence software sprites in character or bitmap graphics modes become an attractive option) The lack of flexible color indexing for character graphics was also a disadvantage against the TMS9918 and VIC-II (at least you had the 5 color mode, but that only gives you 1 optional added color and none for the 1bbp modes). DLIs go a long way towards helping that, but that's still limited on a horizontal line basis (and uses some CPU time -the NES could used raster interrupts for color reloading as well to take advantage of a palette that was considerably larger than even the GTIA/7800 palette, but few if any games used that -the NES has ~56 colors/shades in its default palette but 3 added registers that shift it with 1-1-1 RGB values for a total of some ~448 unique colors/shades but only one bank of ~56 to be indexed on any scanline -and then the limits of CRAM allowing 13 colors for the BG and 12 more for sprites -all as 3 color palettes plus one common BG color) See, I understood everything you just said, because I'm a meganerd like you. But 90% of game programmers I know aren't meganerds. The design says to put the sprite over there, so they go read the hardware manual to see where to poke the X, Y, SHAPE, and COLOR. For better or worse, these people were disqualified from Atari development.
-
Yep. It's possible for simpler graphics to be "better". Sometimes I love the fuzzy glow around 2600 graphics on an old CRT. Blocky shapes can stimulate the imagination, too. Did you read my quote above, about the kid who said that he liked radio more than TV "because the pictures were better"? I've never enjoyed a tank game more than 2600 Combat, because the graphics weren't distracting.
-
No, no no. You won't ever find me arguing about "better graphics" because there's no such thing. I think Zaxxon highlights the respective weaknesses of CV and Atari - CV couldn't scroll, and Atari had fewer pixels with fewer colors. That's why I posted the videos. And sometimes I prefer 2600 to PC because "the pictures are better."
-
No doubt 5200 had the more popular arcade games, but the graphics were a toss up. 1. Atari had hardware scrolling, CV didn't. 2. CV had 32 16x16 hardware sprites, Atari had 4 8x256 hardware sprites. Arcade sprites were 16x16 at the time. 3. CV had 256-pixel-wide multicolor backgrounds, Atari had 160. 4. Atari could do scanline tricks, CV couldn't. Coleco had the advantage with early arcade games that didn't scroll, because it could do 16 background colors per scanline. Atari had the advantage with later scrolling games, because it had hardware scrolling. Though even this is a matter of taste. I prefer Coleco Zaxxon for having more color and detail, but I guess other people prefer Atari Zaxxon for its smooth scrolling. http://www.youtube.com/watch?v=aebXvAkjWho http://www.youtube.com/watch?v=sGIAKKsNylU
-
I programmed a few games for it. We used RGBASM for assembly and NO$GMB for debugging. Anywhere you find those tools, you'll find good docs on the GB's custom Z80. That was ten years ago though, so there is probably a new generation of tools. Martine Korth, the author of NO$GMB, was helpful to us when we had problems. IIRC the GB's Z80 was missing the extended registers like AF' but it had new instructions like LD A,[HL+] which auto-incremented HL after loading from it.
-
Right, the adapter - like Coleco's adapter that played 2600 games. They were selling the 400 in '79 anyway - might as well have started establishing it as a viable gaming platform starting then, instead of waiting until '82 and releasing the 5200, which was essentially a keyboardless 400 with zero backwards compatibility. As for the 8k RAM, well that's twice as much as NES which arrived in America in '84! Those who needed more RAM for computing could have upgraded. When I was talking about releasing the 400 with an optional keyboard, I wasn't thinking about cost reduction in '79. I was thinking about forcing the mass market "videogame" cartridges from '79 onward to require no keyboard. Then a cost-reduced slim model could have been released in '82 maybe with no keyboard option at all, and still play all the "videogame" cartridges. Well yes, for the reason that all SKUs had a keyboard. Remove the keyboard from the cheapest SKU, and prevent most games from requiring one. Maybe. But hopefully with a big library of cartridges from '79 to '82 already under your belt, the Donkey Kong license (and subsequent NES lockouts) wouldn't have been such an issue. I put the phrase "Jay Miner architecture" in quotes for this very reason. Of course Jay didn't design all those video systems. But the idea of a simplified GPU that required CPU support was Jay's. This idea made megabucks for 2600, and so it persisted in 5200, 7800 and Jaguar even if it wasn't practical and Jay wasn't personally involved anymore. By the time 5200 came out, arcade machines had sprite hardware - and so did ColecoVision and NES. Doing sprites on the 5200, 7800 or Jaguar is a nest of prickly tradeoffs and fun hacks that is still keeping programmers busy in 2011. Doing sprites on ColecoVision and NES means writing bytes for X, Y, and SHAPE. In retrospect Atari's more flexible GPU design was a productivity trap.
-
Backward compatibility hasn't really been such a big part of TV console history. The 7800, PS2 and Wii had pretty complete backwards-compatibility, and the PS3 and XB360 had just a little bit. It wouldn't have motivated me to buy a 5200 - I already had a 2600 and the games were getting old anyway. Since no TV console before 5200 had backwards-compatibility, it wasn't an expected feature. With the benefit of 20/20 hindsight, here's my opinion (worth every penny you paid for it) The 5200 should have been a 400 without the keyboard. Even better - the 400 should have had an optional keyboard and two joystick ports. They should have put all the 5200 development money into marketing this gimped 400 from 1979 as a high-end game system and focusing game development on the 400/800. By the time something like ColecoVision rolled around in 1982, there'd be three years of releases for 400 in the market already. Atari should have then countered with a cost-reduced 400, repackaged older games in cheap packs and had the next-next thing waiting in the wings - something incompatible built from cheap off-the-shelf parts without custom chips or shared memory. "The Jay Miner architecture" was a hacker's paradise and a love of mine, but it never translated into sales after 2600. The 5200, 7800, Amiga and Jaguar didn't have a GPU with good fixed-function sprites and tiles and separate video memory, and this was a disaster for inexpensive game development, debugging, hardware upgrades, backwards compatibility and emulation. If Atari had countered ColecoVision with a similar cheap box made of generic parts, they could have focused their budget on games instead. I program PS3 at work and for fun I study old game systems. I tell ya, I think I understand all the ones that made money, but I still don't really understand the limits of what 7800 or Jaguar can do. This is exciting for me as a programmer, but it is a bad thing for the mass marketability of a system. There just aren't that many programmers out there that enjoy hacking on a strange graphics chip, and there were even fewer twenty years ago.
-
The Bit Wars: Was it all BS?
bmcnett replied to toptenmaterial's topic in Classic Console Discussion
Yeah we're in violent agreement! -
The Bit Wars: Was it all BS?
bmcnett replied to toptenmaterial's topic in Classic Console Discussion
Sega Master System was 8-bit and had 4bpp tiles ^^ Read: "reasonable proxy." After all, this is a marketing term. "SMS was 8-bit" is focusing on CPU register width, which itself was never important to game marketers or their market. People wanted the 15-color sprites and "16-bit" was a catchphrase for that. Was SMS 16-bit? I don't remember how it was marketed but I think they could have gotten away with it (Turbo Grafx-16 did.) The SMS hardware palette was a little cartoony compared to the others. Not too many flesh tones to choose from. I believe the PS3 and 360 both have 64-bit CPUs The games use the 32-bit registers and 128-bit registers, but not the 64-bit registers. It's safe to say that the CPUs are capable of 64-bit, should anyone ever want to go there. -
The Bit Wars: Was it all BS?
bmcnett replied to toptenmaterial's topic in Classic Console Discussion
Though you're right that "bits" didn't refer to any specific quantity, the term wasn't meaningless. If anyone had tried to market a "16-bit" game system with three-color sprites, people would have laughed. There was a quantum leap between three-color sprites and fifteen-color sprites, where suddenly you get multiple shades for skin, hair, and clothing. Games went from looking like a coloring book to looking like a comic book. Everyone could feel that when the "16-bit" systems arrived with their fifteen-color sprites. People were willing to pay for that quantum leap, and if the marketers had said "fifteen colors!" Atari could have marketed its 2600 as "128 colors!" which misses the point entirely! Later, when the next generation after "16-bit" started rolling in, there was confusion about the next quantum leap: would it be truecolor sprites, CD video or textured triangles? There was a period between SNES and PS1 when consoles tried all three. In the end, the market decided against truecolor sprites. You look at 2D games on Jaguar vs. Genesis and the lack of colors on Genesis doesn't turn you off, if you're like most people. People didn't like CD video either, because it wasn't pretty like a movie OR interactive like a game. So the next quantum leap became textured triangles, and by that time the marketers found the term "3D" so the talk about bits began to die. Nintendo and Jaguar kept talking 64 bits but nobody knew what they were talking about, since one looked worse than PS1 and the other looked better. So aside from a brief period when NES and SNES were on the market, "bits" was pretty useless and confusing. Don't know why we'd talk about it today! -
The Bit Wars: Was it all BS?
bmcnett replied to toptenmaterial's topic in Classic Console Discussion
You mean like Sega Genesis?
