Jump to content
Multijointed Monster Maker

It's 1990, your designing a game system, what would you do?

Recommended Posts

I see that the OP prefers specifics, so here's something a bit more detailed:

 

"TurboGrafx-32"

 

32-bit CPU (internal, external, and address)
e.g.) 68020

 

An appropriately beefed-up GPU.

 

RAM: 2 or 4MB (whichever's the best price/performance trade-off)

 

Blitter

 

Pixel-scaler

 

 

Maximum resolution:
720x486

 

Palette: 24-bit (or 16-bit color, if 24 is too expensive or has too much overhead)

 

On-screen colors = 32,000 (if not more)
Colors per sprite = 32

Max sprite size = 64x64

Max on-screen sprites = 256

 

Audio:

Wavetable synthesis.
16-bit at 44,100 Hz
Hardware D/A conversion

Cartridge slot.

 

Possible CD-ROM support (with a lot of buffering).
I hesitate to include CD-ROM support after seeing the gawd-awful early CD-ROM game releases of the 1990s.

Edited by Nebulon

Share this post


Link to post
Share on other sites

 

 

Possible CD-ROM support (with a lot of buffering).

I hesitate to include CD-ROM support after seeing the gawd-awful early CD-ROM game releases of the 1990s.

Yeah, 1990 CD-ROM drives are why I left mine open to "expandability" down the road. The time just wasn't right, yet.

Share this post


Link to post
Share on other sites

Yeah, 1990 CD-ROM drives are why I left mine open to "expandability" down the road. The time just wasn't right, yet.

 

Though it wasn't a commercial success in the US, I'd say PC Engine CD (launched in 1988 in Japan, 1990 here in the States) worked just fine for the time.

  • Like 1

Share this post


Link to post
Share on other sites

Now, back to 1990: for PCs there were 2D accelerators starting to become popular for windows, but I don't think many games took advantage of that hardware. (so it makes the point a bit moot)

 

 

True.

 

In 1990 the only real blitter available for the PC was the ATI Mach-8 (and maybe the IBM 8514). I spoke to one of the engineers that was working on it back then and he admitted that it was a half-baked solution at the time. They were trying to implement something similar to the Amiga blitter, but not getting very good results (the Mach-8 was slow and ran hot). Of course, look where ATI is today. Clearly, persistence pays off.

 

So yes, even if PC games did try to make use of it in 1990, the efficiency and performance just wasn't there yet.

Edited by Nebulon

Share this post


Link to post
Share on other sites

 

Though it wasn't a commercial success in the US, I'd say PC Engine CD (launched in 1988 in Japan, 1990 here in the States) worked just fine for the time.

True, but it only had to support a TG16, not these "dream machines".

Share this post


Link to post
Share on other sites

 

Exactly. You just cannot design a console specifically for 2D scrolling games with some sprites... It's 1990 not 1980... I am not quite sure but something like Atari Jaguar might be the answer.

 

The closest things to a good, low-cost oriented framebuffer+blitter optimized console around in 1990 were the Lynx and Flare's Slipstream. The former wasn't fast or powerful enough to be directly used for a TV based console (not enough bandwidth and existing framebuffer size restrictions were too limited for a TV screen, plus it only did 16 colors/4-bit pixels) and a 4 MHz 65C02 was marginal, though probably no worse than the SNES's 2.68 MHz 65816. (much weaker than the 7.16 MHz 650x derivative in the TG16/PCE or 7.67 MHz 68k of the MD, or 7.16 MHz Amiga 68k, even with wait states in 5/6-bitplane modes and blitter bandwidth)

 

The Slipstream OTOH, relied on PSRAM to be fast enough for some interleaved DMA, so not as cheap as pure DRAM, but still cheaper than the multi-bus arrangements with VRAM, PSRAM, and/or SRAM on the PCE, SNES, and MD. Plus a 5.9 MHz 8086 isn't all that great of a CPU either, and it'd need address space expansion for cart based games, though was interesting as a floppy disk based console. It was mapped to support up to 256 kB of PSRAM and 512 kB DRAM, and there was a fair bit of interleaving it could do. 256x200 (up to 256x256) 256 colors from 4096 (12-bit RGB) 8-bit chunky graphics and a blitter optimized for both sprite and line drawing (and block-copy for background drawing), plus a 16-bit DSP useful for sound synthesis and 3D math. (slaving it for simple PCM sound playback for Amiga MOD should've also left a lot of time for math coprocessing, while doing realtime synth would've eaten a lot more time). The x86 CPU and 256 color graphics plus potentially large chunk of RAM might have made it appealing for PC game ports of the period, and Lucasfilm toyed with licensing it in 1989 while Konix was gearing up to release the Multisystem (all of which fell through, of course). 128kB PSRAM plus 512kB DRAM would fit rather well with 640k Realmode PC games. (enough PSRAM to double-buffer into and have some left over for fast blitter/DSP access) So it was ready for 1989 mass production, and would've played into late 80s Atari's computer game license game model they'd aimed at with the 7800 and would continue (somewhat) with the Lynx. (plus 880k double density floppy disks were a much smaller risk than ROM carts for publishers, and the proprietary data/sector format would've been appealing for those worried about piracy ... I think they included some other security features too)

 

Being framebuffer based also meant double-buffered rendering could drop the framerate to increase complexity (more parallax, sprites, etc) like with PC/ST/Amiga games, though dropping below 30/25 FPS would probably be unappealing for most games ... there were already a number of racing games and shooters that ran at 20 fps on the MD or SNES. (something like Galaxy Force would look a lot nicer on the Slipstream, probably) And 3D/Pseudo 3D stuff would be much nicer to work with, as would be realtime scaling effects. (rotation would be possible too, but a lot more math intensive than simple stretching/shrinking ... and combining realtime scaling with pre-rotated animation would tend to look much better ... something appealing for a hefty chunk of RAM and floppy disk storage, just as pre-shifted graphics were appealing on the ST but would be horrible on a ROM based console without 512kB of RAM to pre-shift things into)

 

Atari's Panther didn't go with that design philosophy at all ... it was more of a supercharged 7800 and prelude to the Jaguar's object processor, but required 35 ns SRAM (basically high speed cache RAM) for the list and some high-speed sprites/objects, while intending to pull most from 32-bit (slow/cheap) ROM, hence only using 32k or RAM ... plus an Ensonic (DOC/DOC-II or OTIS) PCM chip with its own private DRAM to work in, and a 16 MHz 68k that'd be getting tons of wait states like the 6502 in the 7800. Plus they made the odd choice of only 32 CRAM entries but in 18-bit RGB 260k colors), and while that meant 5-bit line buffers, it meant 18-bit CRAM entries and more chip space for the 18-bit video DACs (VDACs are fast and not 'cheap' in terms of chip space ... a big reason the Mega Drive stuck with 9-bit RGB). They should've easily managed 6-bit line buffers and 64 colors from 12-bit RGB, and it used an 8-bit linear offset, for palette select using 1, 3, or 15 color objects (similar to the Jaguar) as well as unpacked 8-bit per pixel objects using the full 32 (or 64) colors.

 

In any case, the Panther seems like a really bad set of compromises and not optimized for cost to performance. They probably could've salvaged it with a bit of added custom hardware, especially given they brought Martin Brennan onboard in 1989 to work on the Panther chip itself (and he was one of the 3 designers from Flare who'd done the Slipstream and would do the Jaguar), adding something like a bus controller chip to mediate between Panther and 68k accesses to SRAM and ROM, and possibly add a block of DRAM to work in (and be faster than ROM for sprites) while potentially cutting the cart bus down to 16-bits to save cost and potentially even using a cheaper 8 MHz 68k and a heavier emphasis on interleaved DMA from DRAM and ROM. (work within the existing Panther limits and work around them with a memory mapping and bus-flipping arrangement) Or they could've ditched the 68k in favor of something cheaper, like an embedded 650x (they were already using those in the Lynx, but an 8 MHz one with on-chip zero page would be really nice), could potentially be embedded into a custom chip, and use even cheaper 8-bit ROM and DMA everything into system RAM like the Lynx. (a Hitachi 6309 at 3.58 or 4 MHz would also be really appealing, though the latter would be an overclock)

 

But given the Panther wasn't even ready for production at the time, and the 7800's sales were declining (along with the ST and 8-bit) in '89, especially compared to the 87/88 peak of the ST and 7800, an earlier release would be better, and the Slipstream chipset was ready-made, non-exclusively licensed, and had a bunch of software developers in the UK already working on it. Plus it was built on gate array logic rather than custom or standard cell masks, so was easier to start in smaller volumes at lower costs/risks. (though note, the custom system ASIC in the Sega CD was also a gate array part) The Slipstream wasn't great, but it was there and possible for an immediate 1989 launch or at least rushed test market.

 

 

 

But getting into 100% hypothetical stuff that didn't exist at all at the time? Atari engineers could've looked at the Lynx, seen its design philosophy and either run with it themselves (or rather, commissioned a consulting group to handle it) or go back to the Ex-Atari-Ex-Amiga engineers who'd designed the Lynx chipset to do a console. The same sort of unified memory, DRAM-optimized set-up would've worked great for a home console, but it would've needed to be 16-bits wide on the bus end at least and possibly using a 32 MHz DRAM controller ... though 16 MHz could probably make do. (though 32.22 MHz divided by 2 would be good for NTSC timing, 3.58x9 = 32.22)

 

Sticking with that and the Panther's theme of a 16 MHz 68k with cycle-stolen DMA, but going with the Lynx's low-cost DRAM+blitter+framebuffer arrangement, doubling the internal and external width for the video DMA and blitter operations: so 16-bit FIFOs/latches on an 8-bit bus become 32-bits on a 16-bit bus with a slow DRAM (random) read followed by a page-mode read ... probably 4+2 cycles or 372 ns at 16.11 MHz in 120 ns DRAM, similar to the lynx (100 ns DRAM and a 32.22 MHz controller could probably get that down to 7+3 cycles or 310 ns, but 120 ns Lynx speed would be a much more conservative/realistic goal). You'd get 10.74 MB/s with that, and using cycle-stealing DMA to do a 256x224 NES/SNES (or lower res MD) style screen at 60 Hz would use about 32% of the bus time, meaning the 68k would be closer to 10.95 MHz, or somewhat better due to internal register operations that avoid waits. This is a greater percentage of CPU time than the Lynx's CPU hits, but you're using more than 2x the bandwidth for a display like this, and an 11-ish MHz 68k would be plenty for the time.

 

The Lynx's RLE compressed 1/2/4-bit texture format was also really nice, and extending that to a Panther/Jaguar style 8-bit offset in 256 colors (rather than 4-bit in 16 colors) would work really well, plus allowing direct 8bpp textures too. (maybe RLE, but potentially just uncompressed stuff, especially useful for treating portions of the framebuffer as objects for certain effects)

 

256 colors from 12-bit RGB would also be fine for the time, though 15/16-bit RGB would be nice. (you could also do software based translucency or shading effects via look-up tables, probably in ROM, especially if using 256x256x8-bit tables for translucent blending: 64 kB)

 

Include the 16-bit multiplier unit and sprite scaling capability of the Lynx, and add a bit more to the Sound hardware, say take the Lynx oscillators+DACs and allow at least one DMA channel to feed them for 8 or 16 bit PCM. (if you used word-interleaved LRLR stereo a la STe, you could use a single DMA channel for 8 or 16-bit stereo as well, and be pretty nice for software mixed sound while having 2 or 3 DACs free for chip-synth sounds)

 

256 kB of 16-bit wide 120 ns DRAM would've been a very good cost compromise for 1990 with a framebuffer based console, and have plenty of space to load code and data into, and decompress graphics and sound into from slow cart ROM. (though unlike the Lynx, you could also work in ROM directly, for cases where that's useful ... large RLE textures and look-up tables would come to mind) And while it's no DSP, a fast 16-bit multiply unit would work around one of the 68000's biggest bottlenecks for software rendered 3D. (incidentally something that the Mega Drive really missed out on, as something as cheap and simple as the NEC µPD77C25 used as Nintendo's DSP-1 at the launch of the SNES would've allowed for something probably exceeding Super FX class 3D on the MD at much lower cost ... albeit the same goes for the Atari Falcon, if they wanted a much cheaper sound/fixed-point Math co-pro than the 56k the Falcon got, and potentially STe/MegaSTE vintage ... though embedding a custom 650x+multiplier chip for sound coprocessing and some 3D acceleration would've probably been cheaper for Atari with a 650x license already in use and all)

 

Oh, and of course, you avoid the issue of a relatively alien architecture as the Panther Object Processor presented (and Jaguar later did). The 7800 was the closest thing out there prior to it, and it was rather niche in itself (and never really broke into the European market, either, so not tapping into the wealth of smaller software houses there, especially Atari-ST friendly ones). Software rendering and hardware-accelerated blitter rendering were much more well understood and also somewhat easier to use to simulate tile and sprite based graphics, but with the added flexibility of using framerates lower than the screen refresh rate without tearing (sprite drop-out) or flicker issues.

 

 

Martin Brennan joining the Panther Project in 1989 might have been an opportunity to kick some sense into things, but with all that interest in the Lynx (and it going to market in 1989) on TOP of a major stake in the computer market, it's really weird that the Panther existed at all in the form it did. (it's a novel design, and the sort that an industry leader might be able to pull off, but not something good for a second-tier player ... let alone one built around Jack Tramiel's no-nonsense, straight low-cost consumer market ethos ... and cutthroat negotiation for that matter: then again it was Sam in charge by '89 and Mike Katz had left the games division as well, so leadership was certainly lacking, but I though it was Leonard and Gary who were more involved on the tech-management and logistics end ... marketing issues and negotiating with suppliers and vendors might have been Sam's fault, but it doesn't explain the odd engineering choices)

 

Plus a blitter/framebuffer optimized design would be more useful as a component in a computer, even if just tacked on as an upgrade via the STe's Genlock feature. (ie rather than a unified bus with cycle stealing, attach the new video+sound subsystem more like a PC expansion card ... or the way most game consoles do subsystems on a dedicated bus connected via I/O ports and DMA interfaces) Standard STe graphics/sound for compatibility and enhanced features for 256 color chunky pixel modes, and possibly highres 4-bit packed pixel modes. (plus, with ST video disabled, and 120 ns DRAM + 16 MHz DRAM controller, and you'd be able to use a 16 MHz 68000 without wait states, sans DMA for disk access and such, and no need for a cache like the MEGA STe used) Juse use a 16.0 MHz clock for ST compatibility rather than NTSC timing. (you could do VGA style 640x480x4-bit and 320x480x8-bit 31 kHz video that way, though you'd need to use more page-mode bus saturation with linear reads and fewer DMA slots left for the blitter to work in ... and the CPU would be best only accessing in vblank, while the Blitter and DMA sound could still use hblank DMA slots, plus ST style 31 kHz monitor res at 32 MHz pixel clock leaves a LOT of hblank time available, so that'd be handy here; while dropping to 24 MHz, closer to VGA standard 25 MHz, would cut into that and not be slow enough to allow any interleaved DMA cycles, so 32 MHz ST style would be handy, plus it'd allow 640x400 16-shade grayscale on old mono monitors)

 

 

 

See also: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm

Note the DRAM prices were falling sharply in 1990/91 where they'd jumped up in 1988 then stagnated (crippling the ST's low cost and high performance ... or potential for the 1040STf to become the basic standard in '88) and it was a good time to release a console with a decent chunk of RAM and have it drop in price in the following couple years. Atari OTOH, had the very bad luck of going with 2 MB in 1993/94 with the Jaguar, at a time RAM prices again rose and then stagnated (due in part to a massive resin factory fire in Japan that crippled plastic IC packaging volumes and glob-top assembly) so it ended up staying relatively expensive and not being nearly as good a value as anticipated. It wasn't until mid-way through 1996 that prices dropped again, ie after the Jaguar was discontinued. (it wouldn't have been until the 1996 holiday season that a Jag+CD combo could've been a solid budget market competitor: ie undercutting the Playstation and liquidation-priced 3DO even at a sub-$200 level vs the $250 PSX/N64 of that period) Hell, they probably could've come out with an inexpensive 2 MB 16-bit DRAM cart for the Jag CD by 1997 due to the falling price of 2MB DRAM chips. (you'd need a DRAM controller onboard for that, and it's be 5-cycle latency like the fastest cart ROM setting, but still pretty useful: there's also a 2-cycle high-speed test ROM setting, but that'd only be useful for fast SRAM additions ... ie for things like a 64kB blitter texture buffer)

 

In any case, 1990 was a solid time to release a console, and one Atari had the misfortune of passing up. (those DRAM prices also would've made the 128k+512k PSRAM/DRAM floppy disk Slipstream console a good investment, though that's partially hindsight, and pure luck + foresight of good market prediction in 1990 ... though they could've launched with 256kB and quickly offered an inexpensive 256k upgrade card as the market trend became definitive in 1991)

Share this post


Link to post
Share on other sites

A Sega Genesis specs with better color palette and AMIGA sound chip.

Incidentally, the MD's VDP was designed to support 128 colors (or 121 colors: 8x 15-color palettes + 1 BG color) from 12-bit RGB (4096 colors) and had external expansion pins for that, but they were left unconnected on the MD itself and used later for the Arcade System C. (which also ditched the Z80 in favor of an 8.9 MHz 68000 and PCM chip)

 

Had Sega wanted to use that full capability in 1988, they'd have omitted the CRAM and DACs entirely from the VDP and used an external RAMDAC chip (as the PC Engine did) and probably could've made up the cost difference by removing the Z80+RAM and had the 68k handle the sound drivers alone. (just add a simple 8-bit DMA sound channel and you're good for sample playback and software mixing too ... interrupt-driven PCM is horrible on a 68k and cycle-timed loops aren't practical for most purposes either, so DMA is the way to go: on a 650x based platform like the PC Engine, interrupt based PCM was viable and a 7 kHz driver would tend to eat 5% of CPU time for tight code: Malducci's driver manages such; plus you can do channel-pairing tricks to get 10-bit resolution and double-buffer sample chunks into wave RAM to get better than 7 kHz without added hardware, though you'd need to sacrafice 4 channels to do 10-bit mono that way, and using 4/5-bit PCM, even for some sample based music would be pretty useful and doable with just 2 paired channels at up to 32x7 kHz ... so also tons of potential for interleaved/multiplexed mixing, but I digress)

 

There was also Ricoh's 8-channel PCM chip Sega later used in the CD, but was already using in 1989 in the Arcade on the Model 18, but that's unnecessary added cost and overkill compared to the potential of software mixing with DMA sound. (OTOH it was MUCH cheaper than the Sony SPC700 module of the SNES ... and manufactured by Nintendo's prime chip vendo Ricoh ... and would've been an interesting choice to see tweaked as an embedded CPU+Sound Chip on the SNES end ... with a much faster 65816 and faster RAM rather than wasting money on the Sony module and cheaping out on RAM with DRAM and a slow DRAM controller: compared to NEC, who managed with 70 ns DRAM and a fast controller to allow for full-speed 7.16 MHz 650x operation in 1988 with their CD-ROM system ... 2.68 MHz is SAD in the SNES; throwing in 256 bytes of RAM for one-chip zero page would also be nice and help somewhat for poor compilers for those attempting to use C on the SNES)

 

 

 

True.

 

In 1990 the only real blitter available for the PC was the ATI Mach-8 (and maybe the IBM 8514). I spoke to one of the engineers that was working on it back then and he admitted that it was a half-baked solution at the time. They were trying to implement something similar to the Amiga blitter, but not getting very good results (the Mach-8 was slow and ran hot). Of course, look where ATI is today. Clearly, persistence pays off.

 

So yes, even if PC games did try to make use of it in 1990, the efficiency and performance just wasn't there yet.

The PC world also had the issue of VGA compatibility, and ATI took the route of an 8514 clone, but used a separate VGA core + RAM to provide compatibility there and nothing fancy like genlock to allow overlay of the two screens. Plus, you had 4-bit color modes using bitplanes and 8-bit chunky modes (not to mention the odd organization of nonlinear unchained VGA 8bpp mode: not planar, just interleaved in the 4 64k banks of VGA space ... probably due to the way they got the necessary bandwidth while focusing on linear pixel space in 4-bit mode rather than say, linear 32-bit alligned addresses in chunky mode).

 

OTOH, ATi probably could've made a low cost fast VGA card that simply had some nice added features while focusing on basic VGA compatibility. Remapping RAM to 32-bits wide would be relatively straightforward for a much more friendly/fast (especially for 32-bit CPUs and VESA) linear 32-bit word organized 8-bit packed pixel framebuffer, and also support DMA from main RAM, allowing fast updates of partial or entire screens. (entire ones for double-buffered full-frame rendering, partial ones for looping single-buffered scrolling type graphics, where DMA is mostly filling in portions of the off-screen bits being scrolled-in) Simple DMA block copy and fill function would be good enough for basic acceleration rather than a full blitter and would cater to 8bpp modes and 512kB DRAM. (which become appealing as soon as you adopt high enough bandwidth to do 640 pixel wide 8bpp modes and 640x480 256 colors ... while still being compatible with fixed-frequency VGA monitors; while 640x400 could still be double-buffered, so good for 3D games)

 

You'd also want vblank interrupts to make for fast and simple page-flipping without tedious status register polling. (also very useful for color cycling effects via palette swaps, or 256 color video playback that re-loads the colors for each frame or on key frames: something you can't really do without double buffering or really fast DMA able to copy full frames in vblank ... so using Mode 13h would be out on ISA video cards, while double or triple buffered mode X would be possible via ISA cards ... or of course, a mapper-modified Mode X allowing 32-bit linear pixel organization, though obviously you'd need 2 port or DMA writes on 16-bit ISA for that)

 

DMA functionality without any bit manipulation features would still be useful for 4-bitplane VGA modes, but less useful than something like the Atari STe blitter or Amiga Blitter. (hardware bitfield operations, bitmasking, bit-precise line fill and sprite drawing, etc) But with a CPU with a barrel shifter and fast bit manipulation instructions, you'd be OK software rendering and DMAing that way anyway. (the 68000 was not such a CPU, but a 386SX could handle such ... I forget where the 286 fits in there) So a fast enhanced VGA card that still lacked double-bandwidth modes (640 pixel 8bpp) could still be appealing with DMA copy and such, and offer relatively fast ISA bus performance. (and if it got popular enough, you'd probably have seen games exploiting the DMA function for primitive blitting or screen block/tile updates at 320x240 with square pixels and fast/efficient 32-bit word-packed 8-bit pixels rather than funky mode X, speeding up software blits to the back buffer in main RAM even if copying over ISA was a bottleneck)

 

Gaining market acceptance would be key to getting software like games to support it, but a low cost, enhanced VGA card would seem much more likely to gain such than an 8514 clone.

 

Hmm, perhaps even easier to gain acceptance would be a simple 2 or 4-page hack of Mode 13h, allowing a mapper/bank-switching scheme to let software treat each page like 13h, but have additional control register settings that allow page-flipping and thus more easily allow software to optionally support that with less modification of their rendering code. (just allow for 2 banks to be selected, one designated for the active screen and one designated as the back buffer currently being copied to: you could potentially have 3 back buffers and a quad-buffered arrangement for smoother average framerate, of course) So you get the speed and simplicity of mode 13h without the hassle of single-buffering and ugliness of screen tearing either without v-synch or over ISA where there's no time to copy 64kB in one vblank period. (If you dropped to 60 Hz for 320x200 with a boarder and more vblank time, you'd still only get 62kB at the absolute max over 8 MHz 16-bit ISA ... so with a fast CPU and tight polling of the vblank status register, you could avoid tearing if you had a boarder or status bar or such that didn't need to be re-copied every frame ... plus square pixels, which is nice, though the letterboxing isn't so nice)

Share this post


Link to post
Share on other sites

Oh, and I forgot to mention, even without the Z80, you could leave in all the other Master System compatibility bits (I/O, sound, VDP, etc) and just stick the Z80 into the Power Base Converter. (most or all of the necessary I/O and memory addresses are accessible through the cart slot as is, so you might not even need to change that. You could also have ditched the side expansion port in favor of a VRAM expansion port (there's another 64 kB of VRAM space unused by the VDP) and use fewer pins for that as well. (the dual 8-bit data ports plus multiplexed address lines and DRAM control signals) On that note, upgrading the PSG to allow it to run at lower clock rates (or just 1.79 MHz, half of normal) would make it much more useful for music, though adding Game Gear style stereo functionality would be nicer.

 

The Cart slot is already a much better expansion port than the side port (originally earmarked for a floppy drive before the CD-ROM was pressed into that role) but a cart-slot module based expansion would be far more flexible and efficient ... and you probably wouldn't need that redundant 68000. (it's faster, sure, but swap that for a DSP co-pro of some sort and you've got a generally more useful system more useful for 3D)

 

You could also just put the VRAM expansion lines on the cart slot, potentially on outboard keyed portions (7800/SNES/Jaguar style) to keep PCB costs down on standard carts. (actually, there's a TON of expansion pins that most games don't need and would've been cheaper/better off if segregated from the normally used ROM cart bits ... probably just 48-50 pins needed for most games, including a couple pairs of VCC and GND lines)

 

If you added that second VRAM bank onboard the CD itself, it'd also open up interesting possibilities for other changes, like having the added graphics co-pro ASIC render straight into that VRAM bank, or at least have faster and more flexible DMA than the MD's native VDP (faster VRAM, maxing out DRAM/PSRAM bandwidth, CPU-synched interleaved DMA modes, among other possibilities). Or just include two extra VRAM banks that can be flipped like Sega CD word RAM or 32x framebuffers (or Saturn VDP-1 framebuffers).

 

With 121 colors from 12-bit RGB from the start, the need for video expansion would be less too, but tweaking that a bit more and allowing one or both BG layers to be disabled to allow linear bitmap framebuffers instead (with an eye for software rendered effects, even without expansion hardware) would be interesting, plus you wouldn't need to monopolize both VRAM ports if you disabled both tilemap layers and used the serial bus for framebuffer scanning. (you could do 2 15 color 4-bit planes or 1 121 color 8-bit plane, or 2 half-res 8-bit planes, and potentially make use of unused color values for shadow/hilight translucency effects, though you could also juse one bit for per-pixel priority to allow objects to be drawn in front of or behind the sprite layer) Doing a linear bitmap is much simpler than a tilemap, and the system is already using packed pixel data.

 

Short of that, you could also tweak something the VDP can already do: lowres direct color via mid-screen CRAM DMA updates. The problem with that is it halts the CPU for the entirety of active display, but allowing the tilemap layers to be disabled and DMA'ing drom VRAM itself would allow for the same effect, direct 16-bit (unpacked 12-bit) color bitmap at up to 160x200. Plus sprites could potentially still be enabled if this was a feature rather than just an exploit. (practically speaking, you'd want to limit that to smaller render windows due to VRAM space limits ... right up until you added external VRAM like in the above CD unit suggestion)

 

Note the real-world hack mode using this is limited to 9-bit RGB encoded as unpacked 12-bit (you have 3 nybbles per 16-bit word, just with the 4th bit ignored on all three: the VDP natively works in 12-bit RGB, remember, it just had CRAM and the color DACs truncated to 9-bits to save chip space).

 

 

 

Oh and on that note, I believe the PC Engine was also designed with 12-bit color in mind and the expansion port actually allows for upgrading the RAMDAC, but they didn't use that feature on any of the CD expansion units. (you could've had 481 colors from 4096 12-bit RGB instead of 512 color 9-bit RGB) Oddly enough, the SuperGrafx also retains the 9-bit color limit, in spite of using dual VDPs. (the pixel bus on the expansion slot also provides other information, so an upgraded RAMDAC/mixing chip could potentially add things like translucency effects in hardware)

 

The PC Engine is one console that was pretty close to ideal for its time, but the upgrades didn't push it nearly as far as it could've been ... and marketing was poor in the US and it failed to get a European release at all. (unfortunate given the tiny PC Engine form factor would've probably sold well as-is)

 

They probably should've had at least 2 controller ports on the TG-16 variant, though, and offered 3+ button controllers sooner, them made 6-button ones standard, and should've either made the Supergrafx an expansion unit, built into a second-gen CD-ROM base interface, or gone another direction with video expansion and added a framebuffer bitmap layer instead, with the VDC function probably built into the upgraded RAMDAC chip and piggybacking on existing CRAM entries for the 255 colors. (either software rendered or blitter accelerated ... probably blitter accelerated)

 

The original 1988 CD-ROM unit could've been simplified and made generally more useful by omitting the ADPCM chip, using a unified block of 128 kB DRAM, and either adding simple 8 or 16-bit DMA sound, or just relying on software driven playback instead. (given how poor a lot of ADPCM sounded, and how poorly it buffered and streamed for cutscenes, even simple 4-bit or 5-bit LPCM would've been competitive at the same bitrates, but you can do software DPCM/ADPCM decoding pretty easily and also do software 8 or 10-bit PCM fairly easily with paired channels at offset volume levels, and software mixing is far more flexible than a fixed, single ADPCM channel: that was also a huge limitation of the X68000's sound system, a single 8-bit PCM channel would've been far more useful)

 

In any case, no sound upgrade at all would've been fine for the first gen CD unit, and they could've added something fancier and more generally useful around 1991 as part of the Super CD upgrade. (an entire base interface unit replacement, say 512kB DRAM, the VDC/color upgrade, and perhaps one of NEC's embedded DACs coupled with 16-bit DMA stereo, allowing CPU or DSP driven software mixing as well as slaving the DSP as a 16-bit multiply-accumulate copro for assisting with 3D or scaling effects)

Share this post


Link to post
Share on other sites

 

I've been impressed with Comanche: Maximum Overkill for awhile now. It will run on a 386SX, so if I were somehow transported to 1990 I might try to make a system based around that processor or some sort of clone. Probably would cartridges, so I could put extra RAM/co-processors in the carts.

 

Examples of a 386 PC running games:

 

Share this post


Link to post
Share on other sites

^^Wow you get me and my interests from that period quite well outside of the usual stuff people went nuts over from Apogee, Epic Games and iD software of the time. Stunts still I feel is unmatched in what it pulled off and Comanche is just an insanely excellent treat. I put so many hours on that and I even knew a guy who made early PC cockpits/sims and did a sit down version in a chair with a throttle assembly and stick to play that very game and it made a real impression on me how damn immersive such things can be to an already loved game.

 

I really wish Comanche would pop up on GoG some day as well as Stunts.

 

Likely just going to have to oldwarez (abandonware) the stuff or find one roll by cheapo on ebay to ever have that fun again using dos box.

Share this post


Link to post
Share on other sites

I'd make the SNES, but I'd build it around a 10MHz Motorola 68000 and double the memory. The OP said $300 max, so I figure I have an extra $100 in budget over what the system cost when it launched. That would give me 256KB of main memory, 128KB of sound memory, a main CPU nearly as fast as the Neo Geo, plus all the fancy graphics/audio hardware that the SNES had.

 

You should get something approaching arcade-perfect ports for those CPS1 games that came out on the system. Had to wait for the SNES/Saturn to get Street Fighter Collection with those big sprites.

Share this post


Link to post
Share on other sites

NES was fun bitd but what we have now should of been ad back then. My friend said even 64 bit would be impossible. Would be like wstching tv. But eventually it came about. I would of been the coolest kid if my birthday party in 1990 had a Nintendo 64.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...