Jump to content

kool kitty89

Members
  • Content Count

    2,440
  • Joined

  • Last visited

Community Reputation

154 Excellent

1 Follower

About kool kitty89

  • Rank
    River Patroller
  • Birthday 08/08/1989

Profile Information

  • Gender
    Male
  • Location
    San Jose, CA

Recent Profile Visitors

17,501 profile views
  1. Looking over the schematics more myself, it looks like the Game SHIFTER chip (4118 ASIC) might not be the entire PANTHER video ASIC, but just the video shifter portion with pixel data registers and palette registers (32 18-bit CRAM entries) but not the object processor or line buffers. The Panther dev systems have an additional Toshiba Gate array chip onboard, which may be the actual object processor chip with line buffers with a data port for the SHIFTER to access the line ram from and DMA data. It looks like there's multiple sets of 5 registers with 16-bit inputs accumulating to 5-bit outputs, which seems like it'd be used for bitplane data on 16-bit word chunks like the ST SHIFTER uses, just with 5 rather than 4 bitplanes max, which seems kind of inefficient for a system using chunky pixel data natively (no need for that much chip space used when you could just read and latch an entire chunky pixel at once, or 2 pixels packed into a 10-bit word; logically treated as 8-bit packed pixels in a 16-bit word, but could be done on 10 data lines and 10 and 5-bit data words internally). So it seems like Atari may have been planning to use a version of the SHIFTER with support for 5 bitplane modes at one point and recycled it as the video generator portion of the Panther, which would also mean they went through the trouble to have the object processor ASIC translate packed pixel data to bitplane words internally. (or assuming the CPU-addressible and object processing back buffer end does all work on 8-bit chunky pixel data, the translation would be through memory mapping on the output data port the SHIFTER reads from, assuming my inference is at all accurate) That in turn implies they could have been intending the 4118 ASIC itself as a standalone video chip in a computer, like a further upgrade to the STe's SHIFTER or a competing design that ended up scrapped. (like maybe lacking scroll registers or DMA sound support) Or is the scrolling and fine address control handled on the MMU/MCU end in the STe? Though the schematic drawings are dated well after the STe hardware was designed (or even introduced) and the 4118 chip uses an 84 LLC package like the GST SHIFTER so may have been intended as a plug-in, pin-compatible upgrade to the STe. (or factory assembly line replacement, since Atari wasn't particularly keen on actually shipping upgrade parts, but then the STe did have its SHIFTER socketed, so it'd be an easy dealer-level chip upgrade or an end-user one) 5-bitplane modes with an 18-bit RGB colorspace would've been interesting though, and assuming they were targeting the same 8 MHz dot clock and bus timing arrangements, that would imply a 10 MHz CPU clock and 20 MHz MMU clock. (though they could've just used 32/5 MHz for the dot clock, so 6.4 MHz, or 6.44 MHz for the STe's NTSC 32.2159 MHz clock) The latter might actually make more sense given Atari seemed to be resistant to messing with the MMU timing at all, plus 6.44 MHz gives much closer to square pixels for NTSC (and should have less severe composite video artifacts) and would also fill out the screen to the borders, but conversely would tend to overflow 320 pixels off screen on TVs (or monitors without generous border adjustment) for around 307 pixels visible. That's also assuming a 320x200 resolution was intended and addressing matched that, but if they stuck with a 32kB screen space, 256x200 would use the exact same 32000 bytes as all the existing ST screen modes and also use the exact same screen and border space size. (albeit if that's the case, it would've been handy if they also let it switch to 5.37 MHz for a nice full width TV display and nice 1.5x colorburst in NTSC for good TV game purposes) Given Atari's track record with the ST by 1989, yet another 32000 byte screen mode with identical hblank/vblank timing to the existing STe seems pretty likely, and not particularly amazing but at the same time still better than what they actually put out there in the STe range. (and would've been particularly nice in the MEGA STe ... or a cheaper, cacheless 16 MHz STfm cases STe+ sort of deal) That might also explain why the Panther's development got strung along through 1990: the 1989 system needing 2-chips for video and not being cost-effective enough, and they just stuck with the 5-bit wide line RAM throughout. (granted, upgrading to 8-bits and 256 colors would've made going to a single chip even harder, but then doing away with bitplane translation and doing all the video generation stuff internally with an 8-bit digital video bus output would mate well to a VGA style 256x18-bit RAMDAC for a more economical 2-chip solution for an 8-bit chunky pixel system, and would've made the Panther far more relevant in 1991) Gate array logic is also relatively easy to modify and usually fast to have new masks made for testing, so that should've worked out well too. (and the object processor architecture itself already treated line RAM as 8-bits wide logically, so programming wouldn't be any different, palete select was via an 8-bit linear offset for lower color depths, and the RLE pixel mode used an 8-bit color select with 3 bits totally wasted) I also assume a RAMDAC would be cheaper than repurposing the TT SHIFTER for a 2-chip solution using 256x12-bit color. (plus 18-bit color is nicer) Though come to think of it, the TT SHIFTER works similarly at least: it's got the 64-bit wide bus buffering data and spitting out out faster through a 16-bit port to the SHIFTER, right? ... sort of like it could've been used on a cheaper 32-bit wide set-up at half the resolution. (like 640x240x4bpp and 320x240x8bpp plus 512x200x6 or 400x256x6 on TVs/mid-res monitors and 320x480x4bpp at VGA res or 640x480x2bpp or 768x400x2bpp given 32 MHz dot clock on normal VGA monitor calibration would allow more like 819 pixels in the space 640 usually goes; also potentially 512x400x3bpp and 256x400x6bpp) I wonder if something like that was originally planned with the earlier 68020 machine developments and so-called SHIFTER II. (except it still would've been a fine fit for a 16 MHz 68020 or EC020 machine as a lower/mid-range mainstream market complement to the TT in 1990, or a cheaper version with 16 MHz 68000 with or without MEGA STe style cache, especially with a 32-bit latch connected to 32-bit RAM) Actually a 12 MHz 68020 on a 32-bit bus with the same old 500 ns ST MMU bus slots would work out with 3 wait states and 6 CPU clocks per access slot. Plus 12.5 MHz was the bottom-end 68020 offering, but I think that was only true somewhat early on and the 16 MHz grade replaced it. (and seems to only exist in the PGA version, not the PLLC or CLLC surface mount versions) 16 MHz with 8 cycle slots would be OKish (still faster than the Amiga 1200), but much better with optional local fastRAM. (though a 16 MHz 68000 or 68010 using 8-cycle slots on a 32-bit bus latch could avoid wait states when doing 32-bit alligned operations and while filling prefetch without needing additional cache; also assuming data gets latched within 125 ns, which might mean changing MMU parameters for faster DRAM access timing even if cycle times stayed at 250 ns, which would be right for 120 ns DRAM since 187.5 ns cycles would probably be stressing that too far out of spec and need 100 ns chips: the ST's 248~250 ns cycles pushed 260~270 ns rated parts much more within other tolerances and operating temperature/other conditions) With normal ST MMU timing, you'd be stuck with wait states on a 16 MHz 68000, and mis-alligned 32-bit accesses (9 or 10 cycles rather than aligning with 8), but that'd still help for slower instructions doing 32-bit read/write operations at least. (especially stuff like 32-bit arithmetic, and really slow things like multiplication and division would get a big boost even if you had just the 16-bit bus latch to work with, for things like 3D calculations, 2D scaling, sound sample scaling or interpolation, DSP type stuff) For that matter, a stripped-down STe with that 16 MHz CPU and 32-color Game SHIFTER might have been OK as a game console in 1989. (or going really cheap and feeding the SHIFTER with a 26.85/26.6 MHz clock, so the MMU gets a 13.4/13.3 MHz signal, you end up with slower 298 ns DRAM cycles and can use cheap, old 200 ns DRAM like left over XE/XEGS DRAMs and 596 ns 68k access slots and a 6.7 MHz CPU speed or better: tap the 1/4 clock output from the MMU for 3.355/3.325 MHz and use a PLL for a 3x multiplier to 10.07/9.98 MHz and 6-cycle 2-wait-state bus slots, or just run straight off the 13.3 MHz clock with 8-cycle bus slots, though you could potentially add a little 16kB of 200 or 150 ns SRAM/PSRAM to work in without wait states) And even 128 kB of slow 16-bit DRAM would be a lot more flexible than the 32kB the Panther was planned to have. (64 kB double buffered framebuffer, 64 kB to work in, plus ROM could be on the local CPU bus and have fewer wait states and/or letting the blitter interleave ROM access with the CPU, or both: with 300 and 200 ns ROM cycle options interleaving in 600 or 400 ns slots and the blitter having 6.7 and 10 MHz clock settings while the CPU has 2-wait and no-wait state settings) There's also going the other direction and using a faster MMU with slower SHIFTER DMA and more CPU+blitter access slots. (using a simple 3-way split with 200 ns bus cycles using 120 ns DRAM and 20 MHz MMU, and/or a smarter MMU on top of that that gives access to vblank and maybe hblank SHIFTER bus cycles to the blitter and maybe CPU) Messing around with timing and bus sharing is also a lot easier on a platform that doesn't have to be backwards compatible. (plus they could've dropped the LMC1992 and used a more barebones sound amp circuit, but a nice hack would've been a 16-bit mono mode that connected to the 8-bit channels, or the same thing but also dropped the discrete 8-bit DACs for barebones R2R networks that could be chained to 16-bit mono)
  2. https://www.chzsoft.de/asic-web/ There's a lot of neat leaked documentation, schematics, and some commentary on Atari ASIC designs there, some prototypes and some that ended up in the STe, TT, Falcon, and Jaguar. Also interesting to note the dates on various documents and technical drawings. (among other things, some of them go along with other info I've seen in old interviews or magazine articles pointing to the STe chipset being designed over a year earlier than it was and was delayed for some reason: existing ST sales/demand being high enough is one cited reason; the TT chips also seem to be significantly older than most of the system, but that's also obvious from the 1988 copyright date printed on several chips actually used in production TTs) I'm not sure if this fits best in the ST/Falcon section or Prototypes, but it seems mostly to be Atari 68k based computer stuff, so seemed relevant here. The most intriguing thing in there for me is the GAME SHIFTER, which seems like it might be the PANTHER object processor ASIC or at least the video generator portion of it. It's numered ST-4118 and that's also printed on the socketed 84-pin LLC seen on the Panther development system. (and that chip has lines connecting to a resistor array containing 3x 12 resistor banks, almost certainly R2R networks for 18-bit RGB output, which the Panther used) The fact that it has an ST related part number could be related to the Styra Semiconductor Corporation as noted on that page, but many of the ST-related ASICs also share that designation, so it's not really clear. (and also not clear if Styra was even making those custom chips for Atari) It makes me wonder if the Panther graphics chip(s) were intended for an upgraded Atari ST as well. (if it was intended for the computer market, it would've been kind of weird, but neat in some ways: the line buffers could be exploited to line-double 15 kHz lowres framebuffer modes to VGA synch rates and use the same memory bandwidth, the object list processing could be used for GUI window acceleration, obviously TOS-support-dependent, and a few other non-game related things; and the Panther definitely used packed/chunky pixels and not bitplanes, so all the advantages of that as well, though the disadvantage of using 8bpp for just 32 colors due to the 5-bit line RAM and 32x18-bit CLUT, but 4/2/1-bit bitmap objects could use any consecutive set of those 32 colors via an 8-bit offset, like the Jaguar and also like the Lynx's 4-bit palette offset) **note one of the laziest uses for the multiple-framebuffer window and 32 color palette set-up would be an AGA Amiga style 2 playfield mode with dedicated 15 colors + transparent + BG color, or an even lazier OCS/ECS Amiga simulation of a single 16 color playfield and 15 color sprites. (far poorer utilization than optimizing graphics around the 8-bit offset feature of the palette, which itself is obviously far more work than if a full 256 entry CLUT and 8-bit line RAM was present, but still a lot better than the STe's color limitations and arguably better than the Mega Drive's 4x15 +1 color 9-bit RGB CRAM) If the Panther object processor actually used its line RAM more like the 7800's MARIA does, then 320x5-bits could be expanded to 800x2-bits or 1600x1-bit, which would be of little use to games but very useful for highres computer graphics. (more situations where lines could be double-buffered for efficiency/flexibility and potential line doubling, plus even more necessary if the Panther lacked the ability to chain line buffer reads, limiting screen width to a single line buffer's length) Using the ST Blitter would also be fastest on 1-bit bitmaps, so a highres framebuffer mode or drawing to multiple 1-bit object windows. (even more useful assuming the Panther can do opaque 2-color 1-bit bitmaps like the 7800 Kangaroo mode: rectangles without sprite transparency) Also worth noting: the Panther development board also uses the same 32.2159 MHz oscillator frequency as the NTSC STe uses. (good for monitor compatibility and system clock compatibility, not so good if games actually used the same 8.05 MHz dot clock due to the large border and NTSC color artifact issues, though 32.2/5 = 6.44 MHz would be much better and also have nearly square pixels) Presumably, in the ST, the Panther would be using the same 32kB of fast 32-bit SRAM (35 ns chips used on the dev boards, but being used at 31 ns) and could exploit ST RAM like 500 ns 16-bit ROM using the existing SHIFTER DMA cycles, except able to use an entire H-time worth of bandwidth (or multiple H periods) aside from the DMA sound access slots. (disabling interleaved DMA for 250 ns saturated/cycle stealing access would allow higher bandwidth at the expense of CPU performance, unless you also used a 12 or 16 MHz 68000 and give the CPU access to all bus cycles as well for serial use of the full bus bandwidth; better still if the Blitter and DMA chip could use both even and odd bus cycles as well) Though aside from that, some 16-bit wide 120 ns SRAM used as dedicated object data/framebuffer memory could allow neat high bandwidth modes (and potentially CPU interleaved 250 ns modes too) and 256 kB of SRAM on an upper-scale model would allow single-buffered 1600x1200x1bpp 60 Hz or double-buffered 800x600x2bpp 60 Hz (and potentially 50/50 bus interleaved with the CPU/DMA chips). Which would be nice monochrome/grayscale workstation class resolutions in 1989/1990, sort of like a low-cost competitor to NeXT, especially if they included a 56k DSP option. (which has implications for both audio work and 3D graphics)
  3. So the 5-cycle access timing did work? That'd have some significant implications on blitter texture mapping performance among other things. (tests Kskunk ran years ago came to the conclusion that the blitter tops out at 5-cycles per pixel/textel while doing scaling/rotation or texture mapping, so you can do way faster than from DRAM, which takes 11 cycles, but won't be able to use the full speed of any of the onboard SRAM: and using the GPU scratchpad or even trying to reserve line buffer space as texture buffer/cache has other performance implications, too, like blocking GPU access to SRAM and depriving the GPU from what little high-speed RAM it has to work with anyway) The cheapest SDRAM currently on Mouser (https://www.mouser.com/ProductDetail/Winbond/W9712G6KB-25?qs=sGAEpiMZZMu4dzSHpcLNgnY7VH5pAzisodYhy9hoWwFIaNTUkscd%2Bg%3D%3D a 4-bank interleaved 64 Mbit 16-bit wide DDR-2 800 chip, $1.37/100 ) is specced at 57.5 ns random read/write times at 800 MHz, so well below the 188 ns 5-cycle limit (and 75 ns 2-cycle one for that matter) though that's also DRAM controller speed dependent. Still from the old SDR SDRAM datasheets I've looked at, pretty much everything is going to be way below that limit. The 70 ns FPM DRAM used in many Jaguars (some used 80 ns) can even do better than that with a fast DRAM controller, with the spec limit at 130 ns RC (but the Jaguar does 188 ns), but you can fudge that sometimes too, like Atari had with the 150 ns DRAM RAM in the ST doing 250 ns RC when it was rated for 260. (I'm pretty sure the issue is the Jaguar having only 3-cycle RC as the next step beyond 5-cycles, and 3 is too fast, especially the 1-cycle RAS-to-CAS: you'd need 60 or 50 ns DRAM for that, so you get 3+2 and 2+1 but not 2+2 cycle precharge + RAS-CAS setting, and 2+2 would've worked fine with the RAM they used, even the 80 ns stuff and that's the timing Flare used on the 1993 version of their Slipsteam ASIC; I suspect they were hoping for 32~33 MHz out of the Jaguar, probably using the same 32.216 MHz NTSC clock from the Panther and STe, but couldn't get the chips fast enough for it) If the firmware is updatable on the SDRAM cart, and the SDRAM could be reconfigured so the Jaguar can actually see it some or all of it as RAM (treating it as slow SRAM/PSRAM) that'd be super useful. Not as cool as if the 2 cycle mode worked, but still cool. (it'd let you keep the DSP and 68k out of 64-bit DRAM as much as possible, much more than what ROM access already would allow) I don't think the Jagua's memory interface would let you map cartridge RAM into the unused DRAM address space (particularly the unpopulated second 4 MB bank) but that'd be neat. The full 24-bit address bus is on the cartridge slot, but it'd be more a matter of what TOM's memory map actually allows across the cartridge port, and if it only expects DRAM in that address range. (including the multiplexed DRAM address lines not present on the cartridge slot, sadly ... otherwise it'd have been relatively easy to add DRAM to games, too, though they also could've just run the DRAM connections to another super-cheap edge connector expansion port routed to the back or side) Texture mapping from alternate memory gives you other boosts to blitter performance, too, like avoiding any page breaks or read/write changes to DRAM access while rendering, which means you mix and match texture mapping and gouraud shaded fill operations within the same drawing operation or even mid-line, plus you'd be able to read the Z buffer faster as well (though you have the 1-cycle read/write change overhead for that). One case probably useful for mixing textured and untextured surfaces and Z-buffer reads would be using untextured polygons in the distance (sort of like low-detail textures used in the distance on platforms that relied heavily on texture cache space and bandwidth and especially if they did bilinear filtering) so instead of blurry, smudgy lowres distance detail drop, you'd get just plain, neutral-colored polygons fading into the distance. I assume the 2-cycle ROM mode is also broken for on-cart SRAM and using the various other I/O timings also isn't possible, otherwise that'd be really useful. (and small-ish chunks of on-cart RAM is sort of a retro-savvy trick too, ie relatively common on cartridge based game consoles with significant RAM limitations) 150 ns SRAM or 100 ns (maybe 120 ns) PSRAM would max out that 188 ns (5-cycle) cartridge bus timing, too, which would've been on the cheap/low end when the Jaguar was actually out there, but it just had nowhere near the support/backing (first or third party) that the likes of even the 7800 had (with the variety of games using 8kB and 2 or 3 using 32kB SRAMs onboard). The Jaguar CD didn't include any either, though I wonder if John Carmack had considered including some RAM on-cart to help with Quake on the Jaguar given the very serious performance boost it would provide and potentially even allow a smaller and/or slower ROM to be used, or keep from needing more and faster ROM. (probably 64kB as 2 8-bit chips or a single 32kx16-bit PSRAM like the Sanyo ones Sega was using in the Mega Drive/Genesis Model 2) Faster texture mapping also means more bus time free to the other processors, so all sorts of nice gains there. (including more time/resource to spend doing some sneaky realtime or intermittent hidden decompression loads from ROM, both to DRAM and to update the texture buffer) Oh right, and the Blitter only does one pixel read/write at a time when texture mapping, so any wider than 16 bits won't gain you anything. You could also let the 68000 work in that added RAM at times (along with ROM) and reduce the headache it gives to GPU/OPL/blitter accesses to DRAM. (and when the 68k is under higher priority, namely during an interrupt routine, the time it normally spends hogging the bus might actually allow some cycles free for GPU/OPL/Blitter activity ... I'm not sure how well TOM's memory interface is set up for interleaved bus access: ie if it at least has bus latches on the various bus masters to allow some parallelism a la ST/Amiga or Flare's own Slipstream from version 2.0 onward) Yep, but sometimes those problems can get fixed on the developer end by hardware and not just programming. Add a little bank-switching or memory mapping logic, some RAM, a sound chip here or there ... an I/O chip for extra controller ports. Or, of course, tons of extra ROM that was totally impractical early in the platforms life (or, as with homebrew, was totally impractical for all of its original life). You just see the 'use more RAM' 'cheating' on home computer (including old IBM-compatibles) platforms more often. The irony with the Jaguar is that it had tons of RAM for a game console at the time it was test-marketed in 1993 or launched in '94, but to do that they also made some big trade-offs that didn't pay off, including not expecting the DRAM market to drastically jump up in price in mid/late 1993 and then stagnate for 2-3 years after that. (that burned Atari big time back in 1988/89 and it was happening all over again) The hardware itself is pretty fexible and could've had other configurations/uses (like on a graphics accelerator card) and supported 2 DRAM banks and variable bus sizing, plus a wide range of CPU architectures. It's really optimized for high bandwidth serial bus usage which is great for fast, wide heavily buffered or cached burst operations, but horrible for the few things that aren't supported for that (and horrible for any activity the DSP or 68000 have on the bus as both have horribly slow bus interfaces on top of being limited to 16-bits width, and the 68k has no cache or local RAM of any kind to work in). Populating the second DRAM bank with 16-bit (or wider) DRAM would've allowed full-speed texture mapping and a nice big chunk of DRAM to pull textures from, but having the DSP and 68k work in that (or on-cart RAM even) helps somewhat but not nearly as much as if they'd been given their own slow bus to work on. (and probably putting the cartridge interface on that slow bus, could be just 16-bits wide to cut cost and allow less fragile/finicky connectors to be used, more like the Mega Drive or ISA cards use ... less like PCI or VLB) Atari just would've had to cut corners elsewhere to push towards the $150 price point they wanted. (which, of course, that 2MB configuration failed to meet until 1995) They also didn't work around any of those main bottlenecks with the CD add-on. (like slapping a CPU-stand-in microcontroller and some RAM on there, or just another 68000 and a small ASIC handling interfacing with a shared block of RAM or something like that; another J-RISC chip as the MCU would've been neat, but made no sense with the low volumes Atari was dealing with by that point; licensing the Phillips CD-ROM chipset was enough of a mistake from that standpoint: rather than buying totally off the shelf parts) Or perhaps the more conservatively sensible move of using the minimal 512 kB of 64-bit RAM (4x 1Mbit DRAMs instead of 4x 4Mbit ones) and sticking an expansion port on the back or side to add more later, especially to hedge their best for the limited test market period. You'd want a local CPU bus there too to hedge bets over the 'is the CPU really acceptable on the shared bus?' question. I'd say Jack Tramiel-esque conservative, except planning for modular expansion was one thing the ST line had chronic problems with. (though the 130 XE and European 800XE at least had the PBI interface to mostly carry over the XL PBI ... and that's more than any standard model ST ever got, or even the Falcon for that matter) Jaguar owners didn't take to soldering RAM on the motherboard themselves, either. Or installing clip-on biggyback RAM or CPU upgrades from 3rd party kits. Wait ... actually, an ST/Amiga style 68000 accelerator board with cache would actually be pretty interesting. Or ... I wonder how well asynchronous overclocks work. (they're common on the Mega Drive, but that's with PSRAM and SRAM ... and VRAM through I/O ports, and ROM + wait states that's only sometimes fast enough to cope) I've seen full system Jaguar overclocks with modified BIOS ROMs, but not just asynch 68k clocks using an external oscillator and halt button + turbo switch, which might work depending how wait states are dealt with. (or not even an external oscillator, but tapping the 26.6 MHz signal from the board; a lot more stable than the 10.7/13.4 MHz video clock signal sometimes used on the Mega Drive) Someone could still potentially fix that with some sort of Jaguar Expansion module, maybe as part of some future flash cart project, and it might not be all that expensive to do with some savvy use of components, but then you'd need enough community interest to actually pursue it. (that and these days, the cheap add-on CPU/MCU + I/O + DRAM interface chip would usually be an ARM MCU/SOC, which can be very cheap but also way way overkill: ie you'd want to consciously throttle the CPU performance to be realistic outside of just wanting to max out the Jaguar's native graphics+sound pushing abilities, sort of like installing an old S3 ViRGE graphics card or Rage II mated to a 1 GHz Pentium 3 system ... or a 1.4 GHz Tualatin server chip) There's a couple open-source 68000 FPGA core projects out there too and that would be more retro-savvy for a Jaguar add-on, but I'm not sure it'd be realistic cost-wise compared to using an embedded ARM chip. (it'd have to be in the sub $5 range in bulk given there's a number of pretty versatile ARM cores below that, some not even in the bulk category, and I mean like 900 MHz Cortex A7 MCUs, 64 kB cache, DDR/2/3 SDRAM interface, flash interface, USB, UART, etc) Albeit all the cheap modern stuff (including DRAMs) require I/O rail voltage conversion, though I don't think that's a big deal. Small-ish asynchronous SRAM chips were the only thing I found in the 5V I/O compatible range that'd be relevant to a Jaguar cart or expansion module project. (mostly 16-bit wide 1Mbit and 4Mbit chips in the $1.5-3.75 range, with, which might even be relevant to homebrew developers actually considering hard copy cartridge releases: 12 ns 64kx16-bit, that's 128kB, at $1.45 per 100 units caught my eye while browsing) https://www.mouser.com/ProductDetail/ISSI/IS61C6416AL-12TLI?qs=sGAEpiMZZMt9mBA6nIyysK9MWTGEIBNWCJ0f%2B8johDE%3D) Having a C-friendly CPU/MCU with at least a modest chunk of RAM to work in (let alone a big chunk of SDRAM) would also open the doors to a number of homebrew developers who've had an interest in the Jaguar but passed it up at some point in part due to portability of other projects. (I know Chilly Willy was looking at the Jaguar before he decided to go for Sega 32x and Sega CD development work back around 2009, and some of the Game Boy Advance homebrew scene ended up spilling over to the 32x, and for that matter, a cheap ARM-based chip in a Jaguar cart would create a good deal of potential overlap there and with the Nintendo DS homebrew scene for that matter, among other ARM platforms: but especially quirky, interesting game console hardware platforms) Anyway, too bad this project is stalled for now, but I totally get crappy life stress issues and unfortunate turns of events. (it's one of the reasons I've been pretty much absent around my old forum hangouts and extremely sporadic for the last 5-ish years) I actually thought I'd dropped a brief comment in this thread last year after stumbling on it, but I guess I didn't. I also assume it's way too late to make any comments/suggestions on what might be worth including to the card given it's at the ready-for-manufacture state, aside maybe from modifications to the firmware end, though if that's flashable, those updates can continue after the boards ship, too. That said, I did think it was at least interesting that there's a fairly cheap 8 MB (4Mx16-bit) PSRAM chip available on Mouser at $1.98 /100 units currently, and that seems like a particularly appealing way to go about things. (it's 133 MHz rated, but random read/write speed is 70 ns, but that's still plenty fast for the Jaguar, and a small interface ASIC acting as a memory mapper and 32-bit bus latch could treat it as 140 ns 32-bit SRAM, still plenty fast for the Jaguar's 5, 6, or 10 cycle ROM modes, though at 16-bits it could use the fastROM 2 cycle mode if not for the bugs ... or depending what those bugs actually involve) It's barely more expensive than the cheapest 16 MB SDRAM I could find on there and interfacing is a lot simpler, but then you still need the bus latch and memory mapping logic along with the SD card interface. I also haven't been watching prices or doing any sort of prototype builds or projects myself that involve those sorts of parts, so I'm not sure how much those prices fluctuate. OTOH, that PSRAM chip might be appealing to other designers out there, or even some homebew cart game developers. (if you're already capable of using 5V compatible ROM chips at acceptable prices then a smaller 5V rated SRAM would make more sense, but if you're dealing with ~1.8 volt I/O translation already, that PSRAM seems potentially intriguing)
  4. I don't see pinout diagrams listed online for the EXT port, but I'm almost positive it's a complete Sega Mega Drive controller port, just with the opposite gender. (so a 6-bit parallel port plus serial data signal, I think through the 'select' line normally used to handle the button multiplexing) I don't think much or any homebrew stuff makes use of it either. http://www.sega-16.com/forum/showthread.php?5129-EXT-port-on-model-1 Now that one's pretty easy to find documentation of: http://www.hardwarebook.info/Mega_Drive_Expansion The expansion connector is much more limited than the cartridge slot, it's only got 17 address lines and lacks address 0, so it's a 16-bit word bus with 256 kB address range. It's missing a bunch of neat, useful expansion signals the cart slot has and wouldn't be capable of 32x style genlock video overlay because of that, let alone the address space constraints. (the address issue is also why the MD end of things can only see portions of the MCD's memory and hardware in 128kB blocks with bank switching: also making it more difficult to port MD games to the CD) There'd have been a lot more freedom in interfacing had the normal cart slot been used to interface the Mega CD instead ... though a CD drive module more like the original PC Engine CDROM^2 wouldn't have been limited by that interface and neither would the floppy disk drive add-on Sega had planned to use that port for initially. Sadly, neither the expansion port nor cartridge slot (or expansion port) exposed the 10 data lines needed for the MD VDP's pixel/color expansion bus (which would allow overlaying video in the digital domain and expanding the color palette/CRAM entries) opposed to the more comprehensive expansion port the PC Engine has. (more ironic given NEC opted to circumvent that with the Supergrafx when most/all of those features could've been done efficiently via a modular add-on, including one that simply swapped out the CD drive's base tray; and of course Sega had to jump through more hoops to implement the 32x and couldn't upgrade the native MD graphics output: the Supergrafx didn't upgrade the color depth, but it easily could have) The MCD has its internal mixing/output disabled by default (outputs just its internal sound through the RCA jacks and mixes audio through the MD), but when the aux audio input is grounded, it disables the sound going into the MD and mixes the MD audio through its internal mixing circuit, ending up with a cleaner, higher quality output. (moreso when the MD in question has poorer sound amp circuitry) I'm not sure how that works with a MD/Genesis II or if you're stuck with the crappy mixing on the MD side. All MCDs/Sega CDs came with an appropriate mixing cable, too, and all model 2 CD units also came with an extension block for the model 1 MD/Genesis. (also an RF grounding/isolation plate that screws onto the console and locks it into the slots on the CD unit) Early in the MD's life, Sega also offered stereo PC style amplified speakers you could plug into the headphone jack on the model 1 to get stereo sound output. http://www.sega-16.com/forum/showthread.php?16473-Does-anybody-own-a-set-of-theese That's just a type of dithering with the side effect of being easier to compress the graphics and/or end up with more solid looking blending, or sometimes even just looking better without composite video or RF smearing or artifacting. (the waterfalls in Sonic games using that technique tend to look OK through raw RGB or emulators, but plenty of other cases look rather bad) Results also vary widely depending on the video encoder used and in NTSC vs PAL. (PAL tends to smear/blend rather than artifact, but NTSC ends up with color errors and 'rainbow banding' in either vertical or diagonal lines across the screen: checkerboard dithering causes diagonal lines, 'strippy' AKA 'column' dithering leaves vertical bands) The effect is limited or absent in the lower res H32 mode (256x224) probably due to a combination of less chroma bandwidth required and the dot clock being a less fractional multiple of the NTSC color clock. (H32 is 5.3693 MHz, 1.5x NTSC 3.5795 MHz, but H40 320 pixel wide mode is at 6.7116 MHz or 1.875x chroma). The Sony CXA1145 encoder commonly used exhibits the rainbow chroma artifacts dramatically in composite video and faintly in S-video. It's also there with the later CXA1645 encoder, but more subtle and almost absent in S-video. The oft-dreaded Samsung KA2195D common to many model 2 Genesis consoles had generally poor and blurry composite output and lacks a luminance signal output (so S-video is impossible to get out of it), but seems immune to the rainbow artifacts and ends up blurring things so badly-but-evenly that dither-bars/strips end up blending very solidly and looking like actual translucency. (I suspect some games were developed with test/dev systems using that encoder, too, with art optimized around what it was outputting) It ends up looking more like really blurry PAL style composite video with horizontal smearing. (but without the vertical color blending/accumulation) The composite video filter/smear effect that Kega Fusion uses (or used to use) is very similar to what the Samsung chip outputs. I'm pretty sure that only holds true for the higher res H40 mode and H32 has coarse enough pixels to not blend 100%. (though it can be pretty close, and other artifacts aren't as bad, so games like Virtua Racing and some styles of FMV will look better with it) On top of all that there's 2 other things: 1 is analog video noise present in the RGB signal that (depending on the console and the TV, and not limited to the Model 1) has bright/dark vertical line variances on each column of pixels. That could be the jailbars some refer to, but similarly there's also jailbar style dot crawl present in composite video, worse on some TVs than others and generally worst with the Samsung KA2195D video encoder. (this is more often called 'picket fence' artifacts and not jailbars) I've also seen jailbars show up on some TVs in S-video or through RGB to component video adapters, but some TVs and monitors seem to hide or filter it out better. (CrossBow mentioned it's interference/crosstalk from the NTSC chroma signal, which it could be, but I'm not sure it explains the range of scenarios I've seen it crop up; he also describes it as solid sections of color and not bright/dark shades on the luminance end) I'm pretty sure the rainbow banding is just NTSC chroma artifacting created by luminance data being mis-interpreted ad chroma the limitations of the video encoders used (typical chroma-luma crosstalk). You have the opposite problem (dot crawl, though on the MD it's typically vertical bars fringing areas of high contrast color, not swarming dots) with chroma corrupting the luma data, but that's only present in composite video and not in S-video. (an artifact created by combining the two signals onto one line rather than inherently created during the RGB to NTSC encoding process) Rainbow banding is almost certainly related to high contrast luminance regions getting mis-interpreted as chroma data in the classic NTSC composite video color phase artifacting sense (the same thing that gives you Apple II, CGA, etc artifact colors) and due to the dot clock being a somewhat ugly multiple of the NTSC chroma clock (15/8) the artifacts aren't solid/consistent at all and oscillate across the screen and fluctuate/flash/shift when scrolling occurs. The sony video encoders seem to deal with the lower res 3/2x chroma dot clock (mode H32) better and doesn't seem to have that problem, and as I mentioned above, the effect seems totally absent on the otherwise rather awful Samsung unit. (though it has awful picket fence or jailbar style dot crawl artifacts) Hmm http://www.sega-16.com/forum/showthread.php?24968-MD-Genesis-Jailbars-Revisited Skimming that thread, it seems to be an artifact of the RGB encoder itself, but not quite like I thought. It's an issue of the chroma clock (color carrier) signal being used by the encoder at all, and not a matter of interference of signals through traces on the board, but inside the video encoder chip (just disconnecting/lifting the pin on the encoder will basically eliminate the jailbar artifacts but also give you crisp, luma-only black and white composite/s-video ... like on the Amiga 500 TV out ... or if you wire a C64 or Atari 8-bit video cable wrong and put luma on the composite cable ... kind of ironic it was the Amiga and not Atari ST that did that, especially since the ST is way way less NTSC color encoder friendly with that 8/16 MHz dot clock, plus Commodore already had those chroma/luma S-video C64 monitors that'd should've suited the Amiga fine, so it's kind of weird ... granted, also weird Atari put separate Y/C output on their 8-bit computers yet didn't offer a Y/C monitor to exploit that ... or not even crisp/clear grayscale composite video cables using the luma signal alone for nicer/clearer text and sharp grayscale graphics more suited to the 'serious' computer scene ... or just people who wanted less eyestrain) Err wait, I guess it would be pretty easy to just wire the ST's RGB+synch into a weighted luminance signal via an adapter cable, but I don't think I've seen one of those. (would've been handy for the first-gen 520ST without composite/RF output and on later models if you didn't have an RGB monitor but cared more about picture/text quality than color) Then again, this coming from someone who had a used grayscale VGA monitor on a homebrew multimedia/family PC as a kid in the early 90s. (which worked well enough until we got some games using red/cyan stereo 3D glasses effects ... )
  5. Instead of 2 look-ups, I think you could just use a single 256x2-byte table that takes an 8-bit PCM value (whatever format you want to use, unsigned, signed 2's complement, sign-magnitude, etc) and spit out 2 nybbles of POKEY volume data unpacked into 2 bytes. Then it should just be a matter of 2 writes to POKEY volume registers. That sort of look-up table system might be faster or friendlier on a processor with more registers to work with (load both 8-bit values into register space), but it's still probably the fastest option. You could also buffer some length of those volume byte pairs in zero page and keep overhead during the interrupt routine to a minimum, assuming you're using interrupts and not cycle-counted code for the playback routine. (you'd then probably have 2 sets of buffers, one normal, linear 8-bit PCM stream mixing buffer, for adding channels together and scaling note frequencies, etc, and then the second buffer made up of converted bytes) Doing the same thing on a single covox style DAC would still be faster and a bit simpler, though. I'm not sure how comparing 4 DAC ports would compare. (with that you've got less overhead on the mixing end of things, but you have 4 PCM streams to manage during the playback routine, and even more work than that if you're also doing frequency scaling during that portion of playback and not just reading from 4 PCM mixing buffers at a constant DAC sample rate) Albeit with the single 8-bit DAC you can also use interleave/multiplex mixing to allow full 8-bit samples to be played back at the expense of oversampling (and loss of effective playback rate). Ie a 32 kHz playback routine could be used to interleave mix 4 8 kHz 8-bit PCM streams. That latter method of multiplex (or interleave) mixing would also favor the straight PWM technique since the oversampling would shift the squeal artifacting well out of the audible range (and would tend to be filtered out more by the sound circuits more, either intentional filters or just exceeding the bandwidth of the existing circuits/amps/etc). With straight PWM you also just need 1 look-up from a simple 256x1 byte table, but the playback routine is still more complex. (using that byte to set a POKEY timer to count down the desired pulse duration/width, and if using interrupts and not precise code and/or polling timer status, you'd need 2x as many interrupts as a single covox channel) The hi/low 2-pokey channel method also still needs 2x the interrupts (using 2 POKEY timers), like straight PWM, but is simpler in using the same pulse-width setting at all times and not making PWM part of the sample-setting (or look-up) routine, but instead as part of an 8-bit DAC emulator routine. PWM could potentially be used for better than 8-bit resolution, but using POKEY timers would make that tricky (at least in 8-bit timer mode) though there might be some other work-arounds. (like taking the POKEY channel being used for 256-step PWM and also modulating its 16 volume levels, so you get a linear 12-bit output) Given that'd need 12-bit math for adding channels, it's also probably not that useful compared to just adding to saturation at 8-bits, especially with preprocessed samples (so no need to clamp at 8-bits and prevent overflow errors) though it'd be a neat trick nevertheless. (like if POKEY had been included in the Atari ST) Except you could still stick with 8-bit precision and use PWM on a channel simultaneously using 4-bit volume modulation. You'd then just need 16 linear (or rather 15 linear, non-zero) pulse width steps to complement the 16 volume levels. Too bad POKEY only has the random pulse-wave and square wave outputs, if it had variable pulse width, you could use actual PDM (with tons of oversampling). Though using the random pulse-waveform might be interesting in trying to hide or dither the PWM squeal noise. (you might need to drop the channel frequency down close to or into the audible range to do that, and it might just make things worse, but might shape the noise into more of hiss) Or if nothing else, doing some fort of sample playback through a random-pulse wave tone output might make for weird/interesting distortion sound effects. The SID chip has 12-bit precision duty cycle control over its pulse wave channel, so the C64 could potentially use that for up to 12-bit DAC output (though simple 8-bit would be more useful), and while you could also use the lowpass filter to hide the squeal, you'd effectively be doing PDM and not PWM by having the oscillator set to max frequency and modulating the pulse duty cycle. (... and then this just feeds into if the Atari ST had been a Commodore product and ended up including one or more SID chips for its sound output, not that they also wouldn't go well in an Amiga ... or 2 SIDs with PAULA's L/R outputs wired through each, with SID filters and all) I think PDM is also a misnomer with the technique I originally suggested in the other thread: It's really just amplitude-modulation, and even straight 4-bit PCM on POKEY is actually a special case of pulse-amplitude-modulation (since you're actually volume-modulating a square wave signal, not just a line voltage signal) ... which is also kind of funny given PAM could refer to the Atari 5200 as well. And since my suggested method (with the paired channels) uses a fixed pulse width on the low channel, it's still really just pulse-amplitude modulation. But that other method I just mentioned: combining a variable pulse width with variable pulse amplitude would be some sort of hybrid pulse-frequency-pulse-amplitude modulation. (and unless I'm mistaken, that's not one of the earlier PWM techniques used in the demo ROM, I had the impression those were straight 1-bit on/off pulses with all the volume/amplitude data being expressed via the pulse width alone) You'd also still have the problem with audible squeal even if you did use the volume modulation, but maybe not as bad. (since the upper 4-bits would be handled by channel volume, that'd mean 0 would be silent with no squeal and that low/quiet samples would have proportionally quiet squeal, so you'd only get the full volume pulse-carrier frequency tone/squeal for samples in the 240-255 range, and for really loud samples the squeal also becomes less obvious and annoying) With the 2-channel additive 'PDM' method, the noise is just further reduced as the low-duty-cycle pulse channel is effectively at half the linear amplitude when it's on, so what squeal there is from that is going to be half as loud in the worst case and much quieter than that on average. In fact, you could reduce that to practically zero if you used 2 channels, but instead of setting the volume of both, the low channel gets modulated between volume settings 1 and 0 and you use PWM to achieve the 16 effective amplitude steps for the low 4-bits of pseudo-DAC output. Edit: re-reading my original post in that other thread, it seems I already suggested the single-channel pulse-width+amplitude modulation route as well. I'll have to check out what methods people have actually been working with. (OTOH I didn't specifically mention setting the second channel to volume level 1 and using very quiet PWM to achieve the low 4 bits of an 8-bit linear amplitude output) But given the actual implementation of the 2-channel method still ends up monopolizing 3 POKEY channels due to the way the timers have to be configured, the method of using a single channel for the sound output with PWM only used to provide the low 4-bits might be appealing. (I think you could get away with using 2 channels for that and leaving 2 free to use for other things; you'd need one channel with timer interrupts to set the sample rate and to provide the actual sample sound output, then you'd need another channel with a timer dedicated to providing the pulse width timing) You could even use a third POKEY timer to control a second pulse-width parameter and use that on the fourth channel to get 2-channel 8-bit DAC output, but that's more work and 3 interrupts instead of 2, plus twice as many look-ups compared to just adding channels together and outputting them as a single stream. It's a shame POKEY doesn't support a one-shot timer pulse mode like the MOS 6522 VIA and maybe RIOT does, since that effectively does the work for you. (and you have 2 16-bit interval timers to use, so one could set the sample rate and the other could provide the variable-width pulse output ... say toggling GTIA's beeper output routed through a lowpass filter to cut the squeal) Atari already had POKEY covering the serial port end and VIA's had that bug to work around anyway, so it's obvious why they went with the cheaper PIA. (and RIOT would've presumably been more expensive, too)
  6. A command cache (or scratchpad ... or just prefetch queue) for the blitter probably would've been the more elegant and practical solution, yes, but I was mostly just summarizing a comment kskunk made on the issue years back. That said, wouldn't the rasterization situation also be different on the Jaguar II given Oberon's blitter had a trapezoid drawing function, so you only need to build lists of trap-segments to build triangles (or quads) rather than line by line rasterization. (incidentally, I believe several of the early PC 3D acceelrators worked on trapezoids internally for doing polygon drawing, some documented such as well: it's in the S3 ViRGE manual, at least described in the section on 2D polygon rendering depicting the trapezoidal segments used for arbitrary polygon fill operations) And on the texture mapping bottleneck, John Carmack's suggestion (in the context of something cheap and simple that they should have already included) was a 64-bit destination buffer in the texture mapping pipeline. (though given how slow the texture mapping unit is, given kskunk's 5-cycle peak test results, that wouldn't help all that much for raw speed, no more than populating the second DRAM bank the system already supported ... so it's somewhat moot there, though I'm also unsure Carmack was aware of the existing support in the DRAM controller or the 5-cycle bottleneck of the blitter) kskunk and several others (Gorf, Crazyace, I think maybe Atari Owl) also went over the problems using GPU SRAM as a texture cache, particularly how it kills GPU performance if used heavily, however, kskunk's later tests seem to point to use of the line buffers as texture RAM to be a lot more useful, possibly also useful as a texture render buffer. (the latter wouldn't be faster per se, but rendering from line buffer RAM into line buffer RAM, then blitting to the framebuffer in 64-bit, phrase-alligned chunks, would greatly reduce time on the bus and that much less contention). Honestly, for a game like Quake with the lighting model used, dropping textures entirely at a certain distance (Z-sorted as an extension of the existing ray-casting visibility system a la PC Quake) would've made a ton of sense to minimize texture mapping overhead. (PC quake also already was heavily optimized on minimizing actual rendering time, lots of computation or table based optimizations to spend as little bandwidth as possible drawing to the framebuffer, and an engine like that would adapt well to the Jaguar's bottlenecks, albeit trading more of the tables for raw realtime computation and trading the Pentium FPU pipeline-specific tricks for other things) The SVP-chip version of Virtua Racing used a 16-bit DSP to handle the 3D math, yes, though I think it also assisted with drawing the polygons using the 128kB of DRAM the cart included as a framebuffer as well as work RAM (or local memory for paging things in and out of local DSP memory). It's a DSP though, not a CPU or MCU (so unlike the 32x or even the primitive 16-bit RISC MPU in the Super FX chip) and not good for all that much else, not flexible general purpose processing like the Jaguar's GPU and DSP, but good for 3D and probably OK for simple pixel/line/block fill operations. (as a DSP it also should have done well as a sound processor, but Sega didn't use it as such ... no DACs or audio input lines connected on that cart) Unlike the Jaguar, but like the 32x, you did have multiple buses to work with, and the local DRAM able to be flipped on and off the 68k bus for copying the render buffer into VRAM. Now, the MD's 68k was still fast enough to do some software rendering on its own, and having a much simpler DSP co-processor that simply handled the vertex math and left all the rasterization to the 68k probably would've worked better than SuperFX driven 3D on the SNES (or been competitive, at least), but there's no other examples of co-pro or add-on chips used on the Mega Drive at all, unless you count the Mega CD. (and unfortunately, unlike Mode 7 in the SNES, the fast multiplier unit that must be embedded in the Sega CD Gate Array for the scaling/rotation function isn't directly accessible to either CPU, otherwise it'd be handy for polygonal 3D when the scaling/rotation function wasn't in use ... really handy if they still let the Gate-Array's blitter functionality work for simple copy/fill operations in proper nibble-precise tilemap pixel organization, but ... nope: honestly, with the amount of RAM it had along with that sort of hardware assist, I'd think it would've handled Virtua Racing well enough and probably a solid port of X-Wing, at least the more limited floppy disk version for PC, flat shading, 1MB RAM compliance and such) The Jag was way more powerful than any of that, though ... but yes, being able to interleave 68k access to some extent would give the advantages of a separate/local bus as on the MD. I'm not sure how the timing of the bus latches in the Jaguar work or if interleaving was really a major consideration, but it certainly had been when Flare designed the Slipstream and included a 16-bit bus latch to minimize time the 8086 spent on the bus (in that case interleaving on PSRAM cycles with the CPU working in SRAM or DRAM on the same bus: the video processor would only work in PSRAM, so the bus cycle interleaving was based around the 12-ish MHz video DMA clock, fetching a 16-bit word once ever 4 12MHz clocks). The Slipstream 4 (1993 vintage hardware done in parallel with the Jaguar) switched to dual DRAM banks with page-mode support and up to 32-bit bus width, but the intended 12 MHz 386SX should still have interleaved with video DMA cycles for most video modes provided it worked in the separate DRAM bank. (the blitter and DSP DMA may have remained serial only with the CPU, though) The 68k also doesn't need to hit the bus all that often to stay nearly at full speed, so having a feature to have consistent, periodic wait states for slow-ish interleaved bus access would've made it a far better investment in the system as a whole, albeit using a 20 MHz rated 68k and running it at 3/4 the system clock (19.95 MHz) would've probably been more useful as well. (you don't need the 68k bus cycles to allign with the system bus cycles anyway, not like ST/Amiga style 4-T-state interleaving, so the 1/2 system clock rate wasn't all that useful other than just being cheap/simple to divide) That and they probably spent way too much on Jerry given its limited use (it's a decent audio DSP, but too bottlenecked with its very slow bus cycle times and some other bugs, and need to page code to local RAM to work, thus nixing it as a stand-in as CPU: at least when coupled with the slow bus connection, opposed to paging code modules to the GPU; the DSP even made a poor geometry processor as writes to main RAM were twice as slow as reads, slower than 68k writes in fact: 12 cycles for a 16-bit word, while reads were less crippled at a hard-coded 6 cycles). The DSP's slow bus cycles would have made it reasonable for interleaved access on a separate memory bank (the second DRAM bank and ROM bank) but otherwise it's pretty crippled and a serious bus hog. (compared to just including a rudimentary DMA sound/UART ASIC ... or possibly including an embedded, easily licensed low-cost MPU like a 65C02 or 65816 a la Lynx ... as a sound processor or maybe in leu of the 68k, though it'd make coding in C a lot tougher ... for lazy C or 68k assembly source ports, for what that's worth) Well, that, or the Flare II team could have ditched the 68k and JERRY some time early in 1992 in favor of a nice, flexible microcontroller. Hitachi's SH-1 had just been released, and would've been really appealing as a low-cost CPU+sound processor combo, but aside from happening to catch that brand new design being released, there was also AMD's embedded range of the 29000 series, particularly the bottom-end of its microcontrollers in the family, the AM29205. (both that and the SH-1 had onboard DRAM controllers, both also used 16-bit data buses for lower cost and pin count, so also would have been fairly simple to include a local, dedicated bus to work in rather than sharing the graphics bus ... Flare could've just dropped to a conventional main bus + GPU bus layout and also used a simpler, 16-bit cart slot and copy/DMA data to the Jaguar graphics bus through I/O ports rather than sharing everything ... plus, no fragile and more expensive 32-bit MCB/VESA/PCI-ish connector for the cart slot to deal with, just something close to SNES/Mega Drive, or ISA slot pins) Though, that said, it also shouldn't have been too tough for flare to include a (slow) 16-bit DRAM controller and basic sound hardware (possibly the 16-bit Flare DSP or just DMA sound) and UART on a 16-bit bus controller ASIC to complement a 13.3 or 19.95 MHz 68000. (a unified bus reduces cost, but swapping JERRY for a much smaller, simpler, and slower ASIC, possibly with a lower pin count also saves costs and would have just made more sense ... it also could have used gate array logic, slower, lower density, but for a much smaller and simpler custom chip it would have the advantages of being much easier to prototype, faster to bug-fix, and cheaper to start up production than the standard cell ASICs used for TOM and JERRY) Oh also note, the 3DO was horribly bottlenecked when it came to anything short of solid-shaded polygons as heavy use of textures (3D or 2D) was mutually exclusive with CPU operation as main RAM was texture/sprite/etc RAM, plus it used forward texture mapping (like the Saturn, and Lynx for that matter, though just sprite-drawing) where textels are read out a line at a time and drawn over multiple times to the screen if down-scaled or folded, reducing fillrate further and also breaking translucent blending and gouraud shading. (or corrupting both due to drawing folded pixels multiple times and warping the shading gradient). Plus you had strict liberary-level coding on the 3DO without the ability to clean up bad compiler output with some hand-tuned ARM assembly. If the Jaguar had a bit of smart bus interleaving on multiple memory banks, the 68k might not have fared that badly next to 3DO games. (albeit the CD-ROM mass storage issue was a factor for actual software development ... and PC games that the Jaguar really would've been well suited for: especially various flight/combat sims using shaded 3D or limited texture mapping ... and heavy keyboard commands that made the Jag-pad really appealing) They weren't greedy, they were poorly managed (I blame Sam Tramiel mostly) and extremely desperate, plus it was also just poor timing as DRAM prices stagnated from 1993-1995 and finally dropped again just after the Jaguar was discontinued. (by fall of 1996, the Jaguar Duo could probably have been a genuine low-cost/budget range alternative to the PSX and Saturn ... and the idea of including a RAM expansion cart as standard and offering it at a very low price to existing users would all have been feasible) But in 1993, Atari was desperate, they had a big lawsuit over old patents pending with Sega (which would create a windfall in 1994) but in the mean time they were struggling, downsized to a skeleton of a company, and made the decision to discontinue their computers, somewhat marginalize the Lynx, and put a ton of effort into a Jaguar propaganda campaign to drum up investor cashflow, and it worked: it scared the crap out of Sega (at least the Japanese executives) and got Motorola and IBM onboard for manufacturing along with sufficient investment backing to bring the thing to market. Still, it was wholely mismanaged and the UK/European market entrance was both late and particularly poorly supported ... all really bad decisions on top of cancelling the Falcon. (cancelling the Falcon in the US and continuing to market computers only to the smaller, easier to support/market to UK and European market, or at least the UK, France, and Germany, would've been more reasonable ... moreso if they'd worked the Jaguar chipset, or TOM specifically, into a second-gen Falcon project, like as part of the Falcon040 or a lower-cost '030 counterpart) The further irony, of course, is that CBM fell out of the computer market, leaving a void for the Wintel invasion to finally take the UK/Europe at a time Atari might have continued to compete (especially with the shift towards open-source GNU OS extension development with MiNT/MultiTOS), plus Sega ended up dropping the Game Gear, leaving even less competition for the Lynx. (plus the Game Boy Color failed to even match the Lynx's hardware and continued cost/size/power reduction left tons of room for Lynx updates to compete) Atari's lack of a significant mainstream game console from 1990-1993 (or given the Jaguar remained niche ... from 1990 onward) was a big gap on top of the ST and 7800 sales struggling somewhat in '89, and Sam Tramiel's management ... or perhaps more specifically: Jack's retirement as CEO and Mike Katz's leaving as president of the Games division seriously crippled the whole operation. Katz thought failing to agree on terms with Sega for Mega Drive distribution was a mistake, but even that aside, I can't imagine he couldn't have guided things better after that with the Panther development, Lynx release, Panther cancellation and possible short-term alternatives, etc. (they needed something reasonable to launch in 1990-91, maybe 92 ... and a 'fixed' Panther without so many shortcomings or a derivative of one of the many incremental developments of Flare's Slipstream ... or a much more conservative Jaguar, which would also fall into the 'fixed' Panther configuration: ie support for FPM DRAM operation, enough RAM for a decent-sized framebuffer, addition of a blitter, integrated sound hardware, and some intention for polygonal 3D, but without the custom RISC architecture being implemented: the Flare DSP was enough for geometry co-processing along with CPU-assisted triangle set-up and blitter line fills) Anyway, I wouldn't blame greed as one of the Jaguar's main problems, or Atari's ... though lack of honesty might have been a major problem along with poor management and negotiation skills on Sam's part. (dishonesty with developers during the Jag's lifespand seemed to be one of the problems ... dishonesty with investors was too, which was forgivable to some extent in 1993 with Atari being on the verge of collapse, but much less so after they got market recognition and just seemed to go ... weird or incompetent with what added resources they were afforded) The late introduction of the CD add-on, DRAM prices keeping the base system price point high, and Sony's splash in the market all didn't help, of course. Actually, with all that in mind, Atari probably made a bad bet dropping computers in favor of a new game console ... the Jag chipset (or just TOM) might have been more successful in the Falcon series than it ended up as a console. (as it was, I think the Jaguar's sluggish sales didn't compare too well to the Falcon's sales for the short time it was on the market) Plus the DRAM cost overhead was a lot more justified in a computer system, and floppy disk software was the norm, so no cart vs CD headache to decide over (or be the only console on the market using floppy disks ... especially in 1993/94), plus ... TOM would've been a lot more potent alongside a 68030 on a dedicated bus (even the 16 MHz 030 on the 16-bit bus of the Falcon 030), and that's not just hindsight ... though obviously pure spec-fic fantasy. (well ... I can't help but imagine Jack Tramiel would've put more interest in working a potent new graphics/multimedia processor into a low-cost, mass market home computer rather than 'just a games machine' ... but ... ) Oh, and, back on the topic or real-world relevant stuff: I'd missed out on the new (or last couple years of) Jaguar Flash cart development project (and thread on that), so my comments on a RAM cart earlier in the thread are a bit moot there, as such a RAM cart is in the works, just not Jag-CD oriented. 16 MB of SDRAM acting as cart ROM or RAM is pretty neat, though I'm not sure the 16 or just 6 MB (no bank switching) is planned on being implemented, but the project looks super neat. (and totally relevant to some neat workarounds for homebrew programmers to exploit ... provided folks are interested in that, and interested in digital distribution of freeware or online store style software distribution ... or crowdfunded early access sponsored stuff with free distribution later on ... that and just freely distributed tech demos and hacks, as are common for homebrew on a bunch of old game systems and several computers, even 'obscure' or 'failed' ones like the 32x)
  7. Yes, sort of: http://www.konixmultisystem.co.uk/index.php?id=interviews&content=martin Martin Brennan was brought in to consult on the Panther project in 1989 (the production-ready 8086 version of the Slipstream ASIC was completed by then). John Mathieson would join the Flare II (Jaguar) project later on, around 1991 I believe while Ben Cheese (the DSP and sound guy from Flare I) would move on to Argonaut and design the Super FX GSU, then help found the Argonaut RISC Core spin-off company. (note the GSU-1 was not a DSP like in the Slipstream ASIC, but a fast little 16-bit RISC CPU, 16 16-bit registers, 16-bit address bus, 8-bit external data bus, and the multiply-accumulate performance was poorer than the Flare DSP: 1-cycle for an 8x8=>16-bit multiply 4-cycles for 16x16 vs 1-cycle 16x16-bit on the DSP, but as a CPU it was much more flexible and could run most of the game engine on its own, plus do the polygon drawing operations in its 32kx8-bit SRAM chip, and was optimized for bitplane and tile conversions). http://www.konixmultisystem.co.uk/index.php?id=downloads(see the Slipstream 1.06 documents for the 1989 8086 production version) Anyway, Bennan was brought in on the Panther project here: Meanwhile Konix was having trouble and LucasFilm/Arts decided not to go through with their prior considerations with licensing the Slipstream chipset for the US market. (Konix had a non-exclusive license, so Flare could have sold it to anyone else on varying terms, sort of like the Amiga chipset prior to the 1984 CBM buyout debacle) They also continued developing the Slipstream in parallel with the Jaguar and expanded it in various steps up to a 32-bit data 24-bit address bus 2-bank DRAM based system with a somewhat Jaguar-like Blitter, 25-ish MHz suggested clock rate, support for a variety of CPUs (though a 12 MHz 386SX was the working model of 1994) 25 MHz DSP, and CD-ROM controller/interface. It also worked on 15/16-bit RGB rather than CRY color and had the whole system contained in one ASIC. (DSP+blitter+VDC+UART+CD-controller) Though the CD-ROM interface was apparently buggy at the time. See: "Slipstream Rev4 Reference Guide v3.3" http://www.konixmultisystem.co.uk/index.php?id=downloads Now, what I wonder about was why Martin Brennan moved forward with the Panther project while not pitching the Slipstream to Atari. (or maybe he did but made no mention of it in the Interview) It was a nice little flexible, reasonably potent system, though it had its share of limitations compared to the Mega Drive (already on the market, and Atari Corp themselves had reviewed the hardware in 1988 and decided not to take Sega's licensing/distribution terms for North America: Mike Katz had wanted to, Jack Tramiel and Dave Rosen couldn't agree on favorable terms, plus they'd still be contending for the UK/European market) Plus it was ready made and ready for mass production, and the chips were made on Gate Array logic so should have been fairly adaptable to second sourcing to whatever vendors Atari had best deals with. On top of that it had software in development already on the Konix end, a bunch of UK developers familiar with the hardware and its quirks, and had a somewhat home computer or PC style architecture in general that would lend well to computer game ports (plus actual IBM compatible ports using 8086 Real Mode ... ugly, yes, but for PC games already coded for such, or for 286 in 640k or less of RAM, it would be a smoother transition for assembly-language coded games) It relied on PSRAM to get good bandwidth and do some interleaving, though had DRAM for the blitter and CPU to optionally use (up to 256 kB PSRAM and 512kB DRAM) and was fastest at rendering 256 color graphics. (using an 8bpp chunky framebuffer at 256x200 or up to 256x256 display, and allowing more compact 16 color 4-bit sprites/objects to be used via a mask register; 16-color framebuffer modes were slower to render to as the blitter had to do reads before writes to work on nybbles). It also had a fast DSP and fast line-fill operations useful for doing flat shaded polygons or a mix of other effects (including scaling or even scaling/rotation texture mapping type effects) in realtime. (though sound engines doing DSP-intensive synthesis routines would make that hard, ones just using sample based sound like Amiga MOD or such would use very little DSP time at all, especially for 4-channel music and a couple of SFX channels, even if doing ADPCM decoding as well) The 6 MHz 8086 was slow, but relatively cheap. However, it would've been a bit painful to adapt to ROM carts due to the 1MB address limit (and less than 256 kB were reserved for ROM in the Slipstream). OTOH the system was intended to use 880 kB floppy disks instead, and a DSDD floppy drive would add to the base unit cost, but make it even more appealing to computer game developers (and a lower risk all around than manufacturing masked ROMs ... something impossible for some smaller devs and publishers at the time). Plus you could make big home computer/PC style multi-disk games with lots of multimedia features. Given Atari's focus on the home computer game license side (during the 7800 era) on top of its library of ST games, plus its existing supply chain for DSDD floppy drives for the ST line, it would seem an appealing option. (plus the proprietary 880k format and some level of encryption would be appealing for copy protection, and also avoid the need for funky floppy disk DRM schemes typical of the era) The Panther OTOH was half-finished, rather odd, and not all that cost-effective. (to keep costs down it used 32kB of 32-bit VERY fast SRAM, we're talking 35 ns, like 25 MHz 386s and 486s were using for cache, but aside from a proposed 64 kB of DRAM for the ensoniq DOC-II sound chip, that was it for onboard RAM) It worked like the 7800 using cycle-stolen DMA to load sprite pointers and data (it was a direct precursor to the Jaguar's Object processor, but with no optimization for DRAM use) and would mainly rely on reading from 32-bit wide ROMs. It had a 16 MHz 68000, but with the existing 1989/1990 configuration, the 68k would spend tons of time halted for DMA, much like the 7800's 6502, plus it'd have wait states if working in slow/cheap ROM, while fast ROM (like the PC Engine/TG-16 used) would've been really costly for a company like Atari (NEC had in-house manufacturing but still typically used half the ROM size of contemporary publishers: like 256k where the MD was commonly using 512k in 1989) and while adding some more hardware could have fixed some of that and potentially cut costs by removing the Ensoniq chip (say a DMA sound + bus controller + DRAM interface chip) and allowed use of slow and even 16 or 8-bit wide ROMs loaded into DRAM, that was yet more added work and not something that even happened up to 1991 when the Panther was formally cancelled. So given the Panther lingered on in development to early 1991, the ready-made Slipstream becomes even more strange to pass up, plus tweaking things to allow a 12 MHz 68000 or 286 given the added year of development time should've been child's play compared to completed + fixing the Panther. (68k would be cheaper and generally more friendly, but 286 would make existing Konix dev work easy to port, plus lots of PC games ... either case would also give 24-bit address space to use for cart ROM if they decided to ditch floppies) DRAM prices had also dropped a great deal in both 1990 and 1991, and loading the maxed out 512kB DRAM would've been easy. (128kB PSRAM would've been enough for most purposes as well, though 256kB would be nicer: you only need 128k to double buffer the max 256x256 PAL screen, but having fast RAM left over for DSP DMA use would be significant, including doing 3D matrix processing and spitting out vertex data) *Note, they could easily have just kept the 1MB address space limit for the chipset itself and let the host CPU alone work in 24-bit space. (that'd sort of be like an Amiga based console where the OCS could just access 512kB and most/all ROM stuff would be up to the CPU copying to RAM as needed) Oh, and Atari had already been sourcing 12 MHz 286s for their PC line around this time, so that would be another consideration for that choice. I say floppy disks would be the most novel option at the time and bridge the gap between cart an CD based consoles. Albeit on a purely engineering note (and one Kskunk made years ago) a CD-ROM drive is actually cheaper to manufacture than a DSDD floppy drive (and vastly cheaper than something like a ZIP drive or LS disk drives) but the tech was all patented and had a premium on it in the early 90s and also didn't have the raw volumes for economies of scale quite yet (the Jaguar was released around the time the scales were tipping) so a common DSDD floppy drive would be the cost-effective mass storage option in 1989-1991 for sure. (720k PC/Atari, 800k Apple, 880k Amiga, all the same drive and disk track system, though using different sector sizes, also little endian data for PC, same for the Slipstream ... ignoring a 68000 based one) Oh, and the Multisystem's post-Konix era development as a set-top box included 286, 386, and 68000 configurations, mostly at 12-12.5 MHz. Or at least the 68000 came up at one point: http://www.konixmultisystem.co.uk/index.php?id=multisystem2that whole situation was a mess (not the hardware, but ... Wyn Holloway's end of things) As an aside, I think failing to capitalize heavily on the home computer and PC/DOS game market was one of the Jaguar's failings, though also one partially forced by using ROM carts. The keypad on the controller and the capabilities of the system would've made it really neat for early 90s PC games, 3D and otherwise, including Wing Commander I and II (III would need CD), X-Wing, various Lucas Arts adventure games, etc. (most of that sans full on FMV games could be done via floppy, but I don't think 1.76 MB DSHD floppy disks would've been all that appealing in 1993/94 ... or more likely to get weird looks than 880k would have back in 1990) The Panther's gamepad was essentially the same as the Jaguar's, so equally well suited to keyboard-heavy games (with or without overlays) but the 3 face buttons would've been much less outdated for 1990. (they used the same I/O port mapping as the STe joyports anyway, so STe/Falcon games could use them) Albeit, if using the existing I/O ports the Slipstream ASIC had, you'd need to reduce the number of key inputs, or add another chip for expanded I/O. There's 16 I/O ports for the joysticks already, plus 3 potentiometer inputs and a light pen input, so you could have partial STe port compatibility with 8-bits of I/O per channel, 2 analog POT/paddle/axis inputs on one port and one paddle plus a light pen (light gun) input on the other. Doing a bit of multiplexing like the Mega Drive did (6-bits of I/O in that case, though only multiplexing 2 of those for more buttons) would've been one route to get the full pinout. (plus doing 8-bits + ground is already going to make a pretty thick joypad cable, doing the full 12 bits of I/O the STe/Jag used would be less than cost-effective) *Of course, the STe's ports were originally intended to allow splitters for 4 single-button Atari joysticks or 4 paddles, and the pin designation heavily points to this. http://old.pinouts.ru/Inputs/EnhancedJoystickAtari_pinout.shtml (neat, but overkill) The cost of a little multiplexing logic would be well worth avoiding thick, expensive, awkward cables in any case. (Nintendo OTOH had been using serial based controllers since the Famicom, but the approach at hand is already 8-bit parallel oriented and multiplexing that would be pretty safe to get a good cost compromise ... you could also use an analog matrix like the VCS keypads, but that's both odd and not really cost-effective by then: Gravis used analog lines for its gamepad's d-pad, but that was partially due to making it compatible with 2-axis analog joysticks, allowing normal joystick games to use 8-way digital control via a primitive resistor DAC) Also side note on the Falcon, but having Flare spin-off a cust-down DSP-only ASIC designed to work around the Falcon's DMA/bus timing, and put a bit more on-chip RAM (like 2kB rather than the 1kB of the 1989 Slipstream ... or technically 1.5 kB, but the last 512k doubled as CRAM and was up when all 256 colors were employed) and run 16 MHz, it would've been a major cost saving measure over the Motorola 56k and its 192 kB of super-fast 25 ns SRAM (that's 33 MHz 386/486 cache RAM there). That and possibly ditched the added Falcon sound channels in favor of 16-bit PWM DAC output from the DSP (at 16 MHz, the existing PWM registers would allow up to 125 kHz 14-bit stereo up from 93 kHz in the standard slipstream at 11.9 MHz, though somewhat less than the 208 kHz 14-bit stereo the Jaguar was capable of: all systems used pairs of PWM registers to generate 7-bits and add to 14-bits ... though the PWM registers in JERRY might be broken on the Jaguar as I think it used external 16-bit DACs). You'd need the 8-bit STe PCM channels there for compatibility in any case. That or the Falcon DSP should've been an optional module. (neat for a dedicated sound processing system and neat for 3D coprocessing, but a significant detriment to the price point ... then again offering a MEGA STe style 16 MHz 68000+16k cache in place of the 68030 would've also been an appealing lower-end option in 1992, and might be faster than the 030 in situations where the tiny 256 byte caches it had were insufficient and VIDEL was stealing tons of bus time, like in the 16-bit color mode or even some of the 256 color modes: a plain 68000 working without wait states in 16 kB of cache would have lots of appeal there ... oh, and the 16 MHz blitter would get much more use than for just compatibility) And for anyone wondering: the existing Slipstream would've been a poor add-on for the STe as it supported 256 and 512 pixel modes that would leave huge boarders of synched to ST SHIFTER pixel rates (8/16 MHz) plus that'd requite a 16 MHz slipstream and faster PSRAM anyway. (commissioning the flare team to design an ST-flavor of ASIC would've been interesting ... and more worth their time than the Panther IMO, but then VIDEL was really OK as it was in 1992 and offered good backwards compatibility, while the Jaguar was really epic at the time and a bug-fixed JERRY chip given dedicated RAM to work in and genlocked onto Falcon video would've been awesome in 1994, possibly as part of the cancelled Falcon 040 ... though a 24-40 MHz 030 based system with 32-128kB of board-level cache would've been fine as well, competitive with the 40 MHz 386 systems still popular at the time and then some ... but a 24 MHz 68040 would certainly have been nice; 40 MHz 030 is just kind of nice given it's really easy to get off the ST-compatible 8 MHz base clock, also nice for the 26.6 MHz the Jaguar chipset was managing: ie 2/3 of 40 MHz) Oh duh, I forgot: the Slipstream hardware also would've been really appealing to cross-develop Lynx games for. The CPU architecture is different, but the mix of blitter+coprocessor+framebuffer and packed pixels was quite similar, as was the 3D/pseudo 3D functionality and emphasis on mass storage. (the Lynx's chipset was originally going to use tapes, but very slow, cheap 8-bit ROM ended up being the practical solution for a handheld ... meanwhile floppies were the go-to option for a console) And following suit from the 7800 (and Epyx connection) the Lynx was already leveraged fairly heavily towards the computer game pool of developers and publishers. Ah, the Slipstream and Lynx also both used 12-bit RGB, like the Amiga and STe as well. (the Lynx and STE were just limited to 16 colors ... though for the Lynx's screen that was arguably overkill: same for the Game Gear doing 31 colors in 12-bit, aside from a few games using the Master System mode) As for exclusive games: Starglider III was planned (though probably had some elements re-used for Starfox after being cancelled), and Jeff Minter had a lot of neat ideas going on, including Attack of the Mutant Camels now playable via emulator. (though the sound is a bit bugged, or it's due to lack of lowpass filtering) Edit: I forgot: by 1989, Konix had moved on to a 256 kB fully-loaded PSRAM configuration due to complaints from developers running out of memory, particularly for games using page-flipping (double buffering), though DRAM still wasn't included as standard at that point, I think. On that note, Atari could've released a 256 kB system and stuck in a pair of 30-pin SIMM slots for RAM expansion. (a nice cost-effective idea at the time and borrowing from the ethos of the STe, but in hindsight a VERY good idea as noth only did DRAM prices drop fast, they then stagnated in 1992/93 while the price of 256 kB SIMMs dropped through the floor due to the limited demand: hence the popularity of SIMM savers at the time to re-use older low-density SIMMs as 1 MB) Plus slow, old, 150 ns DRAM would be fine in the Slipstream, as would anything newer, so they could literally use refurbished SIMMs if they wanted. (and people could upgrade using cheap second-hand SIMMs or cheap overstock/surplus ones common on the market) This is was also evident in ST magazine ads at the time that had the 512kB upgrade kits much cheaper per-byte than all the other options. (520 STe to 1040 STe upgrades were cheap, just add 2 256 kB SIMMs, while other configurations required 1 MB SIMMs or possibly non-standard 512kB SIMMS, which were much more expensive for the same amount of RAM, but you only got for slots on the STe, so 1MB was the max using 256k SIMMs ... and SIMM savers wouldn't fit in an STe case, maybe the MEGA STE) That situation with 256 kB SIMMs is narrated here from the early 90s PC perspective: http://www.redhill.net.au/b/b-93.html http://www.redhill.net.au/b/b-94.html That RAM price stagnation was very much like what had crippled Atari in 1988, driving the price of the 520ST up to Amiga 500 levels and killing the 1040ST's potential to become the mainstream/baseline standard. (for general-purpose use a 1040ST at a similar price to Amiga 500 would be an obvious sell, but even for games, the added RAM and savvy RAM-centric tricks like heavy use of pre-shifted animation would've made the platform cut in further to the Amiga and general computer/console game markets ... potentially even better sound, given larger samples for software MOD or, better, ST-specific tracker formats for intros, cutscenes, and non-gaming purposes) Had they standardized the Blitter with the 1040STf in '88, that'd also boost things a bit, including for 3D stuff. (8 MHz 68k + 1 MB of RAM, look-up tables for faster multiplication, and a blitter for fast polygon fills ... also faster at sprite drawing and block-copy, and realtime bit-shifts rather than pre-shifting; a significant boost even without hardware scrolling ... also more potential to eat up CPU time doing interrupts for sample playback) In any case, RAM prices jumped up in '88 and stagnated. (they didn't jump as much in '93, but they stagnated heavily to 1996) See: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htmunder "Annual DRAM price-per-bit ($ per Mbit)"
  8. Why not just use an SD cart that uses SRAM as the simulated ROM, but allow for games to reserve part of that address space for variable use. (or rather than 'reserve' just require software to manage the memory in a responsible manner and avoid writing to address space that's supposed to be treated as ROM) Some of the Mega Drive (and I imagine SNES, GB, etc) flash carts are actually SRAM carts, though I'm not aware of homebrew software exploiting that. (it's just a design choice and leads to faster loading and avoiding burn-out of flash memory) Firstly, it at very least greatly benefits texture mapping speed, especially for large textures where buffering them into GPU SRAM or line buffer SRAM would be impractical. (plus GPU SRAM chokes the GPU if used for textures, while at least line RAM exploits allow the GPU to continue working) So you can actually hit the peak 5.32 Mpixels/s fillrate of the blitter for texture mapping (or scaled/rotated objects ... or just scaled objects where you need per-pixel granularity with the framebuffer that OPL sprites wouldn't provide ... you can use the Z-buffer for OPL sprite priority overlapping a 3D scene, unfortunately) I was also mistaken earlier, it's not 10, but 11 cycles to render a single pixel when texture mapping. 2 cycles for the read, 3 for rowchange, 1 for R/W change, 2 for write, another 3 for rowchange and repeat. Using 2 separate banks of DRAM (or any memory with cycle times no more than 5 cycles) takes 5 cycles instead, I thought it could be faster but see below for Kskunk's quote: the blitter can't render textures faster than 5 cycles (26.6 MHz ticks) per pixel, thus the worst-case timing in DRAM (5-cycles for a read+rowchange) wouldn't slow down the blitter at all. To put it another way, for a game that uses texture mapping in main memory, you'd spend 45.5% of the time you normally would on the bus, and that much more time for other things. (the more texture-heavy a game is, the more dramatic the advantage) On the 'back then' hypothetical end, we could argue the Jag CD came bundled with the RAM cart at its release in 1995, if you want to fixate on the split development base issue. (and in the case of a 32kx16-bit PSRAM chip on-cart or a pair of 32kx8-bit SRAMs, it would've been both cheap and foolproof enough to pull off: with a DRAM cart, it might be cheap enough, but I could imagine delays in actually implementing a little DRAM controller/interface ASIC if they weren't planning ahead ... they obviously hadn't planned ahead for RAM expansion given the lack of external connectivity for the unpopulated main DRAM bank: that would've been cheaper and simpler to expand than any of the above, and taken fewer pins given the multiplexed nature of DRAM ... a little 40-pin edge connector would've been sufficient for a 16-bit DRAM interface, that or just put the DRAM control lines on the cart slot for intended expansion use) Or if they were really planning ahead, perhaps even arrange the Jaguar's DRAM originally as 512kB 64-bit in one bank (4 64kx16-bit DRAMs) and 1MB 32-bit DRAM in the second (two 256kx16-bit DRAMs) while still using a 32-bit wide cartridge bus, but adding the necessary DRAM control signals and allowing that connector to serve both as the interface for the cart address space at 32-bits wide AND to allow expansion of the other 32-bits of the second DRAM bank. (ie the CD add-on could have another 1MB 32-bit wide chunk of DRAM, but mapped to the same 64-bit addresses as the existing 1MB bank, interleaving on a 32-bit word basis and now providing two complete 64-bit DRAM banks, 2MB + 512kB ... and now you could have a cart passthrough without touching any of the cart ROM address space ... plus it's cheaper, no DRAM controller, much more seamless, and much more generally useful for the fast, fully 64-bit portions of the system) On top of all that, the Jag would've been moderately cheaper to manufacture at launch and still a good bit more flexible/powerful due to the reduced bus contention. (fewer page-breaks by doing more work in different DRAM banks as much as possible, plus faster texture mapping and faster 32/64-bit blits as well, as source could be in one bank with destination in the other, keeping nice 2-cycle page mode accesses going) And 1.5 MB was still plenty for the time, and much nicer than what Sega CD or 32x programmers had to work with. Now if they wanted to get fancier and make JERRY slightly less crippled, they'd have also added a 32-bit wide CPU in place of the 68k. (68EC020, 386DX, ARM60, maybe one of the lower-end embedded flavors of AM29000 series, etc ... the 020's I-cache would help a bit too, but whatever was cheapest would be best ... the Jag was already designed with big or little endian in mind, so reconfigurating that would've been less an issue ... a Cyrix 486DLC with the 1kB on-chip cache also was nice ... or IBM's similar chips, but those probably would've only been cheap from the desktop PC perspective, not from an embedded system/console standpoint: the AM29000's low end options also lacked cache, but you've got the massive array of 192 32-bit registers to consider ... a neat complement to the 64-register J-RISCs) But more to the point at hand: Jerry is more of a bottleneck in the CD than with cart games as you can have it work largely in ROM to read samples or other data or copy code (or have the blitter copy chunks to JERRY's SRAM) while avoiding hitting main DRAM and thus avoiding performance-killing page-breaks caused by rowchange. (Jerry's accesses are so slow anyway that ROM isn't that big of a bottleneck, and games using it basically just for audio would be fine, even if doing sample based music+SFX, especially if streaming compressed sampled 2/4-bit ADPCM or even CVSD would be interesting, or 2-bit flavors of CVSD: 2, 3, and 4-bit ADPCM flavors had long been promoted by Covox as low-overhead compression formats for PCs, targeting the low-end systems using parallel port DACs, but applicable to pretty much anything else capable of PCM too: CVSD, especially 1-bit CVSD is obviously better suited to speech compression than musical instruments; plus the DSP can do filtering and interpolation of lower sample rate stuff and minimize both ROM space and bus time needed to stream the samples ... and still probably sound a lot nicer than the SNES, quality of compositions aside of course) In any case, without ROM, the DSP now needs to read from main DRAM, which means page-breaks for TOM where there might otherwise just be some waits. Meanwhile, adding even a chunk of slow RAM (or even a small chunk of RAM) would offload that significantly That aside, wouldn't handling RAM on cart be similar to using ROM of a similar width? (likewise you COULD directly texture map from ROM, but it would've been slow back then, usually 8 cycles, 10 for slow/cheap stuff iirc, plus it'd mean using uncompressed 16-bit textures rather than unpacking them into RAM) Now you also could've had carts that had RAM right on them, like several 7800 games did and some SNES and even MD games (well ... just Virtua Racing with the DRAM for the SVP chip, I think, ignoring SRAM for battery saves) and a couple 7800 games had even used 32kx8-bit SRAM chips back around 1987 (both Summer and Winter Games did that iirc, only using 16kB as that's what they needed for the memory map they used and because 2 8kB chips took up too much space to fit, and the cost of 32kB was cheaper than a modified cart PCB/case at the time, apparently) so 64kB of 16-bit SRAM/PSRAM slapped on cart wouldn't seem too unusual for 1994-96 ... or later. (had the Jag done well with carts). But you needed at least enough confidence in the platform and investment funds handy to actually manufacture carts like that. (making masked ROMs at all was a big problem, and a big reason some folks suggested the Jag should've been a CD system from the start ... not for performance, but for sheer economy of development and attracting more devs and publishers who'd otherwise be unwilling to risk the overhead of a ROM based system: that and Atari could do things like ship out free demo discs both pack-in with consoles and at promotions, and even jump onboard the wave of Shareware distribution at the time ... plus still be vastly cheaper than the 3DO, but that's yet another topic) But on the issue of bus sharing and interleaving, is there too much of a delay for granting the bus to TOM, the OPL, or Blitter to do any useful interleaving between slow, periodic accesses? Like the 8-cycle reads of the 68k (not that it even hits the bus for every memory cycle) or 6-cycles for DSP. I believe you only need a 73-ish ns (2-cycles at 26.6 MHz) period for the actual read/write strobe, and while you couldn't interleave accesses in a single bank of DRAM at that speed (as there's 3 cycles for rowchange and another to switch read/write if needed) having accesses in different DRAM banks with different rows being accesses and held open (for page mode) would allow overlap of everything but the actual read/write strobes. Now, a higher priority processor on the bus couldn't take open cycles from a lower one as it already has priority, so you need situations where the slow processors have priority, but leave enough time to grant one or more accesses to lower-priority processors/DMA-channels/etc (any bus master). The 68k is normally bottom priority, so would be difficult to actually put in a situation where TOM, the Blitter, or OPL could waiting for holes in 68k accesses to work, but the DSP normally has fairly high priority and that could be exploited. Further, the 68k has higher priority when doing interrupts, so coding a game where the 68k is being used pretty much exclusively as an interrupt handler would make that arrangement viable. (as such you could potentially split up general processing duties between the DSP and 68k while not too horribly hogging the bus) From the Jaguar Refrence Manual V8 The Jag II fixed that with double-buffered blitter registers instead. And I say fixed and not 'would have fixed' as I'm pretty sure that was functional on the preproduction Oberon (Tom II) chips used on the development systems in 1995. (Puck was not on those, just old Jerry, as crippled as ever, except 32-bits wide thanks to using the 68020 in place of the 68k ... something they might not have needed to retain for the production version if Puck's features worked correctly, allowing a cheap 68000 to be stuck on there for compatibility: indeed, better compatibility than an '020 would provide, plus a 68k would have been a reasonable fit on the slow 16-bit sample RAM bus Puck was to use, sort of like the 68EC000 in the Sega Saturn audio system) Playing devil's advocate here, I'd point out that Kskunk's skunkboard (and any modern homebrew ROM cart based games that got enough traction to be manufactured in masked ROM) could be run fast enough to allow texture mapping from ROM without waits, but more than that it could allow GPU code/data/DMA fetches from ROM at full speed as well. (using the high speed ROM cycle mode that was originally intended for testing only) The blitter and OPL doing 64-bit bus operations would still be faster in DRAM though, in cases where serial access can exist. But beyond that, you could build an SRAM cart that could either be simple SRAM (only useful for loading from CD), made into a passthrough cart and only using part of the address space (allowing ROM as well, possibly bank-switched), or just a full 6 MB 70 ns SRAM cart used for CD homebrew. Or add an SD card interface (or CF, XD, etc: the latter would probably be easier given it's parallel, but SD is obviously the most popular and what most 'flash' carts use, regardless of whether they load into flash memory or SRAM on-cart: the latter has the advantages of speed and not wearing out from successive writes) And given the hardware hacking stuff folks do (overclocks included), I'd think wiring up the unused DRAM bank would also be an interesting possibility ... probably not as simple as the old piggyback RAM upgrade on the ST, but also not totally different. (and SOJ leads aren't too bad to work with ... touching the leads on TOM would be iffy OTOH) Oh, and Kskunk's experimenting with texture mapping on internal SRAM showed that the logic for calculating scaling/rotation in the blitter limited pixels to 5 cycles at best, so even random read/write DRAM speed would be fast enough to do texture mapping at the peak rate. (that would include page-breaks in the second DRAM bank or a slower DRAM controller onboard a cartridge with no page-mode support and basically behaving like PSRAM at the 5-cycle cart I/O mode) https://forum.beyond3d.com/posts/1936444/ (Nammo is Kskunk) There's also stuff in that thread about rendering directly to the line buffers and potentially doing beam-racing style 3D at 50/60 Hz, but that's better for another thread. My current new favorite is actually: what if Atari spun off Flare II's Jaguar II project to Sega in 1996 during all the second-guessing with the Saturn. (plus the unfinished Puck chip with RCPU and DSP could be displaced by some neat Hitachi Super H RISC chip with built-in UART and such ... or an Power PC 602, convenient 64-bit bus there) But again not the topic here. More on topic I'd say would be bundling a 512kB DRAM cart with the CD to boost performance/flexibility a bit might've made an impact, that and they just had bad luck of choosing to cut their losses early in 1996 before DRAM prices dropped like a rock. (they had the misfortune of test-marketing the Jaguar at about the same time of the big Formosa resin factory fire in Japan that cause RAM and other IC prices to jump up then stagnate, just like they did in 1988: the latter hurt the ST big time and crippled the 1040ST's potential of becoming the bottom-end standard, plus made the Amiga end up price matching the ST that year ... the 520STfm and A500 starting 1988 at $300 vs $500 and meeting at $400 mid-year) Granted, the reason Atari desperately needed good luck to survive was mostly related to Sam Tramiel's abysmal management. (Atari Corp was best under Jack and Mike Katz ... 1985-88) Oh but I doubt anyone would bother with a DRAM cart for modern homebrew. SRAM is much easier to do and cheap enough not to bother with anything else, plus 2-cycle read/write times offers more flexibility for the fast parts of the system. (the DSP and 68k would be fine with 186 ns cycles ... the DSP can only do reads at 6-cycle intervals anyway, and the 68k takes 4, but 8 system clock cycles as it runs at half the speed)
  9. I haven't read the whole thread to see if Curt or Marty or some other historians on the site already corrected this, but Warner-Atari heavily invested in the Amiga chipset and had licensed it, planning it as a home computer, arcade machine, and game console (codenamed MICKY). They also had several in-house 16-bit (68000, x86, and maybe 16032/32016: I think Trammel Technologies dabbled with the latter before switching to the 68k). Amiga ended up cheating out of their contracts with all licensees, and at least in Atari Inc's case, illegally 'refunding' the investments made while claiming to have failed to produce working silicon. Meanwhile they'd signed an exclusive agreement with Commodore. With the confusion going on in June/July of 1984 at Atari Inc, and Warner's horrifically managed liquidation of the company without notifying executives (especially Atari President James Morgan) led to that slipping through the cracks and some lower level management accepting Amiga Inc's refund check without reading over the contract properly. (it's that same sloppiness that led to Tramiel's poor reputation and the myth that he 'fired everyone' when taking over ... rather than the reality that Atari Inc was liquidated, the arcade business spun off and home/consumer business's assets sold off ... it's also that mess that led to some of the neat in-house designs, hardware and documentation along with engineers walking off or becoming fragmented) It was also that breach of contract that leveraged Atari Corp's later settlement with CBM over the ST lawsuit. (the Amiga contract was brought in to counter-sue them) Incidentally, the Amiga contract allowed a game console/arcade machine to be released in 1984, a computer in 1985 with no more than 128kB of RAM, and unlimited hardware configurations from 1986 on. (had Tramiel gotten hold of that license, I imagine they'd have made do with 128k and perhaps shipped without GEM initially, just the text based portion of TOS, and also probably been forced to include RAM expansion via slots or DIP sockets and possibly even use an external cart slot for the OS ROMs ... or slave the cart slot to that purpose while abandoning any intent to use ROM cart based software: though using internal ROM sockets and intending service centers to install OS ROMs as they arrived may have been the more natural decision) The ST was originally intended to have a 128k (130ST) as the bottom-end model, of course, but was abandoned as RAM prices fell and the OS became too large. (and a cut-down version without a DOS at all and just BASIC and tape drive interface routines became unappealing) On the note of the actual thread topic, though: why not add MARIA to the 8-bit chipset? This is something that came to mind while I was looking at the flaws and problems (and possible fixes) for what made the Panther problematic a few years later, but in any case: Replace FREDDIE and possibly the MMU with a new gate array ASIC, performing the memory mapping and DRAM controller duties and fast enough to allow Amiga speed bus cycles (280 ns cycles) fast enough to service existing ANTIC and SALLY access times while only using 50% of the bus cycles, but rather than fiddling with ANTIC or SALLY timing at all (or trying to spin off 3.58 MHz 6502s or what not) use that added bandwidth in leu of cart ROM access for MARIA graphics data, and use the new mapper/controller chip to interleave things seamlessly to avoid the need for CPU halts during MARIA DMA. (though you'd still need to wait for vblank to do MARIA register updates and list/pointer updates in SRAM) Bump MARIA SRAM up to 8kB (a single 8kB SRAM, cheaper, less board space, etc ... 32k would be nice, but not really needed given you're pulling most graphics data from DRAM). Probably map the normal 48k MARIA cart ROM space directly into A8 space and put MARIA registers and SRAM onto the 16k bank switched segment. Possibly add the ability to enable/disable either of the 8k cart ROM chunks to allow the full 64kB of DRAM to be used and those 8k banks flipped in as needed. (I forget if the player 3 and 4 GTIA trigger inputs were used already, but those might be handy for using an additional 2-bits of bank select control) You'd still need on-cart bank-switching logic to extend beyond 16k as well, but you'd make the most of RAM this way and avoid the issue of MARIA/A8 DMA conflicts in ROM. (cheap ROM being too slow to interleave in, at least when both ANTIC/SALLY and MARIA are trying to access it) You'd thus have a really nice system with MARIA graphics operating without holey DMA and genlocked over GTIA graphics (MARIA was designed with this in mind for the planned laserdisc expansion, so genlock with GTIA should be quite possible, particularly as all would be running off a synchronized clock and using common color/pixel clock times or integer multiples of those: ie if one was using 320 pixel and the other 160 pixel modes). MARIA allows for up to 4-bit pixels in its objects, which could potentially also allow a 12-color linear bitmap screen overlay on top of ANTIC+GTIA character or bitmap modes, or turn off the latter entirely for 100% CPU time in a 12-color bitmap. (or 13 colors given GTIA's background color should still be available) For typical late 80s console/arcade games, I imagine it'd be appealing to use 3 or 12 color MARIA sprites over a 5-color ANTIC character scroll layer with GTIA sprited used for a bit of added color. (so 12 color sprite layer + 9 color background) Doing proper Genlock would also give nicer video output than the 7800's hacked solution of merging TIA and MARIA video lines. (a simple disconnector switch also solves that, of course) Further, this sort of machine would have been a much more potent game console to release for 1987 than the XEGS, while also better meriting the price points the XEGS was initially sold for (substantially more than the $99.99 65XE or $89.99 7800 and of course $49.99 for the 2600Jr). The deluxe package XEGS with light gun and keyboard originally retailed for $199.99, and I rather doubt the added MARIA+SRAM + Gate Array chip and 150 ns DRAM rather than old 200/250 ns stock would've pushed it even that high. (probably more like $150 in a basic set and $200 with keyboard and games and/or possibly other software) https://youtu.be/2N2BUTIpnDI?t=97 Plus you'd have a game machine with substantially greater advantages over the NES and Master System. (still some trade-offs like the lower resolution for most purposes, but a monster sprite engine for the time and some pretty nice colors all around ... and the flexibility to do some nice software rendered effects to a linear framebuffer and a ton of RAM for a console at the time, and chunky pixel graphics, so very well suited to storing compressed data on cart to save space) You'd also have a lot more CPU time to do complex POKEY modulation effects (or 4-bit PCM) or possibly make some use of the GTIA beeper channel. (though that would probably be more useful if you added GTIA beeper control to the new ASIC, maybe slaving it to some neat PWM sound ... possibly even useful for sample playback, but I'm mostly just thinking fixed-volume variable duty cycle pulse wave stuff ... though slaving it as a PWM DAC would certainly be interesting, I'm not sure what sort of resolution you'd get out of it: if you could toggle at at 7.16 MHz, that'd allow 28 kHz 8-bit sample playback, which would be quite neat, especially if it was DMA loaded ... though a CPU-loaded FIFO would be pretty good, too) You could obviously have a 128 kB variant of that on the computer end of things, but a game console would probably be better to stick with 64k. (you could drop lower, but that would hamper the compatibility and selling point for promoting expanded A8 development in general as a computing platform on top of enhanced game machine, plus 64kx4-bit DRAMs were a very economical density at the time and using 2 or 4 16kx4-bit ones for a 16 or 32k system would seem a poor value by comparison) And, of course, such a game console would squarely sit in the Home Computer category as far as Nintendo's predatory licensing was concerned, and would soundly avoid the sort of problems the 7800 and Master System both suffered from. Edit: you could also use that faster DRAM timing to allow for a 3.58 MHz 6502, but I'm not sure existing (even new production 1987) NMOS SALLY chips would tolerate that well enough, and 65C02s were around, but then you had to deal with RDY rather than HALT among other things (short of making a CMOS SALLY). OTOH, using that 7.16 MHz bus/DRAM controller ASIC clock divided by 3, you'd get a more likely SALLY-tolerant 2.39 MHz, which would be a nice speed boost for some things, and still wouldn't change ANTIC timing. (just more wait states for SALLY when overlapping with MARIA DMA cycles) 3.58 MHz would obviously be nicer, though. (even more wait states for MARIA, but still a speed gain, and faster interrupt response) You'd need normal 1.79 MHz modes for full compatibility. (also standard XL/XE memory map modes, possibly disabling the cart-slot banking if that proved problematic) Wait: RDY in the 65C02 behaves like HALT on SALLY, doesn't it, since it's CMOS and static and thus needs no refresh? So you could use a 65C02 in there without problem, and use 3 or 4 MHz rated chips at 3.58 MHz. (unless there's any software using undocumented NMOS-specific opcodes or such, you shouldn't have compatibility issues ... plus you get the enhanced instructions, some more than others depending which 'C02 variant they used ... probably Rockwell though, given Atari Corp was using them a fair bit already for chip vending) That aside from other hypotheticals, like if Atari had taken Synertek's assets when Honeywell liquidated. (Synertek was in trouble with Superfund cleanup/lawsuit issues, so it would've been on favorable terms, though another risk/reward investment for Tramiel to make like he did with Atari Inc's assets, though it was sold off in 1985, when Atari Corp was already pretty deep in investment debt) Synertek had already been manufacturing 65C02s prior to being shuttered, for what that's worth, along with second-sourcing a bunch of Atari's custom chips, so it would've been a solid fit all-around. (albeit slightly moreso had the ST used more MOS chips for its 8-bit serial and I/O stuff rather than Motorola ones) And why use a gate array for the new ASIC? It'd be much faster for testing/prototyping than a full custom chip (especially without an in-house chip fab) and would be much lower risk to produce at low volumes, hedging their bets on a potential flop. (if it really took off, they could probably spin off a full custom or standard cell ASIC to not only replace it, but embed the DRAM controller+MMU+CPU+ANTIC+GTIA+MARIA+POKEY+PIA all on one dense CMOS ASIC with a single 8-bit I/O bus and 16-bit plus bank-selected address bus, making it a solid budget console/computer platform around 1989 into the early 90s and also making a nice platform to cross-develop Lynx games for) You could also switch to a single 128kx8-bit DRAM chip by 1990/91 and discontinue the 64k models entirely. It's worth noting that plenty of manufacturers stuck with gate arrays throughout platforms lives in spite of high volume production, so that's always an option too. (and you didn't need the raw logic speed that standard cell and full custom CMOS parts were doing in the late 80s ... Standard Cell also might not have been very widely used yet) Sega used lots of Gate Array chips for various things in the arcade and home consoles. (and the custom graphics/interface chip of the Sega CD was simply called the Gate Array in most documentation/programming literature) Flare Technologies also used Gate Array chips for their Slipstream hardware (the Jaguar was Standard Cell, though), which makes plenty of sense given they'd come from Sinclair, who'd used some of the pioneering Gate Array (ULA) production for the ZX-81 and Speccy.
  10. Oh, and I forgot to mention, even without the Z80, you could leave in all the other Master System compatibility bits (I/O, sound, VDP, etc) and just stick the Z80 into the Power Base Converter. (most or all of the necessary I/O and memory addresses are accessible through the cart slot as is, so you might not even need to change that. You could also have ditched the side expansion port in favor of a VRAM expansion port (there's another 64 kB of VRAM space unused by the VDP) and use fewer pins for that as well. (the dual 8-bit data ports plus multiplexed address lines and DRAM control signals) On that note, upgrading the PSG to allow it to run at lower clock rates (or just 1.79 MHz, half of normal) would make it much more useful for music, though adding Game Gear style stereo functionality would be nicer. The Cart slot is already a much better expansion port than the side port (originally earmarked for a floppy drive before the CD-ROM was pressed into that role) but a cart-slot module based expansion would be far more flexible and efficient ... and you probably wouldn't need that redundant 68000. (it's faster, sure, but swap that for a DSP co-pro of some sort and you've got a generally more useful system more useful for 3D) You could also just put the VRAM expansion lines on the cart slot, potentially on outboard keyed portions (7800/SNES/Jaguar style) to keep PCB costs down on standard carts. (actually, there's a TON of expansion pins that most games don't need and would've been cheaper/better off if segregated from the normally used ROM cart bits ... probably just 48-50 pins needed for most games, including a couple pairs of VCC and GND lines) If you added that second VRAM bank onboard the CD itself, it'd also open up interesting possibilities for other changes, like having the added graphics co-pro ASIC render straight into that VRAM bank, or at least have faster and more flexible DMA than the MD's native VDP (faster VRAM, maxing out DRAM/PSRAM bandwidth, CPU-synched interleaved DMA modes, among other possibilities). Or just include two extra VRAM banks that can be flipped like Sega CD word RAM or 32x framebuffers (or Saturn VDP-1 framebuffers). With 121 colors from 12-bit RGB from the start, the need for video expansion would be less too, but tweaking that a bit more and allowing one or both BG layers to be disabled to allow linear bitmap framebuffers instead (with an eye for software rendered effects, even without expansion hardware) would be interesting, plus you wouldn't need to monopolize both VRAM ports if you disabled both tilemap layers and used the serial bus for framebuffer scanning. (you could do 2 15 color 4-bit planes or 1 121 color 8-bit plane, or 2 half-res 8-bit planes, and potentially make use of unused color values for shadow/hilight translucency effects, though you could also juse one bit for per-pixel priority to allow objects to be drawn in front of or behind the sprite layer) Doing a linear bitmap is much simpler than a tilemap, and the system is already using packed pixel data. Short of that, you could also tweak something the VDP can already do: lowres direct color via mid-screen CRAM DMA updates. The problem with that is it halts the CPU for the entirety of active display, but allowing the tilemap layers to be disabled and DMA'ing drom VRAM itself would allow for the same effect, direct 16-bit (unpacked 12-bit) color bitmap at up to 160x200. Plus sprites could potentially still be enabled if this was a feature rather than just an exploit. (practically speaking, you'd want to limit that to smaller render windows due to VRAM space limits ... right up until you added external VRAM like in the above CD unit suggestion) Note the real-world hack mode using this is limited to 9-bit RGB encoded as unpacked 12-bit (you have 3 nybbles per 16-bit word, just with the 4th bit ignored on all three: the VDP natively works in 12-bit RGB, remember, it just had CRAM and the color DACs truncated to 9-bits to save chip space). Oh and on that note, I believe the PC Engine was also designed with 12-bit color in mind and the expansion port actually allows for upgrading the RAMDAC, but they didn't use that feature on any of the CD expansion units. (you could've had 481 colors from 4096 12-bit RGB instead of 512 color 9-bit RGB) Oddly enough, the SuperGrafx also retains the 9-bit color limit, in spite of using dual VDPs. (the pixel bus on the expansion slot also provides other information, so an upgraded RAMDAC/mixing chip could potentially add things like translucency effects in hardware) The PC Engine is one console that was pretty close to ideal for its time, but the upgrades didn't push it nearly as far as it could've been ... and marketing was poor in the US and it failed to get a European release at all. (unfortunate given the tiny PC Engine form factor would've probably sold well as-is) They probably should've had at least 2 controller ports on the TG-16 variant, though, and offered 3+ button controllers sooner, them made 6-button ones standard, and should've either made the Supergrafx an expansion unit, built into a second-gen CD-ROM base interface, or gone another direction with video expansion and added a framebuffer bitmap layer instead, with the VDC function probably built into the upgraded RAMDAC chip and piggybacking on existing CRAM entries for the 255 colors. (either software rendered or blitter accelerated ... probably blitter accelerated) The original 1988 CD-ROM unit could've been simplified and made generally more useful by omitting the ADPCM chip, using a unified block of 128 kB DRAM, and either adding simple 8 or 16-bit DMA sound, or just relying on software driven playback instead. (given how poor a lot of ADPCM sounded, and how poorly it buffered and streamed for cutscenes, even simple 4-bit or 5-bit LPCM would've been competitive at the same bitrates, but you can do software DPCM/ADPCM decoding pretty easily and also do software 8 or 10-bit PCM fairly easily with paired channels at offset volume levels, and software mixing is far more flexible than a fixed, single ADPCM channel: that was also a huge limitation of the X68000's sound system, a single 8-bit PCM channel would've been far more useful) In any case, no sound upgrade at all would've been fine for the first gen CD unit, and they could've added something fancier and more generally useful around 1991 as part of the Super CD upgrade. (an entire base interface unit replacement, say 512kB DRAM, the VDC/color upgrade, and perhaps one of NEC's embedded DACs coupled with 16-bit DMA stereo, allowing CPU or DSP driven software mixing as well as slaving the DSP as a 16-bit multiply-accumulate copro for assisting with 3D or scaling effects)
  11. Incidentally, the MD's VDP was designed to support 128 colors (or 121 colors: 8x 15-color palettes + 1 BG color) from 12-bit RGB (4096 colors) and had external expansion pins for that, but they were left unconnected on the MD itself and used later for the Arcade System C. (which also ditched the Z80 in favor of an 8.9 MHz 68000 and PCM chip) Had Sega wanted to use that full capability in 1988, they'd have omitted the CRAM and DACs entirely from the VDP and used an external RAMDAC chip (as the PC Engine did) and probably could've made up the cost difference by removing the Z80+RAM and had the 68k handle the sound drivers alone. (just add a simple 8-bit DMA sound channel and you're good for sample playback and software mixing too ... interrupt-driven PCM is horrible on a 68k and cycle-timed loops aren't practical for most purposes either, so DMA is the way to go: on a 650x based platform like the PC Engine, interrupt based PCM was viable and a 7 kHz driver would tend to eat 5% of CPU time for tight code: Malducci's driver manages such; plus you can do channel-pairing tricks to get 10-bit resolution and double-buffer sample chunks into wave RAM to get better than 7 kHz without added hardware, though you'd need to sacrafice 4 channels to do 10-bit mono that way, and using 4/5-bit PCM, even for some sample based music would be pretty useful and doable with just 2 paired channels at up to 32x7 kHz ... so also tons of potential for interleaved/multiplexed mixing, but I digress) There was also Ricoh's 8-channel PCM chip Sega later used in the CD, but was already using in 1989 in the Arcade on the Model 18, but that's unnecessary added cost and overkill compared to the potential of software mixing with DMA sound. (OTOH it was MUCH cheaper than the Sony SPC700 module of the SNES ... and manufactured by Nintendo's prime chip vendo Ricoh ... and would've been an interesting choice to see tweaked as an embedded CPU+Sound Chip on the SNES end ... with a much faster 65816 and faster RAM rather than wasting money on the Sony module and cheaping out on RAM with DRAM and a slow DRAM controller: compared to NEC, who managed with 70 ns DRAM and a fast controller to allow for full-speed 7.16 MHz 650x operation in 1988 with their CD-ROM system ... 2.68 MHz is SAD in the SNES; throwing in 256 bytes of RAM for one-chip zero page would also be nice and help somewhat for poor compilers for those attempting to use C on the SNES) The PC world also had the issue of VGA compatibility, and ATI took the route of an 8514 clone, but used a separate VGA core + RAM to provide compatibility there and nothing fancy like genlock to allow overlay of the two screens. Plus, you had 4-bit color modes using bitplanes and 8-bit chunky modes (not to mention the odd organization of nonlinear unchained VGA 8bpp mode: not planar, just interleaved in the 4 64k banks of VGA space ... probably due to the way they got the necessary bandwidth while focusing on linear pixel space in 4-bit mode rather than say, linear 32-bit alligned addresses in chunky mode). OTOH, ATi probably could've made a low cost fast VGA card that simply had some nice added features while focusing on basic VGA compatibility. Remapping RAM to 32-bits wide would be relatively straightforward for a much more friendly/fast (especially for 32-bit CPUs and VESA) linear 32-bit word organized 8-bit packed pixel framebuffer, and also support DMA from main RAM, allowing fast updates of partial or entire screens. (entire ones for double-buffered full-frame rendering, partial ones for looping single-buffered scrolling type graphics, where DMA is mostly filling in portions of the off-screen bits being scrolled-in) Simple DMA block copy and fill function would be good enough for basic acceleration rather than a full blitter and would cater to 8bpp modes and 512kB DRAM. (which become appealing as soon as you adopt high enough bandwidth to do 640 pixel wide 8bpp modes and 640x480 256 colors ... while still being compatible with fixed-frequency VGA monitors; while 640x400 could still be double-buffered, so good for 3D games) You'd also want vblank interrupts to make for fast and simple page-flipping without tedious status register polling. (also very useful for color cycling effects via palette swaps, or 256 color video playback that re-loads the colors for each frame or on key frames: something you can't really do without double buffering or really fast DMA able to copy full frames in vblank ... so using Mode 13h would be out on ISA video cards, while double or triple buffered mode X would be possible via ISA cards ... or of course, a mapper-modified Mode X allowing 32-bit linear pixel organization, though obviously you'd need 2 port or DMA writes on 16-bit ISA for that) DMA functionality without any bit manipulation features would still be useful for 4-bitplane VGA modes, but less useful than something like the Atari STe blitter or Amiga Blitter. (hardware bitfield operations, bitmasking, bit-precise line fill and sprite drawing, etc) But with a CPU with a barrel shifter and fast bit manipulation instructions, you'd be OK software rendering and DMAing that way anyway. (the 68000 was not such a CPU, but a 386SX could handle such ... I forget where the 286 fits in there) So a fast enhanced VGA card that still lacked double-bandwidth modes (640 pixel 8bpp) could still be appealing with DMA copy and such, and offer relatively fast ISA bus performance. (and if it got popular enough, you'd probably have seen games exploiting the DMA function for primitive blitting or screen block/tile updates at 320x240 with square pixels and fast/efficient 32-bit word-packed 8-bit pixels rather than funky mode X, speeding up software blits to the back buffer in main RAM even if copying over ISA was a bottleneck) Gaining market acceptance would be key to getting software like games to support it, but a low cost, enhanced VGA card would seem much more likely to gain such than an 8514 clone. Hmm, perhaps even easier to gain acceptance would be a simple 2 or 4-page hack of Mode 13h, allowing a mapper/bank-switching scheme to let software treat each page like 13h, but have additional control register settings that allow page-flipping and thus more easily allow software to optionally support that with less modification of their rendering code. (just allow for 2 banks to be selected, one designated for the active screen and one designated as the back buffer currently being copied to: you could potentially have 3 back buffers and a quad-buffered arrangement for smoother average framerate, of course) So you get the speed and simplicity of mode 13h without the hassle of single-buffering and ugliness of screen tearing either without v-synch or over ISA where there's no time to copy 64kB in one vblank period. (If you dropped to 60 Hz for 320x200 with a boarder and more vblank time, you'd still only get 62kB at the absolute max over 8 MHz 16-bit ISA ... so with a fast CPU and tight polling of the vblank status register, you could avoid tearing if you had a boarder or status bar or such that didn't need to be re-copied every frame ... plus square pixels, which is nice, though the letterboxing isn't so nice)
  12. The closest things to a good, low-cost oriented framebuffer+blitter optimized console around in 1990 were the Lynx and Flare's Slipstream. The former wasn't fast or powerful enough to be directly used for a TV based console (not enough bandwidth and existing framebuffer size restrictions were too limited for a TV screen, plus it only did 16 colors/4-bit pixels) and a 4 MHz 65C02 was marginal, though probably no worse than the SNES's 2.68 MHz 65816. (much weaker than the 7.16 MHz 650x derivative in the TG16/PCE or 7.67 MHz 68k of the MD, or 7.16 MHz Amiga 68k, even with wait states in 5/6-bitplane modes and blitter bandwidth) The Slipstream OTOH, relied on PSRAM to be fast enough for some interleaved DMA, so not as cheap as pure DRAM, but still cheaper than the multi-bus arrangements with VRAM, PSRAM, and/or SRAM on the PCE, SNES, and MD. Plus a 5.9 MHz 8086 isn't all that great of a CPU either, and it'd need address space expansion for cart based games, though was interesting as a floppy disk based console. It was mapped to support up to 256 kB of PSRAM and 512 kB DRAM, and there was a fair bit of interleaving it could do. 256x200 (up to 256x256) 256 colors from 4096 (12-bit RGB) 8-bit chunky graphics and a blitter optimized for both sprite and line drawing (and block-copy for background drawing), plus a 16-bit DSP useful for sound synthesis and 3D math. (slaving it for simple PCM sound playback for Amiga MOD should've also left a lot of time for math coprocessing, while doing realtime synth would've eaten a lot more time). The x86 CPU and 256 color graphics plus potentially large chunk of RAM might have made it appealing for PC game ports of the period, and Lucasfilm toyed with licensing it in 1989 while Konix was gearing up to release the Multisystem (all of which fell through, of course). 128kB PSRAM plus 512kB DRAM would fit rather well with 640k Realmode PC games. (enough PSRAM to double-buffer into and have some left over for fast blitter/DSP access) So it was ready for 1989 mass production, and would've played into late 80s Atari's computer game license game model they'd aimed at with the 7800 and would continue (somewhat) with the Lynx. (plus 880k double density floppy disks were a much smaller risk than ROM carts for publishers, and the proprietary data/sector format would've been appealing for those worried about piracy ... I think they included some other security features too) Being framebuffer based also meant double-buffered rendering could drop the framerate to increase complexity (more parallax, sprites, etc) like with PC/ST/Amiga games, though dropping below 30/25 FPS would probably be unappealing for most games ... there were already a number of racing games and shooters that ran at 20 fps on the MD or SNES. (something like Galaxy Force would look a lot nicer on the Slipstream, probably) And 3D/Pseudo 3D stuff would be much nicer to work with, as would be realtime scaling effects. (rotation would be possible too, but a lot more math intensive than simple stretching/shrinking ... and combining realtime scaling with pre-rotated animation would tend to look much better ... something appealing for a hefty chunk of RAM and floppy disk storage, just as pre-shifted graphics were appealing on the ST but would be horrible on a ROM based console without 512kB of RAM to pre-shift things into) Atari's Panther didn't go with that design philosophy at all ... it was more of a supercharged 7800 and prelude to the Jaguar's object processor, but required 35 ns SRAM (basically high speed cache RAM) for the list and some high-speed sprites/objects, while intending to pull most from 32-bit (slow/cheap) ROM, hence only using 32k or RAM ... plus an Ensonic (DOC/DOC-II or OTIS) PCM chip with its own private DRAM to work in, and a 16 MHz 68k that'd be getting tons of wait states like the 6502 in the 7800. Plus they made the odd choice of only 32 CRAM entries but in 18-bit RGB 260k colors), and while that meant 5-bit line buffers, it meant 18-bit CRAM entries and more chip space for the 18-bit video DACs (VDACs are fast and not 'cheap' in terms of chip space ... a big reason the Mega Drive stuck with 9-bit RGB). They should've easily managed 6-bit line buffers and 64 colors from 12-bit RGB, and it used an 8-bit linear offset, for palette select using 1, 3, or 15 color objects (similar to the Jaguar) as well as unpacked 8-bit per pixel objects using the full 32 (or 64) colors. In any case, the Panther seems like a really bad set of compromises and not optimized for cost to performance. They probably could've salvaged it with a bit of added custom hardware, especially given they brought Martin Brennan onboard in 1989 to work on the Panther chip itself (and he was one of the 3 designers from Flare who'd done the Slipstream and would do the Jaguar), adding something like a bus controller chip to mediate between Panther and 68k accesses to SRAM and ROM, and possibly add a block of DRAM to work in (and be faster than ROM for sprites) while potentially cutting the cart bus down to 16-bits to save cost and potentially even using a cheaper 8 MHz 68k and a heavier emphasis on interleaved DMA from DRAM and ROM. (work within the existing Panther limits and work around them with a memory mapping and bus-flipping arrangement) Or they could've ditched the 68k in favor of something cheaper, like an embedded 650x (they were already using those in the Lynx, but an 8 MHz one with on-chip zero page would be really nice), could potentially be embedded into a custom chip, and use even cheaper 8-bit ROM and DMA everything into system RAM like the Lynx. (a Hitachi 6309 at 3.58 or 4 MHz would also be really appealing, though the latter would be an overclock) But given the Panther wasn't even ready for production at the time, and the 7800's sales were declining (along with the ST and 8-bit) in '89, especially compared to the 87/88 peak of the ST and 7800, an earlier release would be better, and the Slipstream chipset was ready-made, non-exclusively licensed, and had a bunch of software developers in the UK already working on it. Plus it was built on gate array logic rather than custom or standard cell masks, so was easier to start in smaller volumes at lower costs/risks. (though note, the custom system ASIC in the Sega CD was also a gate array part) The Slipstream wasn't great, but it was there and possible for an immediate 1989 launch or at least rushed test market. But getting into 100% hypothetical stuff that didn't exist at all at the time? Atari engineers could've looked at the Lynx, seen its design philosophy and either run with it themselves (or rather, commissioned a consulting group to handle it) or go back to the Ex-Atari-Ex-Amiga engineers who'd designed the Lynx chipset to do a console. The same sort of unified memory, DRAM-optimized set-up would've worked great for a home console, but it would've needed to be 16-bits wide on the bus end at least and possibly using a 32 MHz DRAM controller ... though 16 MHz could probably make do. (though 32.22 MHz divided by 2 would be good for NTSC timing, 3.58x9 = 32.22) Sticking with that and the Panther's theme of a 16 MHz 68k with cycle-stolen DMA, but going with the Lynx's low-cost DRAM+blitter+framebuffer arrangement, doubling the internal and external width for the video DMA and blitter operations: so 16-bit FIFOs/latches on an 8-bit bus become 32-bits on a 16-bit bus with a slow DRAM (random) read followed by a page-mode read ... probably 4+2 cycles or 372 ns at 16.11 MHz in 120 ns DRAM, similar to the lynx (100 ns DRAM and a 32.22 MHz controller could probably get that down to 7+3 cycles or 310 ns, but 120 ns Lynx speed would be a much more conservative/realistic goal). You'd get 10.74 MB/s with that, and using cycle-stealing DMA to do a 256x224 NES/SNES (or lower res MD) style screen at 60 Hz would use about 32% of the bus time, meaning the 68k would be closer to 10.95 MHz, or somewhat better due to internal register operations that avoid waits. This is a greater percentage of CPU time than the Lynx's CPU hits, but you're using more than 2x the bandwidth for a display like this, and an 11-ish MHz 68k would be plenty for the time. The Lynx's RLE compressed 1/2/4-bit texture format was also really nice, and extending that to a Panther/Jaguar style 8-bit offset in 256 colors (rather than 4-bit in 16 colors) would work really well, plus allowing direct 8bpp textures too. (maybe RLE, but potentially just uncompressed stuff, especially useful for treating portions of the framebuffer as objects for certain effects) 256 colors from 12-bit RGB would also be fine for the time, though 15/16-bit RGB would be nice. (you could also do software based translucency or shading effects via look-up tables, probably in ROM, especially if using 256x256x8-bit tables for translucent blending: 64 kB) Include the 16-bit multiplier unit and sprite scaling capability of the Lynx, and add a bit more to the Sound hardware, say take the Lynx oscillators+DACs and allow at least one DMA channel to feed them for 8 or 16 bit PCM. (if you used word-interleaved LRLR stereo a la STe, you could use a single DMA channel for 8 or 16-bit stereo as well, and be pretty nice for software mixed sound while having 2 or 3 DACs free for chip-synth sounds) 256 kB of 16-bit wide 120 ns DRAM would've been a very good cost compromise for 1990 with a framebuffer based console, and have plenty of space to load code and data into, and decompress graphics and sound into from slow cart ROM. (though unlike the Lynx, you could also work in ROM directly, for cases where that's useful ... large RLE textures and look-up tables would come to mind) And while it's no DSP, a fast 16-bit multiply unit would work around one of the 68000's biggest bottlenecks for software rendered 3D. (incidentally something that the Mega Drive really missed out on, as something as cheap and simple as the NEC µPD77C25 used as Nintendo's DSP-1 at the launch of the SNES would've allowed for something probably exceeding Super FX class 3D on the MD at much lower cost ... albeit the same goes for the Atari Falcon, if they wanted a much cheaper sound/fixed-point Math co-pro than the 56k the Falcon got, and potentially STe/MegaSTE vintage ... though embedding a custom 650x+multiplier chip for sound coprocessing and some 3D acceleration would've probably been cheaper for Atari with a 650x license already in use and all) Oh, and of course, you avoid the issue of a relatively alien architecture as the Panther Object Processor presented (and Jaguar later did). The 7800 was the closest thing out there prior to it, and it was rather niche in itself (and never really broke into the European market, either, so not tapping into the wealth of smaller software houses there, especially Atari-ST friendly ones). Software rendering and hardware-accelerated blitter rendering were much more well understood and also somewhat easier to use to simulate tile and sprite based graphics, but with the added flexibility of using framerates lower than the screen refresh rate without tearing (sprite drop-out) or flicker issues. Martin Brennan joining the Panther Project in 1989 might have been an opportunity to kick some sense into things, but with all that interest in the Lynx (and it going to market in 1989) on TOP of a major stake in the computer market, it's really weird that the Panther existed at all in the form it did. (it's a novel design, and the sort that an industry leader might be able to pull off, but not something good for a second-tier player ... let alone one built around Jack Tramiel's no-nonsense, straight low-cost consumer market ethos ... and cutthroat negotiation for that matter: then again it was Sam in charge by '89 and Mike Katz had left the games division as well, so leadership was certainly lacking, but I though it was Leonard and Gary who were more involved on the tech-management and logistics end ... marketing issues and negotiating with suppliers and vendors might have been Sam's fault, but it doesn't explain the odd engineering choices) Plus a blitter/framebuffer optimized design would be more useful as a component in a computer, even if just tacked on as an upgrade via the STe's Genlock feature. (ie rather than a unified bus with cycle stealing, attach the new video+sound subsystem more like a PC expansion card ... or the way most game consoles do subsystems on a dedicated bus connected via I/O ports and DMA interfaces) Standard STe graphics/sound for compatibility and enhanced features for 256 color chunky pixel modes, and possibly highres 4-bit packed pixel modes. (plus, with ST video disabled, and 120 ns DRAM + 16 MHz DRAM controller, and you'd be able to use a 16 MHz 68000 without wait states, sans DMA for disk access and such, and no need for a cache like the MEGA STe used) Juse use a 16.0 MHz clock for ST compatibility rather than NTSC timing. (you could do VGA style 640x480x4-bit and 320x480x8-bit 31 kHz video that way, though you'd need to use more page-mode bus saturation with linear reads and fewer DMA slots left for the blitter to work in ... and the CPU would be best only accessing in vblank, while the Blitter and DMA sound could still use hblank DMA slots, plus ST style 31 kHz monitor res at 32 MHz pixel clock leaves a LOT of hblank time available, so that'd be handy here; while dropping to 24 MHz, closer to VGA standard 25 MHz, would cut into that and not be slow enough to allow any interleaved DMA cycles, so 32 MHz ST style would be handy, plus it'd allow 640x400 16-shade grayscale on old mono monitors) See also: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm Note the DRAM prices were falling sharply in 1990/91 where they'd jumped up in 1988 then stagnated (crippling the ST's low cost and high performance ... or potential for the 1040STf to become the basic standard in '88) and it was a good time to release a console with a decent chunk of RAM and have it drop in price in the following couple years. Atari OTOH, had the very bad luck of going with 2 MB in 1993/94 with the Jaguar, at a time RAM prices again rose and then stagnated (due in part to a massive resin factory fire in Japan that crippled plastic IC packaging volumes and glob-top assembly) so it ended up staying relatively expensive and not being nearly as good a value as anticipated. It wasn't until mid-way through 1996 that prices dropped again, ie after the Jaguar was discontinued. (it wouldn't have been until the 1996 holiday season that a Jag+CD combo could've been a solid budget market competitor: ie undercutting the Playstation and liquidation-priced 3DO even at a sub-$200 level vs the $250 PSX/N64 of that period) Hell, they probably could've come out with an inexpensive 2 MB 16-bit DRAM cart for the Jag CD by 1997 due to the falling price of 2MB DRAM chips. (you'd need a DRAM controller onboard for that, and it's be 5-cycle latency like the fastest cart ROM setting, but still pretty useful: there's also a 2-cycle high-speed test ROM setting, but that'd only be useful for fast SRAM additions ... ie for things like a 64kB blitter texture buffer) In any case, 1990 was a solid time to release a console, and one Atari had the misfortune of passing up. (those DRAM prices also would've made the 128k+512k PSRAM/DRAM floppy disk Slipstream console a good investment, though that's partially hindsight, and pure luck + foresight of good market prediction in 1990 ... though they could've launched with 256kB and quickly offered an inexpensive 256k upgrade card as the market trend became definitive in 1991)
  13. One big consideration for RAM expansion on the Jaguar wouldn't be for sheer added storage capacity considerations, but for speed improvements. Even a relatively small chunk of 75 ns SRAM or PSRAM (like 64 or 128 kB) 16-bits wide on a RAM cart for the CD (or hypothetically slapped as an add-on for a ROM based game) could dramatically speed up texture mapping and potentially make the 68000 a bit more useful as well. The blitter can only render one pixel at a time in DRAM, and does so slowly, using 5 cycles per read and write (10 cycles per pixel). You can speed this up by loading small textures into GPU RAM, but that seriously hampers the GPU itself (contention for the GPU scratchpad will kill GPU performance), but having textures stored outside of the 2MB 64-bit wide DRAM bank would allow writes to DRAM in page-mode (2-cycles) and fast 2-cycle memory accesses to SRAM or PSRAM for 4 cycles per textel rendered (possibly 2-cycles per textel with SRAM; I forget if the blitter can interleave read and write cycles in separate memory banks ... it'd be 4-cycles with PSRAM in any case, due to the slower random read/write cycle times than SRAM). Worst-case would be on page-breaks in DRAM, where you'd end up with 5 cycles per textel, but that's still 2x as fast as normal Jaguar Texture mapping. I also forget if the cart bus has a locked minimum cycle time, but that might also be 4-cycles. (these are all 26.6 MHz cycles, mind you) Use of the 68k would be aided when working in 16-bit P/SRAM by not contending for DRAM and allowing interleaved DRAM accesses within 68k cycles ST/Amiga style, sort of. (the difference being separate memory banks and use of page-mode DRAM access) You could nest up to 3 page-mode DRAM accesses into a single 68000 bus cycle and have it working in the little P/SRAM bank without waits. The DSP also has a 16-bit bus and somewhat slow/broken due to bugs, so it's a bus hog in main DRAM too, but could be a fair bit more useful doing some parallel processing in added P/SRAM. (in typical games it's best just working from ROM for the same reason, loading sound data and such) You also don't need anything wider than 16-bits for texture mapping (straight, uncompressed 16bpp textures) and likewise nothing wider than that for helping with the 68k, or DSP for that matter (also only 16-bit access) Throwing a RAM cart like that bundled with the Jaguar CD would've made it a much more substantial upgrade back in '95, but it'd still be neat to see for homebrew stuff today. (more likely to see SRAM + ROM arrangements, though) They could've also built it into the CD unit, but given all the delays it saw and the added (if simple) logic required to disable the RAM when a cart was inserted would've made lumping it into a cartridge make much more sense. They could've put SRAM or PSRAM on normal ROM game carts back in the day too, but it didn't make sense with the sort of sales volumes they were dealing with, and the most demanding games would've been better on CD anyway. (stuff like Quake and Tomb Raider would've seriously been aided by something like that and would've been hardpressed to fit on carts anyway, especially at sane price points ... added RAM aside) It's also worth noting that the Jaguar itself supported 2 DRAM banks, but only one was populated. (so two pages could be kept open at once and some interleaved accesses without nearly as much of a hit to the bus) This isn't relevant for a cart-based expansion and they didn't include a dedicated RAM expansion slot, either, but it's an interesting missed opportunity to consider. (putting the DRAM control and address pins on an expansion port ... even limited to 16-bits wide would've been significant and much more cost effective than an SRAM expansion) Though given the added pin count and traces needed for a dedicated expansion port, it would've made far more sense to just add the DRAM control lines to the cart slot and allowed for up to 32-bit DRAM to be added on that end mapped to the empty 4MB DRAM bank address space. (they'd planned the Jag CD before the base system was test-marketed, and the Jag cart slot has a pretty hefty pin-count, so that would've made a great deal of sense) *The Jaguar's memory map includes 2 banks of 4MB each for DRAM and a single 6MB bank for cartridges (ROM or P/SRAM) with the remainder of address space dedicated to internal registers, I/O, and such. They also could've used a different base DRAM configuration and populated both banks, but that either would've added to cost (like 2.5 MB with just a 5th 512kB 16-bit DRAM chip added) or required a mix of different DRAM chips to be used with less total RAM. (say 4x 128kB chips for a 64-bit bank and 2 515kB 16-bit chips for a 1MB 32-bit bank ... less RAM and less cost-effective smaller DRAMs for the 64-bit chunk, more board space taken up, but potentially slightly lower initial price point and substantially better use of texture mapping as well as more flexible use of the 68k and DSP ... you could also drop down to 1 MB total system RAM with just a single 512kB chip 16-bits wide, which would be fine for a lot of early/mid 90s cart games, arcade games and SOME PC/Amiga ports, but you'd definitely want RAM expansion on a CD based console in that case; the lower initial price point might have been worthwhile) Oh, and Doom might have still been easier to code on a 1 MB (.5 MB 64-bit .5 MB 16-bit) Jaguar in spite of the lower RAM quantity due to better use of 68k and DSP, as well as ability to keep textures in the 16-bit bank for faster access. (and given how much profanity Carmack left in his comments of Jag Doom's source code, you can bet the bugs /and/ broken DSP bus controller were high on his lists of complaints) That's also still way more RAM than the 32x port had to work with slightly later (adapted from Jag Doom's source code no less) and that port also had to make do with less ROM (3 vs 4 MB). Jag Doom probably would've had some nice music in such a situation as well. (though 32x Doom also has a 7.67 MHz 68000 with its own private bus and 64 kB of PSRAM to work in totally in parallel, though I think 32x Doom just slaves the 68k to I/O and sound and leaves the brunt of things to the 32x side) In case you're interested: https://console5.com/wiki/Genesis_2 https://console5.com/wiki/SRAM_512Kb:_32K_x_16-bit Sega was using 64kB 16-bit wide PSRAM chips by Sanyo in later model Mega Drive 2s. (and other varities of 32kx16-bit PSRAMs on all but the really early revisions: the VA0 model 2 had a pair of 32kx8-bit PSRAMs instead) The Sanyo LC331632M was offered in 70 ns speeds as well. (the MD mostly used 120 ns) And also a neat resource for DRAM pricing up through the 1990s: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm
  14. I'd love to rehash some of the comparisons and Amiga VS ST stuff ... or business decisions, or merits of the STe or Falcon going on in this thread, but don't have the time to read through it right now. Instead, how about something on point with the original topic? Hopefully noone's posted this video already: Nice selection of MOD players on the STe (and one on the ST, plus native Amiga playback for reference). One of those uses the 50 kHz mode with multiplex mixing, which I believe is sample-interleaving (ie a 50 kHz channel effectively becomes 2 25 kHz channels by interleaving samples on a byte basis when mixing rather than adding them together). The advantage there being slightly lower CPU overhead, but more importantly the ability to keep the full 8-bit resolution intact and not deal with clipping or quality loss. (the latter would certainly occur if you mixed a bunch of 8-bit samples to 16-bit resolution and output the upper 8-bits to the sound hardware ... some PC sound drivers for 8-bit sound blaster cards do this) Incidentally, the multiplex-mixed MOD player seems closest to the Amiga player to my ears at least. (though the comparison for all of the 25 kHz and 50 kHz players are pretty close ... the 12.5 kHz one is obviously another story, as is the ST player using the YM2149 for PCM) I forget if the STe has 2 DMA sound channels or not. Wikipedia mentions stereo mode being handled with byte interleaving LRLRLR style, which implies there's only one DMA channel actually being used. Regardless of that, I was also unsure if the 50 kHz mode functioned as distinct left and right stereo channels or not. (ie given the LRLR interleave, does stereo have half the max sample rate of mono, or does the stereo mode allow for double the DMA bandwidth of mono? ... I assumed it works as one left 50 kHz 8-bit channel and one right 50 kHz 8-bit channel, so the latter case of 2x the bandwidth) It might be similar to the PWM audio in the Sega 32x in as far as having a 'mono' mode that simply writes the same data to left and right buffers while stereo just allows either to be written to individually. (with stereo and mono modes having the same resolution and frequency limits) Chilly Willy would know much more about the 32x example if he's around. There's also various 8-channel mods among other things on both the Amiga and STe. (Turrican II's intro on the Amiga uses software-mixed music, going well beyond Paula's native limitations; it's rather impressive they managed to make the ST version sound such a reasonable approximation given the limitations there ... I believe that's a normal 4-channel MOD, but some other tricks might be thrown in) I'm not sure if Turrican II uses the 'effects' mode of the Amiga or just software mixes in the normal 4-channel mode. Effects mode (only one left and one right channel, but with 6-bit logarithmic volume control) on the Amiga would be much more in line with the STe's capabilities and best suited for software mixing with more demand for stereo panning effects. The STe would probably have a notable advantage here between the higher sample rate and LMC 1992 mixing/effects chip to work with. (ie not as good for games, but more useful for professional audio applications) The 8-bit sound limit is still a pretty big hurdle for the professional audio end of things, though, and complicates good quality software mixing (you need good optimization to avoid clipping while also avoiding overly quiet samples that end up effectively much less than 8-bit resolution). You could potentially wire the Amiga to be 16-bit mono by merging the left and right channels while in effects mode and setting the volume one one to be 255 times as loud as the other, adding together to 16-bit linear PCM (rather than the 14-bit nonlinear hack), and it could have done this out of the box if Paula simply allow for a mono switch in software. But the STE, OTOH should actually allow this to be done given the LMC 1992's functionality allowing left and right sound channels to be panned left, center, and right, or effectively leaving both as mono. (I'm not sure of the volume granularity on the LMC 1992, but if it allows any combination of settings such that one channel is approximately 255 times as loud as the other, you could get 16-bit mono out of it) On a side note, the STe audio is also somewhat comparable to the Sound Blaster Pro, or somewhat more capable given the 44 kHz 8-bit mono and 22 kHz 8-bit stereo limitations of that. (it also beat that to market by 2 years) The SB Pro also had a whole mess of FM synth channels with its dual OPL2s, of course, and the somewhat more feature-rich OPL3 in the SB Pro 2.0 (and SB-16) later on, but those got fairly mediocre use, almost never in conjunction with PCM and are mostly relevant for games rather than professional audio/music applications. (I'd argue the ST would have gotten much more use out of such FM synth chips had it ever had them ... due to the UK/Euro chiptune scene: you saw plenty of that on the Mega Drive; but that's another story as well) The sound hardware of the STe may not have been expressly designed for professional applications over games or home entertainment, but it certainly seems to be more useful for such and shines best next to contemporaries when used as such. (shame it wasn't added earlier, though ... like with the 1040 ST or MEGA ST, even without the LMC 1992)
  15. I'm just known for very long, often (at least partially) rambling posts. Don't worry, I'm not one of the more ... uh ... temperamental members of the community. (I'd like to think the opposite, really, I got along pretty well with Gorf back when he was still hanging around, I learned a lot from him, Atariowl, CrazyAce, and Kskunk -Kskunk probably most heavily on the actual electrical engineering and hardware design end of things) Honestly, if Atari was in a sound financial situation to launch the Jaguar in 1993/94, then yes, dropping a bit more and eating the added cost would be fine, but then waiting for a mass release in 1994 and avoiding the pre-release PR stunt in '93 would have been possible too. Throwing in an added 512 kB DRAM would be better either way though, especially if the 'addition' came down to more RAM or a more potent CPU (68EC020, Cyrix/IBM 386/486+cache, or 386DX -no cache but at least bumping JERRY onto a 32-bit wide bus). Both would be nice, but there's still going to be serious real-world cost constraints. (for 1994, a 25 or 26.6 MHz Cyrix 486DLC + additional 1 MB 32-bit bank of DRAM would be really nice and probably not going crazy with cost, but probably pushing it a bit -it MIGHT have been cheaper than using a 68EC020-25, though) The 68k+2.5 MB arrangement is probably the safer bet though, ideally with another 6 months or so to clean up bugs and spin off revised TOM and JERRY parts. (plus cut a little off JERRY's cost with a 128-pin 16-bit bus part rather than the 144 pin package) x86 would be really nice for source-ports of assembly language PC games of the time, maybe speeding up DOOM's development too. (I'm tempted to suggest a 386DX again for that JERRY bus boost too ... or a Cyrix or IBM 486SLC) X-Wing and Tie Fighter along with Lucas Art's adventure games would be really nice on a Jaguar CD. (Tie Fighter's smooth shaded lighting engine would be perfect for the Jaguar's 3D) And yeah, the added 512 kB DRAM is more for blitter benefit than CPU, and you don't NEED a 1 MB 32-bit bank to make good use of a 32-bit wide CPU+JERRY interface. (just use the 2 MB block for that, leave the 512 kB 16-bit block just for blitter textures) I'm tempted not to suggest the bottom-barrel 386SX given the more limited advantages over the 68k. (25 MHz 386SX might have had some merit though ... but the added cache on the Cyrix and IBM parts just make those so much more useful -Cyrix had more second sources and had their SLC on the market much earlier in volumes than the faster, larger-cache IBM counterparts, so probably the more realistic option -they were on the market by the time the preproduction Jaguar dev systems hit in '92, so a fairly real consideration, especially if Atari could get a deal for down-binned parts otherwise not very marketable) I also just kind of like Cyrix in general, neat company at an interesting time. Oh, right, and they also probably could've kept the 26.6 MHz timing AND dropped DRAM cycle time to 4 clocks if they'd bumped up to 70 ns DRAMs rather than 80 ns. (better option than dropping back to 25 MHz unless you're also using 25 MHz rated CPUs -and don't want to overclock) I was thinking of 1 MB in terms of what the Sega CD had to work with (512 kB program RAM + 256 kB of word RAM -sometimes used as a render buffer for the graphics ASIC, depending on the mode used), and thinking in terms of PC games typically using 640k or 1 MB up to 1993 .... but 4 MB minimum got pretty common right after that and 2/2.5 MB would make ports way, way easier. And yeah, Slipstream was more a suggestion of having SOMETHING to field, but the Panther might've been a better option there (just not with 32kB) ... I suppose if they got a really, REALLY good deal on the Ensoniq DOCII, used 128 kB SRAM + 64 or 128 kB DRAM for the soundchip, it'd be usable as-is (no redesigns to the custom chip) per the 1989 Panther prototype. (128 kB + 128 kB was probably the smarter move given how cheap 128k DRAMs were at the time ... ) They were using 35 ns SRAMs on the prototype for SOME reason, though. (and lots and lots of 100 ns SRAMs for the ROM emulation bank ... 2 MB of SRAM ... 16 128kx8-bit chips) And yeah, the 1989 Slipstream might have managed a better Star Fox (or Starglider III -or Return of Starglider as was the working title). Some of the developers were pretty scathing in interviews on it, but that was the 8088 prototype with something like 1/3 or less real-world CPU performance (way worse bus contention), plus a MOD player would take way less DSP time than FM synth (and sound nicer), so less contention for sound vs 3D there too. (that said, a 16 MHz 68000 might manage a better than SNES Star Fox too, and with 128k you'd have room for a framebuffer sized object in the Panther -and the majority of CPU time available due to the DMA-light nature of framebuffers vs many sprites ... using hardware scaled sprites for a fair amount of stuff would probably be wise, too though) I've been rather pessimistic over the cost of the Ensonic chip, but it'd be a pretty sweet set-up if affordable. Slipstream is nice if you like the idea of a floppy disk based console, though. (probably good for Wolfenstein style games too ... then again, so would the Panther ... 32 colors vs 256 colors though, or more likely 16 colors dithered to simulate 256 colors -or ... 136 colors, 160x200 effective screen size) Oh, and Wing Commander 1 and 2. (3 would be a good fit for the Jaguar CD) The Slipstream was also ready for mass production in an at least usable form in 1989, the same might not be true for Panther. (and if the work to 'fix' Panther was more than say ... tweaking the Slipstream to take a 12 MHz 286 or 68000, it might not be worth the difference) Oh, and floppy-wise, probably not worth worrying about backwards compatibility on the console front, so Jaguar could potentially be the same regardless of anything preceding it. (including media -using a simple SRAM+battery backup save system on the jaguar would probably be fine too ... preferably in memory card format, and a basic slow 32kx8-bit SRAM would be fine for that -the Saturn managed to fit lots more save blocks in that than a 128k flash card on the PSX, and save much faster ... shame it was integrated and not in a card format -and the 512 kB save-cart was way too expensive) A huge chunk of this is off topic yes, but I think it's important given the Jaguar's success was crippled by Atari's financial situation in 1993. Even a mediocre (but profitable) market success in 1989-1993 would have been a world of difference for them. (7800 level success should have been more than enough ... probably enough to keep the ST/Falcon and Lynx afloat too -both had plenty of potential left, Atari was just too hamstrung to manage all of that at once ... chicken-egg need-money-to-make-money sort of scenario -ie a bit more buffer to get through the roughest spots and they could have pulled out of the downward financial and management spiral and the Lynx and computers could have remained assets along with the Jaguar ... and Slipstream/Panther in the budget-console market) They totally could have ridden on cost/size reduced Lynx variants (with improved screens and battery life) into the late 1990s. (a lower cost lower-power non-backlit screen model would be wise too, but color LCDs weren't good enough for that until about 1996 I think ... had they gone grayscale back in 1989 with color optional, it would have worked, but doing that after the fact isn't practical -developers don't include features for such so games look wrong in black and white -16 shades of gray would be pushing it in 1989 too, but better than the GB's 4 shades would probably be possible ... maybe 8 or 16 with the condition that anything NEEDING good visibility would use higher contrast and anything subtle might or might not actually be visible)
×
×
  • Create New...