Jump to content

kool kitty89

  • Content Count

  • Joined

  • Last visited

Community Reputation

149 Excellent

About kool kitty89

  • Rank
    River Patroller
  • Birthday 08/08/1989

Profile Information

  • Gender
  • Location
    San Jose, CA
  1. A command cache (or scratchpad ... or just prefetch queue) for the blitter probably would've been the more elegant and practical solution, yes, but I was mostly just summarizing a comment kskunk made on the issue years back. That said, wouldn't the rasterization situation also be different on the Jaguar II given Oberon's blitter had a trapezoid drawing function, so you only need to build lists of trap-segments to build triangles (or quads) rather than line by line rasterization. (incidentally, I believe several of the early PC 3D acceelrators worked on trapezoids internally for doing polygon drawing, some documented such as well: it's in the S3 ViRGE manual, at least described in the section on 2D polygon rendering depicting the trapezoidal segments used for arbitrary polygon fill operations) And on the texture mapping bottleneck, John Carmack's suggestion (in the context of something cheap and simple that they should have already included) was a 64-bit destination buffer in the texture mapping pipeline. (though given how slow the texture mapping unit is, given kskunk's 5-cycle peak test results, that wouldn't help all that much for raw speed, no more than populating the second DRAM bank the system already supported ... so it's somewhat moot there, though I'm also unsure Carmack was aware of the existing support in the DRAM controller or the 5-cycle bottleneck of the blitter) kskunk and several others (Gorf, Crazyace, I think maybe Atari Owl) also went over the problems using GPU SRAM as a texture cache, particularly how it kills GPU performance if used heavily, however, kskunk's later tests seem to point to use of the line buffers as texture RAM to be a lot more useful, possibly also useful as a texture render buffer. (the latter wouldn't be faster per se, but rendering from line buffer RAM into line buffer RAM, then blitting to the framebuffer in 64-bit, phrase-alligned chunks, would greatly reduce time on the bus and that much less contention). Honestly, for a game like Quake with the lighting model used, dropping textures entirely at a certain distance (Z-sorted as an extension of the existing ray-casting visibility system a la PC Quake) would've made a ton of sense to minimize texture mapping overhead. (PC quake also already was heavily optimized on minimizing actual rendering time, lots of computation or table based optimizations to spend as little bandwidth as possible drawing to the framebuffer, and an engine like that would adapt well to the Jaguar's bottlenecks, albeit trading more of the tables for raw realtime computation and trading the Pentium FPU pipeline-specific tricks for other things) The SVP-chip version of Virtua Racing used a 16-bit DSP to handle the 3D math, yes, though I think it also assisted with drawing the polygons using the 128kB of DRAM the cart included as a framebuffer as well as work RAM (or local memory for paging things in and out of local DSP memory). It's a DSP though, not a CPU or MCU (so unlike the 32x or even the primitive 16-bit RISC MPU in the Super FX chip) and not good for all that much else, not flexible general purpose processing like the Jaguar's GPU and DSP, but good for 3D and probably OK for simple pixel/line/block fill operations. (as a DSP it also should have done well as a sound processor, but Sega didn't use it as such ... no DACs or audio input lines connected on that cart) Unlike the Jaguar, but like the 32x, you did have multiple buses to work with, and the local DRAM able to be flipped on and off the 68k bus for copying the render buffer into VRAM. Now, the MD's 68k was still fast enough to do some software rendering on its own, and having a much simpler DSP co-processor that simply handled the vertex math and left all the rasterization to the 68k probably would've worked better than SuperFX driven 3D on the SNES (or been competitive, at least), but there's no other examples of co-pro or add-on chips used on the Mega Drive at all, unless you count the Mega CD. (and unfortunately, unlike Mode 7 in the SNES, the fast multiplier unit that must be embedded in the Sega CD Gate Array for the scaling/rotation function isn't directly accessible to either CPU, otherwise it'd be handy for polygonal 3D when the scaling/rotation function wasn't in use ... really handy if they still let the Gate-Array's blitter functionality work for simple copy/fill operations in proper nibble-precise tilemap pixel organization, but ... nope: honestly, with the amount of RAM it had along with that sort of hardware assist, I'd think it would've handled Virtua Racing well enough and probably a solid port of X-Wing, at least the more limited floppy disk version for PC, flat shading, 1MB RAM compliance and such) The Jag was way more powerful than any of that, though ... but yes, being able to interleave 68k access to some extent would give the advantages of a separate/local bus as on the MD. I'm not sure how the timing of the bus latches in the Jaguar work or if interleaving was really a major consideration, but it certainly had been when Flare designed the Slipstream and included a 16-bit bus latch to minimize time the 8086 spent on the bus (in that case interleaving on PSRAM cycles with the CPU working in SRAM or DRAM on the same bus: the video processor would only work in PSRAM, so the bus cycle interleaving was based around the 12-ish MHz video DMA clock, fetching a 16-bit word once ever 4 12MHz clocks). The Slipstream 4 (1993 vintage hardware done in parallel with the Jaguar) switched to dual DRAM banks with page-mode support and up to 32-bit bus width, but the intended 12 MHz 386SX should still have interleaved with video DMA cycles for most video modes provided it worked in the separate DRAM bank. (the blitter and DSP DMA may have remained serial only with the CPU, though) The 68k also doesn't need to hit the bus all that often to stay nearly at full speed, so having a feature to have consistent, periodic wait states for slow-ish interleaved bus access would've made it a far better investment in the system as a whole, albeit using a 20 MHz rated 68k and running it at 3/4 the system clock (19.95 MHz) would've probably been more useful as well. (you don't need the 68k bus cycles to allign with the system bus cycles anyway, not like ST/Amiga style 4-T-state interleaving, so the 1/2 system clock rate wasn't all that useful other than just being cheap/simple to divide) That and they probably spent way too much on Jerry given its limited use (it's a decent audio DSP, but too bottlenecked with its very slow bus cycle times and some other bugs, and need to page code to local RAM to work, thus nixing it as a stand-in as CPU: at least when coupled with the slow bus connection, opposed to paging code modules to the GPU; the DSP even made a poor geometry processor as writes to main RAM were twice as slow as reads, slower than 68k writes in fact: 12 cycles for a 16-bit word, while reads were less crippled at a hard-coded 6 cycles). The DSP's slow bus cycles would have made it reasonable for interleaved access on a separate memory bank (the second DRAM bank and ROM bank) but otherwise it's pretty crippled and a serious bus hog. (compared to just including a rudimentary DMA sound/UART ASIC ... or possibly including an embedded, easily licensed low-cost MPU like a 65C02 or 65816 a la Lynx ... as a sound processor or maybe in leu of the 68k, though it'd make coding in C a lot tougher ... for lazy C or 68k assembly source ports, for what that's worth) Well, that, or the Flare II team could have ditched the 68k and JERRY some time early in 1992 in favor of a nice, flexible microcontroller. Hitachi's SH-1 had just been released, and would've been really appealing as a low-cost CPU+sound processor combo, but aside from happening to catch that brand new design being released, there was also AMD's embedded range of the 29000 series, particularly the bottom-end of its microcontrollers in the family, the AM29205. (both that and the SH-1 had onboard DRAM controllers, both also used 16-bit data buses for lower cost and pin count, so also would have been fairly simple to include a local, dedicated bus to work in rather than sharing the graphics bus ... Flare could've just dropped to a conventional main bus + GPU bus layout and also used a simpler, 16-bit cart slot and copy/DMA data to the Jaguar graphics bus through I/O ports rather than sharing everything ... plus, no fragile and more expensive 32-bit MCB/VESA/PCI-ish connector for the cart slot to deal with, just something close to SNES/Mega Drive, or ISA slot pins) Though, that said, it also shouldn't have been too tough for flare to include a (slow) 16-bit DRAM controller and basic sound hardware (possibly the 16-bit Flare DSP or just DMA sound) and UART on a 16-bit bus controller ASIC to complement a 13.3 or 19.95 MHz 68000. (a unified bus reduces cost, but swapping JERRY for a much smaller, simpler, and slower ASIC, possibly with a lower pin count also saves costs and would have just made more sense ... it also could have used gate array logic, slower, lower density, but for a much smaller and simpler custom chip it would have the advantages of being much easier to prototype, faster to bug-fix, and cheaper to start up production than the standard cell ASICs used for TOM and JERRY) Oh also note, the 3DO was horribly bottlenecked when it came to anything short of solid-shaded polygons as heavy use of textures (3D or 2D) was mutually exclusive with CPU operation as main RAM was texture/sprite/etc RAM, plus it used forward texture mapping (like the Saturn, and Lynx for that matter, though just sprite-drawing) where textels are read out a line at a time and drawn over multiple times to the screen if down-scaled or folded, reducing fillrate further and also breaking translucent blending and gouraud shading. (or corrupting both due to drawing folded pixels multiple times and warping the shading gradient). Plus you had strict liberary-level coding on the 3DO without the ability to clean up bad compiler output with some hand-tuned ARM assembly. If the Jaguar had a bit of smart bus interleaving on multiple memory banks, the 68k might not have fared that badly next to 3DO games. (albeit the CD-ROM mass storage issue was a factor for actual software development ... and PC games that the Jaguar really would've been well suited for: especially various flight/combat sims using shaded 3D or limited texture mapping ... and heavy keyboard commands that made the Jag-pad really appealing) They weren't greedy, they were poorly managed (I blame Sam Tramiel mostly) and extremely desperate, plus it was also just poor timing as DRAM prices stagnated from 1993-1995 and finally dropped again just after the Jaguar was discontinued. (by fall of 1996, the Jaguar Duo could probably have been a genuine low-cost/budget range alternative to the PSX and Saturn ... and the idea of including a RAM expansion cart as standard and offering it at a very low price to existing users would all have been feasible) But in 1993, Atari was desperate, they had a big lawsuit over old patents pending with Sega (which would create a windfall in 1994) but in the mean time they were struggling, downsized to a skeleton of a company, and made the decision to discontinue their computers, somewhat marginalize the Lynx, and put a ton of effort into a Jaguar propaganda campaign to drum up investor cashflow, and it worked: it scared the crap out of Sega (at least the Japanese executives) and got Motorola and IBM onboard for manufacturing along with sufficient investment backing to bring the thing to market. Still, it was wholely mismanaged and the UK/European market entrance was both late and particularly poorly supported ... all really bad decisions on top of cancelling the Falcon. (cancelling the Falcon in the US and continuing to market computers only to the smaller, easier to support/market to UK and European market, or at least the UK, France, and Germany, would've been more reasonable ... moreso if they'd worked the Jaguar chipset, or TOM specifically, into a second-gen Falcon project, like as part of the Falcon040 or a lower-cost '030 counterpart) The further irony, of course, is that CBM fell out of the computer market, leaving a void for the Wintel invasion to finally take the UK/Europe at a time Atari might have continued to compete (especially with the shift towards open-source GNU OS extension development with MiNT/MultiTOS), plus Sega ended up dropping the Game Gear, leaving even less competition for the Lynx. (plus the Game Boy Color failed to even match the Lynx's hardware and continued cost/size/power reduction left tons of room for Lynx updates to compete) Atari's lack of a significant mainstream game console from 1990-1993 (or given the Jaguar remained niche ... from 1990 onward) was a big gap on top of the ST and 7800 sales struggling somewhat in '89, and Sam Tramiel's management ... or perhaps more specifically: Jack's retirement as CEO and Mike Katz's leaving as president of the Games division seriously crippled the whole operation. Katz thought failing to agree on terms with Sega for Mega Drive distribution was a mistake, but even that aside, I can't imagine he couldn't have guided things better after that with the Panther development, Lynx release, Panther cancellation and possible short-term alternatives, etc. (they needed something reasonable to launch in 1990-91, maybe 92 ... and a 'fixed' Panther without so many shortcomings or a derivative of one of the many incremental developments of Flare's Slipstream ... or a much more conservative Jaguar, which would also fall into the 'fixed' Panther configuration: ie support for FPM DRAM operation, enough RAM for a decent-sized framebuffer, addition of a blitter, integrated sound hardware, and some intention for polygonal 3D, but without the custom RISC architecture being implemented: the Flare DSP was enough for geometry co-processing along with CPU-assisted triangle set-up and blitter line fills) Anyway, I wouldn't blame greed as one of the Jaguar's main problems, or Atari's ... though lack of honesty might have been a major problem along with poor management and negotiation skills on Sam's part. (dishonesty with developers during the Jag's lifespand seemed to be one of the problems ... dishonesty with investors was too, which was forgivable to some extent in 1993 with Atari being on the verge of collapse, but much less so after they got market recognition and just seemed to go ... weird or incompetent with what added resources they were afforded) The late introduction of the CD add-on, DRAM prices keeping the base system price point high, and Sony's splash in the market all didn't help, of course. Actually, with all that in mind, Atari probably made a bad bet dropping computers in favor of a new game console ... the Jag chipset (or just TOM) might have been more successful in the Falcon series than it ended up as a console. (as it was, I think the Jaguar's sluggish sales didn't compare too well to the Falcon's sales for the short time it was on the market) Plus the DRAM cost overhead was a lot more justified in a computer system, and floppy disk software was the norm, so no cart vs CD headache to decide over (or be the only console on the market using floppy disks ... especially in 1993/94), plus ... TOM would've been a lot more potent alongside a 68030 on a dedicated bus (even the 16 MHz 030 on the 16-bit bus of the Falcon 030), and that's not just hindsight ... though obviously pure spec-fic fantasy. (well ... I can't help but imagine Jack Tramiel would've put more interest in working a potent new graphics/multimedia processor into a low-cost, mass market home computer rather than 'just a games machine' ... but ... ) Oh, and, back on the topic or real-world relevant stuff: I'd missed out on the new (or last couple years of) Jaguar Flash cart development project (and thread on that), so my comments on a RAM cart earlier in the thread are a bit moot there, as such a RAM cart is in the works, just not Jag-CD oriented. 16 MB of SDRAM acting as cart ROM or RAM is pretty neat, though I'm not sure the 16 or just 6 MB (no bank switching) is planned on being implemented, but the project looks super neat. (and totally relevant to some neat workarounds for homebrew programmers to exploit ... provided folks are interested in that, and interested in digital distribution of freeware or online store style software distribution ... or crowdfunded early access sponsored stuff with free distribution later on ... that and just freely distributed tech demos and hacks, as are common for homebrew on a bunch of old game systems and several computers, even 'obscure' or 'failed' ones like the 32x)
  2. Yes, sort of: http://www.konixmultisystem.co.uk/index.php?id=interviews&content=martin Martin Brennan was brought in to consult on the Panther project in 1989 (the production-ready 8086 version of the Slipstream ASIC was completed by then). John Mathieson would join the Flare II (Jaguar) project later on, around 1991 I believe while Ben Cheese (the DSP and sound guy from Flare I) would move on to Argonaut and design the Super FX GSU, then help found the Argonaut RISC Core spin-off company. (note the GSU-1 was not a DSP like in the Slipstream ASIC, but a fast little 16-bit RISC CPU, 16 16-bit registers, 16-bit address bus, 8-bit external data bus, and the multiply-accumulate performance was poorer than the Flare DSP: 1-cycle for an 8x8=>16-bit multiply 4-cycles for 16x16 vs 1-cycle 16x16-bit on the DSP, but as a CPU it was much more flexible and could run most of the game engine on its own, plus do the polygon drawing operations in its 32kx8-bit SRAM chip, and was optimized for bitplane and tile conversions). http://www.konixmultisystem.co.uk/index.php?id=downloads(see the Slipstream 1.06 documents for the 1989 8086 production version) Anyway, Bennan was brought in on the Panther project here: Meanwhile Konix was having trouble and LucasFilm/Arts decided not to go through with their prior considerations with licensing the Slipstream chipset for the US market. (Konix had a non-exclusive license, so Flare could have sold it to anyone else on varying terms, sort of like the Amiga chipset prior to the 1984 CBM buyout debacle) They also continued developing the Slipstream in parallel with the Jaguar and expanded it in various steps up to a 32-bit data 24-bit address bus 2-bank DRAM based system with a somewhat Jaguar-like Blitter, 25-ish MHz suggested clock rate, support for a variety of CPUs (though a 12 MHz 386SX was the working model of 1994) 25 MHz DSP, and CD-ROM controller/interface. It also worked on 15/16-bit RGB rather than CRY color and had the whole system contained in one ASIC. (DSP+blitter+VDC+UART+CD-controller) Though the CD-ROM interface was apparently buggy at the time. See: "Slipstream Rev4 Reference Guide v3.3" http://www.konixmultisystem.co.uk/index.php?id=downloads Now, what I wonder about was why Martin Brennan moved forward with the Panther project while not pitching the Slipstream to Atari. (or maybe he did but made no mention of it in the Interview) It was a nice little flexible, reasonably potent system, though it had its share of limitations compared to the Mega Drive (already on the market, and Atari Corp themselves had reviewed the hardware in 1988 and decided not to take Sega's licensing/distribution terms for North America: Mike Katz had wanted to, Jack Tramiel and Dave Rosen couldn't agree on favorable terms, plus they'd still be contending for the UK/European market) Plus it was ready made and ready for mass production, and the chips were made on Gate Array logic so should have been fairly adaptable to second sourcing to whatever vendors Atari had best deals with. On top of that it had software in development already on the Konix end, a bunch of UK developers familiar with the hardware and its quirks, and had a somewhat home computer or PC style architecture in general that would lend well to computer game ports (plus actual IBM compatible ports using 8086 Real Mode ... ugly, yes, but for PC games already coded for such, or for 286 in 640k or less of RAM, it would be a smoother transition for assembly-language coded games) It relied on PSRAM to get good bandwidth and do some interleaving, though had DRAM for the blitter and CPU to optionally use (up to 256 kB PSRAM and 512kB DRAM) and was fastest at rendering 256 color graphics. (using an 8bpp chunky framebuffer at 256x200 or up to 256x256 display, and allowing more compact 16 color 4-bit sprites/objects to be used via a mask register; 16-color framebuffer modes were slower to render to as the blitter had to do reads before writes to work on nybbles). It also had a fast DSP and fast line-fill operations useful for doing flat shaded polygons or a mix of other effects (including scaling or even scaling/rotation texture mapping type effects) in realtime. (though sound engines doing DSP-intensive synthesis routines would make that hard, ones just using sample based sound like Amiga MOD or such would use very little DSP time at all, especially for 4-channel music and a couple of SFX channels, even if doing ADPCM decoding as well) The 6 MHz 8086 was slow, but relatively cheap. However, it would've been a bit painful to adapt to ROM carts due to the 1MB address limit (and less than 256 kB were reserved for ROM in the Slipstream). OTOH the system was intended to use 880 kB floppy disks instead, and a DSDD floppy drive would add to the base unit cost, but make it even more appealing to computer game developers (and a lower risk all around than manufacturing masked ROMs ... something impossible for some smaller devs and publishers at the time). Plus you could make big home computer/PC style multi-disk games with lots of multimedia features. Given Atari's focus on the home computer game license side (during the 7800 era) on top of its library of ST games, plus its existing supply chain for DSDD floppy drives for the ST line, it would seem an appealing option. (plus the proprietary 880k format and some level of encryption would be appealing for copy protection, and also avoid the need for funky floppy disk DRM schemes typical of the era) The Panther OTOH was half-finished, rather odd, and not all that cost-effective. (to keep costs down it used 32kB of 32-bit VERY fast SRAM, we're talking 35 ns, like 25 MHz 386s and 486s were using for cache, but aside from a proposed 64 kB of DRAM for the ensoniq DOC-II sound chip, that was it for onboard RAM) It worked like the 7800 using cycle-stolen DMA to load sprite pointers and data (it was a direct precursor to the Jaguar's Object processor, but with no optimization for DRAM use) and would mainly rely on reading from 32-bit wide ROMs. It had a 16 MHz 68000, but with the existing 1989/1990 configuration, the 68k would spend tons of time halted for DMA, much like the 7800's 6502, plus it'd have wait states if working in slow/cheap ROM, while fast ROM (like the PC Engine/TG-16 used) would've been really costly for a company like Atari (NEC had in-house manufacturing but still typically used half the ROM size of contemporary publishers: like 256k where the MD was commonly using 512k in 1989) and while adding some more hardware could have fixed some of that and potentially cut costs by removing the Ensoniq chip (say a DMA sound + bus controller + DRAM interface chip) and allowed use of slow and even 16 or 8-bit wide ROMs loaded into DRAM, that was yet more added work and not something that even happened up to 1991 when the Panther was formally cancelled. So given the Panther lingered on in development to early 1991, the ready-made Slipstream becomes even more strange to pass up, plus tweaking things to allow a 12 MHz 68000 or 286 given the added year of development time should've been child's play compared to completed + fixing the Panther. (68k would be cheaper and generally more friendly, but 286 would make existing Konix dev work easy to port, plus lots of PC games ... either case would also give 24-bit address space to use for cart ROM if they decided to ditch floppies) DRAM prices had also dropped a great deal in both 1990 and 1991, and loading the maxed out 512kB DRAM would've been easy. (128kB PSRAM would've been enough for most purposes as well, though 256kB would be nicer: you only need 128k to double buffer the max 256x256 PAL screen, but having fast RAM left over for DSP DMA use would be significant, including doing 3D matrix processing and spitting out vertex data) *Note, they could easily have just kept the 1MB address space limit for the chipset itself and let the host CPU alone work in 24-bit space. (that'd sort of be like an Amiga based console where the OCS could just access 512kB and most/all ROM stuff would be up to the CPU copying to RAM as needed) Oh, and Atari had already been sourcing 12 MHz 286s for their PC line around this time, so that would be another consideration for that choice. I say floppy disks would be the most novel option at the time and bridge the gap between cart an CD based consoles. Albeit on a purely engineering note (and one Kskunk made years ago) a CD-ROM drive is actually cheaper to manufacture than a DSDD floppy drive (and vastly cheaper than something like a ZIP drive or LS disk drives) but the tech was all patented and had a premium on it in the early 90s and also didn't have the raw volumes for economies of scale quite yet (the Jaguar was released around the time the scales were tipping) so a common DSDD floppy drive would be the cost-effective mass storage option in 1989-1991 for sure. (720k PC/Atari, 800k Apple, 880k Amiga, all the same drive and disk track system, though using different sector sizes, also little endian data for PC, same for the Slipstream ... ignoring a 68000 based one) Oh, and the Multisystem's post-Konix era development as a set-top box included 286, 386, and 68000 configurations, mostly at 12-12.5 MHz. Or at least the 68000 came up at one point: http://www.konixmultisystem.co.uk/index.php?id=multisystem2that whole situation was a mess (not the hardware, but ... Wyn Holloway's end of things) As an aside, I think failing to capitalize heavily on the home computer and PC/DOS game market was one of the Jaguar's failings, though also one partially forced by using ROM carts. The keypad on the controller and the capabilities of the system would've made it really neat for early 90s PC games, 3D and otherwise, including Wing Commander I and II (III would need CD), X-Wing, various Lucas Arts adventure games, etc. (most of that sans full on FMV games could be done via floppy, but I don't think 1.76 MB DSHD floppy disks would've been all that appealing in 1993/94 ... or more likely to get weird looks than 880k would have back in 1990) The Panther's gamepad was essentially the same as the Jaguar's, so equally well suited to keyboard-heavy games (with or without overlays) but the 3 face buttons would've been much less outdated for 1990. (they used the same I/O port mapping as the STe joyports anyway, so STe/Falcon games could use them) Albeit, if using the existing I/O ports the Slipstream ASIC had, you'd need to reduce the number of key inputs, or add another chip for expanded I/O. There's 16 I/O ports for the joysticks already, plus 3 potentiometer inputs and a light pen input, so you could have partial STe port compatibility with 8-bits of I/O per channel, 2 analog POT/paddle/axis inputs on one port and one paddle plus a light pen (light gun) input on the other. Doing a bit of multiplexing like the Mega Drive did (6-bits of I/O in that case, though only multiplexing 2 of those for more buttons) would've been one route to get the full pinout. (plus doing 8-bits + ground is already going to make a pretty thick joypad cable, doing the full 12 bits of I/O the STe/Jag used would be less than cost-effective) *Of course, the STe's ports were originally intended to allow splitters for 4 single-button Atari joysticks or 4 paddles, and the pin designation heavily points to this. http://old.pinouts.ru/Inputs/EnhancedJoystickAtari_pinout.shtml (neat, but overkill) The cost of a little multiplexing logic would be well worth avoiding thick, expensive, awkward cables in any case. (Nintendo OTOH had been using serial based controllers since the Famicom, but the approach at hand is already 8-bit parallel oriented and multiplexing that would be pretty safe to get a good cost compromise ... you could also use an analog matrix like the VCS keypads, but that's both odd and not really cost-effective by then: Gravis used analog lines for its gamepad's d-pad, but that was partially due to making it compatible with 2-axis analog joysticks, allowing normal joystick games to use 8-way digital control via a primitive resistor DAC) Also side note on the Falcon, but having Flare spin-off a cust-down DSP-only ASIC designed to work around the Falcon's DMA/bus timing, and put a bit more on-chip RAM (like 2kB rather than the 1kB of the 1989 Slipstream ... or technically 1.5 kB, but the last 512k doubled as CRAM and was up when all 256 colors were employed) and run 16 MHz, it would've been a major cost saving measure over the Motorola 56k and its 192 kB of super-fast 25 ns SRAM (that's 33 MHz 386/486 cache RAM there). That and possibly ditched the added Falcon sound channels in favor of 16-bit PWM DAC output from the DSP (at 16 MHz, the existing PWM registers would allow up to 125 kHz 14-bit stereo up from 93 kHz in the standard slipstream at 11.9 MHz, though somewhat less than the 208 kHz 14-bit stereo the Jaguar was capable of: all systems used pairs of PWM registers to generate 7-bits and add to 14-bits ... though the PWM registers in JERRY might be broken on the Jaguar as I think it used external 16-bit DACs). You'd need the 8-bit STe PCM channels there for compatibility in any case. That or the Falcon DSP should've been an optional module. (neat for a dedicated sound processing system and neat for 3D coprocessing, but a significant detriment to the price point ... then again offering a MEGA STe style 16 MHz 68000+16k cache in place of the 68030 would've also been an appealing lower-end option in 1992, and might be faster than the 030 in situations where the tiny 256 byte caches it had were insufficient and VIDEL was stealing tons of bus time, like in the 16-bit color mode or even some of the 256 color modes: a plain 68000 working without wait states in 16 kB of cache would have lots of appeal there ... oh, and the 16 MHz blitter would get much more use than for just compatibility) And for anyone wondering: the existing Slipstream would've been a poor add-on for the STe as it supported 256 and 512 pixel modes that would leave huge boarders of synched to ST SHIFTER pixel rates (8/16 MHz) plus that'd requite a 16 MHz slipstream and faster PSRAM anyway. (commissioning the flare team to design an ST-flavor of ASIC would've been interesting ... and more worth their time than the Panther IMO, but then VIDEL was really OK as it was in 1992 and offered good backwards compatibility, while the Jaguar was really epic at the time and a bug-fixed JERRY chip given dedicated RAM to work in and genlocked onto Falcon video would've been awesome in 1994, possibly as part of the cancelled Falcon 040 ... though a 24-40 MHz 030 based system with 32-128kB of board-level cache would've been fine as well, competitive with the 40 MHz 386 systems still popular at the time and then some ... but a 24 MHz 68040 would certainly have been nice; 40 MHz 030 is just kind of nice given it's really easy to get off the ST-compatible 8 MHz base clock, also nice for the 26.6 MHz the Jaguar chipset was managing: ie 2/3 of 40 MHz) Oh duh, I forgot: the Slipstream hardware also would've been really appealing to cross-develop Lynx games for. The CPU architecture is different, but the mix of blitter+coprocessor+framebuffer and packed pixels was quite similar, as was the 3D/pseudo 3D functionality and emphasis on mass storage. (the Lynx's chipset was originally going to use tapes, but very slow, cheap 8-bit ROM ended up being the practical solution for a handheld ... meanwhile floppies were the go-to option for a console) And following suit from the 7800 (and Epyx connection) the Lynx was already leveraged fairly heavily towards the computer game pool of developers and publishers. Ah, the Slipstream and Lynx also both used 12-bit RGB, like the Amiga and STe as well. (the Lynx and STE were just limited to 16 colors ... though for the Lynx's screen that was arguably overkill: same for the Game Gear doing 31 colors in 12-bit, aside from a few games using the Master System mode) As for exclusive games: Starglider III was planned (though probably had some elements re-used for Starfox after being cancelled), and Jeff Minter had a lot of neat ideas going on, including Attack of the Mutant Camels now playable via emulator. (though the sound is a bit bugged, or it's due to lack of lowpass filtering) Edit: I forgot: by 1989, Konix had moved on to a 256 kB fully-loaded PSRAM configuration due to complaints from developers running out of memory, particularly for games using page-flipping (double buffering), though DRAM still wasn't included as standard at that point, I think. On that note, Atari could've released a 256 kB system and stuck in a pair of 30-pin SIMM slots for RAM expansion. (a nice cost-effective idea at the time and borrowing from the ethos of the STe, but in hindsight a VERY good idea as noth only did DRAM prices drop fast, they then stagnated in 1992/93 while the price of 256 kB SIMMs dropped through the floor due to the limited demand: hence the popularity of SIMM savers at the time to re-use older low-density SIMMs as 1 MB) Plus slow, old, 150 ns DRAM would be fine in the Slipstream, as would anything newer, so they could literally use refurbished SIMMs if they wanted. (and people could upgrade using cheap second-hand SIMMs or cheap overstock/surplus ones common on the market) This is was also evident in ST magazine ads at the time that had the 512kB upgrade kits much cheaper per-byte than all the other options. (520 STe to 1040 STe upgrades were cheap, just add 2 256 kB SIMMs, while other configurations required 1 MB SIMMs or possibly non-standard 512kB SIMMS, which were much more expensive for the same amount of RAM, but you only got for slots on the STe, so 1MB was the max using 256k SIMMs ... and SIMM savers wouldn't fit in an STe case, maybe the MEGA STE) That situation with 256 kB SIMMs is narrated here from the early 90s PC perspective: http://www.redhill.net.au/b/b-93.html http://www.redhill.net.au/b/b-94.html That RAM price stagnation was very much like what had crippled Atari in 1988, driving the price of the 520ST up to Amiga 500 levels and killing the 1040ST's potential to become the mainstream/baseline standard. (for general-purpose use a 1040ST at a similar price to Amiga 500 would be an obvious sell, but even for games, the added RAM and savvy RAM-centric tricks like heavy use of pre-shifted animation would've made the platform cut in further to the Amiga and general computer/console game markets ... potentially even better sound, given larger samples for software MOD or, better, ST-specific tracker formats for intros, cutscenes, and non-gaming purposes) Had they standardized the Blitter with the 1040STf in '88, that'd also boost things a bit, including for 3D stuff. (8 MHz 68k + 1 MB of RAM, look-up tables for faster multiplication, and a blitter for fast polygon fills ... also faster at sprite drawing and block-copy, and realtime bit-shifts rather than pre-shifting; a significant boost even without hardware scrolling ... also more potential to eat up CPU time doing interrupts for sample playback) In any case, RAM prices jumped up in '88 and stagnated. (they didn't jump as much in '93, but they stagnated heavily to 1996) See: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htmunder "Annual DRAM price-per-bit ($ per Mbit)"
  3. Why not just use an SD cart that uses SRAM as the simulated ROM, but allow for games to reserve part of that address space for variable use. (or rather than 'reserve' just require software to manage the memory in a responsible manner and avoid writing to address space that's supposed to be treated as ROM) Some of the Mega Drive (and I imagine SNES, GB, etc) flash carts are actually SRAM carts, though I'm not aware of homebrew software exploiting that. (it's just a design choice and leads to faster loading and avoiding burn-out of flash memory) Firstly, it at very least greatly benefits texture mapping speed, especially for large textures where buffering them into GPU SRAM or line buffer SRAM would be impractical. (plus GPU SRAM chokes the GPU if used for textures, while at least line RAM exploits allow the GPU to continue working) So you can actually hit the peak 5.32 Mpixels/s fillrate of the blitter for texture mapping (or scaled/rotated objects ... or just scaled objects where you need per-pixel granularity with the framebuffer that OPL sprites wouldn't provide ... you can use the Z-buffer for OPL sprite priority overlapping a 3D scene, unfortunately) I was also mistaken earlier, it's not 10, but 11 cycles to render a single pixel when texture mapping. 2 cycles for the read, 3 for rowchange, 1 for R/W change, 2 for write, another 3 for rowchange and repeat. Using 2 separate banks of DRAM (or any memory with cycle times no more than 5 cycles) takes 5 cycles instead, I thought it could be faster but see below for Kskunk's quote: the blitter can't render textures faster than 5 cycles (26.6 MHz ticks) per pixel, thus the worst-case timing in DRAM (5-cycles for a read+rowchange) wouldn't slow down the blitter at all. To put it another way, for a game that uses texture mapping in main memory, you'd spend 45.5% of the time you normally would on the bus, and that much more time for other things. (the more texture-heavy a game is, the more dramatic the advantage) On the 'back then' hypothetical end, we could argue the Jag CD came bundled with the RAM cart at its release in 1995, if you want to fixate on the split development base issue. (and in the case of a 32kx16-bit PSRAM chip on-cart or a pair of 32kx8-bit SRAMs, it would've been both cheap and foolproof enough to pull off: with a DRAM cart, it might be cheap enough, but I could imagine delays in actually implementing a little DRAM controller/interface ASIC if they weren't planning ahead ... they obviously hadn't planned ahead for RAM expansion given the lack of external connectivity for the unpopulated main DRAM bank: that would've been cheaper and simpler to expand than any of the above, and taken fewer pins given the multiplexed nature of DRAM ... a little 40-pin edge connector would've been sufficient for a 16-bit DRAM interface, that or just put the DRAM control lines on the cart slot for intended expansion use) Or if they were really planning ahead, perhaps even arrange the Jaguar's DRAM originally as 512kB 64-bit in one bank (4 64kx16-bit DRAMs) and 1MB 32-bit DRAM in the second (two 256kx16-bit DRAMs) while still using a 32-bit wide cartridge bus, but adding the necessary DRAM control signals and allowing that connector to serve both as the interface for the cart address space at 32-bits wide AND to allow expansion of the other 32-bits of the second DRAM bank. (ie the CD add-on could have another 1MB 32-bit wide chunk of DRAM, but mapped to the same 64-bit addresses as the existing 1MB bank, interleaving on a 32-bit word basis and now providing two complete 64-bit DRAM banks, 2MB + 512kB ... and now you could have a cart passthrough without touching any of the cart ROM address space ... plus it's cheaper, no DRAM controller, much more seamless, and much more generally useful for the fast, fully 64-bit portions of the system) On top of all that, the Jag would've been moderately cheaper to manufacture at launch and still a good bit more flexible/powerful due to the reduced bus contention. (fewer page-breaks by doing more work in different DRAM banks as much as possible, plus faster texture mapping and faster 32/64-bit blits as well, as source could be in one bank with destination in the other, keeping nice 2-cycle page mode accesses going) And 1.5 MB was still plenty for the time, and much nicer than what Sega CD or 32x programmers had to work with. Now if they wanted to get fancier and make JERRY slightly less crippled, they'd have also added a 32-bit wide CPU in place of the 68k. (68EC020, 386DX, ARM60, maybe one of the lower-end embedded flavors of AM29000 series, etc ... the 020's I-cache would help a bit too, but whatever was cheapest would be best ... the Jag was already designed with big or little endian in mind, so reconfigurating that would've been less an issue ... a Cyrix 486DLC with the 1kB on-chip cache also was nice ... or IBM's similar chips, but those probably would've only been cheap from the desktop PC perspective, not from an embedded system/console standpoint: the AM29000's low end options also lacked cache, but you've got the massive array of 192 32-bit registers to consider ... a neat complement to the 64-register J-RISCs) But more to the point at hand: Jerry is more of a bottleneck in the CD than with cart games as you can have it work largely in ROM to read samples or other data or copy code (or have the blitter copy chunks to JERRY's SRAM) while avoiding hitting main DRAM and thus avoiding performance-killing page-breaks caused by rowchange. (Jerry's accesses are so slow anyway that ROM isn't that big of a bottleneck, and games using it basically just for audio would be fine, even if doing sample based music+SFX, especially if streaming compressed sampled 2/4-bit ADPCM or even CVSD would be interesting, or 2-bit flavors of CVSD: 2, 3, and 4-bit ADPCM flavors had long been promoted by Covox as low-overhead compression formats for PCs, targeting the low-end systems using parallel port DACs, but applicable to pretty much anything else capable of PCM too: CVSD, especially 1-bit CVSD is obviously better suited to speech compression than musical instruments; plus the DSP can do filtering and interpolation of lower sample rate stuff and minimize both ROM space and bus time needed to stream the samples ... and still probably sound a lot nicer than the SNES, quality of compositions aside of course) In any case, without ROM, the DSP now needs to read from main DRAM, which means page-breaks for TOM where there might otherwise just be some waits. Meanwhile, adding even a chunk of slow RAM (or even a small chunk of RAM) would offload that significantly That aside, wouldn't handling RAM on cart be similar to using ROM of a similar width? (likewise you COULD directly texture map from ROM, but it would've been slow back then, usually 8 cycles, 10 for slow/cheap stuff iirc, plus it'd mean using uncompressed 16-bit textures rather than unpacking them into RAM) Now you also could've had carts that had RAM right on them, like several 7800 games did and some SNES and even MD games (well ... just Virtua Racing with the DRAM for the SVP chip, I think, ignoring SRAM for battery saves) and a couple 7800 games had even used 32kx8-bit SRAM chips back around 1987 (both Summer and Winter Games did that iirc, only using 16kB as that's what they needed for the memory map they used and because 2 8kB chips took up too much space to fit, and the cost of 32kB was cheaper than a modified cart PCB/case at the time, apparently) so 64kB of 16-bit SRAM/PSRAM slapped on cart wouldn't seem too unusual for 1994-96 ... or later. (had the Jag done well with carts). But you needed at least enough confidence in the platform and investment funds handy to actually manufacture carts like that. (making masked ROMs at all was a big problem, and a big reason some folks suggested the Jag should've been a CD system from the start ... not for performance, but for sheer economy of development and attracting more devs and publishers who'd otherwise be unwilling to risk the overhead of a ROM based system: that and Atari could do things like ship out free demo discs both pack-in with consoles and at promotions, and even jump onboard the wave of Shareware distribution at the time ... plus still be vastly cheaper than the 3DO, but that's yet another topic) But on the issue of bus sharing and interleaving, is there too much of a delay for granting the bus to TOM, the OPL, or Blitter to do any useful interleaving between slow, periodic accesses? Like the 8-cycle reads of the 68k (not that it even hits the bus for every memory cycle) or 6-cycles for DSP. I believe you only need a 73-ish ns (2-cycles at 26.6 MHz) period for the actual read/write strobe, and while you couldn't interleave accesses in a single bank of DRAM at that speed (as there's 3 cycles for rowchange and another to switch read/write if needed) having accesses in different DRAM banks with different rows being accesses and held open (for page mode) would allow overlap of everything but the actual read/write strobes. Now, a higher priority processor on the bus couldn't take open cycles from a lower one as it already has priority, so you need situations where the slow processors have priority, but leave enough time to grant one or more accesses to lower-priority processors/DMA-channels/etc (any bus master). The 68k is normally bottom priority, so would be difficult to actually put in a situation where TOM, the Blitter, or OPL could waiting for holes in 68k accesses to work, but the DSP normally has fairly high priority and that could be exploited. Further, the 68k has higher priority when doing interrupts, so coding a game where the 68k is being used pretty much exclusively as an interrupt handler would make that arrangement viable. (as such you could potentially split up general processing duties between the DSP and 68k while not too horribly hogging the bus) From the Jaguar Refrence Manual V8 The Jag II fixed that with double-buffered blitter registers instead. And I say fixed and not 'would have fixed' as I'm pretty sure that was functional on the preproduction Oberon (Tom II) chips used on the development systems in 1995. (Puck was not on those, just old Jerry, as crippled as ever, except 32-bits wide thanks to using the 68020 in place of the 68k ... something they might not have needed to retain for the production version if Puck's features worked correctly, allowing a cheap 68000 to be stuck on there for compatibility: indeed, better compatibility than an '020 would provide, plus a 68k would have been a reasonable fit on the slow 16-bit sample RAM bus Puck was to use, sort of like the 68EC000 in the Sega Saturn audio system) Playing devil's advocate here, I'd point out that Kskunk's skunkboard (and any modern homebrew ROM cart based games that got enough traction to be manufactured in masked ROM) could be run fast enough to allow texture mapping from ROM without waits, but more than that it could allow GPU code/data/DMA fetches from ROM at full speed as well. (using the high speed ROM cycle mode that was originally intended for testing only) The blitter and OPL doing 64-bit bus operations would still be faster in DRAM though, in cases where serial access can exist. But beyond that, you could build an SRAM cart that could either be simple SRAM (only useful for loading from CD), made into a passthrough cart and only using part of the address space (allowing ROM as well, possibly bank-switched), or just a full 6 MB 70 ns SRAM cart used for CD homebrew. Or add an SD card interface (or CF, XD, etc: the latter would probably be easier given it's parallel, but SD is obviously the most popular and what most 'flash' carts use, regardless of whether they load into flash memory or SRAM on-cart: the latter has the advantages of speed and not wearing out from successive writes) And given the hardware hacking stuff folks do (overclocks included), I'd think wiring up the unused DRAM bank would also be an interesting possibility ... probably not as simple as the old piggyback RAM upgrade on the ST, but also not totally different. (and SOJ leads aren't too bad to work with ... touching the leads on TOM would be iffy OTOH) Oh, and Kskunk's experimenting with texture mapping on internal SRAM showed that the logic for calculating scaling/rotation in the blitter limited pixels to 5 cycles at best, so even random read/write DRAM speed would be fast enough to do texture mapping at the peak rate. (that would include page-breaks in the second DRAM bank or a slower DRAM controller onboard a cartridge with no page-mode support and basically behaving like PSRAM at the 5-cycle cart I/O mode) https://forum.beyond3d.com/posts/1936444/ (Nammo is Kskunk) There's also stuff in that thread about rendering directly to the line buffers and potentially doing beam-racing style 3D at 50/60 Hz, but that's better for another thread. My current new favorite is actually: what if Atari spun off Flare II's Jaguar II project to Sega in 1996 during all the second-guessing with the Saturn. (plus the unfinished Puck chip with RCPU and DSP could be displaced by some neat Hitachi Super H RISC chip with built-in UART and such ... or an Power PC 602, convenient 64-bit bus there) But again not the topic here. More on topic I'd say would be bundling a 512kB DRAM cart with the CD to boost performance/flexibility a bit might've made an impact, that and they just had bad luck of choosing to cut their losses early in 1996 before DRAM prices dropped like a rock. (they had the misfortune of test-marketing the Jaguar at about the same time of the big Formosa resin factory fire in Japan that cause RAM and other IC prices to jump up then stagnate, just like they did in 1988: the latter hurt the ST big time and crippled the 1040ST's potential of becoming the bottom-end standard, plus made the Amiga end up price matching the ST that year ... the 520STfm and A500 starting 1988 at $300 vs $500 and meeting at $400 mid-year) Granted, the reason Atari desperately needed good luck to survive was mostly related to Sam Tramiel's abysmal management. (Atari Corp was best under Jack and Mike Katz ... 1985-88) Oh but I doubt anyone would bother with a DRAM cart for modern homebrew. SRAM is much easier to do and cheap enough not to bother with anything else, plus 2-cycle read/write times offers more flexibility for the fast parts of the system. (the DSP and 68k would be fine with 186 ns cycles ... the DSP can only do reads at 6-cycle intervals anyway, and the 68k takes 4, but 8 system clock cycles as it runs at half the speed)
  4. I haven't read the whole thread to see if Curt or Marty or some other historians on the site already corrected this, but Warner-Atari heavily invested in the Amiga chipset and had licensed it, planning it as a home computer, arcade machine, and game console (codenamed MICKY). They also had several in-house 16-bit (68000, x86, and maybe 16032/32016: I think Trammel Technologies dabbled with the latter before switching to the 68k). Amiga ended up cheating out of their contracts with all licensees, and at least in Atari Inc's case, illegally 'refunding' the investments made while claiming to have failed to produce working silicon. Meanwhile they'd signed an exclusive agreement with Commodore. With the confusion going on in June/July of 1984 at Atari Inc, and Warner's horrifically managed liquidation of the company without notifying executives (especially Atari President James Morgan) led to that slipping through the cracks and some lower level management accepting Amiga Inc's refund check without reading over the contract properly. (it's that same sloppiness that led to Tramiel's poor reputation and the myth that he 'fired everyone' when taking over ... rather than the reality that Atari Inc was liquidated, the arcade business spun off and home/consumer business's assets sold off ... it's also that mess that led to some of the neat in-house designs, hardware and documentation along with engineers walking off or becoming fragmented) It was also that breach of contract that leveraged Atari Corp's later settlement with CBM over the ST lawsuit. (the Amiga contract was brought in to counter-sue them) Incidentally, the Amiga contract allowed a game console/arcade machine to be released in 1984, a computer in 1985 with no more than 128kB of RAM, and unlimited hardware configurations from 1986 on. (had Tramiel gotten hold of that license, I imagine they'd have made do with 128k and perhaps shipped without GEM initially, just the text based portion of TOS, and also probably been forced to include RAM expansion via slots or DIP sockets and possibly even use an external cart slot for the OS ROMs ... or slave the cart slot to that purpose while abandoning any intent to use ROM cart based software: though using internal ROM sockets and intending service centers to install OS ROMs as they arrived may have been the more natural decision) The ST was originally intended to have a 128k (130ST) as the bottom-end model, of course, but was abandoned as RAM prices fell and the OS became too large. (and a cut-down version without a DOS at all and just BASIC and tape drive interface routines became unappealing) On the note of the actual thread topic, though: why not add MARIA to the 8-bit chipset? This is something that came to mind while I was looking at the flaws and problems (and possible fixes) for what made the Panther problematic a few years later, but in any case: Replace FREDDIE and possibly the MMU with a new gate array ASIC, performing the memory mapping and DRAM controller duties and fast enough to allow Amiga speed bus cycles (280 ns cycles) fast enough to service existing ANTIC and SALLY access times while only using 50% of the bus cycles, but rather than fiddling with ANTIC or SALLY timing at all (or trying to spin off 3.58 MHz 6502s or what not) use that added bandwidth in leu of cart ROM access for MARIA graphics data, and use the new mapper/controller chip to interleave things seamlessly to avoid the need for CPU halts during MARIA DMA. (though you'd still need to wait for vblank to do MARIA register updates and list/pointer updates in SRAM) Bump MARIA SRAM up to 8kB (a single 8kB SRAM, cheaper, less board space, etc ... 32k would be nice, but not really needed given you're pulling most graphics data from DRAM). Probably map the normal 48k MARIA cart ROM space directly into A8 space and put MARIA registers and SRAM onto the 16k bank switched segment. Possibly add the ability to enable/disable either of the 8k cart ROM chunks to allow the full 64kB of DRAM to be used and those 8k banks flipped in as needed. (I forget if the player 3 and 4 GTIA trigger inputs were used already, but those might be handy for using an additional 2-bits of bank select control) You'd still need on-cart bank-switching logic to extend beyond 16k as well, but you'd make the most of RAM this way and avoid the issue of MARIA/A8 DMA conflicts in ROM. (cheap ROM being too slow to interleave in, at least when both ANTIC/SALLY and MARIA are trying to access it) You'd thus have a really nice system with MARIA graphics operating without holey DMA and genlocked over GTIA graphics (MARIA was designed with this in mind for the planned laserdisc expansion, so genlock with GTIA should be quite possible, particularly as all would be running off a synchronized clock and using common color/pixel clock times or integer multiples of those: ie if one was using 320 pixel and the other 160 pixel modes). MARIA allows for up to 4-bit pixels in its objects, which could potentially also allow a 12-color linear bitmap screen overlay on top of ANTIC+GTIA character or bitmap modes, or turn off the latter entirely for 100% CPU time in a 12-color bitmap. (or 13 colors given GTIA's background color should still be available) For typical late 80s console/arcade games, I imagine it'd be appealing to use 3 or 12 color MARIA sprites over a 5-color ANTIC character scroll layer with GTIA sprited used for a bit of added color. (so 12 color sprite layer + 9 color background) Doing proper Genlock would also give nicer video output than the 7800's hacked solution of merging TIA and MARIA video lines. (a simple disconnector switch also solves that, of course) Further, this sort of machine would have been a much more potent game console to release for 1987 than the XEGS, while also better meriting the price points the XEGS was initially sold for (substantially more than the $99.99 65XE or $89.99 7800 and of course $49.99 for the 2600Jr). The deluxe package XEGS with light gun and keyboard originally retailed for $199.99, and I rather doubt the added MARIA+SRAM + Gate Array chip and 150 ns DRAM rather than old 200/250 ns stock would've pushed it even that high. (probably more like $150 in a basic set and $200 with keyboard and games and/or possibly other software) https://youtu.be/2N2BUTIpnDI?t=97 Plus you'd have a game machine with substantially greater advantages over the NES and Master System. (still some trade-offs like the lower resolution for most purposes, but a monster sprite engine for the time and some pretty nice colors all around ... and the flexibility to do some nice software rendered effects to a linear framebuffer and a ton of RAM for a console at the time, and chunky pixel graphics, so very well suited to storing compressed data on cart to save space) You'd also have a lot more CPU time to do complex POKEY modulation effects (or 4-bit PCM) or possibly make some use of the GTIA beeper channel. (though that would probably be more useful if you added GTIA beeper control to the new ASIC, maybe slaving it to some neat PWM sound ... possibly even useful for sample playback, but I'm mostly just thinking fixed-volume variable duty cycle pulse wave stuff ... though slaving it as a PWM DAC would certainly be interesting, I'm not sure what sort of resolution you'd get out of it: if you could toggle at at 7.16 MHz, that'd allow 28 kHz 8-bit sample playback, which would be quite neat, especially if it was DMA loaded ... though a CPU-loaded FIFO would be pretty good, too) You could obviously have a 128 kB variant of that on the computer end of things, but a game console would probably be better to stick with 64k. (you could drop lower, but that would hamper the compatibility and selling point for promoting expanded A8 development in general as a computing platform on top of enhanced game machine, plus 64kx4-bit DRAMs were a very economical density at the time and using 2 or 4 16kx4-bit ones for a 16 or 32k system would seem a poor value by comparison) And, of course, such a game console would squarely sit in the Home Computer category as far as Nintendo's predatory licensing was concerned, and would soundly avoid the sort of problems the 7800 and Master System both suffered from. Edit: you could also use that faster DRAM timing to allow for a 3.58 MHz 6502, but I'm not sure existing (even new production 1987) NMOS SALLY chips would tolerate that well enough, and 65C02s were around, but then you had to deal with RDY rather than HALT among other things (short of making a CMOS SALLY). OTOH, using that 7.16 MHz bus/DRAM controller ASIC clock divided by 3, you'd get a more likely SALLY-tolerant 2.39 MHz, which would be a nice speed boost for some things, and still wouldn't change ANTIC timing. (just more wait states for SALLY when overlapping with MARIA DMA cycles) 3.58 MHz would obviously be nicer, though. (even more wait states for MARIA, but still a speed gain, and faster interrupt response) You'd need normal 1.79 MHz modes for full compatibility. (also standard XL/XE memory map modes, possibly disabling the cart-slot banking if that proved problematic) Wait: RDY in the 65C02 behaves like HALT on SALLY, doesn't it, since it's CMOS and static and thus needs no refresh? So you could use a 65C02 in there without problem, and use 3 or 4 MHz rated chips at 3.58 MHz. (unless there's any software using undocumented NMOS-specific opcodes or such, you shouldn't have compatibility issues ... plus you get the enhanced instructions, some more than others depending which 'C02 variant they used ... probably Rockwell though, given Atari Corp was using them a fair bit already for chip vending) That aside from other hypotheticals, like if Atari had taken Synertek's assets when Honeywell liquidated. (Synertek was in trouble with Superfund cleanup/lawsuit issues, so it would've been on favorable terms, though another risk/reward investment for Tramiel to make like he did with Atari Inc's assets, though it was sold off in 1985, when Atari Corp was already pretty deep in investment debt) Synertek had already been manufacturing 65C02s prior to being shuttered, for what that's worth, along with second-sourcing a bunch of Atari's custom chips, so it would've been a solid fit all-around. (albeit slightly moreso had the ST used more MOS chips for its 8-bit serial and I/O stuff rather than Motorola ones) And why use a gate array for the new ASIC? It'd be much faster for testing/prototyping than a full custom chip (especially without an in-house chip fab) and would be much lower risk to produce at low volumes, hedging their bets on a potential flop. (if it really took off, they could probably spin off a full custom or standard cell ASIC to not only replace it, but embed the DRAM controller+MMU+CPU+ANTIC+GTIA+MARIA+POKEY+PIA all on one dense CMOS ASIC with a single 8-bit I/O bus and 16-bit plus bank-selected address bus, making it a solid budget console/computer platform around 1989 into the early 90s and also making a nice platform to cross-develop Lynx games for) You could also switch to a single 128kx8-bit DRAM chip by 1990/91 and discontinue the 64k models entirely. It's worth noting that plenty of manufacturers stuck with gate arrays throughout platforms lives in spite of high volume production, so that's always an option too. (and you didn't need the raw logic speed that standard cell and full custom CMOS parts were doing in the late 80s ... Standard Cell also might not have been very widely used yet) Sega used lots of Gate Array chips for various things in the arcade and home consoles. (and the custom graphics/interface chip of the Sega CD was simply called the Gate Array in most documentation/programming literature) Flare Technologies also used Gate Array chips for their Slipstream hardware (the Jaguar was Standard Cell, though), which makes plenty of sense given they'd come from Sinclair, who'd used some of the pioneering Gate Array (ULA) production for the ZX-81 and Speccy.
  5. Oh, and I forgot to mention, even without the Z80, you could leave in all the other Master System compatibility bits (I/O, sound, VDP, etc) and just stick the Z80 into the Power Base Converter. (most or all of the necessary I/O and memory addresses are accessible through the cart slot as is, so you might not even need to change that. You could also have ditched the side expansion port in favor of a VRAM expansion port (there's another 64 kB of VRAM space unused by the VDP) and use fewer pins for that as well. (the dual 8-bit data ports plus multiplexed address lines and DRAM control signals) On that note, upgrading the PSG to allow it to run at lower clock rates (or just 1.79 MHz, half of normal) would make it much more useful for music, though adding Game Gear style stereo functionality would be nicer. The Cart slot is already a much better expansion port than the side port (originally earmarked for a floppy drive before the CD-ROM was pressed into that role) but a cart-slot module based expansion would be far more flexible and efficient ... and you probably wouldn't need that redundant 68000. (it's faster, sure, but swap that for a DSP co-pro of some sort and you've got a generally more useful system more useful for 3D) You could also just put the VRAM expansion lines on the cart slot, potentially on outboard keyed portions (7800/SNES/Jaguar style) to keep PCB costs down on standard carts. (actually, there's a TON of expansion pins that most games don't need and would've been cheaper/better off if segregated from the normally used ROM cart bits ... probably just 48-50 pins needed for most games, including a couple pairs of VCC and GND lines) If you added that second VRAM bank onboard the CD itself, it'd also open up interesting possibilities for other changes, like having the added graphics co-pro ASIC render straight into that VRAM bank, or at least have faster and more flexible DMA than the MD's native VDP (faster VRAM, maxing out DRAM/PSRAM bandwidth, CPU-synched interleaved DMA modes, among other possibilities). Or just include two extra VRAM banks that can be flipped like Sega CD word RAM or 32x framebuffers (or Saturn VDP-1 framebuffers). With 121 colors from 12-bit RGB from the start, the need for video expansion would be less too, but tweaking that a bit more and allowing one or both BG layers to be disabled to allow linear bitmap framebuffers instead (with an eye for software rendered effects, even without expansion hardware) would be interesting, plus you wouldn't need to monopolize both VRAM ports if you disabled both tilemap layers and used the serial bus for framebuffer scanning. (you could do 2 15 color 4-bit planes or 1 121 color 8-bit plane, or 2 half-res 8-bit planes, and potentially make use of unused color values for shadow/hilight translucency effects, though you could also juse one bit for per-pixel priority to allow objects to be drawn in front of or behind the sprite layer) Doing a linear bitmap is much simpler than a tilemap, and the system is already using packed pixel data. Short of that, you could also tweak something the VDP can already do: lowres direct color via mid-screen CRAM DMA updates. The problem with that is it halts the CPU for the entirety of active display, but allowing the tilemap layers to be disabled and DMA'ing drom VRAM itself would allow for the same effect, direct 16-bit (unpacked 12-bit) color bitmap at up to 160x200. Plus sprites could potentially still be enabled if this was a feature rather than just an exploit. (practically speaking, you'd want to limit that to smaller render windows due to VRAM space limits ... right up until you added external VRAM like in the above CD unit suggestion) Note the real-world hack mode using this is limited to 9-bit RGB encoded as unpacked 12-bit (you have 3 nybbles per 16-bit word, just with the 4th bit ignored on all three: the VDP natively works in 12-bit RGB, remember, it just had CRAM and the color DACs truncated to 9-bits to save chip space). Oh and on that note, I believe the PC Engine was also designed with 12-bit color in mind and the expansion port actually allows for upgrading the RAMDAC, but they didn't use that feature on any of the CD expansion units. (you could've had 481 colors from 4096 12-bit RGB instead of 512 color 9-bit RGB) Oddly enough, the SuperGrafx also retains the 9-bit color limit, in spite of using dual VDPs. (the pixel bus on the expansion slot also provides other information, so an upgraded RAMDAC/mixing chip could potentially add things like translucency effects in hardware) The PC Engine is one console that was pretty close to ideal for its time, but the upgrades didn't push it nearly as far as it could've been ... and marketing was poor in the US and it failed to get a European release at all. (unfortunate given the tiny PC Engine form factor would've probably sold well as-is) They probably should've had at least 2 controller ports on the TG-16 variant, though, and offered 3+ button controllers sooner, them made 6-button ones standard, and should've either made the Supergrafx an expansion unit, built into a second-gen CD-ROM base interface, or gone another direction with video expansion and added a framebuffer bitmap layer instead, with the VDC function probably built into the upgraded RAMDAC chip and piggybacking on existing CRAM entries for the 255 colors. (either software rendered or blitter accelerated ... probably blitter accelerated) The original 1988 CD-ROM unit could've been simplified and made generally more useful by omitting the ADPCM chip, using a unified block of 128 kB DRAM, and either adding simple 8 or 16-bit DMA sound, or just relying on software driven playback instead. (given how poor a lot of ADPCM sounded, and how poorly it buffered and streamed for cutscenes, even simple 4-bit or 5-bit LPCM would've been competitive at the same bitrates, but you can do software DPCM/ADPCM decoding pretty easily and also do software 8 or 10-bit PCM fairly easily with paired channels at offset volume levels, and software mixing is far more flexible than a fixed, single ADPCM channel: that was also a huge limitation of the X68000's sound system, a single 8-bit PCM channel would've been far more useful) In any case, no sound upgrade at all would've been fine for the first gen CD unit, and they could've added something fancier and more generally useful around 1991 as part of the Super CD upgrade. (an entire base interface unit replacement, say 512kB DRAM, the VDC/color upgrade, and perhaps one of NEC's embedded DACs coupled with 16-bit DMA stereo, allowing CPU or DSP driven software mixing as well as slaving the DSP as a 16-bit multiply-accumulate copro for assisting with 3D or scaling effects)
  6. Incidentally, the MD's VDP was designed to support 128 colors (or 121 colors: 8x 15-color palettes + 1 BG color) from 12-bit RGB (4096 colors) and had external expansion pins for that, but they were left unconnected on the MD itself and used later for the Arcade System C. (which also ditched the Z80 in favor of an 8.9 MHz 68000 and PCM chip) Had Sega wanted to use that full capability in 1988, they'd have omitted the CRAM and DACs entirely from the VDP and used an external RAMDAC chip (as the PC Engine did) and probably could've made up the cost difference by removing the Z80+RAM and had the 68k handle the sound drivers alone. (just add a simple 8-bit DMA sound channel and you're good for sample playback and software mixing too ... interrupt-driven PCM is horrible on a 68k and cycle-timed loops aren't practical for most purposes either, so DMA is the way to go: on a 650x based platform like the PC Engine, interrupt based PCM was viable and a 7 kHz driver would tend to eat 5% of CPU time for tight code: Malducci's driver manages such; plus you can do channel-pairing tricks to get 10-bit resolution and double-buffer sample chunks into wave RAM to get better than 7 kHz without added hardware, though you'd need to sacrafice 4 channels to do 10-bit mono that way, and using 4/5-bit PCM, even for some sample based music would be pretty useful and doable with just 2 paired channels at up to 32x7 kHz ... so also tons of potential for interleaved/multiplexed mixing, but I digress) There was also Ricoh's 8-channel PCM chip Sega later used in the CD, but was already using in 1989 in the Arcade on the Model 18, but that's unnecessary added cost and overkill compared to the potential of software mixing with DMA sound. (OTOH it was MUCH cheaper than the Sony SPC700 module of the SNES ... and manufactured by Nintendo's prime chip vendo Ricoh ... and would've been an interesting choice to see tweaked as an embedded CPU+Sound Chip on the SNES end ... with a much faster 65816 and faster RAM rather than wasting money on the Sony module and cheaping out on RAM with DRAM and a slow DRAM controller: compared to NEC, who managed with 70 ns DRAM and a fast controller to allow for full-speed 7.16 MHz 650x operation in 1988 with their CD-ROM system ... 2.68 MHz is SAD in the SNES; throwing in 256 bytes of RAM for one-chip zero page would also be nice and help somewhat for poor compilers for those attempting to use C on the SNES) The PC world also had the issue of VGA compatibility, and ATI took the route of an 8514 clone, but used a separate VGA core + RAM to provide compatibility there and nothing fancy like genlock to allow overlay of the two screens. Plus, you had 4-bit color modes using bitplanes and 8-bit chunky modes (not to mention the odd organization of nonlinear unchained VGA 8bpp mode: not planar, just interleaved in the 4 64k banks of VGA space ... probably due to the way they got the necessary bandwidth while focusing on linear pixel space in 4-bit mode rather than say, linear 32-bit alligned addresses in chunky mode). OTOH, ATi probably could've made a low cost fast VGA card that simply had some nice added features while focusing on basic VGA compatibility. Remapping RAM to 32-bits wide would be relatively straightforward for a much more friendly/fast (especially for 32-bit CPUs and VESA) linear 32-bit word organized 8-bit packed pixel framebuffer, and also support DMA from main RAM, allowing fast updates of partial or entire screens. (entire ones for double-buffered full-frame rendering, partial ones for looping single-buffered scrolling type graphics, where DMA is mostly filling in portions of the off-screen bits being scrolled-in) Simple DMA block copy and fill function would be good enough for basic acceleration rather than a full blitter and would cater to 8bpp modes and 512kB DRAM. (which become appealing as soon as you adopt high enough bandwidth to do 640 pixel wide 8bpp modes and 640x480 256 colors ... while still being compatible with fixed-frequency VGA monitors; while 640x400 could still be double-buffered, so good for 3D games) You'd also want vblank interrupts to make for fast and simple page-flipping without tedious status register polling. (also very useful for color cycling effects via palette swaps, or 256 color video playback that re-loads the colors for each frame or on key frames: something you can't really do without double buffering or really fast DMA able to copy full frames in vblank ... so using Mode 13h would be out on ISA video cards, while double or triple buffered mode X would be possible via ISA cards ... or of course, a mapper-modified Mode X allowing 32-bit linear pixel organization, though obviously you'd need 2 port or DMA writes on 16-bit ISA for that) DMA functionality without any bit manipulation features would still be useful for 4-bitplane VGA modes, but less useful than something like the Atari STe blitter or Amiga Blitter. (hardware bitfield operations, bitmasking, bit-precise line fill and sprite drawing, etc) But with a CPU with a barrel shifter and fast bit manipulation instructions, you'd be OK software rendering and DMAing that way anyway. (the 68000 was not such a CPU, but a 386SX could handle such ... I forget where the 286 fits in there) So a fast enhanced VGA card that still lacked double-bandwidth modes (640 pixel 8bpp) could still be appealing with DMA copy and such, and offer relatively fast ISA bus performance. (and if it got popular enough, you'd probably have seen games exploiting the DMA function for primitive blitting or screen block/tile updates at 320x240 with square pixels and fast/efficient 32-bit word-packed 8-bit pixels rather than funky mode X, speeding up software blits to the back buffer in main RAM even if copying over ISA was a bottleneck) Gaining market acceptance would be key to getting software like games to support it, but a low cost, enhanced VGA card would seem much more likely to gain such than an 8514 clone. Hmm, perhaps even easier to gain acceptance would be a simple 2 or 4-page hack of Mode 13h, allowing a mapper/bank-switching scheme to let software treat each page like 13h, but have additional control register settings that allow page-flipping and thus more easily allow software to optionally support that with less modification of their rendering code. (just allow for 2 banks to be selected, one designated for the active screen and one designated as the back buffer currently being copied to: you could potentially have 3 back buffers and a quad-buffered arrangement for smoother average framerate, of course) So you get the speed and simplicity of mode 13h without the hassle of single-buffering and ugliness of screen tearing either without v-synch or over ISA where there's no time to copy 64kB in one vblank period. (If you dropped to 60 Hz for 320x200 with a boarder and more vblank time, you'd still only get 62kB at the absolute max over 8 MHz 16-bit ISA ... so with a fast CPU and tight polling of the vblank status register, you could avoid tearing if you had a boarder or status bar or such that didn't need to be re-copied every frame ... plus square pixels, which is nice, though the letterboxing isn't so nice)
  7. The closest things to a good, low-cost oriented framebuffer+blitter optimized console around in 1990 were the Lynx and Flare's Slipstream. The former wasn't fast or powerful enough to be directly used for a TV based console (not enough bandwidth and existing framebuffer size restrictions were too limited for a TV screen, plus it only did 16 colors/4-bit pixels) and a 4 MHz 65C02 was marginal, though probably no worse than the SNES's 2.68 MHz 65816. (much weaker than the 7.16 MHz 650x derivative in the TG16/PCE or 7.67 MHz 68k of the MD, or 7.16 MHz Amiga 68k, even with wait states in 5/6-bitplane modes and blitter bandwidth) The Slipstream OTOH, relied on PSRAM to be fast enough for some interleaved DMA, so not as cheap as pure DRAM, but still cheaper than the multi-bus arrangements with VRAM, PSRAM, and/or SRAM on the PCE, SNES, and MD. Plus a 5.9 MHz 8086 isn't all that great of a CPU either, and it'd need address space expansion for cart based games, though was interesting as a floppy disk based console. It was mapped to support up to 256 kB of PSRAM and 512 kB DRAM, and there was a fair bit of interleaving it could do. 256x200 (up to 256x256) 256 colors from 4096 (12-bit RGB) 8-bit chunky graphics and a blitter optimized for both sprite and line drawing (and block-copy for background drawing), plus a 16-bit DSP useful for sound synthesis and 3D math. (slaving it for simple PCM sound playback for Amiga MOD should've also left a lot of time for math coprocessing, while doing realtime synth would've eaten a lot more time). The x86 CPU and 256 color graphics plus potentially large chunk of RAM might have made it appealing for PC game ports of the period, and Lucasfilm toyed with licensing it in 1989 while Konix was gearing up to release the Multisystem (all of which fell through, of course). 128kB PSRAM plus 512kB DRAM would fit rather well with 640k Realmode PC games. (enough PSRAM to double-buffer into and have some left over for fast blitter/DSP access) So it was ready for 1989 mass production, and would've played into late 80s Atari's computer game license game model they'd aimed at with the 7800 and would continue (somewhat) with the Lynx. (plus 880k double density floppy disks were a much smaller risk than ROM carts for publishers, and the proprietary data/sector format would've been appealing for those worried about piracy ... I think they included some other security features too) Being framebuffer based also meant double-buffered rendering could drop the framerate to increase complexity (more parallax, sprites, etc) like with PC/ST/Amiga games, though dropping below 30/25 FPS would probably be unappealing for most games ... there were already a number of racing games and shooters that ran at 20 fps on the MD or SNES. (something like Galaxy Force would look a lot nicer on the Slipstream, probably) And 3D/Pseudo 3D stuff would be much nicer to work with, as would be realtime scaling effects. (rotation would be possible too, but a lot more math intensive than simple stretching/shrinking ... and combining realtime scaling with pre-rotated animation would tend to look much better ... something appealing for a hefty chunk of RAM and floppy disk storage, just as pre-shifted graphics were appealing on the ST but would be horrible on a ROM based console without 512kB of RAM to pre-shift things into) Atari's Panther didn't go with that design philosophy at all ... it was more of a supercharged 7800 and prelude to the Jaguar's object processor, but required 35 ns SRAM (basically high speed cache RAM) for the list and some high-speed sprites/objects, while intending to pull most from 32-bit (slow/cheap) ROM, hence only using 32k or RAM ... plus an Ensonic (DOC/DOC-II or OTIS) PCM chip with its own private DRAM to work in, and a 16 MHz 68k that'd be getting tons of wait states like the 6502 in the 7800. Plus they made the odd choice of only 32 CRAM entries but in 18-bit RGB 260k colors), and while that meant 5-bit line buffers, it meant 18-bit CRAM entries and more chip space for the 18-bit video DACs (VDACs are fast and not 'cheap' in terms of chip space ... a big reason the Mega Drive stuck with 9-bit RGB). They should've easily managed 6-bit line buffers and 64 colors from 12-bit RGB, and it used an 8-bit linear offset, for palette select using 1, 3, or 15 color objects (similar to the Jaguar) as well as unpacked 8-bit per pixel objects using the full 32 (or 64) colors. In any case, the Panther seems like a really bad set of compromises and not optimized for cost to performance. They probably could've salvaged it with a bit of added custom hardware, especially given they brought Martin Brennan onboard in 1989 to work on the Panther chip itself (and he was one of the 3 designers from Flare who'd done the Slipstream and would do the Jaguar), adding something like a bus controller chip to mediate between Panther and 68k accesses to SRAM and ROM, and possibly add a block of DRAM to work in (and be faster than ROM for sprites) while potentially cutting the cart bus down to 16-bits to save cost and potentially even using a cheaper 8 MHz 68k and a heavier emphasis on interleaved DMA from DRAM and ROM. (work within the existing Panther limits and work around them with a memory mapping and bus-flipping arrangement) Or they could've ditched the 68k in favor of something cheaper, like an embedded 650x (they were already using those in the Lynx, but an 8 MHz one with on-chip zero page would be really nice), could potentially be embedded into a custom chip, and use even cheaper 8-bit ROM and DMA everything into system RAM like the Lynx. (a Hitachi 6309 at 3.58 or 4 MHz would also be really appealing, though the latter would be an overclock) But given the Panther wasn't even ready for production at the time, and the 7800's sales were declining (along with the ST and 8-bit) in '89, especially compared to the 87/88 peak of the ST and 7800, an earlier release would be better, and the Slipstream chipset was ready-made, non-exclusively licensed, and had a bunch of software developers in the UK already working on it. Plus it was built on gate array logic rather than custom or standard cell masks, so was easier to start in smaller volumes at lower costs/risks. (though note, the custom system ASIC in the Sega CD was also a gate array part) The Slipstream wasn't great, but it was there and possible for an immediate 1989 launch or at least rushed test market. But getting into 100% hypothetical stuff that didn't exist at all at the time? Atari engineers could've looked at the Lynx, seen its design philosophy and either run with it themselves (or rather, commissioned a consulting group to handle it) or go back to the Ex-Atari-Ex-Amiga engineers who'd designed the Lynx chipset to do a console. The same sort of unified memory, DRAM-optimized set-up would've worked great for a home console, but it would've needed to be 16-bits wide on the bus end at least and possibly using a 32 MHz DRAM controller ... though 16 MHz could probably make do. (though 32.22 MHz divided by 2 would be good for NTSC timing, 3.58x9 = 32.22) Sticking with that and the Panther's theme of a 16 MHz 68k with cycle-stolen DMA, but going with the Lynx's low-cost DRAM+blitter+framebuffer arrangement, doubling the internal and external width for the video DMA and blitter operations: so 16-bit FIFOs/latches on an 8-bit bus become 32-bits on a 16-bit bus with a slow DRAM (random) read followed by a page-mode read ... probably 4+2 cycles or 372 ns at 16.11 MHz in 120 ns DRAM, similar to the lynx (100 ns DRAM and a 32.22 MHz controller could probably get that down to 7+3 cycles or 310 ns, but 120 ns Lynx speed would be a much more conservative/realistic goal). You'd get 10.74 MB/s with that, and using cycle-stealing DMA to do a 256x224 NES/SNES (or lower res MD) style screen at 60 Hz would use about 32% of the bus time, meaning the 68k would be closer to 10.95 MHz, or somewhat better due to internal register operations that avoid waits. This is a greater percentage of CPU time than the Lynx's CPU hits, but you're using more than 2x the bandwidth for a display like this, and an 11-ish MHz 68k would be plenty for the time. The Lynx's RLE compressed 1/2/4-bit texture format was also really nice, and extending that to a Panther/Jaguar style 8-bit offset in 256 colors (rather than 4-bit in 16 colors) would work really well, plus allowing direct 8bpp textures too. (maybe RLE, but potentially just uncompressed stuff, especially useful for treating portions of the framebuffer as objects for certain effects) 256 colors from 12-bit RGB would also be fine for the time, though 15/16-bit RGB would be nice. (you could also do software based translucency or shading effects via look-up tables, probably in ROM, especially if using 256x256x8-bit tables for translucent blending: 64 kB) Include the 16-bit multiplier unit and sprite scaling capability of the Lynx, and add a bit more to the Sound hardware, say take the Lynx oscillators+DACs and allow at least one DMA channel to feed them for 8 or 16 bit PCM. (if you used word-interleaved LRLR stereo a la STe, you could use a single DMA channel for 8 or 16-bit stereo as well, and be pretty nice for software mixed sound while having 2 or 3 DACs free for chip-synth sounds) 256 kB of 16-bit wide 120 ns DRAM would've been a very good cost compromise for 1990 with a framebuffer based console, and have plenty of space to load code and data into, and decompress graphics and sound into from slow cart ROM. (though unlike the Lynx, you could also work in ROM directly, for cases where that's useful ... large RLE textures and look-up tables would come to mind) And while it's no DSP, a fast 16-bit multiply unit would work around one of the 68000's biggest bottlenecks for software rendered 3D. (incidentally something that the Mega Drive really missed out on, as something as cheap and simple as the NEC µPD77C25 used as Nintendo's DSP-1 at the launch of the SNES would've allowed for something probably exceeding Super FX class 3D on the MD at much lower cost ... albeit the same goes for the Atari Falcon, if they wanted a much cheaper sound/fixed-point Math co-pro than the 56k the Falcon got, and potentially STe/MegaSTE vintage ... though embedding a custom 650x+multiplier chip for sound coprocessing and some 3D acceleration would've probably been cheaper for Atari with a 650x license already in use and all) Oh, and of course, you avoid the issue of a relatively alien architecture as the Panther Object Processor presented (and Jaguar later did). The 7800 was the closest thing out there prior to it, and it was rather niche in itself (and never really broke into the European market, either, so not tapping into the wealth of smaller software houses there, especially Atari-ST friendly ones). Software rendering and hardware-accelerated blitter rendering were much more well understood and also somewhat easier to use to simulate tile and sprite based graphics, but with the added flexibility of using framerates lower than the screen refresh rate without tearing (sprite drop-out) or flicker issues. Martin Brennan joining the Panther Project in 1989 might have been an opportunity to kick some sense into things, but with all that interest in the Lynx (and it going to market in 1989) on TOP of a major stake in the computer market, it's really weird that the Panther existed at all in the form it did. (it's a novel design, and the sort that an industry leader might be able to pull off, but not something good for a second-tier player ... let alone one built around Jack Tramiel's no-nonsense, straight low-cost consumer market ethos ... and cutthroat negotiation for that matter: then again it was Sam in charge by '89 and Mike Katz had left the games division as well, so leadership was certainly lacking, but I though it was Leonard and Gary who were more involved on the tech-management and logistics end ... marketing issues and negotiating with suppliers and vendors might have been Sam's fault, but it doesn't explain the odd engineering choices) Plus a blitter/framebuffer optimized design would be more useful as a component in a computer, even if just tacked on as an upgrade via the STe's Genlock feature. (ie rather than a unified bus with cycle stealing, attach the new video+sound subsystem more like a PC expansion card ... or the way most game consoles do subsystems on a dedicated bus connected via I/O ports and DMA interfaces) Standard STe graphics/sound for compatibility and enhanced features for 256 color chunky pixel modes, and possibly highres 4-bit packed pixel modes. (plus, with ST video disabled, and 120 ns DRAM + 16 MHz DRAM controller, and you'd be able to use a 16 MHz 68000 without wait states, sans DMA for disk access and such, and no need for a cache like the MEGA STe used) Juse use a 16.0 MHz clock for ST compatibility rather than NTSC timing. (you could do VGA style 640x480x4-bit and 320x480x8-bit 31 kHz video that way, though you'd need to use more page-mode bus saturation with linear reads and fewer DMA slots left for the blitter to work in ... and the CPU would be best only accessing in vblank, while the Blitter and DMA sound could still use hblank DMA slots, plus ST style 31 kHz monitor res at 32 MHz pixel clock leaves a LOT of hblank time available, so that'd be handy here; while dropping to 24 MHz, closer to VGA standard 25 MHz, would cut into that and not be slow enough to allow any interleaved DMA cycles, so 32 MHz ST style would be handy, plus it'd allow 640x400 16-shade grayscale on old mono monitors) See also: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm Note the DRAM prices were falling sharply in 1990/91 where they'd jumped up in 1988 then stagnated (crippling the ST's low cost and high performance ... or potential for the 1040STf to become the basic standard in '88) and it was a good time to release a console with a decent chunk of RAM and have it drop in price in the following couple years. Atari OTOH, had the very bad luck of going with 2 MB in 1993/94 with the Jaguar, at a time RAM prices again rose and then stagnated (due in part to a massive resin factory fire in Japan that crippled plastic IC packaging volumes and glob-top assembly) so it ended up staying relatively expensive and not being nearly as good a value as anticipated. It wasn't until mid-way through 1996 that prices dropped again, ie after the Jaguar was discontinued. (it wouldn't have been until the 1996 holiday season that a Jag+CD combo could've been a solid budget market competitor: ie undercutting the Playstation and liquidation-priced 3DO even at a sub-$200 level vs the $250 PSX/N64 of that period) Hell, they probably could've come out with an inexpensive 2 MB 16-bit DRAM cart for the Jag CD by 1997 due to the falling price of 2MB DRAM chips. (you'd need a DRAM controller onboard for that, and it's be 5-cycle latency like the fastest cart ROM setting, but still pretty useful: there's also a 2-cycle high-speed test ROM setting, but that'd only be useful for fast SRAM additions ... ie for things like a 64kB blitter texture buffer) In any case, 1990 was a solid time to release a console, and one Atari had the misfortune of passing up. (those DRAM prices also would've made the 128k+512k PSRAM/DRAM floppy disk Slipstream console a good investment, though that's partially hindsight, and pure luck + foresight of good market prediction in 1990 ... though they could've launched with 256kB and quickly offered an inexpensive 256k upgrade card as the market trend became definitive in 1991)
  8. One big consideration for RAM expansion on the Jaguar wouldn't be for sheer added storage capacity considerations, but for speed improvements. Even a relatively small chunk of 75 ns SRAM or PSRAM (like 64 or 128 kB) 16-bits wide on a RAM cart for the CD (or hypothetically slapped as an add-on for a ROM based game) could dramatically speed up texture mapping and potentially make the 68000 a bit more useful as well. The blitter can only render one pixel at a time in DRAM, and does so slowly, using 5 cycles per read and write (10 cycles per pixel). You can speed this up by loading small textures into GPU RAM, but that seriously hampers the GPU itself (contention for the GPU scratchpad will kill GPU performance), but having textures stored outside of the 2MB 64-bit wide DRAM bank would allow writes to DRAM in page-mode (2-cycles) and fast 2-cycle memory accesses to SRAM or PSRAM for 4 cycles per textel rendered (possibly 2-cycles per textel with SRAM; I forget if the blitter can interleave read and write cycles in separate memory banks ... it'd be 4-cycles with PSRAM in any case, due to the slower random read/write cycle times than SRAM). Worst-case would be on page-breaks in DRAM, where you'd end up with 5 cycles per textel, but that's still 2x as fast as normal Jaguar Texture mapping. I also forget if the cart bus has a locked minimum cycle time, but that might also be 4-cycles. (these are all 26.6 MHz cycles, mind you) Use of the 68k would be aided when working in 16-bit P/SRAM by not contending for DRAM and allowing interleaved DRAM accesses within 68k cycles ST/Amiga style, sort of. (the difference being separate memory banks and use of page-mode DRAM access) You could nest up to 3 page-mode DRAM accesses into a single 68000 bus cycle and have it working in the little P/SRAM bank without waits. The DSP also has a 16-bit bus and somewhat slow/broken due to bugs, so it's a bus hog in main DRAM too, but could be a fair bit more useful doing some parallel processing in added P/SRAM. (in typical games it's best just working from ROM for the same reason, loading sound data and such) You also don't need anything wider than 16-bits for texture mapping (straight, uncompressed 16bpp textures) and likewise nothing wider than that for helping with the 68k, or DSP for that matter (also only 16-bit access) Throwing a RAM cart like that bundled with the Jaguar CD would've made it a much more substantial upgrade back in '95, but it'd still be neat to see for homebrew stuff today. (more likely to see SRAM + ROM arrangements, though) They could've also built it into the CD unit, but given all the delays it saw and the added (if simple) logic required to disable the RAM when a cart was inserted would've made lumping it into a cartridge make much more sense. They could've put SRAM or PSRAM on normal ROM game carts back in the day too, but it didn't make sense with the sort of sales volumes they were dealing with, and the most demanding games would've been better on CD anyway. (stuff like Quake and Tomb Raider would've seriously been aided by something like that and would've been hardpressed to fit on carts anyway, especially at sane price points ... added RAM aside) It's also worth noting that the Jaguar itself supported 2 DRAM banks, but only one was populated. (so two pages could be kept open at once and some interleaved accesses without nearly as much of a hit to the bus) This isn't relevant for a cart-based expansion and they didn't include a dedicated RAM expansion slot, either, but it's an interesting missed opportunity to consider. (putting the DRAM control and address pins on an expansion port ... even limited to 16-bits wide would've been significant and much more cost effective than an SRAM expansion) Though given the added pin count and traces needed for a dedicated expansion port, it would've made far more sense to just add the DRAM control lines to the cart slot and allowed for up to 32-bit DRAM to be added on that end mapped to the empty 4MB DRAM bank address space. (they'd planned the Jag CD before the base system was test-marketed, and the Jag cart slot has a pretty hefty pin-count, so that would've made a great deal of sense) *The Jaguar's memory map includes 2 banks of 4MB each for DRAM and a single 6MB bank for cartridges (ROM or P/SRAM) with the remainder of address space dedicated to internal registers, I/O, and such. They also could've used a different base DRAM configuration and populated both banks, but that either would've added to cost (like 2.5 MB with just a 5th 512kB 16-bit DRAM chip added) or required a mix of different DRAM chips to be used with less total RAM. (say 4x 128kB chips for a 64-bit bank and 2 515kB 16-bit chips for a 1MB 32-bit bank ... less RAM and less cost-effective smaller DRAMs for the 64-bit chunk, more board space taken up, but potentially slightly lower initial price point and substantially better use of texture mapping as well as more flexible use of the 68k and DSP ... you could also drop down to 1 MB total system RAM with just a single 512kB chip 16-bits wide, which would be fine for a lot of early/mid 90s cart games, arcade games and SOME PC/Amiga ports, but you'd definitely want RAM expansion on a CD based console in that case; the lower initial price point might have been worthwhile) Oh, and Doom might have still been easier to code on a 1 MB (.5 MB 64-bit .5 MB 16-bit) Jaguar in spite of the lower RAM quantity due to better use of 68k and DSP, as well as ability to keep textures in the 16-bit bank for faster access. (and given how much profanity Carmack left in his comments of Jag Doom's source code, you can bet the bugs /and/ broken DSP bus controller were high on his lists of complaints) That's also still way more RAM than the 32x port had to work with slightly later (adapted from Jag Doom's source code no less) and that port also had to make do with less ROM (3 vs 4 MB). Jag Doom probably would've had some nice music in such a situation as well. (though 32x Doom also has a 7.67 MHz 68000 with its own private bus and 64 kB of PSRAM to work in totally in parallel, though I think 32x Doom just slaves the 68k to I/O and sound and leaves the brunt of things to the 32x side) In case you're interested: https://console5.com/wiki/Genesis_2 https://console5.com/wiki/SRAM_512Kb:_32K_x_16-bit Sega was using 64kB 16-bit wide PSRAM chips by Sanyo in later model Mega Drive 2s. (and other varities of 32kx16-bit PSRAMs on all but the really early revisions: the VA0 model 2 had a pair of 32kx8-bit PSRAMs instead) The Sanyo LC331632M was offered in 70 ns speeds as well. (the MD mostly used 120 ns) And also a neat resource for DRAM pricing up through the 1990s: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htm
  9. I'd love to rehash some of the comparisons and Amiga VS ST stuff ... or business decisions, or merits of the STe or Falcon going on in this thread, but don't have the time to read through it right now. Instead, how about something on point with the original topic? Hopefully noone's posted this video already: Nice selection of MOD players on the STe (and one on the ST, plus native Amiga playback for reference). One of those uses the 50 kHz mode with multiplex mixing, which I believe is sample-interleaving (ie a 50 kHz channel effectively becomes 2 25 kHz channels by interleaving samples on a byte basis when mixing rather than adding them together). The advantage there being slightly lower CPU overhead, but more importantly the ability to keep the full 8-bit resolution intact and not deal with clipping or quality loss. (the latter would certainly occur if you mixed a bunch of 8-bit samples to 16-bit resolution and output the upper 8-bits to the sound hardware ... some PC sound drivers for 8-bit sound blaster cards do this) Incidentally, the multiplex-mixed MOD player seems closest to the Amiga player to my ears at least. (though the comparison for all of the 25 kHz and 50 kHz players are pretty close ... the 12.5 kHz one is obviously another story, as is the ST player using the YM2149 for PCM) I forget if the STe has 2 DMA sound channels or not. Wikipedia mentions stereo mode being handled with byte interleaving LRLRLR style, which implies there's only one DMA channel actually being used. Regardless of that, I was also unsure if the 50 kHz mode functioned as distinct left and right stereo channels or not. (ie given the LRLR interleave, does stereo have half the max sample rate of mono, or does the stereo mode allow for double the DMA bandwidth of mono? ... I assumed it works as one left 50 kHz 8-bit channel and one right 50 kHz 8-bit channel, so the latter case of 2x the bandwidth) It might be similar to the PWM audio in the Sega 32x in as far as having a 'mono' mode that simply writes the same data to left and right buffers while stereo just allows either to be written to individually. (with stereo and mono modes having the same resolution and frequency limits) Chilly Willy would know much more about the 32x example if he's around. There's also various 8-channel mods among other things on both the Amiga and STe. (Turrican II's intro on the Amiga uses software-mixed music, going well beyond Paula's native limitations; it's rather impressive they managed to make the ST version sound such a reasonable approximation given the limitations there ... I believe that's a normal 4-channel MOD, but some other tricks might be thrown in) I'm not sure if Turrican II uses the 'effects' mode of the Amiga or just software mixes in the normal 4-channel mode. Effects mode (only one left and one right channel, but with 6-bit logarithmic volume control) on the Amiga would be much more in line with the STe's capabilities and best suited for software mixing with more demand for stereo panning effects. The STe would probably have a notable advantage here between the higher sample rate and LMC 1992 mixing/effects chip to work with. (ie not as good for games, but more useful for professional audio applications) The 8-bit sound limit is still a pretty big hurdle for the professional audio end of things, though, and complicates good quality software mixing (you need good optimization to avoid clipping while also avoiding overly quiet samples that end up effectively much less than 8-bit resolution). You could potentially wire the Amiga to be 16-bit mono by merging the left and right channels while in effects mode and setting the volume one one to be 255 times as loud as the other, adding together to 16-bit linear PCM (rather than the 14-bit nonlinear hack), and it could have done this out of the box if Paula simply allow for a mono switch in software. But the STE, OTOH should actually allow this to be done given the LMC 1992's functionality allowing left and right sound channels to be panned left, center, and right, or effectively leaving both as mono. (I'm not sure of the volume granularity on the LMC 1992, but if it allows any combination of settings such that one channel is approximately 255 times as loud as the other, you could get 16-bit mono out of it) On a side note, the STe audio is also somewhat comparable to the Sound Blaster Pro, or somewhat more capable given the 44 kHz 8-bit mono and 22 kHz 8-bit stereo limitations of that. (it also beat that to market by 2 years) The SB Pro also had a whole mess of FM synth channels with its dual OPL2s, of course, and the somewhat more feature-rich OPL3 in the SB Pro 2.0 (and SB-16) later on, but those got fairly mediocre use, almost never in conjunction with PCM and are mostly relevant for games rather than professional audio/music applications. (I'd argue the ST would have gotten much more use out of such FM synth chips had it ever had them ... due to the UK/Euro chiptune scene: you saw plenty of that on the Mega Drive; but that's another story as well) The sound hardware of the STe may not have been expressly designed for professional applications over games or home entertainment, but it certainly seems to be more useful for such and shines best next to contemporaries when used as such. (shame it wasn't added earlier, though ... like with the 1040 ST or MEGA ST, even without the LMC 1992)
  10. I'm just known for very long, often (at least partially) rambling posts. Don't worry, I'm not one of the more ... uh ... temperamental members of the community. (I'd like to think the opposite, really, I got along pretty well with Gorf back when he was still hanging around, I learned a lot from him, Atariowl, CrazyAce, and Kskunk -Kskunk probably most heavily on the actual electrical engineering and hardware design end of things) Honestly, if Atari was in a sound financial situation to launch the Jaguar in 1993/94, then yes, dropping a bit more and eating the added cost would be fine, but then waiting for a mass release in 1994 and avoiding the pre-release PR stunt in '93 would have been possible too. Throwing in an added 512 kB DRAM would be better either way though, especially if the 'addition' came down to more RAM or a more potent CPU (68EC020, Cyrix/IBM 386/486+cache, or 386DX -no cache but at least bumping JERRY onto a 32-bit wide bus). Both would be nice, but there's still going to be serious real-world cost constraints. (for 1994, a 25 or 26.6 MHz Cyrix 486DLC + additional 1 MB 32-bit bank of DRAM would be really nice and probably not going crazy with cost, but probably pushing it a bit -it MIGHT have been cheaper than using a 68EC020-25, though) The 68k+2.5 MB arrangement is probably the safer bet though, ideally with another 6 months or so to clean up bugs and spin off revised TOM and JERRY parts. (plus cut a little off JERRY's cost with a 128-pin 16-bit bus part rather than the 144 pin package) x86 would be really nice for source-ports of assembly language PC games of the time, maybe speeding up DOOM's development too. (I'm tempted to suggest a 386DX again for that JERRY bus boost too ... or a Cyrix or IBM 486SLC) X-Wing and Tie Fighter along with Lucas Art's adventure games would be really nice on a Jaguar CD. (Tie Fighter's smooth shaded lighting engine would be perfect for the Jaguar's 3D) And yeah, the added 512 kB DRAM is more for blitter benefit than CPU, and you don't NEED a 1 MB 32-bit bank to make good use of a 32-bit wide CPU+JERRY interface. (just use the 2 MB block for that, leave the 512 kB 16-bit block just for blitter textures) I'm tempted not to suggest the bottom-barrel 386SX given the more limited advantages over the 68k. (25 MHz 386SX might have had some merit though ... but the added cache on the Cyrix and IBM parts just make those so much more useful -Cyrix had more second sources and had their SLC on the market much earlier in volumes than the faster, larger-cache IBM counterparts, so probably the more realistic option -they were on the market by the time the preproduction Jaguar dev systems hit in '92, so a fairly real consideration, especially if Atari could get a deal for down-binned parts otherwise not very marketable) I also just kind of like Cyrix in general, neat company at an interesting time. Oh, right, and they also probably could've kept the 26.6 MHz timing AND dropped DRAM cycle time to 4 clocks if they'd bumped up to 70 ns DRAMs rather than 80 ns. (better option than dropping back to 25 MHz unless you're also using 25 MHz rated CPUs -and don't want to overclock) I was thinking of 1 MB in terms of what the Sega CD had to work with (512 kB program RAM + 256 kB of word RAM -sometimes used as a render buffer for the graphics ASIC, depending on the mode used), and thinking in terms of PC games typically using 640k or 1 MB up to 1993 .... but 4 MB minimum got pretty common right after that and 2/2.5 MB would make ports way, way easier. And yeah, Slipstream was more a suggestion of having SOMETHING to field, but the Panther might've been a better option there (just not with 32kB) ... I suppose if they got a really, REALLY good deal on the Ensoniq DOCII, used 128 kB SRAM + 64 or 128 kB DRAM for the soundchip, it'd be usable as-is (no redesigns to the custom chip) per the 1989 Panther prototype. (128 kB + 128 kB was probably the smarter move given how cheap 128k DRAMs were at the time ... ) They were using 35 ns SRAMs on the prototype for SOME reason, though. (and lots and lots of 100 ns SRAMs for the ROM emulation bank ... 2 MB of SRAM ... 16 128kx8-bit chips) And yeah, the 1989 Slipstream might have managed a better Star Fox (or Starglider III -or Return of Starglider as was the working title). Some of the developers were pretty scathing in interviews on it, but that was the 8088 prototype with something like 1/3 or less real-world CPU performance (way worse bus contention), plus a MOD player would take way less DSP time than FM synth (and sound nicer), so less contention for sound vs 3D there too. (that said, a 16 MHz 68000 might manage a better than SNES Star Fox too, and with 128k you'd have room for a framebuffer sized object in the Panther -and the majority of CPU time available due to the DMA-light nature of framebuffers vs many sprites ... using hardware scaled sprites for a fair amount of stuff would probably be wise, too though) I've been rather pessimistic over the cost of the Ensonic chip, but it'd be a pretty sweet set-up if affordable. Slipstream is nice if you like the idea of a floppy disk based console, though. (probably good for Wolfenstein style games too ... then again, so would the Panther ... 32 colors vs 256 colors though, or more likely 16 colors dithered to simulate 256 colors -or ... 136 colors, 160x200 effective screen size) Oh, and Wing Commander 1 and 2. (3 would be a good fit for the Jaguar CD) The Slipstream was also ready for mass production in an at least usable form in 1989, the same might not be true for Panther. (and if the work to 'fix' Panther was more than say ... tweaking the Slipstream to take a 12 MHz 286 or 68000, it might not be worth the difference) Oh, and floppy-wise, probably not worth worrying about backwards compatibility on the console front, so Jaguar could potentially be the same regardless of anything preceding it. (including media -using a simple SRAM+battery backup save system on the jaguar would probably be fine too ... preferably in memory card format, and a basic slow 32kx8-bit SRAM would be fine for that -the Saturn managed to fit lots more save blocks in that than a 128k flash card on the PSX, and save much faster ... shame it was integrated and not in a card format -and the 512 kB save-cart was way too expensive) A huge chunk of this is off topic yes, but I think it's important given the Jaguar's success was crippled by Atari's financial situation in 1993. Even a mediocre (but profitable) market success in 1989-1993 would have been a world of difference for them. (7800 level success should have been more than enough ... probably enough to keep the ST/Falcon and Lynx afloat too -both had plenty of potential left, Atari was just too hamstrung to manage all of that at once ... chicken-egg need-money-to-make-money sort of scenario -ie a bit more buffer to get through the roughest spots and they could have pulled out of the downward financial and management spiral and the Lynx and computers could have remained assets along with the Jaguar ... and Slipstream/Panther in the budget-console market) They totally could have ridden on cost/size reduced Lynx variants (with improved screens and battery life) into the late 1990s. (a lower cost lower-power non-backlit screen model would be wise too, but color LCDs weren't good enough for that until about 1996 I think ... had they gone grayscale back in 1989 with color optional, it would have worked, but doing that after the fact isn't practical -developers don't include features for such so games look wrong in black and white -16 shades of gray would be pushing it in 1989 too, but better than the GB's 4 shades would probably be possible ... maybe 8 or 16 with the condition that anything NEEDING good visibility would use higher contrast and anything subtle might or might not actually be visible)
  11. That seems unlikely to be the main reason. The YM2151 would've just been too expensive and IF Atari wanted to use SOME FM sound chip in place of the YM2149, the YM2203 would make perfect sense as it shares identical I/O functionality and backwards compatibility while also being cheaper than the YM2151. (or at least should be cheaper and a less high-end part than the 2151; the OPL would be the other cheaper option, but I doubt cheaper than the 2203 and less useful given the lack of I/O and use of 2-op rather than 4-op FM -yes, 9 rather than 3 channels, but not near as nice, plus the 2203 includes the YM square wave channels too) The YM2149 was better in both being cheap and having General Instruments manufacturing compatible chips (so Atari could pick between which vendor was cheaper at the time and had a second source if shipments were delayed). Given how high-end the YM2151 was in 1985, Atari was probably better off looking into the likes of Ensoniq or such for competitive pricing. Anything in that range would be more an add-on but potentially a cartridge type arrangement rather than a more costly midi box. (the existing ST cart slow has enough address space to support the Ensonique DOC used in the Apple IIgs, but an external audio mixing line would be needed -adding audio in/out lines on the cart slot would've been nice, especially with the YM's 3-channel output for potential stereo effects through an external device) Still, simple DMA sound would be the cheapest enhancement and probably something omitted more due to time to market than any material cost. (though not including it in the 1040ST or 520STf in 1986 or at least the MEGA models in 1987 was more a mistake, same for omitting scroll registers in the SHIFTER -a bigger problem than lack of blitter and the main two things that even lazy Amiga games tended to show off) Scrolling's also useful for business applications, especially in a single-tasking environment with full-screen applications. (scroll most of the page and just re-draw the static boarder and mouse cursor -nice for paint programs too) I suggested the same thing in a thread a few years ago, that along with dual POKEYs in the ST and similar things (including omitting both ACIAs and the hitachi keyboard scanner MCU in favor of a parallel keyboard interface) but there were a bunch of practial counters to it, particularly including that a DMA sound circuit should be much cheaper than any of those options and more powerful. (in fact, the same criticism comes up from Genesis/MD programmers who have any hardware design knowledge -the Z80 is a massive waste in there and even gimped as a poor-man's less-than-cost-effective PCM sound driver; adding a DMA sound circuit and dropping the Z80+SRAM into the Power Bace Converter should have cut costs and made the MD easier to get good sound out of -and no Z80-68k bus arbitration logic needed, either ... smaller, simpler, cleaner board and the only backwards compatible hardware being in the VDP and I/O ends of things) There's probably a better argument that the Apple II to Macintosh transition should've been backwards compatible (use the 6502 bus for I/O and video handling, let the 68k run without wait states) especially since up to 1984, all Apple IIs had still been plain old 1 MHz systems, making that much more potential for cycle stealing bandwidth without totally halting that little 8-bit processor. (you could easily manage 3 MB/s on the 6502 bus without asserting wait states, PLUS the 6502 is also extremely fast on interrupts, so if they DID go lazy and omit DMA sound, it'd be pretty reasonable to just throw in a bare DAC port for the CPU to write to manually with an interval timer setting the sample rate -still a waste though, better to use DMA sound and make use of that 1 MHz CPU for coprosessing/IO handling) The Apple II was a much more costly/higher margin product than any of Atari Inc or Atari Corp's machines, so a more costly arrangement like that makes a lot of sense. (same for Tandy using a 4 MHz Z80 + 6 MHz 68000 in the Model 16 ... same WOULD have made sense if IBM had gone for a Z80 for the PC rather than an 8088 and dropped in a 68000 later on instead of upgrading to a 286 -a 5.37 MHz Z80 would be faster/nicer for a lot of stuff than a 4.77 MHz 8088 anyway, even with a primitive bank-switching scheme like the TRS 80 Model 2 used ... even a 3.58 MHz Z80 might've been preferable in some ways -sticking with the NTSC-centric timing IBM used) But back on topic: the ST in large part was to be a Better, color-capable Macintosh for a rock-bottom price. And honestly, hardware-wise, the DMA sound of the mac was the only technical shortcoming of the 520ST. (the CPU was faster and wait-free, the monochrome mode was higher res, the standard monitors were larger -though H/V beam pitch/height control pots would've been nice to hide overscan, it had 2 color graphics modes and a nice palette, and the overall base system was more capable than a Mac or PC/AT short of the latter's standard hard drive -Atari should've gotten on the ball with a SCSI hard-drive interface via the DMA port much sooner too, particularly in time for the MEGA ST a couple years down the road -I understand the pricing wasn't right for the 520 or 1040 in Atari's business model, but the Mega should've fit even a luxury priced drive) Interesting, but that doesn't sound quite right. Firstly though, I will say that I meant that Katz and Jack Tramel worked well together on a management/business level, not that they LIKED each other per se. Both were capable of dealing with a critical, no-nonsense hard business style from what I've seen/read and given Katz didn't leave until AFTER Jack had stepped down as CEO, I can't think that he was pressured into leaving (or outright fired as he was at Sega) for failing to fall in line and humor upper management's egos. (I could see him leaving in disappointment over the loss of the Genesis contract, or in disappointment in what Sam Tramiel was doing, or just general wear and tear over many factors -he did leave to take an extended vacation from the industry as a whole that was only cut short at the behest of Sega) I also meant that, regardless of their relationship, Atari Corp never regained the level of management ability shown during the Jack+Katz days. (either on the computer or games end of things) From Jack's point of view it really seems foolhardy to judge the XEGS that way ... the ST and STe perhaps, and the Panther must have been on the drawing board in 1988 as well if not wirewrapped (it was in down silicon the following year following revision and refinement from Martin Brennan) but the XEGS made no sense there (and honestly, it made very little sense next to the 65XE selling for less -a Deluxe gaming bundled 65XE would've made more sense and simplified mass production) A very stripped down STe might have made a decent budget-market console for 1989, but really would've made more sense back in '87 had they had a SHIFTER with at LEAST hardware scrolling at the time. (with scrolling you can get away with a lot more without needing massive pre-shifting tricks and thus make do with around 128 kB of DRAM rather than 512 kB ... scrolling SHIFTER with DMA sound would've done it, along with scrapping the mess of computer-specific I/O hardware in favor of something cheap and simple ... a 256x200 32 color/shade mode would've been nicer too, better pixel size for NTSC and better color without more CRAM and just one bit for high/lo intensity -ie 3-3-3-1 RGBI) Now ... a 7800 expansion module in 1987 would probably have been a better idea than the XEGS too, probably easier to expand on their existing budget market than a 16-bit console too. 32kx8-bit SRAMs were getting cheap by that time (cheap enough that Epyx put them on carts that only needed 16k) so something like 32k+POKEY would've been nice, maybe with SIO and keyboard ports too. (XEGS style keyboard port would be useful there) Sacrificing 4 kB of that RAM to mask the existing SRAM block would be a nice option if they planned to upgrade the base unit into a Super 7800 type system. (if they wanted to keep it add-on only, it'd be better to just bank switch the 32k SRAM into the existing 48k address space) 32k would be enough to load tape based games into and/or use framebuffer based 4 color backgrounds, or multiple 3 color playfields even. (and software sprites or -more likely/useful, software driven character modes on those playfields that take less time per frame to build than CPU+DMA overhead of MARIA sprites each 160x192 3-color plane would be 7.5 kB or a bit more if you use overscan to help with scrolling -hiding character updates in the boarders) Probably make Summer and Winter games both launch titles for the expansion system and cancel the more costly versions with onboard SRAM. (but beef the new versions up with some nice POKEY tunes) You'd also have a system that (for cart games at least) was definitively superior to the 65XE. (better graphics, equally fast CPU -with similar free CPU time in framebuffer displays, and better sound via TIA+POKEY) No low res 9/16 color modes, though, for what those're worth. If they did release an all in one Super 7800 (or 7800Plus) or whatever, using a nicer arrangement for the TIA/MARIA video would be worth it, even if it amounted to a literal manual switch. (composite video connectors like the XEGS used would be nice too) A cost cut, merged TIA+RIOT would be nice too. (arguably a better cost-cutting measure than the single-chip late-gen 2600 Jr, unless it was cheaper to just re-use that chip and disable the 6507 when using MARIA) Hell, given how late the 7800 was introduced overseas, it might make more sense to go straight for the 32k+POKEY equipped model there. (that might actually have a chance at competing alongside the Master System in the budget end of the market there, especially with SIO tape drives used for budget software) The $200 price tag the XEGS was touting in 1987 would be much more justified for a deluxe bundled 7800+Expansion module (or integrated all in one 7800EX) with pack-in games showing off the new capabilities ... maybe include Ballblazer as a built-in game too. (nice POKEY-compatible game that'd be confusing to re-release separately) Another bonus to drawing more attention to the 7800 is the sprite/list system is similar to what the Panther and Jaguar later used, so if Atari was keen on sticking with that (rather than say, dropping it in favor of a blitter based system -or in the Jaguar's case, blitter without OPL) it'd give developers more room to get comfortable with using the novel/different architecture. (I'm not so sure they should've gone with Panther, but this would at least give it more merit ... designing the Panther around 4 of those fairly cheap 32kx8-bit SRAMs rather than the rather costly 8kx8-bit 35 ns -386/486 cache class- SRAMs, it'd have been far more useful ... maybe needing an added latch to make it work right, or clocking the graphics chip and CPU a bit slower, but totally worth it -using an 8 MHz pixel clock like the ST is kind of nasty on NTSC TVs anyway, blurry and leaving a big boarder at 320x200 ... 12.5 MHz or 13.3 MHz like the Jag used would be nice, or something more universally NTSC/PAL-compliant ... plus SRAM makes interleaved bus sharing so so SO easy it's a horrible waste not to take advantage of) I'm also assuming the 16 MHz Panther used a simple 8 MHz dot clock and not more variable (with the 32.22 MHz master clock, you'd get a nice 320x200 screen at 1/5 speed -6.44 MHz, just below the MegaDrive's 320 wide mode, and 1/6x would give a nice 5.37 MHz if you wanted to exactly match the SNES or MD low-res mode -useful for ports using common pixel art for similar aspect ratios) ^Honestly, I think they'd have been wiser to invest more in the ST chipset at the time, possibly with re-spinning it as a game system as a secondary goal. (putting more effort into making the 1040STe really nice and a serious upgrade -like 16 MHz CPU, 256 color chunky/packed pixel modes, maybe a 16 MHz blitter with fast block/line fill and move for chunky/packed pixel operation, and overscan support for the SHIFTER with more variable dot clock/resolution modes) And honestly, adding that to the SHIFTER should be a simpler job, chip-design wise than the Panther by a good margin. (hell ... even with the 1989 STe as it was, if they'd turned around and spun a 16 MHz, 256 color capable, chunky-pixel arranged system for 1990 ... something cheaper/simpler by far than the TT030 -ie about as cheap as the existing 1040STe- it'd be pretty nice ... hell, given how little the BLiTTER was actually used, and given how decent a 16 MHz 68000 would be at software blits for 8-bit chunky pixels, they could save the space/cost and omit that entirely -the TT030, unfortunately did not use chunky pixels and thus had less than elegant handling of its added 256 color VGA monitor modes)
  12. Or as I concluded with above (yeah, long post ... ) had CBM not bought out Amiga and broken contracts with all their former investors, the Amiga could have been a semi-open licensed standard, possibly including both Atari and Amiga. At least it would if Atari Inc had continued to exist (game/arcade chipset in 1984, computer in 1985 with 128k RAM cap, and unrestricted use of the chipset in 1986 -per Atari's contract). Whether Atari Corp would have opted for that (had Atari Inc still been liquidated but Amiga not defaulted on their chipset ... or withheld the preproduction chips promised to Atari in early 1984) is another matter. That 128k limit would be a big deal compared to the ST ... then again, without the Amiga contract default, CBM's injunction might have gone unchallenged leaving the licensed Amiga (MICKEY) design as their best bet anyway. (forcing them to make do with a 128k system for 1985 might not have been such a bad thing either, plenty of IBM and Apple machines were making do with that or less -remember the 64k Mac was there initially) Plus it'd force Atari to include RAM expansion provisions unlike the ST. (and a more flexible expansion port than the cart slot on the ST ... probably -a 40 pin socket would work rather well for a 16-bit SIMM-like RAM expansion card interface, multiplexed address lines and all ... come to think of it, that could've allowed any base model ST to be expanded to the maximum 4 MB via a simple no-frills cartridge, shame ... and make the 1040ST memory configuration much more standard) But yeah, Amiga could've been an industry wide standard if CBM+Amiga Inc hadn't gone proprietary.
  13. I still think having Atari Inc as an autonomous subsidiary would have been better (with better chosen management -closer to what Morgan was in 1984), but short of that, at least not having conflicting dual-management would have been a big improvement, especially from anyone experienced with the entertainment industry. Really, any change that could avoid the distribution network and market modeling/analysis problems Atari had would have pretty much assured they had enough margin of error to solve most other issues in due time. (solid distribution system = no overproduction of carts, no bloated market, no massive cost overruns due to overproduction, no odd target market errors as with the 1200XL and 5200 -and lack of 600- in 1982, no crash in 1983, etc) There's also the opposite problem: Intel has competition from AMD, Cyrix, IBM, and Harris putting out competitive or faster CPUs at competitive or lower prices. NEC did too to some extent, but Intel managed to slow their penetration into the market given they weren't an official licensee. (Cyrix managed to leverage alternate legal precident to avoid major delays in the 486SLC/DLC's release, partially due to the later date and extensive precident from NEC and AMD) There were other competitors and licensees too, but most of those were 8088/86 licensees only. AMD and Harris offered 16 and 20 MHz 286s (and 25 MHz in Harris's case) when Intel topped at 12 MHz rated parts to keep the gap to the 386SX wider. In Motorola's case, they refused to license anything beyond the 68000 (or CMOS variant thereof) and no 3rd parties took it upon themselves to engineer their own competing derivatives. No 16-bit bus 68000s with clock speeds rated beyond 16.7 MHz (Motorola had 20 MHz parts), no added MMU or cache logic or 32-bit external bus other modifications for more versatile or powerful CPUs. Hitachi's HD68HC000 in particular SHOULD have been able to have high yields of much higher than 16 MHz clock speeds (CMOS Hitachi parts of that vintage have pretty good reputations for both low power usage and good overclocking compared to MC68HC000s -which themselves aren't bad at all for overclocks) 3rd party accelerator board manufacturers did up-rate 68000s for speeds into the 36 MHz range, so Atari and other OEMs potentially could have done the same, but that still wouldn't cover the lack of internal L1 cache or a 32-bit data or address buses. (admittedly the cache and bus-width issues could be worked around at the chipset level with 32-bit latches and board-level caches -or cache integrated into the chipset ASICs themselves) Computer systems actually using 68000s beyond 8 and 10 MHz were also fairly rare, so that's another factor. (seems an odd engineering choice in general on many counts, and lack of MMU support is only a partial explanation given there WERE usable workarounds for that issue on the 68000 and single-task embedded platforms or computers -particularly the ST- that weren't implementing memory protection -though the MEGA ST and later models DID feature tweaks to allow protection through the system's MMU) See: http://www.dadhacker.com/blog/?p=1383 On the CPU end of the market, Motorola competed fine with Intel right up into the 486 era. Had they focused on a die-shrunk 3.3/3.52V version of the 68040, they might have kept pace with or stayed ahead of the late generation 486s and early Pentium as well. The 68060's performance gain over the 040 was fairly poor while the 040 had excellent ALU and FPU performance compared to the 486 and could've dug into the Pentium market fairly heavily if it could reach higher clock speeds. (a die shrink also should have meant easily expanding the 4+4 kB cache size to at least double that like late-gen 486s did) Hell, Apple might have held off their transition to PowerPC if Motorola had 100+ MHz 68040s out by 1994. (clock for clock, the 68040 was probably closer to Cyrix's 5x86 than a 486, ignoring finer architectural differences between x86 and 68k -or IA-32) Plenty of companies did better than Intel in SPITE of lacking the same level of manufacturing facilities. Cyrix did fairly well using 2nd sourced manufacturing through TI and others in their early days and then rather impressive CPUs once they got on IBM's higher priority, modern manufacturing infrastructure (at least closer to AMD's and Motorola's production facilities) and even managed to outperform Intel's top rated CPUs for a brief period when the 6x86-200 was released (150 MHz, 75 MHz bus) with real-world aggregate application performance beyond that of the Pentium 200 or Pentium Pro. (though 32-bit Unix/Linux, OS/2, and Windows NT generally ran faster on the Pentium Pro) AMD had outstanding performance with the K6 family before the world-leading Athlon came on the scene (and the K6-III had a brief period as the fastest x86 CPU on the market at its introduction) and Cyrix faired reasonably well with the M2 (6x86MX) initially but quickly slid downhill following their National Semiconductor merger. Falling from standout budget to upper midrange CPU maker down to marginally attractive budget-buy down to falling off the map by 2000. (though to be fair, the final Cyrix design VIA almost released before dropping it in favor of an IDT design under a Cyrix label probably would have outperformed the latter by a good margin, it just wasn't super cheap like the IDT part and had a lower clock speed -IDT's Winchip-based 'Cyrix III' chip could be clocked higher and thus marketed more easily, plus used about half the die space and power of the Cyrix designed 'Joshua' part -the latter probably would've performed fairly close to a K6-III or Copermine Pentium III of similar clock speed) Not to even get into the performance on the Power PC end of things. (which DID avoid the major pitfalls of the 68k architecture -IBM and Motorola being very open with cross-licensing and competitive pricing, though raw demand was far lower than 68k in its heyday, let alone x86) The 68k family had no real price/performance range to match the likes of the 386DX-40 or Cyrix, IBM, and AMD 486 parts in the early 90s. Sure, the 68000 from its various vendors was plenty attractive next to most 286s and slower 386SXs (and cheaper to manufacture than either) but that fell out of relevance beyond the embedded scene by the mid 90s. (though again rather moot due to the very limited number of companies actually USING 16 MHz 68000s, or even 12 MHz ones) Atari's ST line would've been pretty interesting if they'd been using 16 or 12 MHz 68000s by the time they were using 12 and 16 MHz 286s in their AtariPC line. (they also had a rather nice 386DX-40 machine in their ABC line a bit later ... good luck managing a 33 or 40 MHz 68030 in the Falcon at that time, or 33 MHz 68020 even ... or 25 MHz 68EC020) The TT030 was really an odd duck for Atari to release ... in 1990 you had the 8 MHz ST, Mega ST, STe, and then the 32 MHz (very expensive) TT030 and nothing in-between at a time when mainstream PC buyers were mostly beading towards 12 or 16 MHz 286s and 386SXs were mostly a marketing gimmick not worth the cost (and not performing better short of an SX-25) and 386DXs were luxury items just behind the 486SX and workstation-class DX. (1991 would change all that with AMD's DX40 and 1992 with Cyrix breaking into the market) This in the context of people who built/upgraded their own PCs or went to PC shops to do so, not looking at those buying the (rarely worthwhile) overpriced and vastly underpowered pre-built machines or astronomically priced business/workstation class machines. (that said, Atari's PCs actually seem very close to some of the more rational/nice custom built jobs of the era ... the 386SX-16 ABC was probably a ripoff -as nearly all 386SXs were, but their 286 and 386DX-40 machines seem pretty damn nice for the time -and indeed, going from a 286-12 to a 386DX40 was one of the most sensible and rational real-world upgrades you could do with a PC ... next beyond that would be a 486DX/66 or more likely waiting for a DX/100 -especially if that DX40 had board-level L1 cache or at least sockets to upgrade later ... most PC games in 1994 would run fine on a decent DX-40 rig with VGA ... an ISA video card might struggle a bit with screen tearing, VESA would be better -though for 320x200 256 color stuff, a fast 16-bit ISA card would probably suffice) And on a side note, the 68000 MIGHT have fared a bit differently if the Amiga chipset had been cross-licensed and Amiga Inc stayed independent and fabless. (a variety of manufacturers all with their own machines derived from the same core chipset, including Atari Inc, no Commodore buyout, no MOS-exclusive manufacturing, etc ... plus maybe seeing the thing hit the market in 1984 rather than '85 ... maybe implementation of the Ranger chipset too -though I think going with a 32-bit DRAM chipset would have been more useful and cheaper than the VRAM arrangement, unless they used DRAM for most of ChipRAM and only a small amount of 16-bit VRAM for the framebuffer ... a 512k+128k machine would've been nice for 1987)
  14. Hardware wise, ignoring a ton of other variables and missed opportunities on the R&D end from 1989 to 1993 when the chipset was frozen for production and indistrial design of the PCB, case, etc were laid down for mass production, the only big engineering mistake seems to have been heavily banking on DRAM prices dropping rapidly. The Jaguar was screwed over by the same situation as the ST in 1988: DRAM shortage/crisis stagnating (and even INCREASING) prices of existing DRAM grades. (that more than anything else killed the ST's momentum or its supremacy in the 16-bit computer market in Europe ... or potential edge over the Amiga in any market -lack of major hardware updates didn't help, and Sam taking over as CEO in 1989 ... ) It's compounded by the fact that that single-bank 2MB arrangement also squanders some very nice features of the chipset and for a cart based system, RAM capacity isn't all that critical (yes, nice fast memory to work in AND decompress into, but there's moderation). Cutting down to 1 MB and lumping another MB onto the CD drive later on would have been a far, far safer bet and very practical for the time. Using 4 128kB 16-bit wide DRAMs for one 512 kB bank and a single 512kB 16-bit wide DRAM for the 2nd bank would be great. (allow a lot of interleaving with 68k and DSP accesses in the 16-bit bank, speed up blitter texture rendering a ton, and retain peak bandwidth for OPL-intensive games and minimal framebuffer scanning overhead) You'd even still have room to Z-buffer with some freedom. (tougher if you interleaved with the framebuffer on a 64-bit phrase basis -ie 4x 16-bit screens rather than 3x- but still doable if most/all textures are in the other bank and you're not using many sprites) With interleaving they might have been better to stick with 25 MHz rather than pushing to that extra 26.6 MHz given it'd allow 4 clock random access cycles rather than 5 for better interleaving and synchronization with 68k cycles. (lose a little on peak FPM bandwidth, but gain a lot on average performance and latency) Plus you can use 12.5 MHz rated 68ks and save a little bit. They probably could have cut more than $25 of the raw component costs, and a good deal more when scaling that up to final retail distribution pricing (Kskunk said a save rule of thumb is double, but it's obviously more complex than that in real world terms). Managing a $199 price point for the promotional test market in '93 and $149.99 for a bare bones core-system arrangement for the 1994 launch might have been doable then. ROM is in a separate bank as well and wouldn't screw with DRAM cycle timing, so having the DSP mainly limited to ROM fetches might have been one more way to minimize bus strangling. (for games using the DSP purely for sound -hard to do much else given its slow bus logic, ROM would be fine for streaming samples from for sample based stuff -rather than wavetable synth using ROM and scratchpad RAM alone) Hell, dropping the RAM content AND going with a CD-ROM based machine from day one probably would have made sense for the time too, Atari apparently had problems negotiating for cheap enough drives and controller chipsets but they probably could have compromised more on that to make it to market sooner. (licensing that Phillips CD controller chipset was a bad investment anyway, better to buy off the shelf until they have enough volume to merit long-term investment like that) Flair had already embedded a CD-ROM decoder in their more modest Slipstream 3 ASIC, so taking that as a priority sooner should have allowed it to be crammed into JERRY as well ... though its existing UART bugs don't make that promising. (not sure why they didn't go cheaper/simpler with JERRY and use an up-clocked flair DSP, save a ton on silicon and leave less space for bugs -maybe allow use of simpler/faster to engineer gate array logic to get the bugs smoothed out quicker than standard cell) Using a 1x rather than 2x speed drive is a given for a 1993/94 timeframe too. (keeping the total system cost close to the Sega CD's price would have been a nice marketing angle ... new 64-bit machine CHEAPER than the combined cost of a Genesis + CD bundle ... let alone 3DO) They critically failed to introduce the Jaguar to the UK (or mainland Europe) in a timely manner ... striking the London and Paris test markets was a big blow there, but failing to ramp up distribution for the 1994 launch was the real nail in the coffin. (rather than trying for a nationwide US launch they probably should have focused on their strongholds ... places with remaining market recognition and/or relatively cheap/dense distribution and viral marketing potential, I think California might have been on that list, or at least select regions thereof, and the UK was a big big win if they wanted to actually make money on the system -there's bleeding money on marketing to try and make you system flashy and mass market worldwide, and then there's compromising to produce a semi-niche low-budget next-gen product that's at least profitable and has the potential -if you're lucky- to make it big if that market sector pans out to bigger potential ... and you have the investor backing and management to back it up -they needed another Michael Katz type manager at that point ... pulling another 7800 level success or then some) That said, they also could have just kept the Jaguar in development longer and used the simpler, cheaper single-chip ASIC Slipstream 4 design Flair also had ready by 1993 (1992 as well, but there were multiple revisions and the Slipstream archive site only has the 1993 V 3.3 manual available -the 1988 and 1989 versions of the prototype and production 16-bit slipstream are there too, but nothing on the 1990 or 1992 revisions) Less powerful but critically cheaper and simpler 32-bit system targeting a 386SX though I believe (like the Jaguar) able to swap big and little endian modes and support a variety of CPU types. (swapping that 386SX for a cheaper, more elegant 68000 would probably have been the natural solution) It supported dual DRAM banks and FPM access as well, but no CRY color or 64-bit stuff, just a blitter, DSP, I/O hardware (including CD-ROM decoder), video controller, and memory controller. (it used 18-bit RGB paletted and 5-6-5 RGB direct color with gouraud shading/lighting and color blending done using that rather than CRY methodology -I'd have thought going 4-4-4-4 RGBY would have been a simpler extension to the older Slipstream configuration and simple 4-bit additive based color blending and lighting ... and an easy option for a 4-4 C/Y palletted 256 color arrangement with fast shading, but they went more typical highcolor or SVGA-like it seems) See: http://www.konixmultisystem.co.uk/index.php?id=downloads It'd be really interesting to find the missing Slipstream developments from the time period during the Jaguar's development too, especially something to fill Atari's massive 1990-1993 console market gap better than the existing Panther or 8086 based Slipstream 1.06 of 1989 could have done -the latter DID push the alternate budget market angle with floppy disk software targeted and did have pretty notable developer support already AND pending interest from Lucasfilm Games. (as it is, I'm kind of surprised Atari commissioned the Jaguar rather than settling for the simpler Slipstream ... or didn't funnel Flair's engineering expertise into the ST's chipset enhancement/consolidation in 1989-1992 leading up to the Falcon's release, or MEGA STe for that matter) I say 1990 gap due to the 7800 declining heavily in 1989 and nearly dropping off the map in 1990 in the US. (though it was just getting its very late start overseas) If you haven't looked at it before, the Slipstream Archive page also added the manual for the 1989 8086 (not 8088) based production Slipstream 1.06 design with tweaks and bug fixes. (including 16-bit DRAM interface and 16-bit bus latch for the 8086 to interleave memory cycles rather nicely -rather than halting it during active display as the 8088 prototype did) And the Slipstream 4 (Rev 3.3) lists copyrights of 1988, 89, 90, 92, and 93; '88 is the 8088 prototype's date and '89 is the 8086 production version, but the '90 and '92 revisions are unaccounted for. (as is 'Slipstream 2' and 'Slipstream 3' naming conventions ... unless the 8088 to 8086 transition marked Slipstream 2, in which case that still leaves 3) Edit: Just quickly glancing at the last couple pages of this thread, there's some blatant misconceptions going on (or possibly trolling, not going to bother guessing which ). The 7800 was NOT a failure and Jack Tramiel did NOT try to 're engineer the company into a computer company: Warner liquidated Atari suddenly and without warning in July 1984 (4th of July weekend of all things) with James Morgan and the rest of Atari upper management (let alone other staff) uninformed and left in chaos ... Tramiel and co. had also not been informed by Warner how poorly/rashly this was being managed ... or point out that James Morgan should be contacted ASAP to cooperate with the transition. This left a confused mess for Tramiel to deal with when he began reviewing the liquidated assets of the then-defunct Atari Inc. ... and staff that had not been notified that the company they worked for no longer existed (a legal mess to given all the breeches of contract ... really big SNAFU on Warner's part) Warner ALSO failed to include the 7800's license/rights in the sale, which cost nearly a year in renegotiation and litigation over this until it was smoothed out in early 1985. Tramiel's company (Trammel Technologies Ltd. ) took on Atari's old assets and was renamed Atari Corporation in 1984. He brought Mike Katz in (former president of Epyx and later president of Sega of America) in to develop Atari Corp's new games division in 1985 to build up marketing/distribution and new software development for the 7800 and 2600. (or rather he brought Katz on for games management/marketing, and Katz convinced Tramiel to establish a separate division for that) It was during that time that Katz also realized Nintendo had exclusive contracts with most Japanese arcade companies and decided to primarily pursue American computer game publishers. (and leverage his relationship with Epyx heavily) The 7800 was a market success for Atari Corp, holding the second highest market share (above the Master System, behind the NES) from 1986-1988 (with 87 and 88 being very strong years for hardware sales, though '87 probably the market-share peak given Nintendo's massive increase in sales in '88). 3.77 million were sold in total between 1986 and 1990, 3.76 million of that being sold through 1989. (sales dropped off drastically in 1990 with under 100k sold) See: http://www.gamasutra.com/blogs/MattMatthews/20090526/84050/Atari_7800_Sales_Figures_1986__1990.php Jack Tramiel stepped down from his rol as CEO in late 1988 (Sam taking over and Jack staying on in the less hands-on chairman of the board) and Mike Katz left in early 1989, went on an extended vacation from the industry only to be pulled out early by Sega's Co-founder David Rosen (Founder of Rosen Enterprises -one of Sega's original parent companies and Chairman of Sega America) and became president of Sega of America shortly after the Genesis's launch. (he orchestrated the Genesis Does campaign, established a working relationship with EA, built up Sega's sports lineup, among other things) It's also notable that in late 1988, Rosen had been in negotiations with Jack Tramiel and Katz to distribute the MegaDrive/Genesis in North America. (in large part due to Katz's success with the 7800) Katz favored it, but Rosen and Tramiel couldn't agree on terms. Katz was also gone by the time the Lynx was on the table (ironic given his connections to Epyx) so he had no hand in building up its launch or form factor. I assume that the combination of the Panther being in development (possibly alongside an ST/STe derived console) and the offer only extending to North America (leaving a conflict of interests in Europe) contributed to Tramiel and Rosen's disagreements. (honestly, the loss of Katz and Jack's management talent was probably a far bigger blow to Atari Corp than failing to secure that license ... between that loss of management and the sting of the 1988 DRAM crisis, the ST never regained its momentum, and I can only imagine how Katz might have contributed to the Lynx or development of the Panther -or work with Martin Brennan and John Mathieson from Flair)
  15. They had the Atari 600 in the works to complement the release of the 1200XL (and replace the Atari 400) but canceled it in favor of targeting the higher end of the market entirely. (similar mistake made with the 5200) If you want to go any cheaper, drop in a chicklet or membrane keyboard in leu of the 400's mechanical one. (but given the cost of the machine and its 16 kB, a full-throw keyboard would probably be worthwhile to counter the VIC-20 in quality and performance terms -maybe a hard-capped collapsing rubber dome keyboard, but most of those were pretty chicklety in '82 ... the XE and ST keyboards were a very nice compromise, but I don't think those options existed in '82 -still, something better than the 400's flat, zero feedback membrane board was definitely out there ... even the Commodore Max's membrane board had raised, collapsing bubble style keys) They should have offered a 400 with full throw keyboard prior to that too, especially with the 32k models. (and potential 16, 32, and 64k machines to follow on with the XL series -2kB DRAMs were still cheaper in 1982, though slapping a full 32 ICs on the board for 64k probably wasn't worth the trade-off, 16k would have been and probably 32k as well ... I'd get into 48k vs 64k too, but that arrangement alone didn't really contribute to cost or compatibility issues on the 1200XL -the specific WAY they configured the 64kB and the new OS had problems, but none that shouldn't have been fixed if looked at seriously or worth resorting to an unexpanded 48k machine ... unless they did so intentionally to save cost and leave bank-switching logic on an external module to be connected to the PBI) http://phe.rockefeller.edu/LogletLab/DRAM/dram.htm(DRAM price list, 5th chart over, gives a decent idea what engineers would have had to consider in '81 and '82 -the price disparity of 64kbit vs 16kbit chips in '81 in particular would have given a lot of pause to new developments targeting the higher density) Anyway, had they actually focused on tailoring hardware to the UK/European marketplace earlier on (ie as soon as they started moving into that market), cutting the costs out of the 400/800 as far as anything FCC-specific went (the multi-board configuration with all the aluminum castings) would've been a big target to go for both reducing the price point and improving profit margins. (plus the more compact form factor would sell better) That'd also give them a leg up once the FCC Class C requirement came on the scene, expediting a new release before the 600 and 1200 could be completed. I also err on the side of skepticism as far as the 3200/Sylvia project goes given its time gap between development and the 5200's release. It seems more likely that concerns beyond time to market dictated its cancellation. (to the extent that my arguments of a 6502C+SRAM hack is practically moot -the 3200 also should have provided a unique architecture harder to produce unlicensed games for than the 5200 or 8-bit computers if they cared to implement such -also moot until Atari kicks their but in gear regarding their OWN distribution management)
  • Create New...