Jump to content

kool kitty89

  • Content Count

  • Joined

  • Last visited

Community Reputation

159 Excellent

1 Follower

About kool kitty89

  • Rank
    River Patroller
  • Birthday 08/08/1989

Profile Information

  • Gender
  • Location
    San Jose, CA

Recent Profile Visitors

17,930 profile views
  1. I should probably post about this on Spritesmind or something (I'm not sure I even have an account on there), but might as well mention it here given the conceptual overlap and on the off-chance any of the Atari 8-bit computer coders out there are also messing with PC Engine stuff. (I've seen a couple people doing that in the past, 6502 code heads and all) And heck, I've had this in my head for over 6 years now and haven't really talked about it anywhere. But it seems to me you can use the 32x5-bit wave sample buffers in the Hu6280 CPU/Sound chip to store 5-bit PDM information, so you get the 32 amplitude levels from each 5-bit word, plus 32 additional linear levels from the pulse density (hypothetically more than that with more complex noise shaping, but I'm thinking simpler than that but still nicer sounding than straight PWM) so an effective 10-bit output. In this case I think it's more literally PDM and not sort of twisting the definition as I did with my starting post for this topic. It's sort of like pushing bits through a serial shift register in loop mode at a high idle clock rate, but you're shifting out 5 bit words rather than 1 bit. (so in effect you're spitting out a looping, oversampled high frequency multi-pulse waveform that gets effectively filtered down to a linear amplitude sample) You'd need to double buffer samples using 2 of the sound channels intermittently, muting the back buffer and loading it, then waiting to update the sample output (via cycle counted code or a timer interrupt). I was mostly thinking in terms of timer interrupts here since this method could allow similar sample rates to the other 10-bit PCM hack (using 2 channels and a volume offset for upper and lower 5-bit values, or less of an offset for just 8-bit resolution, but in any case requiring the CPU to set every single sample manually, so one interrupt per sample played if using an interrupt driven routine). So with the onboard timer interrupt, you get approximately 6.99 kHz or an integer division of that. So that'd be 6.99 kHz 10-bit PCM output, potentially with software-mixing of multiple 8-bit channels, ADPCM or DPCM decoded samples, etc (with a look-up table to convert 10-bit linear PCM to the appropriate 32x5-bit register data). But more useful/less CPU intensive than that would be to chop the 32x5-bit buffer into 4 chunks of 8x5-bits for 4x 8-bit resolution interleaved/multiplexed channels. (useful for simplified code and low overhead mixing of 8-bit samples, or compressed samples decoded to 8 bits, plus additional software mixing on any of those 4 channels to 8-bit saturation: also lighter since pure 8-bit math is faster than mixing to 16 bits and clipping to 10) Further, if you could synchronize the timer interrupt with the wave buffer playback frequency, so rather than looping the sample only plays once per interrupt, you could also potentially use the same basic technique to achieve 27.96 kHz 8-bit playback with that same 6.99 kHz interrupt. (worst case you'd still have some 6.99 kHz carrier tone aliasing where the idle high looping technique oversamples those overtones out of the audible range) You could also use slower timer intervals for less overhead, but then the aliasing falls more and more into the audible and nasty range. Since the wave generator and timer are based on the same source clock, I'd think you could synchronize them as such, but not sure. The oversampling 'whole 32x5-bit buffer as a single amplitude sample' seems like the more foolproof option, and the one that seems more like 'why didn't someone do that back in the 90s' sort of thing. Plus it's got the advantage of leaving the 4-bit logarithmic volume control totally free, so you can play with master volume on the simulated-PCM channel(s) by just setting the volume register as normal. (or potentially do Amiga-style extended resolution by modulating that volume level on top of everything else: something that does work on the PC Engine, but requires intermittent channel use to avoid clicks/pops when the volume register is set, which is irrelevant if you're already using 2 channels in a double buffered set-up) Or take another 2 channels (for double buffering) and use the same mechanism there for 10-bit output, then use a volume offset to allow that to be added with the first channel for 20-bit output. Of course, that's kind of pointless at 7 kHz, and more interesting if you could get the ~28 kHz 8-bit synchronized method working and used the volume offset for low+high 8-bits and 28 kHz 16-bit mono output. (which would be pretty nice, potentially a lot nicer and more flexible for streaming compressed or uncompressed PCM audio from the CD Unit than trying to buffer into the hardware ADPCM decoder: useful compared to straight redbook audio even with 16-bit 28 kHz LPCM since it takes up less space on disc and could allow more dynamic uses of sound/music data and interspersed game data loading, but also much more useful for software mixing multiple sampled SFX channels rather than choppy single-channel ADPCM SFX) **that and this leads into another topic regarding how the ADPCM decoder in the CD-ROM module was kind of pointless and would've been nicer to have a full 128kB of program RAM instead of the separate 64kB of ADPCM buffer RAM. ** on a side note, a typical 8-bit shift register in loop mode could be used for 1-bit PDM with effective 3-bit linear amplitude per byte loaded. (I looked into that a while back for potentially hacking VIA or ACIA type chips for sampled sound output, but their max bitrate seemed too low to be really useful for that: though using them for looping 1x8-bit sample, complex pulse tone/noise waveforms seemed a little more interesting) I'd think POKEY's serial data output could also be hacked similarly, for what that's worth.
  2. Aside from the very limited supply side of things and retro tech going up in price a good deal across the board (Sega stuff was vastly cheaper 5-10 years ago), there's a few other things in play. You've got younger generations discovering these consoles and having interest in them, driving a new source of demand. You've got local shops and online sellers having less and less access to surplus/used goods compared to a few years back, and you've got a chunk of international sellers who might have somewhat (or much) lower prices but much higher shipping costs, at least for individual items. It seems like a lot of the second hand goods and electronics recycling/resale market has moved away from US thrift and warehouse stores and gotten moved overseas (East Asia and Eastern Europe for the most part from what I can see). Though it also seems like some stock that was formerly in California (especially around here in Santa Clara Valley) has moved to parts of the Southwest, especially old PC hardware. (Texas, Arizona, New Mexico, Nevada all seem to have warehouses sorting through supplies of old motherboards among other things, and I mean 80s and 90s era stuff) I'm not sure, but a disproportionate number of Atari Jaguars may have been sold in CA and stayed around here longer than elsewhere, and Atari was obviously based in Sunnyvale and I think had some of their inventory stored in Sunnyvale or Santa Clara warehouses (or a near-by industrial/warehouse district). There were some 235,000 or so Jaguars built and while that's not a lot by video game console standards the limited sales left a big portion of that stock as part of the Atari Corp liquidation in 1996 there were still in the area of 100,000 of those in inventory (sales had reached 100k by the end of 1994, but appear to have stagnated thereafter). I assume some of that ~100k surplus figure also included completed Jaguar system boards that hadn't yet been assembled into cases. Best Electronics had spare/replacement Jaguar boards for some years after their NOS boxed systems dried up, but those eventually ran out as well. But on top of that, Ebay itself is probably part of the problem: they've been keeping shorter and shorter records of sales, thus making it really difficult to make well educated pricing decisions as a buyer or seller. At this point, it's well under 6 months worth of completed listings that can be viewed. So more than just making it difficult to price things, it's incredibly hard to work out how common or uncommon it is for given items to even go up for sale or what sort of things tend to have seasonal availability. And unless there's some 3rd party archive site taking up this role, you're stuck with personally keeping track of the market for years at a time if you want to actually have a record of things coming and going. I'm not sure, but it seems like encouraging ignorance on the side of the buyer (moreso than the seller) and having the most visible listings be overpriced BIN options that almost never sell has led to inflated prices that more buyers impulsively give into or find attractive by comparison to even more inflated listings. Ebay staff/management may have actually taken notice of and actively encouraged the behavior. Their restriction on completed listing info doesn't seem proportional to archived data storage/management constraints or costs, particularly for simple read-only fixed archives of older listings that could be placed on a lower bandwidth server, potentially with restricted image data sizes/capacity per listing as well. (so you'd lose some high-res photos, or even all the photos other than the thumbnail, but retain the actual listing information and description)
  3. Yes, not to get into grittier details, but I was mistaken with that old post (actually about the first year I really got into the retro gaming/computer community/discussion scene online too), and I've learned a lot since then. However, as far as I've come across in recent articles, research, and discussion, Sega never released complete, let alone detailed sales figures for the Master System or Mega Drive, and the fragmented evidence and figures available for the SMS are sparser than those used to approximate MegaDrive/Genesis sales. For that matter, I recall some of Nintendo's own figures being slightly vague in some respects and also lacking complete breakdowns by region and year (also the issue of fiscal year vs calendar year and different regional fiscal years). Atari Corp's 7800 sales figures are among the most detailed I've seen, and I think they use US fiscal years and not calendar years for their dates. (and are for the US, I think or was it all of their North American market? ... I'm not sure how Canadian sales were handled or if those were all imports made by local non-Atari distributors or mail-order, and included in US sales) And honestly, the 7800 sales figures in the US vs apparent (approximate) Master System sales and market share are some of the more interesting comparisons to consider. That and also bear in mind something I overlooked back in 2010: the PAL 7800 arrived very, very late, basically when the platform was dying in the US (and really in need of a successor) and extremely dated for a new release, especially in the UK and continental European markets that were saturated with 8-bit and increasingly inexpensive home computers. The Atari ST was past its prime by that point, too, and on the verge of being problematic even as a good budget PC/game console option (taking both hardware and software prices into consideration). Wikipedia lists 1987 for the 7800's release in PAL regions for some reason, and maybe the PAL version of MARIA was ready that year, but I'm pretty sure the 1989 date is correct for the actual release, as RFGeneration has, along with the publishing dates on PAL game cartridges. http://www.rfgeneration.com/PHP/gethwinfo.php?ID=U-032-H-00010-A Atari also kept supporting/publishing software in PAL markets after US distribution had stopped, and I think some of the very latest publication dates of any 7800 games (homebrew aside) were for European releases in 1991. Sentinel maybe have been the last and was a PAL exclusive prior to ResQsoft's 2002 release for NTSC. A lot of people are surprised and dubious of the SMS vs 7800 sales, but it makes sense to me, as does the confusion. The 7800 had 3 things going for it, the Atari name, the big back library of 2600 games (and some new ones), and the low price of the console and games (both the discounted 2600 games and new games). The biggest weak point was probably not making the difference between 2600 and 7800 games clear enough and somewhat confusing labeling/marketing on that front. (the actual technical limitations and Atari's monetary or in-house software investment constraints come after that, same for the confusing release and market position of the XEGS in 1987, and I'm leaving out Nintendo's own licensing contracts and market manipulation as a separate category altogether) The 7800 was fairly consistently priced in the $85-90 range, cheaper than the NES, and moreso the Master System. And the NES's price advantage over the SMS was also probably a major issue, though I'm less sure of the exact dates and price changes. You also only really had 1987 and 1988 as the SMS as full years of it being Sega's front-line competitor in North America. The Genesis released in north America August of 1989, and by that Christmas sale season, they had Michael Katz's management and marketing campaign styling influencing things, though I think 1990 is when that hit full force. (he'd been the one handling Atari Corp's game/entertainment division as President from 1985 to late 1988 and left the company on an -intended- extended vacation from the industry shortly after Atari and Sega's negotiations for Mega Drive distribution fell through and also before Jack Tramiel stepped down as President and CEO; Sega's David Rosen convinced Katz to join Sega of America's team as President, and jump back into the game earlier than Katz had planned)
  4. This might just be a throw-away idea, but with the various (attempted and completed) expansion modules for other classic game consoles and home computers (Atari and otherwise), plus the efforts I've seen for quirky sound and/or memory expansion boards or specialized retro sound cards in the retro Sega and PC scenes, it seemed worth throwing out there. (that and the level of features SNES flash carts have reached) That, and I think I've made the mistake of interjecting random mostly-off-topic ideas like that into other homebrew projects too often, back when I was an active poster. Or with a single post late last year here: https://atariage.com/forums/topic/254003-upcoming-jaguar-game-drive-cartridge/page/125/?tab=comments#comment-4386972 I realized a more plausible and practical possible Jaguar upgrade would be a DRAM (SDRAM or PSRAM, depending on cost and availability) based SD-card style flash cart with provisions for mapping some of the RAM into use as actual expanded RAM (and not just pretending to be ROM) or at least allowing that RAM to be read and written to by an on-cart CPU or coprocessor. The added CPU or MCU could be any number of things, but for folks actually interested in something remotely plausible and period-accurate to the mid 90s and the Jaguar's existing architecture, just including a faster 68000 or 68EC000 with zero wait state access to local RAM would make sense. And here's the thing, it could be a basic PCB kit as well as completed boards, and include the necessary provision for flash drive interface + DRAM (potentially separate pads for DRAM and PSRAM and jumpers to select one or both) and then pads for a 68 pin PLLC socketed 68000 or 68EC000. (pads for a surface mount PLLC would probably be cheaper, but the socket seems a lot safer and a lot easier to install for those with basic soldering equipment or no equipment at all for people buying boards with sockets already installed) Or if you really wanted to be safe, pads for DIP-64 68000s could be provided as well. You could thus have decentralized manufacture of the boards (downloadable free plans, various people making and offering kits or just the bare PCBs, and others potentially selling completed boards in various states of completeness) and also let homebrew devs choose what features to bother supporting or requiring. (though I'd think outright requiring all features to run the game at all would be less popular) Making the whole project open-source (and potentially compatible with) SaintT's SDRAM board would also make far more sense than starting from scratch. You could do a lot of that as a single-chip FPGA too (FPGA + RAM), but with the current availability of cheap 20 MHz 68HC000s (and to some extent EC000s), mostly from China through ebay (mostly on Ebay, or potentially less reliable/safe auction/sale sites which I'd avoid) it seems like a discrete socketed 68000 would make the most sense. Supplies of 16 MHz 68HC000s in 64 pin DIP packages are also relatively cheap currently and 12 MHz Hitachi 68HC000s are around at low prices intermittently as well (and HD68HC000s are very low power and quite good overclockers from what I understand) And to cover the uncertain future availability of certain speed grades (and a potential flood of counterfeit/remarked chips), and to allow for overclocking, there could be jumpers or software selection for different clock rates (probably using PLLs and dividers off the native Jaguar ~13.3 MHz signal). A base 13.3 MHz would be safe/conservative (and still offer a lot of advantages with a slow CPU on a separate bus, not bothering the Jaguar chipset itself, sort of like a slow PC with an all-in-one 3D accelerator + DSP sound board). Then you could have 16.7 (5/4), 17.7 (x4/3), 19.9 MHz) (x3/2), and 26.6 MHz (x2). You could go higher, but 26.6 MHz seems reasonable for a virtually foolproof overclock on 20 MHz parts (and many, if not most CMOS 16 MHz parts). You could probably omit the 16.7 MHz setting, too and simplify the clock synthesizer arrangement and also only need 2-bits for selecting the clock speed (be it set in software or via jumpers or a switch block). Allowing data to be streamed from the SD card (in place of a CD-ROM or hard drive) seems like it would open up a lot more opportunities for homebrew as well, plus potentially allow emulation of the Jaguar CD itself and effectively allow much more ready access to the Jaguar CD library without one of the scarce, expensive, Jaguar CD units in good working order. (and I know there's the whole piracy vs abandonware debate elsewhere, but I'd just stay out of that and focus on being free, open, and just rely on the honor system for that: ie for folks who have real copies of the game but not working CD units, plus Good Old Games style cheap license purchases and DRM-free digital downloads for any publisher/rights holder of out of print software to bring out of abandonment) Well, that and I guess if Jaguar emulation software gets good enough and flexible enough, homebrew games could even be offered bundled with embedded emulators. (like GoG does with DOSBOX and ... maybe some other old home computer platforms, I'm kind of out of the loop on that) But there's obviously something special about using real hardware, and the Jaguar's quirky architecture makes it harder to capture via emulation.
  5. Looking over the schematics more myself, it looks like the Game SHIFTER chip (4118 ASIC) might not be the entire PANTHER video ASIC, but just the video shifter portion with pixel data registers and palette registers (32 18-bit CRAM entries) but not the object processor or line buffers. The Panther dev systems have an additional Toshiba Gate array chip onboard, which may be the actual object processor chip with line buffers with a data port for the SHIFTER to access the line ram from and DMA data. It looks like there's multiple sets of 5 registers with 16-bit inputs accumulating to 5-bit outputs, which seems like it'd be used for bitplane data on 16-bit word chunks like the ST SHIFTER uses, just with 5 rather than 4 bitplanes max, which seems kind of inefficient for a system using chunky pixel data natively (no need for that much chip space used when you could just read and latch an entire chunky pixel at once, or 2 pixels packed into a 10-bit word; logically treated as 8-bit packed pixels in a 16-bit word, but could be done on 10 data lines and 10 and 5-bit data words internally). So it seems like Atari may have been planning to use a version of the SHIFTER with support for 5 bitplane modes at one point and recycled it as the video generator portion of the Panther, which would also mean they went through the trouble to have the object processor ASIC translate packed pixel data to bitplane words internally. (or assuming the CPU-addressible and object processing back buffer end does all work on 8-bit chunky pixel data, the translation would be through memory mapping on the output data port the SHIFTER reads from, assuming my inference is at all accurate) That in turn implies they could have been intending the 4118 ASIC itself as a standalone video chip in a computer, like a further upgrade to the STe's SHIFTER or a competing design that ended up scrapped. (like maybe lacking scroll registers or DMA sound support) Or is the scrolling and fine address control handled on the MMU/MCU end in the STe? Though the schematic drawings are dated well after the STe hardware was designed (or even introduced) and the 4118 chip uses an 84 LLC package like the GST SHIFTER so may have been intended as a plug-in, pin-compatible upgrade to the STe. (or factory assembly line replacement, since Atari wasn't particularly keen on actually shipping upgrade parts, but then the STe did have its SHIFTER socketed, so it'd be an easy dealer-level chip upgrade or an end-user one) 5-bitplane modes with an 18-bit RGB colorspace would've been interesting though, and assuming they were targeting the same 8 MHz dot clock and bus timing arrangements, that would imply a 10 MHz CPU clock and 20 MHz MMU clock. (though they could've just used 32/5 MHz for the dot clock, so 6.4 MHz, or 6.44 MHz for the STe's NTSC 32.2159 MHz clock) The latter might actually make more sense given Atari seemed to be resistant to messing with the MMU timing at all, plus 6.44 MHz gives much closer to square pixels for NTSC (and should have less severe composite video artifacts) and would also fill out the screen to the borders, but conversely would tend to overflow 320 pixels off screen on TVs (or monitors without generous border adjustment) for around 307 pixels visible. That's also assuming a 320x200 resolution was intended and addressing matched that, but if they stuck with a 32kB screen space, 256x200 would use the exact same 32000 bytes as all the existing ST screen modes and also use the exact same screen and border space size. (albeit if that's the case, it would've been handy if they also let it switch to 5.37 MHz for a nice full width TV display and nice 1.5x colorburst in NTSC for good TV game purposes) Given Atari's track record with the ST by 1989, yet another 32000 byte screen mode with identical hblank/vblank timing to the existing STe seems pretty likely, and not particularly amazing but at the same time still better than what they actually put out there in the STe range. (and would've been particularly nice in the MEGA STe ... or a cheaper, cacheless 16 MHz STfm cases STe+ sort of deal) That might also explain why the Panther's development got strung along through 1990: the 1989 system needing 2-chips for video and not being cost-effective enough, and they just stuck with the 5-bit wide line RAM throughout. (granted, upgrading to 8-bits and 256 colors would've made going to a single chip even harder, but then doing away with bitplane translation and doing all the video generation stuff internally with an 8-bit digital video bus output would mate well to a VGA style 256x18-bit RAMDAC for a more economical 2-chip solution for an 8-bit chunky pixel system, and would've made the Panther far more relevant in 1991) Gate array logic is also relatively easy to modify and usually fast to have new masks made for testing, so that should've worked out well too. (and the object processor architecture itself already treated line RAM as 8-bits wide logically, so programming wouldn't be any different, palete select was via an 8-bit linear offset for lower color depths, and the RLE pixel mode used an 8-bit color select with 3 bits totally wasted) I also assume a RAMDAC would be cheaper than repurposing the TT SHIFTER for a 2-chip solution using 256x12-bit color. (plus 18-bit color is nicer) Though come to think of it, the TT SHIFTER works similarly at least: it's got the 64-bit wide bus buffering data and spitting out out faster through a 16-bit port to the SHIFTER, right? ... sort of like it could've been used on a cheaper 32-bit wide set-up at half the resolution. (like 640x240x4bpp and 320x240x8bpp plus 512x200x6 or 400x256x6 on TVs/mid-res monitors and 320x480x4bpp at VGA res or 640x480x2bpp or 768x400x2bpp given 32 MHz dot clock on normal VGA monitor calibration would allow more like 819 pixels in the space 640 usually goes; also potentially 512x400x3bpp and 256x400x6bpp) I wonder if something like that was originally planned with the earlier 68020 machine developments and so-called SHIFTER II. (except it still would've been a fine fit for a 16 MHz 68020 or EC020 machine as a lower/mid-range mainstream market complement to the TT in 1990, or a cheaper version with 16 MHz 68000 with or without MEGA STe style cache, especially with a 32-bit latch connected to 32-bit RAM) Actually a 12 MHz 68020 on a 32-bit bus with the same old 500 ns ST MMU bus slots would work out with 3 wait states and 6 CPU clocks per access slot. Plus 12.5 MHz was the bottom-end 68020 offering, but I think that was only true somewhat early on and the 16 MHz grade replaced it. (and seems to only exist in the PGA version, not the PLLC or CLLC surface mount versions) 16 MHz with 8 cycle slots would be OKish (still faster than the Amiga 1200), but much better with optional local fastRAM. (though a 16 MHz 68000 or 68010 using 8-cycle slots on a 32-bit bus latch could avoid wait states when doing 32-bit alligned operations and while filling prefetch without needing additional cache; also assuming data gets latched within 125 ns, which might mean changing MMU parameters for faster DRAM access timing even if cycle times stayed at 250 ns, which would be right for 120 ns DRAM since 187.5 ns cycles would probably be stressing that too far out of spec and need 100 ns chips: the ST's 248~250 ns cycles pushed 260~270 ns rated parts much more within other tolerances and operating temperature/other conditions) With normal ST MMU timing, you'd be stuck with wait states on a 16 MHz 68000, and mis-alligned 32-bit accesses (9 or 10 cycles rather than aligning with 8), but that'd still help for slower instructions doing 32-bit read/write operations at least. (especially stuff like 32-bit arithmetic, and really slow things like multiplication and division would get a big boost even if you had just the 16-bit bus latch to work with, for things like 3D calculations, 2D scaling, sound sample scaling or interpolation, DSP type stuff) For that matter, a stripped-down STe with that 16 MHz CPU and 32-color Game SHIFTER might have been OK as a game console in 1989. (or going really cheap and feeding the SHIFTER with a 26.85/26.6 MHz clock, so the MMU gets a 13.4/13.3 MHz signal, you end up with slower 298 ns DRAM cycles and can use cheap, old 200 ns DRAM like left over XE/XEGS DRAMs and 596 ns 68k access slots and a 6.7 MHz CPU speed or better: tap the 1/4 clock output from the MMU for 3.355/3.325 MHz and use a PLL for a 3x multiplier to 10.07/9.98 MHz and 6-cycle 2-wait-state bus slots, or just run straight off the 13.3 MHz clock with 8-cycle bus slots, though you could potentially add a little 16kB of 200 or 150 ns SRAM/PSRAM to work in without wait states) And even 128 kB of slow 16-bit DRAM would be a lot more flexible than the 32kB the Panther was planned to have. (64 kB double buffered framebuffer, 64 kB to work in, plus ROM could be on the local CPU bus and have fewer wait states and/or letting the blitter interleave ROM access with the CPU, or both: with 300 and 200 ns ROM cycle options interleaving in 600 or 400 ns slots and the blitter having 6.7 and 10 MHz clock settings while the CPU has 2-wait and no-wait state settings) There's also going the other direction and using a faster MMU with slower SHIFTER DMA and more CPU+blitter access slots. (using a simple 3-way split with 200 ns bus cycles using 120 ns DRAM and 20 MHz MMU, and/or a smarter MMU on top of that that gives access to vblank and maybe hblank SHIFTER bus cycles to the blitter and maybe CPU) Messing around with timing and bus sharing is also a lot easier on a platform that doesn't have to be backwards compatible. (plus they could've dropped the LMC1992 and used a more barebones sound amp circuit, but a nice hack would've been a 16-bit mono mode that connected to the 8-bit channels, or the same thing but also dropped the discrete 8-bit DACs for barebones R2R networks that could be chained to 16-bit mono)
  6. https://www.chzsoft.de/asic-web/ There's a lot of neat leaked documentation, schematics, and some commentary on Atari ASIC designs there, some prototypes and some that ended up in the STe, TT, Falcon, and Jaguar. Also interesting to note the dates on various documents and technical drawings. (among other things, some of them go along with other info I've seen in old interviews or magazine articles pointing to the STe chipset being designed over a year earlier than it was and was delayed for some reason: existing ST sales/demand being high enough is one cited reason; the TT chips also seem to be significantly older than most of the system, but that's also obvious from the 1988 copyright date printed on several chips actually used in production TTs) I'm not sure if this fits best in the ST/Falcon section or Prototypes, but it seems mostly to be Atari 68k based computer stuff, so seemed relevant here. The most intriguing thing in there for me is the GAME SHIFTER, which seems like it might be the PANTHER object processor ASIC or at least the video generator portion of it. It's numered ST-4118 and that's also printed on the socketed 84-pin LLC seen on the Panther development system. (and that chip has lines connecting to a resistor array containing 3x 12 resistor banks, almost certainly R2R networks for 18-bit RGB output, which the Panther used) The fact that it has an ST related part number could be related to the Styra Semiconductor Corporation as noted on that page, but many of the ST-related ASICs also share that designation, so it's not really clear. (and also not clear if Styra was even making those custom chips for Atari) It makes me wonder if the Panther graphics chip(s) were intended for an upgraded Atari ST as well. (if it was intended for the computer market, it would've been kind of weird, but neat in some ways: the line buffers could be exploited to line-double 15 kHz lowres framebuffer modes to VGA synch rates and use the same memory bandwidth, the object list processing could be used for GUI window acceleration, obviously TOS-support-dependent, and a few other non-game related things; and the Panther definitely used packed/chunky pixels and not bitplanes, so all the advantages of that as well, though the disadvantage of using 8bpp for just 32 colors due to the 5-bit line RAM and 32x18-bit CLUT, but 4/2/1-bit bitmap objects could use any consecutive set of those 32 colors via an 8-bit offset, like the Jaguar and also like the Lynx's 4-bit palette offset) **note one of the laziest uses for the multiple-framebuffer window and 32 color palette set-up would be an AGA Amiga style 2 playfield mode with dedicated 15 colors + transparent + BG color, or an even lazier OCS/ECS Amiga simulation of a single 16 color playfield and 15 color sprites. (far poorer utilization than optimizing graphics around the 8-bit offset feature of the palette, which itself is obviously far more work than if a full 256 entry CLUT and 8-bit line RAM was present, but still a lot better than the STe's color limitations and arguably better than the Mega Drive's 4x15 +1 color 9-bit RGB CRAM) If the Panther object processor actually used its line RAM more like the 7800's MARIA does, then 320x5-bits could be expanded to 800x2-bits or 1600x1-bit, which would be of little use to games but very useful for highres computer graphics. (more situations where lines could be double-buffered for efficiency/flexibility and potential line doubling, plus even more necessary if the Panther lacked the ability to chain line buffer reads, limiting screen width to a single line buffer's length) Using the ST Blitter would also be fastest on 1-bit bitmaps, so a highres framebuffer mode or drawing to multiple 1-bit object windows. (even more useful assuming the Panther can do opaque 2-color 1-bit bitmaps like the 7800 Kangaroo mode: rectangles without sprite transparency) Also worth noting: the Panther development board also uses the same 32.2159 MHz oscillator frequency as the NTSC STe uses. (good for monitor compatibility and system clock compatibility, not so good if games actually used the same 8.05 MHz dot clock due to the large border and NTSC color artifact issues, though 32.2/5 = 6.44 MHz would be much better and also have nearly square pixels) Presumably, in the ST, the Panther would be using the same 32kB of fast 32-bit SRAM (35 ns chips used on the dev boards, but being used at 31 ns) and could exploit ST RAM like 500 ns 16-bit ROM using the existing SHIFTER DMA cycles, except able to use an entire H-time worth of bandwidth (or multiple H periods) aside from the DMA sound access slots. (disabling interleaved DMA for 250 ns saturated/cycle stealing access would allow higher bandwidth at the expense of CPU performance, unless you also used a 12 or 16 MHz 68000 and give the CPU access to all bus cycles as well for serial use of the full bus bandwidth; better still if the Blitter and DMA chip could use both even and odd bus cycles as well) Though aside from that, some 16-bit wide 120 ns SRAM used as dedicated object data/framebuffer memory could allow neat high bandwidth modes (and potentially CPU interleaved 250 ns modes too) and 256 kB of SRAM on an upper-scale model would allow single-buffered 1600x1200x1bpp 60 Hz or double-buffered 800x600x2bpp 60 Hz (and potentially 50/50 bus interleaved with the CPU/DMA chips). Which would be nice monochrome/grayscale workstation class resolutions in 1989/1990, sort of like a low-cost competitor to NeXT, especially if they included a 56k DSP option. (which has implications for both audio work and 3D graphics)
  7. So the 5-cycle access timing did work? That'd have some significant implications on blitter texture mapping performance among other things. (tests Kskunk ran years ago came to the conclusion that the blitter tops out at 5-cycles per pixel/textel while doing scaling/rotation or texture mapping, so you can do way faster than from DRAM, which takes 11 cycles, but won't be able to use the full speed of any of the onboard SRAM: and using the GPU scratchpad or even trying to reserve line buffer space as texture buffer/cache has other performance implications, too, like blocking GPU access to SRAM and depriving the GPU from what little high-speed RAM it has to work with anyway) The cheapest SDRAM currently on Mouser (https://www.mouser.com/ProductDetail/Winbond/W9712G6KB-25?qs=sGAEpiMZZMu4dzSHpcLNgnY7VH5pAzisodYhy9hoWwFIaNTUkscd%2Bg%3D%3D a 4-bank interleaved 64 Mbit 16-bit wide DDR-2 800 chip, $1.37/100 ) is specced at 57.5 ns random read/write times at 800 MHz, so well below the 188 ns 5-cycle limit (and 75 ns 2-cycle one for that matter) though that's also DRAM controller speed dependent. Still from the old SDR SDRAM datasheets I've looked at, pretty much everything is going to be way below that limit. The 70 ns FPM DRAM used in many Jaguars (some used 80 ns) can even do better than that with a fast DRAM controller, with the spec limit at 130 ns RC (but the Jaguar does 188 ns), but you can fudge that sometimes too, like Atari had with the 150 ns DRAM RAM in the ST doing 250 ns RC when it was rated for 260. (I'm pretty sure the issue is the Jaguar having only 3-cycle RC as the next step beyond 5-cycles, and 3 is too fast, especially the 1-cycle RAS-to-CAS: you'd need 60 or 50 ns DRAM for that, so you get 3+2 and 2+1 but not 2+2 cycle precharge + RAS-CAS setting, and 2+2 would've worked fine with the RAM they used, even the 80 ns stuff and that's the timing Flare used on the 1993 version of their Slipsteam ASIC; I suspect they were hoping for 32~33 MHz out of the Jaguar, probably using the same 32.216 MHz NTSC clock from the Panther and STe, but couldn't get the chips fast enough for it) If the firmware is updatable on the SDRAM cart, and the SDRAM could be reconfigured so the Jaguar can actually see it some or all of it as RAM (treating it as slow SRAM/PSRAM) that'd be super useful. Not as cool as if the 2 cycle mode worked, but still cool. (it'd let you keep the DSP and 68k out of 64-bit DRAM as much as possible, much more than what ROM access already would allow) I don't think the Jagua's memory interface would let you map cartridge RAM into the unused DRAM address space (particularly the unpopulated second 4 MB bank) but that'd be neat. The full 24-bit address bus is on the cartridge slot, but it'd be more a matter of what TOM's memory map actually allows across the cartridge port, and if it only expects DRAM in that address range. (including the multiplexed DRAM address lines not present on the cartridge slot, sadly ... otherwise it'd have been relatively easy to add DRAM to games, too, though they also could've just run the DRAM connections to another super-cheap edge connector expansion port routed to the back or side) Texture mapping from alternate memory gives you other boosts to blitter performance, too, like avoiding any page breaks or read/write changes to DRAM access while rendering, which means you mix and match texture mapping and gouraud shaded fill operations within the same drawing operation or even mid-line, plus you'd be able to read the Z buffer faster as well (though you have the 1-cycle read/write change overhead for that). One case probably useful for mixing textured and untextured surfaces and Z-buffer reads would be using untextured polygons in the distance (sort of like low-detail textures used in the distance on platforms that relied heavily on texture cache space and bandwidth and especially if they did bilinear filtering) so instead of blurry, smudgy lowres distance detail drop, you'd get just plain, neutral-colored polygons fading into the distance. I assume the 2-cycle ROM mode is also broken for on-cart SRAM and using the various other I/O timings also isn't possible, otherwise that'd be really useful. (and small-ish chunks of on-cart RAM is sort of a retro-savvy trick too, ie relatively common on cartridge based game consoles with significant RAM limitations) 150 ns SRAM or 100 ns (maybe 120 ns) PSRAM would max out that 188 ns (5-cycle) cartridge bus timing, too, which would've been on the cheap/low end when the Jaguar was actually out there, but it just had nowhere near the support/backing (first or third party) that the likes of even the 7800 had (with the variety of games using 8kB and 2 or 3 using 32kB SRAMs onboard). The Jaguar CD didn't include any either, though I wonder if John Carmack had considered including some RAM on-cart to help with Quake on the Jaguar given the very serious performance boost it would provide and potentially even allow a smaller and/or slower ROM to be used, or keep from needing more and faster ROM. (probably 64kB as 2 8-bit chips or a single 32kx16-bit PSRAM like the Sanyo ones Sega was using in the Mega Drive/Genesis Model 2) Faster texture mapping also means more bus time free to the other processors, so all sorts of nice gains there. (including more time/resource to spend doing some sneaky realtime or intermittent hidden decompression loads from ROM, both to DRAM and to update the texture buffer) Oh right, and the Blitter only does one pixel read/write at a time when texture mapping, so any wider than 16 bits won't gain you anything. You could also let the 68000 work in that added RAM at times (along with ROM) and reduce the headache it gives to GPU/OPL/blitter accesses to DRAM. (and when the 68k is under higher priority, namely during an interrupt routine, the time it normally spends hogging the bus might actually allow some cycles free for GPU/OPL/Blitter activity ... I'm not sure how well TOM's memory interface is set up for interleaved bus access: ie if it at least has bus latches on the various bus masters to allow some parallelism a la ST/Amiga or Flare's own Slipstream from version 2.0 onward) Yep, but sometimes those problems can get fixed on the developer end by hardware and not just programming. Add a little bank-switching or memory mapping logic, some RAM, a sound chip here or there ... an I/O chip for extra controller ports. Or, of course, tons of extra ROM that was totally impractical early in the platforms life (or, as with homebrew, was totally impractical for all of its original life). You just see the 'use more RAM' 'cheating' on home computer (including old IBM-compatibles) platforms more often. The irony with the Jaguar is that it had tons of RAM for a game console at the time it was test-marketed in 1993 or launched in '94, but to do that they also made some big trade-offs that didn't pay off, including not expecting the DRAM market to drastically jump up in price in mid/late 1993 and then stagnate for 2-3 years after that. (that burned Atari big time back in 1988/89 and it was happening all over again) The hardware itself is pretty fexible and could've had other configurations/uses (like on a graphics accelerator card) and supported 2 DRAM banks and variable bus sizing, plus a wide range of CPU architectures. It's really optimized for high bandwidth serial bus usage which is great for fast, wide heavily buffered or cached burst operations, but horrible for the few things that aren't supported for that (and horrible for any activity the DSP or 68000 have on the bus as both have horribly slow bus interfaces on top of being limited to 16-bits width, and the 68k has no cache or local RAM of any kind to work in). Populating the second DRAM bank with 16-bit (or wider) DRAM would've allowed full-speed texture mapping and a nice big chunk of DRAM to pull textures from, but having the DSP and 68k work in that (or on-cart RAM even) helps somewhat but not nearly as much as if they'd been given their own slow bus to work on. (and probably putting the cartridge interface on that slow bus, could be just 16-bits wide to cut cost and allow less fragile/finicky connectors to be used, more like the Mega Drive or ISA cards use ... less like PCI or VLB) Atari just would've had to cut corners elsewhere to push towards the $150 price point they wanted. (which, of course, that 2MB configuration failed to meet until 1995) They also didn't work around any of those main bottlenecks with the CD add-on. (like slapping a CPU-stand-in microcontroller and some RAM on there, or just another 68000 and a small ASIC handling interfacing with a shared block of RAM or something like that; another J-RISC chip as the MCU would've been neat, but made no sense with the low volumes Atari was dealing with by that point; licensing the Phillips CD-ROM chipset was enough of a mistake from that standpoint: rather than buying totally off the shelf parts) Or perhaps the more conservatively sensible move of using the minimal 512 kB of 64-bit RAM (4x 1Mbit DRAMs instead of 4x 4Mbit ones) and sticking an expansion port on the back or side to add more later, especially to hedge their best for the limited test market period. You'd want a local CPU bus there too to hedge bets over the 'is the CPU really acceptable on the shared bus?' question. I'd say Jack Tramiel-esque conservative, except planning for modular expansion was one thing the ST line had chronic problems with. (though the 130 XE and European 800XE at least had the PBI interface to mostly carry over the XL PBI ... and that's more than any standard model ST ever got, or even the Falcon for that matter) Jaguar owners didn't take to soldering RAM on the motherboard themselves, either. Or installing clip-on biggyback RAM or CPU upgrades from 3rd party kits. Wait ... actually, an ST/Amiga style 68000 accelerator board with cache would actually be pretty interesting. Or ... I wonder how well asynchronous overclocks work. (they're common on the Mega Drive, but that's with PSRAM and SRAM ... and VRAM through I/O ports, and ROM + wait states that's only sometimes fast enough to cope) I've seen full system Jaguar overclocks with modified BIOS ROMs, but not just asynch 68k clocks using an external oscillator and halt button + turbo switch, which might work depending how wait states are dealt with. (or not even an external oscillator, but tapping the 26.6 MHz signal from the board; a lot more stable than the 10.7/13.4 MHz video clock signal sometimes used on the Mega Drive) Someone could still potentially fix that with some sort of Jaguar Expansion module, maybe as part of some future flash cart project, and it might not be all that expensive to do with some savvy use of components, but then you'd need enough community interest to actually pursue it. (that and these days, the cheap add-on CPU/MCU + I/O + DRAM interface chip would usually be an ARM MCU/SOC, which can be very cheap but also way way overkill: ie you'd want to consciously throttle the CPU performance to be realistic outside of just wanting to max out the Jaguar's native graphics+sound pushing abilities, sort of like installing an old S3 ViRGE graphics card or Rage II mated to a 1 GHz Pentium 3 system ... or a 1.4 GHz Tualatin server chip) There's a couple open-source 68000 FPGA core projects out there too and that would be more retro-savvy for a Jaguar add-on, but I'm not sure it'd be realistic cost-wise compared to using an embedded ARM chip. (it'd have to be in the sub $5 range in bulk given there's a number of pretty versatile ARM cores below that, some not even in the bulk category, and I mean like 900 MHz Cortex A7 MCUs, 64 kB cache, DDR/2/3 SDRAM interface, flash interface, USB, UART, etc) Albeit all the cheap modern stuff (including DRAMs) require I/O rail voltage conversion, though I don't think that's a big deal. Small-ish asynchronous SRAM chips were the only thing I found in the 5V I/O compatible range that'd be relevant to a Jaguar cart or expansion module project. (mostly 16-bit wide 1Mbit and 4Mbit chips in the $1.5-3.75 range, with, which might even be relevant to homebrew developers actually considering hard copy cartridge releases: 12 ns 64kx16-bit, that's 128kB, at $1.45 per 100 units caught my eye while browsing) https://www.mouser.com/ProductDetail/ISSI/IS61C6416AL-12TLI?qs=sGAEpiMZZMt9mBA6nIyysK9MWTGEIBNWCJ0f%2B8johDE%3D) Having a C-friendly CPU/MCU with at least a modest chunk of RAM to work in (let alone a big chunk of SDRAM) would also open the doors to a number of homebrew developers who've had an interest in the Jaguar but passed it up at some point in part due to portability of other projects. (I know Chilly Willy was looking at the Jaguar before he decided to go for Sega 32x and Sega CD development work back around 2009, and some of the Game Boy Advance homebrew scene ended up spilling over to the 32x, and for that matter, a cheap ARM-based chip in a Jaguar cart would create a good deal of potential overlap there and with the Nintendo DS homebrew scene for that matter, among other ARM platforms: but especially quirky, interesting game console hardware platforms) Anyway, too bad this project is stalled for now, but I totally get crappy life stress issues and unfortunate turns of events. (it's one of the reasons I've been pretty much absent around my old forum hangouts and extremely sporadic for the last 5-ish years) I actually thought I'd dropped a brief comment in this thread last year after stumbling on it, but I guess I didn't. I also assume it's way too late to make any comments/suggestions on what might be worth including to the card given it's at the ready-for-manufacture state, aside maybe from modifications to the firmware end, though if that's flashable, those updates can continue after the boards ship, too. That said, I did think it was at least interesting that there's a fairly cheap 8 MB (4Mx16-bit) PSRAM chip available on Mouser at $1.98 /100 units currently, and that seems like a particularly appealing way to go about things. (it's 133 MHz rated, but random read/write speed is 70 ns, but that's still plenty fast for the Jaguar, and a small interface ASIC acting as a memory mapper and 32-bit bus latch could treat it as 140 ns 32-bit SRAM, still plenty fast for the Jaguar's 5, 6, or 10 cycle ROM modes, though at 16-bits it could use the fastROM 2 cycle mode if not for the bugs ... or depending what those bugs actually involve) It's barely more expensive than the cheapest 16 MB SDRAM I could find on there and interfacing is a lot simpler, but then you still need the bus latch and memory mapping logic along with the SD card interface. I also haven't been watching prices or doing any sort of prototype builds or projects myself that involve those sorts of parts, so I'm not sure how much those prices fluctuate. OTOH, that PSRAM chip might be appealing to other designers out there, or even some homebew cart game developers. (if you're already capable of using 5V compatible ROM chips at acceptable prices then a smaller 5V rated SRAM would make more sense, but if you're dealing with ~1.8 volt I/O translation already, that PSRAM seems potentially intriguing)
  8. I don't see pinout diagrams listed online for the EXT port, but I'm almost positive it's a complete Sega Mega Drive controller port, just with the opposite gender. (so a 6-bit parallel port plus serial data signal, I think through the 'select' line normally used to handle the button multiplexing) I don't think much or any homebrew stuff makes use of it either. http://www.sega-16.com/forum/showthread.php?5129-EXT-port-on-model-1 Now that one's pretty easy to find documentation of: http://www.hardwarebook.info/Mega_Drive_Expansion The expansion connector is much more limited than the cartridge slot, it's only got 17 address lines and lacks address 0, so it's a 16-bit word bus with 256 kB address range. It's missing a bunch of neat, useful expansion signals the cart slot has and wouldn't be capable of 32x style genlock video overlay because of that, let alone the address space constraints. (the address issue is also why the MD end of things can only see portions of the MCD's memory and hardware in 128kB blocks with bank switching: also making it more difficult to port MD games to the CD) There'd have been a lot more freedom in interfacing had the normal cart slot been used to interface the Mega CD instead ... though a CD drive module more like the original PC Engine CDROM^2 wouldn't have been limited by that interface and neither would the floppy disk drive add-on Sega had planned to use that port for initially. Sadly, neither the expansion port nor cartridge slot (or expansion port) exposed the 10 data lines needed for the MD VDP's pixel/color expansion bus (which would allow overlaying video in the digital domain and expanding the color palette/CRAM entries) opposed to the more comprehensive expansion port the PC Engine has. (more ironic given NEC opted to circumvent that with the Supergrafx when most/all of those features could've been done efficiently via a modular add-on, including one that simply swapped out the CD drive's base tray; and of course Sega had to jump through more hoops to implement the 32x and couldn't upgrade the native MD graphics output: the Supergrafx didn't upgrade the color depth, but it easily could have) The MCD has its internal mixing/output disabled by default (outputs just its internal sound through the RCA jacks and mixes audio through the MD), but when the aux audio input is grounded, it disables the sound going into the MD and mixes the MD audio through its internal mixing circuit, ending up with a cleaner, higher quality output. (moreso when the MD in question has poorer sound amp circuitry) I'm not sure how that works with a MD/Genesis II or if you're stuck with the crappy mixing on the MD side. All MCDs/Sega CDs came with an appropriate mixing cable, too, and all model 2 CD units also came with an extension block for the model 1 MD/Genesis. (also an RF grounding/isolation plate that screws onto the console and locks it into the slots on the CD unit) Early in the MD's life, Sega also offered stereo PC style amplified speakers you could plug into the headphone jack on the model 1 to get stereo sound output. http://www.sega-16.com/forum/showthread.php?16473-Does-anybody-own-a-set-of-theese That's just a type of dithering with the side effect of being easier to compress the graphics and/or end up with more solid looking blending, or sometimes even just looking better without composite video or RF smearing or artifacting. (the waterfalls in Sonic games using that technique tend to look OK through raw RGB or emulators, but plenty of other cases look rather bad) Results also vary widely depending on the video encoder used and in NTSC vs PAL. (PAL tends to smear/blend rather than artifact, but NTSC ends up with color errors and 'rainbow banding' in either vertical or diagonal lines across the screen: checkerboard dithering causes diagonal lines, 'strippy' AKA 'column' dithering leaves vertical bands) The effect is limited or absent in the lower res H32 mode (256x224) probably due to a combination of less chroma bandwidth required and the dot clock being a less fractional multiple of the NTSC color clock. (H32 is 5.3693 MHz, 1.5x NTSC 3.5795 MHz, but H40 320 pixel wide mode is at 6.7116 MHz or 1.875x chroma). The Sony CXA1145 encoder commonly used exhibits the rainbow chroma artifacts dramatically in composite video and faintly in S-video. It's also there with the later CXA1645 encoder, but more subtle and almost absent in S-video. The oft-dreaded Samsung KA2195D common to many model 2 Genesis consoles had generally poor and blurry composite output and lacks a luminance signal output (so S-video is impossible to get out of it), but seems immune to the rainbow artifacts and ends up blurring things so badly-but-evenly that dither-bars/strips end up blending very solidly and looking like actual translucency. (I suspect some games were developed with test/dev systems using that encoder, too, with art optimized around what it was outputting) It ends up looking more like really blurry PAL style composite video with horizontal smearing. (but without the vertical color blending/accumulation) The composite video filter/smear effect that Kega Fusion uses (or used to use) is very similar to what the Samsung chip outputs. I'm pretty sure that only holds true for the higher res H40 mode and H32 has coarse enough pixels to not blend 100%. (though it can be pretty close, and other artifacts aren't as bad, so games like Virtua Racing and some styles of FMV will look better with it) On top of all that there's 2 other things: 1 is analog video noise present in the RGB signal that (depending on the console and the TV, and not limited to the Model 1) has bright/dark vertical line variances on each column of pixels. That could be the jailbars some refer to, but similarly there's also jailbar style dot crawl present in composite video, worse on some TVs than others and generally worst with the Samsung KA2195D video encoder. (this is more often called 'picket fence' artifacts and not jailbars) I've also seen jailbars show up on some TVs in S-video or through RGB to component video adapters, but some TVs and monitors seem to hide or filter it out better. (CrossBow mentioned it's interference/crosstalk from the NTSC chroma signal, which it could be, but I'm not sure it explains the range of scenarios I've seen it crop up; he also describes it as solid sections of color and not bright/dark shades on the luminance end) I'm pretty sure the rainbow banding is just NTSC chroma artifacting created by luminance data being mis-interpreted ad chroma the limitations of the video encoders used (typical chroma-luma crosstalk). You have the opposite problem (dot crawl, though on the MD it's typically vertical bars fringing areas of high contrast color, not swarming dots) with chroma corrupting the luma data, but that's only present in composite video and not in S-video. (an artifact created by combining the two signals onto one line rather than inherently created during the RGB to NTSC encoding process) Rainbow banding is almost certainly related to high contrast luminance regions getting mis-interpreted as chroma data in the classic NTSC composite video color phase artifacting sense (the same thing that gives you Apple II, CGA, etc artifact colors) and due to the dot clock being a somewhat ugly multiple of the NTSC chroma clock (15/8) the artifacts aren't solid/consistent at all and oscillate across the screen and fluctuate/flash/shift when scrolling occurs. The sony video encoders seem to deal with the lower res 3/2x chroma dot clock (mode H32) better and doesn't seem to have that problem, and as I mentioned above, the effect seems totally absent on the otherwise rather awful Samsung unit. (though it has awful picket fence or jailbar style dot crawl artifacts) Hmm http://www.sega-16.com/forum/showthread.php?24968-MD-Genesis-Jailbars-Revisited Skimming that thread, it seems to be an artifact of the RGB encoder itself, but not quite like I thought. It's an issue of the chroma clock (color carrier) signal being used by the encoder at all, and not a matter of interference of signals through traces on the board, but inside the video encoder chip (just disconnecting/lifting the pin on the encoder will basically eliminate the jailbar artifacts but also give you crisp, luma-only black and white composite/s-video ... like on the Amiga 500 TV out ... or if you wire a C64 or Atari 8-bit video cable wrong and put luma on the composite cable ... kind of ironic it was the Amiga and not Atari ST that did that, especially since the ST is way way less NTSC color encoder friendly with that 8/16 MHz dot clock, plus Commodore already had those chroma/luma S-video C64 monitors that'd should've suited the Amiga fine, so it's kind of weird ... granted, also weird Atari put separate Y/C output on their 8-bit computers yet didn't offer a Y/C monitor to exploit that ... or not even crisp/clear grayscale composite video cables using the luma signal alone for nicer/clearer text and sharp grayscale graphics more suited to the 'serious' computer scene ... or just people who wanted less eyestrain) Err wait, I guess it would be pretty easy to just wire the ST's RGB+synch into a weighted luminance signal via an adapter cable, but I don't think I've seen one of those. (would've been handy for the first-gen 520ST without composite/RF output and on later models if you didn't have an RGB monitor but cared more about picture/text quality than color) Then again, this coming from someone who had a used grayscale VGA monitor on a homebrew multimedia/family PC as a kid in the early 90s. (which worked well enough until we got some games using red/cyan stereo 3D glasses effects ... )
  9. Instead of 2 look-ups, I think you could just use a single 256x2-byte table that takes an 8-bit PCM value (whatever format you want to use, unsigned, signed 2's complement, sign-magnitude, etc) and spit out 2 nybbles of POKEY volume data unpacked into 2 bytes. Then it should just be a matter of 2 writes to POKEY volume registers. That sort of look-up table system might be faster or friendlier on a processor with more registers to work with (load both 8-bit values into register space), but it's still probably the fastest option. You could also buffer some length of those volume byte pairs in zero page and keep overhead during the interrupt routine to a minimum, assuming you're using interrupts and not cycle-counted code for the playback routine. (you'd then probably have 2 sets of buffers, one normal, linear 8-bit PCM stream mixing buffer, for adding channels together and scaling note frequencies, etc, and then the second buffer made up of converted bytes) Doing the same thing on a single covox style DAC would still be faster and a bit simpler, though. I'm not sure how comparing 4 DAC ports would compare. (with that you've got less overhead on the mixing end of things, but you have 4 PCM streams to manage during the playback routine, and even more work than that if you're also doing frequency scaling during that portion of playback and not just reading from 4 PCM mixing buffers at a constant DAC sample rate) Albeit with the single 8-bit DAC you can also use interleave/multiplex mixing to allow full 8-bit samples to be played back at the expense of oversampling (and loss of effective playback rate). Ie a 32 kHz playback routine could be used to interleave mix 4 8 kHz 8-bit PCM streams. That latter method of multiplex (or interleave) mixing would also favor the straight PWM technique since the oversampling would shift the squeal artifacting well out of the audible range (and would tend to be filtered out more by the sound circuits more, either intentional filters or just exceeding the bandwidth of the existing circuits/amps/etc). With straight PWM you also just need 1 look-up from a simple 256x1 byte table, but the playback routine is still more complex. (using that byte to set a POKEY timer to count down the desired pulse duration/width, and if using interrupts and not precise code and/or polling timer status, you'd need 2x as many interrupts as a single covox channel) The hi/low 2-pokey channel method also still needs 2x the interrupts (using 2 POKEY timers), like straight PWM, but is simpler in using the same pulse-width setting at all times and not making PWM part of the sample-setting (or look-up) routine, but instead as part of an 8-bit DAC emulator routine. PWM could potentially be used for better than 8-bit resolution, but using POKEY timers would make that tricky (at least in 8-bit timer mode) though there might be some other work-arounds. (like taking the POKEY channel being used for 256-step PWM and also modulating its 16 volume levels, so you get a linear 12-bit output) Given that'd need 12-bit math for adding channels, it's also probably not that useful compared to just adding to saturation at 8-bits, especially with preprocessed samples (so no need to clamp at 8-bits and prevent overflow errors) though it'd be a neat trick nevertheless. (like if POKEY had been included in the Atari ST) Except you could still stick with 8-bit precision and use PWM on a channel simultaneously using 4-bit volume modulation. You'd then just need 16 linear (or rather 15 linear, non-zero) pulse width steps to complement the 16 volume levels. Too bad POKEY only has the random pulse-wave and square wave outputs, if it had variable pulse width, you could use actual PDM (with tons of oversampling). Though using the random pulse-waveform might be interesting in trying to hide or dither the PWM squeal noise. (you might need to drop the channel frequency down close to or into the audible range to do that, and it might just make things worse, but might shape the noise into more of hiss) Or if nothing else, doing some fort of sample playback through a random-pulse wave tone output might make for weird/interesting distortion sound effects. The SID chip has 12-bit precision duty cycle control over its pulse wave channel, so the C64 could potentially use that for up to 12-bit DAC output (though simple 8-bit would be more useful), and while you could also use the lowpass filter to hide the squeal, you'd effectively be doing PDM and not PWM by having the oscillator set to max frequency and modulating the pulse duty cycle. (... and then this just feeds into if the Atari ST had been a Commodore product and ended up including one or more SID chips for its sound output, not that they also wouldn't go well in an Amiga ... or 2 SIDs with PAULA's L/R outputs wired through each, with SID filters and all) I think PDM is also a misnomer with the technique I originally suggested in the other thread: It's really just amplitude-modulation, and even straight 4-bit PCM on POKEY is actually a special case of pulse-amplitude-modulation (since you're actually volume-modulating a square wave signal, not just a line voltage signal) ... which is also kind of funny given PAM could refer to the Atari 5200 as well. And since my suggested method (with the paired channels) uses a fixed pulse width on the low channel, it's still really just pulse-amplitude modulation. But that other method I just mentioned: combining a variable pulse width with variable pulse amplitude would be some sort of hybrid pulse-frequency-pulse-amplitude modulation. (and unless I'm mistaken, that's not one of the earlier PWM techniques used in the demo ROM, I had the impression those were straight 1-bit on/off pulses with all the volume/amplitude data being expressed via the pulse width alone) You'd also still have the problem with audible squeal even if you did use the volume modulation, but maybe not as bad. (since the upper 4-bits would be handled by channel volume, that'd mean 0 would be silent with no squeal and that low/quiet samples would have proportionally quiet squeal, so you'd only get the full volume pulse-carrier frequency tone/squeal for samples in the 240-255 range, and for really loud samples the squeal also becomes less obvious and annoying) With the 2-channel additive 'PDM' method, the noise is just further reduced as the low-duty-cycle pulse channel is effectively at half the linear amplitude when it's on, so what squeal there is from that is going to be half as loud in the worst case and much quieter than that on average. In fact, you could reduce that to practically zero if you used 2 channels, but instead of setting the volume of both, the low channel gets modulated between volume settings 1 and 0 and you use PWM to achieve the 16 effective amplitude steps for the low 4-bits of pseudo-DAC output. Edit: re-reading my original post in that other thread, it seems I already suggested the single-channel pulse-width+amplitude modulation route as well. I'll have to check out what methods people have actually been working with. (OTOH I didn't specifically mention setting the second channel to volume level 1 and using very quiet PWM to achieve the low 4 bits of an 8-bit linear amplitude output) But given the actual implementation of the 2-channel method still ends up monopolizing 3 POKEY channels due to the way the timers have to be configured, the method of using a single channel for the sound output with PWM only used to provide the low 4-bits might be appealing. (I think you could get away with using 2 channels for that and leaving 2 free to use for other things; you'd need one channel with timer interrupts to set the sample rate and to provide the actual sample sound output, then you'd need another channel with a timer dedicated to providing the pulse width timing) You could even use a third POKEY timer to control a second pulse-width parameter and use that on the fourth channel to get 2-channel 8-bit DAC output, but that's more work and 3 interrupts instead of 2, plus twice as many look-ups compared to just adding channels together and outputting them as a single stream. It's a shame POKEY doesn't support a one-shot timer pulse mode like the MOS 6522 VIA and maybe RIOT does, since that effectively does the work for you. (and you have 2 16-bit interval timers to use, so one could set the sample rate and the other could provide the variable-width pulse output ... say toggling GTIA's beeper output routed through a lowpass filter to cut the squeal) Atari already had POKEY covering the serial port end and VIA's had that bug to work around anyway, so it's obvious why they went with the cheaper PIA. (and RIOT would've presumably been more expensive, too)
  10. A command cache (or scratchpad ... or just prefetch queue) for the blitter probably would've been the more elegant and practical solution, yes, but I was mostly just summarizing a comment kskunk made on the issue years back. That said, wouldn't the rasterization situation also be different on the Jaguar II given Oberon's blitter had a trapezoid drawing function, so you only need to build lists of trap-segments to build triangles (or quads) rather than line by line rasterization. (incidentally, I believe several of the early PC 3D acceelrators worked on trapezoids internally for doing polygon drawing, some documented such as well: it's in the S3 ViRGE manual, at least described in the section on 2D polygon rendering depicting the trapezoidal segments used for arbitrary polygon fill operations) And on the texture mapping bottleneck, John Carmack's suggestion (in the context of something cheap and simple that they should have already included) was a 64-bit destination buffer in the texture mapping pipeline. (though given how slow the texture mapping unit is, given kskunk's 5-cycle peak test results, that wouldn't help all that much for raw speed, no more than populating the second DRAM bank the system already supported ... so it's somewhat moot there, though I'm also unsure Carmack was aware of the existing support in the DRAM controller or the 5-cycle bottleneck of the blitter) kskunk and several others (Gorf, Crazyace, I think maybe Atari Owl) also went over the problems using GPU SRAM as a texture cache, particularly how it kills GPU performance if used heavily, however, kskunk's later tests seem to point to use of the line buffers as texture RAM to be a lot more useful, possibly also useful as a texture render buffer. (the latter wouldn't be faster per se, but rendering from line buffer RAM into line buffer RAM, then blitting to the framebuffer in 64-bit, phrase-alligned chunks, would greatly reduce time on the bus and that much less contention). Honestly, for a game like Quake with the lighting model used, dropping textures entirely at a certain distance (Z-sorted as an extension of the existing ray-casting visibility system a la PC Quake) would've made a ton of sense to minimize texture mapping overhead. (PC quake also already was heavily optimized on minimizing actual rendering time, lots of computation or table based optimizations to spend as little bandwidth as possible drawing to the framebuffer, and an engine like that would adapt well to the Jaguar's bottlenecks, albeit trading more of the tables for raw realtime computation and trading the Pentium FPU pipeline-specific tricks for other things) The SVP-chip version of Virtua Racing used a 16-bit DSP to handle the 3D math, yes, though I think it also assisted with drawing the polygons using the 128kB of DRAM the cart included as a framebuffer as well as work RAM (or local memory for paging things in and out of local DSP memory). It's a DSP though, not a CPU or MCU (so unlike the 32x or even the primitive 16-bit RISC MPU in the Super FX chip) and not good for all that much else, not flexible general purpose processing like the Jaguar's GPU and DSP, but good for 3D and probably OK for simple pixel/line/block fill operations. (as a DSP it also should have done well as a sound processor, but Sega didn't use it as such ... no DACs or audio input lines connected on that cart) Unlike the Jaguar, but like the 32x, you did have multiple buses to work with, and the local DRAM able to be flipped on and off the 68k bus for copying the render buffer into VRAM. Now, the MD's 68k was still fast enough to do some software rendering on its own, and having a much simpler DSP co-processor that simply handled the vertex math and left all the rasterization to the 68k probably would've worked better than SuperFX driven 3D on the SNES (or been competitive, at least), but there's no other examples of co-pro or add-on chips used on the Mega Drive at all, unless you count the Mega CD. (and unfortunately, unlike Mode 7 in the SNES, the fast multiplier unit that must be embedded in the Sega CD Gate Array for the scaling/rotation function isn't directly accessible to either CPU, otherwise it'd be handy for polygonal 3D when the scaling/rotation function wasn't in use ... really handy if they still let the Gate-Array's blitter functionality work for simple copy/fill operations in proper nibble-precise tilemap pixel organization, but ... nope: honestly, with the amount of RAM it had along with that sort of hardware assist, I'd think it would've handled Virtua Racing well enough and probably a solid port of X-Wing, at least the more limited floppy disk version for PC, flat shading, 1MB RAM compliance and such) The Jag was way more powerful than any of that, though ... but yes, being able to interleave 68k access to some extent would give the advantages of a separate/local bus as on the MD. I'm not sure how the timing of the bus latches in the Jaguar work or if interleaving was really a major consideration, but it certainly had been when Flare designed the Slipstream and included a 16-bit bus latch to minimize time the 8086 spent on the bus (in that case interleaving on PSRAM cycles with the CPU working in SRAM or DRAM on the same bus: the video processor would only work in PSRAM, so the bus cycle interleaving was based around the 12-ish MHz video DMA clock, fetching a 16-bit word once ever 4 12MHz clocks). The Slipstream 4 (1993 vintage hardware done in parallel with the Jaguar) switched to dual DRAM banks with page-mode support and up to 32-bit bus width, but the intended 12 MHz 386SX should still have interleaved with video DMA cycles for most video modes provided it worked in the separate DRAM bank. (the blitter and DSP DMA may have remained serial only with the CPU, though) The 68k also doesn't need to hit the bus all that often to stay nearly at full speed, so having a feature to have consistent, periodic wait states for slow-ish interleaved bus access would've made it a far better investment in the system as a whole, albeit using a 20 MHz rated 68k and running it at 3/4 the system clock (19.95 MHz) would've probably been more useful as well. (you don't need the 68k bus cycles to allign with the system bus cycles anyway, not like ST/Amiga style 4-T-state interleaving, so the 1/2 system clock rate wasn't all that useful other than just being cheap/simple to divide) That and they probably spent way too much on Jerry given its limited use (it's a decent audio DSP, but too bottlenecked with its very slow bus cycle times and some other bugs, and need to page code to local RAM to work, thus nixing it as a stand-in as CPU: at least when coupled with the slow bus connection, opposed to paging code modules to the GPU; the DSP even made a poor geometry processor as writes to main RAM were twice as slow as reads, slower than 68k writes in fact: 12 cycles for a 16-bit word, while reads were less crippled at a hard-coded 6 cycles). The DSP's slow bus cycles would have made it reasonable for interleaved access on a separate memory bank (the second DRAM bank and ROM bank) but otherwise it's pretty crippled and a serious bus hog. (compared to just including a rudimentary DMA sound/UART ASIC ... or possibly including an embedded, easily licensed low-cost MPU like a 65C02 or 65816 a la Lynx ... as a sound processor or maybe in leu of the 68k, though it'd make coding in C a lot tougher ... for lazy C or 68k assembly source ports, for what that's worth) Well, that, or the Flare II team could have ditched the 68k and JERRY some time early in 1992 in favor of a nice, flexible microcontroller. Hitachi's SH-1 had just been released, and would've been really appealing as a low-cost CPU+sound processor combo, but aside from happening to catch that brand new design being released, there was also AMD's embedded range of the 29000 series, particularly the bottom-end of its microcontrollers in the family, the AM29205. (both that and the SH-1 had onboard DRAM controllers, both also used 16-bit data buses for lower cost and pin count, so also would have been fairly simple to include a local, dedicated bus to work in rather than sharing the graphics bus ... Flare could've just dropped to a conventional main bus + GPU bus layout and also used a simpler, 16-bit cart slot and copy/DMA data to the Jaguar graphics bus through I/O ports rather than sharing everything ... plus, no fragile and more expensive 32-bit MCB/VESA/PCI-ish connector for the cart slot to deal with, just something close to SNES/Mega Drive, or ISA slot pins) Though, that said, it also shouldn't have been too tough for flare to include a (slow) 16-bit DRAM controller and basic sound hardware (possibly the 16-bit Flare DSP or just DMA sound) and UART on a 16-bit bus controller ASIC to complement a 13.3 or 19.95 MHz 68000. (a unified bus reduces cost, but swapping JERRY for a much smaller, simpler, and slower ASIC, possibly with a lower pin count also saves costs and would have just made more sense ... it also could have used gate array logic, slower, lower density, but for a much smaller and simpler custom chip it would have the advantages of being much easier to prototype, faster to bug-fix, and cheaper to start up production than the standard cell ASICs used for TOM and JERRY) Oh also note, the 3DO was horribly bottlenecked when it came to anything short of solid-shaded polygons as heavy use of textures (3D or 2D) was mutually exclusive with CPU operation as main RAM was texture/sprite/etc RAM, plus it used forward texture mapping (like the Saturn, and Lynx for that matter, though just sprite-drawing) where textels are read out a line at a time and drawn over multiple times to the screen if down-scaled or folded, reducing fillrate further and also breaking translucent blending and gouraud shading. (or corrupting both due to drawing folded pixels multiple times and warping the shading gradient). Plus you had strict liberary-level coding on the 3DO without the ability to clean up bad compiler output with some hand-tuned ARM assembly. If the Jaguar had a bit of smart bus interleaving on multiple memory banks, the 68k might not have fared that badly next to 3DO games. (albeit the CD-ROM mass storage issue was a factor for actual software development ... and PC games that the Jaguar really would've been well suited for: especially various flight/combat sims using shaded 3D or limited texture mapping ... and heavy keyboard commands that made the Jag-pad really appealing) They weren't greedy, they were poorly managed (I blame Sam Tramiel mostly) and extremely desperate, plus it was also just poor timing as DRAM prices stagnated from 1993-1995 and finally dropped again just after the Jaguar was discontinued. (by fall of 1996, the Jaguar Duo could probably have been a genuine low-cost/budget range alternative to the PSX and Saturn ... and the idea of including a RAM expansion cart as standard and offering it at a very low price to existing users would all have been feasible) But in 1993, Atari was desperate, they had a big lawsuit over old patents pending with Sega (which would create a windfall in 1994) but in the mean time they were struggling, downsized to a skeleton of a company, and made the decision to discontinue their computers, somewhat marginalize the Lynx, and put a ton of effort into a Jaguar propaganda campaign to drum up investor cashflow, and it worked: it scared the crap out of Sega (at least the Japanese executives) and got Motorola and IBM onboard for manufacturing along with sufficient investment backing to bring the thing to market. Still, it was wholely mismanaged and the UK/European market entrance was both late and particularly poorly supported ... all really bad decisions on top of cancelling the Falcon. (cancelling the Falcon in the US and continuing to market computers only to the smaller, easier to support/market to UK and European market, or at least the UK, France, and Germany, would've been more reasonable ... moreso if they'd worked the Jaguar chipset, or TOM specifically, into a second-gen Falcon project, like as part of the Falcon040 or a lower-cost '030 counterpart) The further irony, of course, is that CBM fell out of the computer market, leaving a void for the Wintel invasion to finally take the UK/Europe at a time Atari might have continued to compete (especially with the shift towards open-source GNU OS extension development with MiNT/MultiTOS), plus Sega ended up dropping the Game Gear, leaving even less competition for the Lynx. (plus the Game Boy Color failed to even match the Lynx's hardware and continued cost/size/power reduction left tons of room for Lynx updates to compete) Atari's lack of a significant mainstream game console from 1990-1993 (or given the Jaguar remained niche ... from 1990 onward) was a big gap on top of the ST and 7800 sales struggling somewhat in '89, and Sam Tramiel's management ... or perhaps more specifically: Jack's retirement as CEO and Mike Katz's leaving as president of the Games division seriously crippled the whole operation. Katz thought failing to agree on terms with Sega for Mega Drive distribution was a mistake, but even that aside, I can't imagine he couldn't have guided things better after that with the Panther development, Lynx release, Panther cancellation and possible short-term alternatives, etc. (they needed something reasonable to launch in 1990-91, maybe 92 ... and a 'fixed' Panther without so many shortcomings or a derivative of one of the many incremental developments of Flare's Slipstream ... or a much more conservative Jaguar, which would also fall into the 'fixed' Panther configuration: ie support for FPM DRAM operation, enough RAM for a decent-sized framebuffer, addition of a blitter, integrated sound hardware, and some intention for polygonal 3D, but without the custom RISC architecture being implemented: the Flare DSP was enough for geometry co-processing along with CPU-assisted triangle set-up and blitter line fills) Anyway, I wouldn't blame greed as one of the Jaguar's main problems, or Atari's ... though lack of honesty might have been a major problem along with poor management and negotiation skills on Sam's part. (dishonesty with developers during the Jag's lifespand seemed to be one of the problems ... dishonesty with investors was too, which was forgivable to some extent in 1993 with Atari being on the verge of collapse, but much less so after they got market recognition and just seemed to go ... weird or incompetent with what added resources they were afforded) The late introduction of the CD add-on, DRAM prices keeping the base system price point high, and Sony's splash in the market all didn't help, of course. Actually, with all that in mind, Atari probably made a bad bet dropping computers in favor of a new game console ... the Jag chipset (or just TOM) might have been more successful in the Falcon series than it ended up as a console. (as it was, I think the Jaguar's sluggish sales didn't compare too well to the Falcon's sales for the short time it was on the market) Plus the DRAM cost overhead was a lot more justified in a computer system, and floppy disk software was the norm, so no cart vs CD headache to decide over (or be the only console on the market using floppy disks ... especially in 1993/94), plus ... TOM would've been a lot more potent alongside a 68030 on a dedicated bus (even the 16 MHz 030 on the 16-bit bus of the Falcon 030), and that's not just hindsight ... though obviously pure spec-fic fantasy. (well ... I can't help but imagine Jack Tramiel would've put more interest in working a potent new graphics/multimedia processor into a low-cost, mass market home computer rather than 'just a games machine' ... but ... ) Oh, and, back on the topic or real-world relevant stuff: I'd missed out on the new (or last couple years of) Jaguar Flash cart development project (and thread on that), so my comments on a RAM cart earlier in the thread are a bit moot there, as such a RAM cart is in the works, just not Jag-CD oriented. 16 MB of SDRAM acting as cart ROM or RAM is pretty neat, though I'm not sure the 16 or just 6 MB (no bank switching) is planned on being implemented, but the project looks super neat. (and totally relevant to some neat workarounds for homebrew programmers to exploit ... provided folks are interested in that, and interested in digital distribution of freeware or online store style software distribution ... or crowdfunded early access sponsored stuff with free distribution later on ... that and just freely distributed tech demos and hacks, as are common for homebrew on a bunch of old game systems and several computers, even 'obscure' or 'failed' ones like the 32x)
  11. Yes, sort of: http://www.konixmultisystem.co.uk/index.php?id=interviews&content=martin Martin Brennan was brought in to consult on the Panther project in 1989 (the production-ready 8086 version of the Slipstream ASIC was completed by then). John Mathieson would join the Flare II (Jaguar) project later on, around 1991 I believe while Ben Cheese (the DSP and sound guy from Flare I) would move on to Argonaut and design the Super FX GSU, then help found the Argonaut RISC Core spin-off company. (note the GSU-1 was not a DSP like in the Slipstream ASIC, but a fast little 16-bit RISC CPU, 16 16-bit registers, 16-bit address bus, 8-bit external data bus, and the multiply-accumulate performance was poorer than the Flare DSP: 1-cycle for an 8x8=>16-bit multiply 4-cycles for 16x16 vs 1-cycle 16x16-bit on the DSP, but as a CPU it was much more flexible and could run most of the game engine on its own, plus do the polygon drawing operations in its 32kx8-bit SRAM chip, and was optimized for bitplane and tile conversions). http://www.konixmultisystem.co.uk/index.php?id=downloads(see the Slipstream 1.06 documents for the 1989 8086 production version) Anyway, Bennan was brought in on the Panther project here: Meanwhile Konix was having trouble and LucasFilm/Arts decided not to go through with their prior considerations with licensing the Slipstream chipset for the US market. (Konix had a non-exclusive license, so Flare could have sold it to anyone else on varying terms, sort of like the Amiga chipset prior to the 1984 CBM buyout debacle) They also continued developing the Slipstream in parallel with the Jaguar and expanded it in various steps up to a 32-bit data 24-bit address bus 2-bank DRAM based system with a somewhat Jaguar-like Blitter, 25-ish MHz suggested clock rate, support for a variety of CPUs (though a 12 MHz 386SX was the working model of 1994) 25 MHz DSP, and CD-ROM controller/interface. It also worked on 15/16-bit RGB rather than CRY color and had the whole system contained in one ASIC. (DSP+blitter+VDC+UART+CD-controller) Though the CD-ROM interface was apparently buggy at the time. See: "Slipstream Rev4 Reference Guide v3.3" http://www.konixmultisystem.co.uk/index.php?id=downloads Now, what I wonder about was why Martin Brennan moved forward with the Panther project while not pitching the Slipstream to Atari. (or maybe he did but made no mention of it in the Interview) It was a nice little flexible, reasonably potent system, though it had its share of limitations compared to the Mega Drive (already on the market, and Atari Corp themselves had reviewed the hardware in 1988 and decided not to take Sega's licensing/distribution terms for North America: Mike Katz had wanted to, Jack Tramiel and Dave Rosen couldn't agree on favorable terms, plus they'd still be contending for the UK/European market) Plus it was ready made and ready for mass production, and the chips were made on Gate Array logic so should have been fairly adaptable to second sourcing to whatever vendors Atari had best deals with. On top of that it had software in development already on the Konix end, a bunch of UK developers familiar with the hardware and its quirks, and had a somewhat home computer or PC style architecture in general that would lend well to computer game ports (plus actual IBM compatible ports using 8086 Real Mode ... ugly, yes, but for PC games already coded for such, or for 286 in 640k or less of RAM, it would be a smoother transition for assembly-language coded games) It relied on PSRAM to get good bandwidth and do some interleaving, though had DRAM for the blitter and CPU to optionally use (up to 256 kB PSRAM and 512kB DRAM) and was fastest at rendering 256 color graphics. (using an 8bpp chunky framebuffer at 256x200 or up to 256x256 display, and allowing more compact 16 color 4-bit sprites/objects to be used via a mask register; 16-color framebuffer modes were slower to render to as the blitter had to do reads before writes to work on nybbles). It also had a fast DSP and fast line-fill operations useful for doing flat shaded polygons or a mix of other effects (including scaling or even scaling/rotation texture mapping type effects) in realtime. (though sound engines doing DSP-intensive synthesis routines would make that hard, ones just using sample based sound like Amiga MOD or such would use very little DSP time at all, especially for 4-channel music and a couple of SFX channels, even if doing ADPCM decoding as well) The 6 MHz 8086 was slow, but relatively cheap. However, it would've been a bit painful to adapt to ROM carts due to the 1MB address limit (and less than 256 kB were reserved for ROM in the Slipstream). OTOH the system was intended to use 880 kB floppy disks instead, and a DSDD floppy drive would add to the base unit cost, but make it even more appealing to computer game developers (and a lower risk all around than manufacturing masked ROMs ... something impossible for some smaller devs and publishers at the time). Plus you could make big home computer/PC style multi-disk games with lots of multimedia features. Given Atari's focus on the home computer game license side (during the 7800 era) on top of its library of ST games, plus its existing supply chain for DSDD floppy drives for the ST line, it would seem an appealing option. (plus the proprietary 880k format and some level of encryption would be appealing for copy protection, and also avoid the need for funky floppy disk DRM schemes typical of the era) The Panther OTOH was half-finished, rather odd, and not all that cost-effective. (to keep costs down it used 32kB of 32-bit VERY fast SRAM, we're talking 35 ns, like 25 MHz 386s and 486s were using for cache, but aside from a proposed 64 kB of DRAM for the ensoniq DOC-II sound chip, that was it for onboard RAM) It worked like the 7800 using cycle-stolen DMA to load sprite pointers and data (it was a direct precursor to the Jaguar's Object processor, but with no optimization for DRAM use) and would mainly rely on reading from 32-bit wide ROMs. It had a 16 MHz 68000, but with the existing 1989/1990 configuration, the 68k would spend tons of time halted for DMA, much like the 7800's 6502, plus it'd have wait states if working in slow/cheap ROM, while fast ROM (like the PC Engine/TG-16 used) would've been really costly for a company like Atari (NEC had in-house manufacturing but still typically used half the ROM size of contemporary publishers: like 256k where the MD was commonly using 512k in 1989) and while adding some more hardware could have fixed some of that and potentially cut costs by removing the Ensoniq chip (say a DMA sound + bus controller + DRAM interface chip) and allowed use of slow and even 16 or 8-bit wide ROMs loaded into DRAM, that was yet more added work and not something that even happened up to 1991 when the Panther was formally cancelled. So given the Panther lingered on in development to early 1991, the ready-made Slipstream becomes even more strange to pass up, plus tweaking things to allow a 12 MHz 68000 or 286 given the added year of development time should've been child's play compared to completed + fixing the Panther. (68k would be cheaper and generally more friendly, but 286 would make existing Konix dev work easy to port, plus lots of PC games ... either case would also give 24-bit address space to use for cart ROM if they decided to ditch floppies) DRAM prices had also dropped a great deal in both 1990 and 1991, and loading the maxed out 512kB DRAM would've been easy. (128kB PSRAM would've been enough for most purposes as well, though 256kB would be nicer: you only need 128k to double buffer the max 256x256 PAL screen, but having fast RAM left over for DSP DMA use would be significant, including doing 3D matrix processing and spitting out vertex data) *Note, they could easily have just kept the 1MB address space limit for the chipset itself and let the host CPU alone work in 24-bit space. (that'd sort of be like an Amiga based console where the OCS could just access 512kB and most/all ROM stuff would be up to the CPU copying to RAM as needed) Oh, and Atari had already been sourcing 12 MHz 286s for their PC line around this time, so that would be another consideration for that choice. I say floppy disks would be the most novel option at the time and bridge the gap between cart an CD based consoles. Albeit on a purely engineering note (and one Kskunk made years ago) a CD-ROM drive is actually cheaper to manufacture than a DSDD floppy drive (and vastly cheaper than something like a ZIP drive or LS disk drives) but the tech was all patented and had a premium on it in the early 90s and also didn't have the raw volumes for economies of scale quite yet (the Jaguar was released around the time the scales were tipping) so a common DSDD floppy drive would be the cost-effective mass storage option in 1989-1991 for sure. (720k PC/Atari, 800k Apple, 880k Amiga, all the same drive and disk track system, though using different sector sizes, also little endian data for PC, same for the Slipstream ... ignoring a 68000 based one) Oh, and the Multisystem's post-Konix era development as a set-top box included 286, 386, and 68000 configurations, mostly at 12-12.5 MHz. Or at least the 68000 came up at one point: http://www.konixmultisystem.co.uk/index.php?id=multisystem2that whole situation was a mess (not the hardware, but ... Wyn Holloway's end of things) As an aside, I think failing to capitalize heavily on the home computer and PC/DOS game market was one of the Jaguar's failings, though also one partially forced by using ROM carts. The keypad on the controller and the capabilities of the system would've made it really neat for early 90s PC games, 3D and otherwise, including Wing Commander I and II (III would need CD), X-Wing, various Lucas Arts adventure games, etc. (most of that sans full on FMV games could be done via floppy, but I don't think 1.76 MB DSHD floppy disks would've been all that appealing in 1993/94 ... or more likely to get weird looks than 880k would have back in 1990) The Panther's gamepad was essentially the same as the Jaguar's, so equally well suited to keyboard-heavy games (with or without overlays) but the 3 face buttons would've been much less outdated for 1990. (they used the same I/O port mapping as the STe joyports anyway, so STe/Falcon games could use them) Albeit, if using the existing I/O ports the Slipstream ASIC had, you'd need to reduce the number of key inputs, or add another chip for expanded I/O. There's 16 I/O ports for the joysticks already, plus 3 potentiometer inputs and a light pen input, so you could have partial STe port compatibility with 8-bits of I/O per channel, 2 analog POT/paddle/axis inputs on one port and one paddle plus a light pen (light gun) input on the other. Doing a bit of multiplexing like the Mega Drive did (6-bits of I/O in that case, though only multiplexing 2 of those for more buttons) would've been one route to get the full pinout. (plus doing 8-bits + ground is already going to make a pretty thick joypad cable, doing the full 12 bits of I/O the STe/Jag used would be less than cost-effective) *Of course, the STe's ports were originally intended to allow splitters for 4 single-button Atari joysticks or 4 paddles, and the pin designation heavily points to this. http://old.pinouts.ru/Inputs/EnhancedJoystickAtari_pinout.shtml (neat, but overkill) The cost of a little multiplexing logic would be well worth avoiding thick, expensive, awkward cables in any case. (Nintendo OTOH had been using serial based controllers since the Famicom, but the approach at hand is already 8-bit parallel oriented and multiplexing that would be pretty safe to get a good cost compromise ... you could also use an analog matrix like the VCS keypads, but that's both odd and not really cost-effective by then: Gravis used analog lines for its gamepad's d-pad, but that was partially due to making it compatible with 2-axis analog joysticks, allowing normal joystick games to use 8-way digital control via a primitive resistor DAC) Also side note on the Falcon, but having Flare spin-off a cust-down DSP-only ASIC designed to work around the Falcon's DMA/bus timing, and put a bit more on-chip RAM (like 2kB rather than the 1kB of the 1989 Slipstream ... or technically 1.5 kB, but the last 512k doubled as CRAM and was up when all 256 colors were employed) and run 16 MHz, it would've been a major cost saving measure over the Motorola 56k and its 192 kB of super-fast 25 ns SRAM (that's 33 MHz 386/486 cache RAM there). That and possibly ditched the added Falcon sound channels in favor of 16-bit PWM DAC output from the DSP (at 16 MHz, the existing PWM registers would allow up to 125 kHz 14-bit stereo up from 93 kHz in the standard slipstream at 11.9 MHz, though somewhat less than the 208 kHz 14-bit stereo the Jaguar was capable of: all systems used pairs of PWM registers to generate 7-bits and add to 14-bits ... though the PWM registers in JERRY might be broken on the Jaguar as I think it used external 16-bit DACs). You'd need the 8-bit STe PCM channels there for compatibility in any case. That or the Falcon DSP should've been an optional module. (neat for a dedicated sound processing system and neat for 3D coprocessing, but a significant detriment to the price point ... then again offering a MEGA STe style 16 MHz 68000+16k cache in place of the 68030 would've also been an appealing lower-end option in 1992, and might be faster than the 030 in situations where the tiny 256 byte caches it had were insufficient and VIDEL was stealing tons of bus time, like in the 16-bit color mode or even some of the 256 color modes: a plain 68000 working without wait states in 16 kB of cache would have lots of appeal there ... oh, and the 16 MHz blitter would get much more use than for just compatibility) And for anyone wondering: the existing Slipstream would've been a poor add-on for the STe as it supported 256 and 512 pixel modes that would leave huge boarders of synched to ST SHIFTER pixel rates (8/16 MHz) plus that'd requite a 16 MHz slipstream and faster PSRAM anyway. (commissioning the flare team to design an ST-flavor of ASIC would've been interesting ... and more worth their time than the Panther IMO, but then VIDEL was really OK as it was in 1992 and offered good backwards compatibility, while the Jaguar was really epic at the time and a bug-fixed JERRY chip given dedicated RAM to work in and genlocked onto Falcon video would've been awesome in 1994, possibly as part of the cancelled Falcon 040 ... though a 24-40 MHz 030 based system with 32-128kB of board-level cache would've been fine as well, competitive with the 40 MHz 386 systems still popular at the time and then some ... but a 24 MHz 68040 would certainly have been nice; 40 MHz 030 is just kind of nice given it's really easy to get off the ST-compatible 8 MHz base clock, also nice for the 26.6 MHz the Jaguar chipset was managing: ie 2/3 of 40 MHz) Oh duh, I forgot: the Slipstream hardware also would've been really appealing to cross-develop Lynx games for. The CPU architecture is different, but the mix of blitter+coprocessor+framebuffer and packed pixels was quite similar, as was the 3D/pseudo 3D functionality and emphasis on mass storage. (the Lynx's chipset was originally going to use tapes, but very slow, cheap 8-bit ROM ended up being the practical solution for a handheld ... meanwhile floppies were the go-to option for a console) And following suit from the 7800 (and Epyx connection) the Lynx was already leveraged fairly heavily towards the computer game pool of developers and publishers. Ah, the Slipstream and Lynx also both used 12-bit RGB, like the Amiga and STe as well. (the Lynx and STE were just limited to 16 colors ... though for the Lynx's screen that was arguably overkill: same for the Game Gear doing 31 colors in 12-bit, aside from a few games using the Master System mode) As for exclusive games: Starglider III was planned (though probably had some elements re-used for Starfox after being cancelled), and Jeff Minter had a lot of neat ideas going on, including Attack of the Mutant Camels now playable via emulator. (though the sound is a bit bugged, or it's due to lack of lowpass filtering) Edit: I forgot: by 1989, Konix had moved on to a 256 kB fully-loaded PSRAM configuration due to complaints from developers running out of memory, particularly for games using page-flipping (double buffering), though DRAM still wasn't included as standard at that point, I think. On that note, Atari could've released a 256 kB system and stuck in a pair of 30-pin SIMM slots for RAM expansion. (a nice cost-effective idea at the time and borrowing from the ethos of the STe, but in hindsight a VERY good idea as noth only did DRAM prices drop fast, they then stagnated in 1992/93 while the price of 256 kB SIMMs dropped through the floor due to the limited demand: hence the popularity of SIMM savers at the time to re-use older low-density SIMMs as 1 MB) Plus slow, old, 150 ns DRAM would be fine in the Slipstream, as would anything newer, so they could literally use refurbished SIMMs if they wanted. (and people could upgrade using cheap second-hand SIMMs or cheap overstock/surplus ones common on the market) This is was also evident in ST magazine ads at the time that had the 512kB upgrade kits much cheaper per-byte than all the other options. (520 STe to 1040 STe upgrades were cheap, just add 2 256 kB SIMMs, while other configurations required 1 MB SIMMs or possibly non-standard 512kB SIMMS, which were much more expensive for the same amount of RAM, but you only got for slots on the STe, so 1MB was the max using 256k SIMMs ... and SIMM savers wouldn't fit in an STe case, maybe the MEGA STE) That situation with 256 kB SIMMs is narrated here from the early 90s PC perspective: http://www.redhill.net.au/b/b-93.html http://www.redhill.net.au/b/b-94.html That RAM price stagnation was very much like what had crippled Atari in 1988, driving the price of the 520ST up to Amiga 500 levels and killing the 1040ST's potential to become the mainstream/baseline standard. (for general-purpose use a 1040ST at a similar price to Amiga 500 would be an obvious sell, but even for games, the added RAM and savvy RAM-centric tricks like heavy use of pre-shifted animation would've made the platform cut in further to the Amiga and general computer/console game markets ... potentially even better sound, given larger samples for software MOD or, better, ST-specific tracker formats for intros, cutscenes, and non-gaming purposes) Had they standardized the Blitter with the 1040STf in '88, that'd also boost things a bit, including for 3D stuff. (8 MHz 68k + 1 MB of RAM, look-up tables for faster multiplication, and a blitter for fast polygon fills ... also faster at sprite drawing and block-copy, and realtime bit-shifts rather than pre-shifting; a significant boost even without hardware scrolling ... also more potential to eat up CPU time doing interrupts for sample playback) In any case, RAM prices jumped up in '88 and stagnated. (they didn't jump as much in '93, but they stagnated heavily to 1996) See: https://phe.rockefeller.edu/LogletLab/DRAM/dram.htmunder "Annual DRAM price-per-bit ($ per Mbit)"
  12. Why not just use an SD cart that uses SRAM as the simulated ROM, but allow for games to reserve part of that address space for variable use. (or rather than 'reserve' just require software to manage the memory in a responsible manner and avoid writing to address space that's supposed to be treated as ROM) Some of the Mega Drive (and I imagine SNES, GB, etc) flash carts are actually SRAM carts, though I'm not aware of homebrew software exploiting that. (it's just a design choice and leads to faster loading and avoiding burn-out of flash memory) Firstly, it at very least greatly benefits texture mapping speed, especially for large textures where buffering them into GPU SRAM or line buffer SRAM would be impractical. (plus GPU SRAM chokes the GPU if used for textures, while at least line RAM exploits allow the GPU to continue working) So you can actually hit the peak 5.32 Mpixels/s fillrate of the blitter for texture mapping (or scaled/rotated objects ... or just scaled objects where you need per-pixel granularity with the framebuffer that OPL sprites wouldn't provide ... you can use the Z-buffer for OPL sprite priority overlapping a 3D scene, unfortunately) I was also mistaken earlier, it's not 10, but 11 cycles to render a single pixel when texture mapping. 2 cycles for the read, 3 for rowchange, 1 for R/W change, 2 for write, another 3 for rowchange and repeat. Using 2 separate banks of DRAM (or any memory with cycle times no more than 5 cycles) takes 5 cycles instead, I thought it could be faster but see below for Kskunk's quote: the blitter can't render textures faster than 5 cycles (26.6 MHz ticks) per pixel, thus the worst-case timing in DRAM (5-cycles for a read+rowchange) wouldn't slow down the blitter at all. To put it another way, for a game that uses texture mapping in main memory, you'd spend 45.5% of the time you normally would on the bus, and that much more time for other things. (the more texture-heavy a game is, the more dramatic the advantage) On the 'back then' hypothetical end, we could argue the Jag CD came bundled with the RAM cart at its release in 1995, if you want to fixate on the split development base issue. (and in the case of a 32kx16-bit PSRAM chip on-cart or a pair of 32kx8-bit SRAMs, it would've been both cheap and foolproof enough to pull off: with a DRAM cart, it might be cheap enough, but I could imagine delays in actually implementing a little DRAM controller/interface ASIC if they weren't planning ahead ... they obviously hadn't planned ahead for RAM expansion given the lack of external connectivity for the unpopulated main DRAM bank: that would've been cheaper and simpler to expand than any of the above, and taken fewer pins given the multiplexed nature of DRAM ... a little 40-pin edge connector would've been sufficient for a 16-bit DRAM interface, that or just put the DRAM control lines on the cart slot for intended expansion use) Or if they were really planning ahead, perhaps even arrange the Jaguar's DRAM originally as 512kB 64-bit in one bank (4 64kx16-bit DRAMs) and 1MB 32-bit DRAM in the second (two 256kx16-bit DRAMs) while still using a 32-bit wide cartridge bus, but adding the necessary DRAM control signals and allowing that connector to serve both as the interface for the cart address space at 32-bits wide AND to allow expansion of the other 32-bits of the second DRAM bank. (ie the CD add-on could have another 1MB 32-bit wide chunk of DRAM, but mapped to the same 64-bit addresses as the existing 1MB bank, interleaving on a 32-bit word basis and now providing two complete 64-bit DRAM banks, 2MB + 512kB ... and now you could have a cart passthrough without touching any of the cart ROM address space ... plus it's cheaper, no DRAM controller, much more seamless, and much more generally useful for the fast, fully 64-bit portions of the system) On top of all that, the Jag would've been moderately cheaper to manufacture at launch and still a good bit more flexible/powerful due to the reduced bus contention. (fewer page-breaks by doing more work in different DRAM banks as much as possible, plus faster texture mapping and faster 32/64-bit blits as well, as source could be in one bank with destination in the other, keeping nice 2-cycle page mode accesses going) And 1.5 MB was still plenty for the time, and much nicer than what Sega CD or 32x programmers had to work with. Now if they wanted to get fancier and make JERRY slightly less crippled, they'd have also added a 32-bit wide CPU in place of the 68k. (68EC020, 386DX, ARM60, maybe one of the lower-end embedded flavors of AM29000 series, etc ... the 020's I-cache would help a bit too, but whatever was cheapest would be best ... the Jag was already designed with big or little endian in mind, so reconfigurating that would've been less an issue ... a Cyrix 486DLC with the 1kB on-chip cache also was nice ... or IBM's similar chips, but those probably would've only been cheap from the desktop PC perspective, not from an embedded system/console standpoint: the AM29000's low end options also lacked cache, but you've got the massive array of 192 32-bit registers to consider ... a neat complement to the 64-register J-RISCs) But more to the point at hand: Jerry is more of a bottleneck in the CD than with cart games as you can have it work largely in ROM to read samples or other data or copy code (or have the blitter copy chunks to JERRY's SRAM) while avoiding hitting main DRAM and thus avoiding performance-killing page-breaks caused by rowchange. (Jerry's accesses are so slow anyway that ROM isn't that big of a bottleneck, and games using it basically just for audio would be fine, even if doing sample based music+SFX, especially if streaming compressed sampled 2/4-bit ADPCM or even CVSD would be interesting, or 2-bit flavors of CVSD: 2, 3, and 4-bit ADPCM flavors had long been promoted by Covox as low-overhead compression formats for PCs, targeting the low-end systems using parallel port DACs, but applicable to pretty much anything else capable of PCM too: CVSD, especially 1-bit CVSD is obviously better suited to speech compression than musical instruments; plus the DSP can do filtering and interpolation of lower sample rate stuff and minimize both ROM space and bus time needed to stream the samples ... and still probably sound a lot nicer than the SNES, quality of compositions aside of course) In any case, without ROM, the DSP now needs to read from main DRAM, which means page-breaks for TOM where there might otherwise just be some waits. Meanwhile, adding even a chunk of slow RAM (or even a small chunk of RAM) would offload that significantly That aside, wouldn't handling RAM on cart be similar to using ROM of a similar width? (likewise you COULD directly texture map from ROM, but it would've been slow back then, usually 8 cycles, 10 for slow/cheap stuff iirc, plus it'd mean using uncompressed 16-bit textures rather than unpacking them into RAM) Now you also could've had carts that had RAM right on them, like several 7800 games did and some SNES and even MD games (well ... just Virtua Racing with the DRAM for the SVP chip, I think, ignoring SRAM for battery saves) and a couple 7800 games had even used 32kx8-bit SRAM chips back around 1987 (both Summer and Winter Games did that iirc, only using 16kB as that's what they needed for the memory map they used and because 2 8kB chips took up too much space to fit, and the cost of 32kB was cheaper than a modified cart PCB/case at the time, apparently) so 64kB of 16-bit SRAM/PSRAM slapped on cart wouldn't seem too unusual for 1994-96 ... or later. (had the Jag done well with carts). But you needed at least enough confidence in the platform and investment funds handy to actually manufacture carts like that. (making masked ROMs at all was a big problem, and a big reason some folks suggested the Jag should've been a CD system from the start ... not for performance, but for sheer economy of development and attracting more devs and publishers who'd otherwise be unwilling to risk the overhead of a ROM based system: that and Atari could do things like ship out free demo discs both pack-in with consoles and at promotions, and even jump onboard the wave of Shareware distribution at the time ... plus still be vastly cheaper than the 3DO, but that's yet another topic) But on the issue of bus sharing and interleaving, is there too much of a delay for granting the bus to TOM, the OPL, or Blitter to do any useful interleaving between slow, periodic accesses? Like the 8-cycle reads of the 68k (not that it even hits the bus for every memory cycle) or 6-cycles for DSP. I believe you only need a 73-ish ns (2-cycles at 26.6 MHz) period for the actual read/write strobe, and while you couldn't interleave accesses in a single bank of DRAM at that speed (as there's 3 cycles for rowchange and another to switch read/write if needed) having accesses in different DRAM banks with different rows being accesses and held open (for page mode) would allow overlap of everything but the actual read/write strobes. Now, a higher priority processor on the bus couldn't take open cycles from a lower one as it already has priority, so you need situations where the slow processors have priority, but leave enough time to grant one or more accesses to lower-priority processors/DMA-channels/etc (any bus master). The 68k is normally bottom priority, so would be difficult to actually put in a situation where TOM, the Blitter, or OPL could waiting for holes in 68k accesses to work, but the DSP normally has fairly high priority and that could be exploited. Further, the 68k has higher priority when doing interrupts, so coding a game where the 68k is being used pretty much exclusively as an interrupt handler would make that arrangement viable. (as such you could potentially split up general processing duties between the DSP and 68k while not too horribly hogging the bus) From the Jaguar Refrence Manual V8 The Jag II fixed that with double-buffered blitter registers instead. And I say fixed and not 'would have fixed' as I'm pretty sure that was functional on the preproduction Oberon (Tom II) chips used on the development systems in 1995. (Puck was not on those, just old Jerry, as crippled as ever, except 32-bits wide thanks to using the 68020 in place of the 68k ... something they might not have needed to retain for the production version if Puck's features worked correctly, allowing a cheap 68000 to be stuck on there for compatibility: indeed, better compatibility than an '020 would provide, plus a 68k would have been a reasonable fit on the slow 16-bit sample RAM bus Puck was to use, sort of like the 68EC000 in the Sega Saturn audio system) Playing devil's advocate here, I'd point out that Kskunk's skunkboard (and any modern homebrew ROM cart based games that got enough traction to be manufactured in masked ROM) could be run fast enough to allow texture mapping from ROM without waits, but more than that it could allow GPU code/data/DMA fetches from ROM at full speed as well. (using the high speed ROM cycle mode that was originally intended for testing only) The blitter and OPL doing 64-bit bus operations would still be faster in DRAM though, in cases where serial access can exist. But beyond that, you could build an SRAM cart that could either be simple SRAM (only useful for loading from CD), made into a passthrough cart and only using part of the address space (allowing ROM as well, possibly bank-switched), or just a full 6 MB 70 ns SRAM cart used for CD homebrew. Or add an SD card interface (or CF, XD, etc: the latter would probably be easier given it's parallel, but SD is obviously the most popular and what most 'flash' carts use, regardless of whether they load into flash memory or SRAM on-cart: the latter has the advantages of speed and not wearing out from successive writes) And given the hardware hacking stuff folks do (overclocks included), I'd think wiring up the unused DRAM bank would also be an interesting possibility ... probably not as simple as the old piggyback RAM upgrade on the ST, but also not totally different. (and SOJ leads aren't too bad to work with ... touching the leads on TOM would be iffy OTOH) Oh, and Kskunk's experimenting with texture mapping on internal SRAM showed that the logic for calculating scaling/rotation in the blitter limited pixels to 5 cycles at best, so even random read/write DRAM speed would be fast enough to do texture mapping at the peak rate. (that would include page-breaks in the second DRAM bank or a slower DRAM controller onboard a cartridge with no page-mode support and basically behaving like PSRAM at the 5-cycle cart I/O mode) https://forum.beyond3d.com/posts/1936444/ (Nammo is Kskunk) There's also stuff in that thread about rendering directly to the line buffers and potentially doing beam-racing style 3D at 50/60 Hz, but that's better for another thread. My current new favorite is actually: what if Atari spun off Flare II's Jaguar II project to Sega in 1996 during all the second-guessing with the Saturn. (plus the unfinished Puck chip with RCPU and DSP could be displaced by some neat Hitachi Super H RISC chip with built-in UART and such ... or an Power PC 602, convenient 64-bit bus there) But again not the topic here. More on topic I'd say would be bundling a 512kB DRAM cart with the CD to boost performance/flexibility a bit might've made an impact, that and they just had bad luck of choosing to cut their losses early in 1996 before DRAM prices dropped like a rock. (they had the misfortune of test-marketing the Jaguar at about the same time of the big Formosa resin factory fire in Japan that cause RAM and other IC prices to jump up then stagnate, just like they did in 1988: the latter hurt the ST big time and crippled the 1040ST's potential of becoming the bottom-end standard, plus made the Amiga end up price matching the ST that year ... the 520STfm and A500 starting 1988 at $300 vs $500 and meeting at $400 mid-year) Granted, the reason Atari desperately needed good luck to survive was mostly related to Sam Tramiel's abysmal management. (Atari Corp was best under Jack and Mike Katz ... 1985-88) Oh but I doubt anyone would bother with a DRAM cart for modern homebrew. SRAM is much easier to do and cheap enough not to bother with anything else, plus 2-cycle read/write times offers more flexibility for the fast parts of the system. (the DSP and 68k would be fine with 186 ns cycles ... the DSP can only do reads at 6-cycle intervals anyway, and the 68k takes 4, but 8 system clock cycles as it runs at half the speed)
  13. I haven't read the whole thread to see if Curt or Marty or some other historians on the site already corrected this, but Warner-Atari heavily invested in the Amiga chipset and had licensed it, planning it as a home computer, arcade machine, and game console (codenamed MICKY). They also had several in-house 16-bit (68000, x86, and maybe 16032/32016: I think Trammel Technologies dabbled with the latter before switching to the 68k). Amiga ended up cheating out of their contracts with all licensees, and at least in Atari Inc's case, illegally 'refunding' the investments made while claiming to have failed to produce working silicon. Meanwhile they'd signed an exclusive agreement with Commodore. With the confusion going on in June/July of 1984 at Atari Inc, and Warner's horrifically managed liquidation of the company without notifying executives (especially Atari President James Morgan) led to that slipping through the cracks and some lower level management accepting Amiga Inc's refund check without reading over the contract properly. (it's that same sloppiness that led to Tramiel's poor reputation and the myth that he 'fired everyone' when taking over ... rather than the reality that Atari Inc was liquidated, the arcade business spun off and home/consumer business's assets sold off ... it's also that mess that led to some of the neat in-house designs, hardware and documentation along with engineers walking off or becoming fragmented) It was also that breach of contract that leveraged Atari Corp's later settlement with CBM over the ST lawsuit. (the Amiga contract was brought in to counter-sue them) Incidentally, the Amiga contract allowed a game console/arcade machine to be released in 1984, a computer in 1985 with no more than 128kB of RAM, and unlimited hardware configurations from 1986 on. (had Tramiel gotten hold of that license, I imagine they'd have made do with 128k and perhaps shipped without GEM initially, just the text based portion of TOS, and also probably been forced to include RAM expansion via slots or DIP sockets and possibly even use an external cart slot for the OS ROMs ... or slave the cart slot to that purpose while abandoning any intent to use ROM cart based software: though using internal ROM sockets and intending service centers to install OS ROMs as they arrived may have been the more natural decision) The ST was originally intended to have a 128k (130ST) as the bottom-end model, of course, but was abandoned as RAM prices fell and the OS became too large. (and a cut-down version without a DOS at all and just BASIC and tape drive interface routines became unappealing) On the note of the actual thread topic, though: why not add MARIA to the 8-bit chipset? This is something that came to mind while I was looking at the flaws and problems (and possible fixes) for what made the Panther problematic a few years later, but in any case: Replace FREDDIE and possibly the MMU with a new gate array ASIC, performing the memory mapping and DRAM controller duties and fast enough to allow Amiga speed bus cycles (280 ns cycles) fast enough to service existing ANTIC and SALLY access times while only using 50% of the bus cycles, but rather than fiddling with ANTIC or SALLY timing at all (or trying to spin off 3.58 MHz 6502s or what not) use that added bandwidth in leu of cart ROM access for MARIA graphics data, and use the new mapper/controller chip to interleave things seamlessly to avoid the need for CPU halts during MARIA DMA. (though you'd still need to wait for vblank to do MARIA register updates and list/pointer updates in SRAM) Bump MARIA SRAM up to 8kB (a single 8kB SRAM, cheaper, less board space, etc ... 32k would be nice, but not really needed given you're pulling most graphics data from DRAM). Probably map the normal 48k MARIA cart ROM space directly into A8 space and put MARIA registers and SRAM onto the 16k bank switched segment. Possibly add the ability to enable/disable either of the 8k cart ROM chunks to allow the full 64kB of DRAM to be used and those 8k banks flipped in as needed. (I forget if the player 3 and 4 GTIA trigger inputs were used already, but those might be handy for using an additional 2-bits of bank select control) You'd still need on-cart bank-switching logic to extend beyond 16k as well, but you'd make the most of RAM this way and avoid the issue of MARIA/A8 DMA conflicts in ROM. (cheap ROM being too slow to interleave in, at least when both ANTIC/SALLY and MARIA are trying to access it) You'd thus have a really nice system with MARIA graphics operating without holey DMA and genlocked over GTIA graphics (MARIA was designed with this in mind for the planned laserdisc expansion, so genlock with GTIA should be quite possible, particularly as all would be running off a synchronized clock and using common color/pixel clock times or integer multiples of those: ie if one was using 320 pixel and the other 160 pixel modes). MARIA allows for up to 4-bit pixels in its objects, which could potentially also allow a 12-color linear bitmap screen overlay on top of ANTIC+GTIA character or bitmap modes, or turn off the latter entirely for 100% CPU time in a 12-color bitmap. (or 13 colors given GTIA's background color should still be available) For typical late 80s console/arcade games, I imagine it'd be appealing to use 3 or 12 color MARIA sprites over a 5-color ANTIC character scroll layer with GTIA sprited used for a bit of added color. (so 12 color sprite layer + 9 color background) Doing proper Genlock would also give nicer video output than the 7800's hacked solution of merging TIA and MARIA video lines. (a simple disconnector switch also solves that, of course) Further, this sort of machine would have been a much more potent game console to release for 1987 than the XEGS, while also better meriting the price points the XEGS was initially sold for (substantially more than the $99.99 65XE or $89.99 7800 and of course $49.99 for the 2600Jr). The deluxe package XEGS with light gun and keyboard originally retailed for $199.99, and I rather doubt the added MARIA+SRAM + Gate Array chip and 150 ns DRAM rather than old 200/250 ns stock would've pushed it even that high. (probably more like $150 in a basic set and $200 with keyboard and games and/or possibly other software) https://youtu.be/2N2BUTIpnDI?t=97 Plus you'd have a game machine with substantially greater advantages over the NES and Master System. (still some trade-offs like the lower resolution for most purposes, but a monster sprite engine for the time and some pretty nice colors all around ... and the flexibility to do some nice software rendered effects to a linear framebuffer and a ton of RAM for a console at the time, and chunky pixel graphics, so very well suited to storing compressed data on cart to save space) You'd also have a lot more CPU time to do complex POKEY modulation effects (or 4-bit PCM) or possibly make some use of the GTIA beeper channel. (though that would probably be more useful if you added GTIA beeper control to the new ASIC, maybe slaving it to some neat PWM sound ... possibly even useful for sample playback, but I'm mostly just thinking fixed-volume variable duty cycle pulse wave stuff ... though slaving it as a PWM DAC would certainly be interesting, I'm not sure what sort of resolution you'd get out of it: if you could toggle at at 7.16 MHz, that'd allow 28 kHz 8-bit sample playback, which would be quite neat, especially if it was DMA loaded ... though a CPU-loaded FIFO would be pretty good, too) You could obviously have a 128 kB variant of that on the computer end of things, but a game console would probably be better to stick with 64k. (you could drop lower, but that would hamper the compatibility and selling point for promoting expanded A8 development in general as a computing platform on top of enhanced game machine, plus 64kx4-bit DRAMs were a very economical density at the time and using 2 or 4 16kx4-bit ones for a 16 or 32k system would seem a poor value by comparison) And, of course, such a game console would squarely sit in the Home Computer category as far as Nintendo's predatory licensing was concerned, and would soundly avoid the sort of problems the 7800 and Master System both suffered from. Edit: you could also use that faster DRAM timing to allow for a 3.58 MHz 6502, but I'm not sure existing (even new production 1987) NMOS SALLY chips would tolerate that well enough, and 65C02s were around, but then you had to deal with RDY rather than HALT among other things (short of making a CMOS SALLY). OTOH, using that 7.16 MHz bus/DRAM controller ASIC clock divided by 3, you'd get a more likely SALLY-tolerant 2.39 MHz, which would be a nice speed boost for some things, and still wouldn't change ANTIC timing. (just more wait states for SALLY when overlapping with MARIA DMA cycles) 3.58 MHz would obviously be nicer, though. (even more wait states for MARIA, but still a speed gain, and faster interrupt response) You'd need normal 1.79 MHz modes for full compatibility. (also standard XL/XE memory map modes, possibly disabling the cart-slot banking if that proved problematic) Wait: RDY in the 65C02 behaves like HALT on SALLY, doesn't it, since it's CMOS and static and thus needs no refresh? So you could use a 65C02 in there without problem, and use 3 or 4 MHz rated chips at 3.58 MHz. (unless there's any software using undocumented NMOS-specific opcodes or such, you shouldn't have compatibility issues ... plus you get the enhanced instructions, some more than others depending which 'C02 variant they used ... probably Rockwell though, given Atari Corp was using them a fair bit already for chip vending) That aside from other hypotheticals, like if Atari had taken Synertek's assets when Honeywell liquidated. (Synertek was in trouble with Superfund cleanup/lawsuit issues, so it would've been on favorable terms, though another risk/reward investment for Tramiel to make like he did with Atari Inc's assets, though it was sold off in 1985, when Atari Corp was already pretty deep in investment debt) Synertek had already been manufacturing 65C02s prior to being shuttered, for what that's worth, along with second-sourcing a bunch of Atari's custom chips, so it would've been a solid fit all-around. (albeit slightly moreso had the ST used more MOS chips for its 8-bit serial and I/O stuff rather than Motorola ones) And why use a gate array for the new ASIC? It'd be much faster for testing/prototyping than a full custom chip (especially without an in-house chip fab) and would be much lower risk to produce at low volumes, hedging their bets on a potential flop. (if it really took off, they could probably spin off a full custom or standard cell ASIC to not only replace it, but embed the DRAM controller+MMU+CPU+ANTIC+GTIA+MARIA+POKEY+PIA all on one dense CMOS ASIC with a single 8-bit I/O bus and 16-bit plus bank-selected address bus, making it a solid budget console/computer platform around 1989 into the early 90s and also making a nice platform to cross-develop Lynx games for) You could also switch to a single 128kx8-bit DRAM chip by 1990/91 and discontinue the 64k models entirely. It's worth noting that plenty of manufacturers stuck with gate arrays throughout platforms lives in spite of high volume production, so that's always an option too. (and you didn't need the raw logic speed that standard cell and full custom CMOS parts were doing in the late 80s ... Standard Cell also might not have been very widely used yet) Sega used lots of Gate Array chips for various things in the arcade and home consoles. (and the custom graphics/interface chip of the Sega CD was simply called the Gate Array in most documentation/programming literature) Flare Technologies also used Gate Array chips for their Slipstream hardware (the Jaguar was Standard Cell, though), which makes plenty of sense given they'd come from Sinclair, who'd used some of the pioneering Gate Array (ULA) production for the ZX-81 and Speccy.
  14. Oh, and I forgot to mention, even without the Z80, you could leave in all the other Master System compatibility bits (I/O, sound, VDP, etc) and just stick the Z80 into the Power Base Converter. (most or all of the necessary I/O and memory addresses are accessible through the cart slot as is, so you might not even need to change that. You could also have ditched the side expansion port in favor of a VRAM expansion port (there's another 64 kB of VRAM space unused by the VDP) and use fewer pins for that as well. (the dual 8-bit data ports plus multiplexed address lines and DRAM control signals) On that note, upgrading the PSG to allow it to run at lower clock rates (or just 1.79 MHz, half of normal) would make it much more useful for music, though adding Game Gear style stereo functionality would be nicer. The Cart slot is already a much better expansion port than the side port (originally earmarked for a floppy drive before the CD-ROM was pressed into that role) but a cart-slot module based expansion would be far more flexible and efficient ... and you probably wouldn't need that redundant 68000. (it's faster, sure, but swap that for a DSP co-pro of some sort and you've got a generally more useful system more useful for 3D) You could also just put the VRAM expansion lines on the cart slot, potentially on outboard keyed portions (7800/SNES/Jaguar style) to keep PCB costs down on standard carts. (actually, there's a TON of expansion pins that most games don't need and would've been cheaper/better off if segregated from the normally used ROM cart bits ... probably just 48-50 pins needed for most games, including a couple pairs of VCC and GND lines) If you added that second VRAM bank onboard the CD itself, it'd also open up interesting possibilities for other changes, like having the added graphics co-pro ASIC render straight into that VRAM bank, or at least have faster and more flexible DMA than the MD's native VDP (faster VRAM, maxing out DRAM/PSRAM bandwidth, CPU-synched interleaved DMA modes, among other possibilities). Or just include two extra VRAM banks that can be flipped like Sega CD word RAM or 32x framebuffers (or Saturn VDP-1 framebuffers). With 121 colors from 12-bit RGB from the start, the need for video expansion would be less too, but tweaking that a bit more and allowing one or both BG layers to be disabled to allow linear bitmap framebuffers instead (with an eye for software rendered effects, even without expansion hardware) would be interesting, plus you wouldn't need to monopolize both VRAM ports if you disabled both tilemap layers and used the serial bus for framebuffer scanning. (you could do 2 15 color 4-bit planes or 1 121 color 8-bit plane, or 2 half-res 8-bit planes, and potentially make use of unused color values for shadow/hilight translucency effects, though you could also juse one bit for per-pixel priority to allow objects to be drawn in front of or behind the sprite layer) Doing a linear bitmap is much simpler than a tilemap, and the system is already using packed pixel data. Short of that, you could also tweak something the VDP can already do: lowres direct color via mid-screen CRAM DMA updates. The problem with that is it halts the CPU for the entirety of active display, but allowing the tilemap layers to be disabled and DMA'ing drom VRAM itself would allow for the same effect, direct 16-bit (unpacked 12-bit) color bitmap at up to 160x200. Plus sprites could potentially still be enabled if this was a feature rather than just an exploit. (practically speaking, you'd want to limit that to smaller render windows due to VRAM space limits ... right up until you added external VRAM like in the above CD unit suggestion) Note the real-world hack mode using this is limited to 9-bit RGB encoded as unpacked 12-bit (you have 3 nybbles per 16-bit word, just with the 4th bit ignored on all three: the VDP natively works in 12-bit RGB, remember, it just had CRAM and the color DACs truncated to 9-bits to save chip space). Oh and on that note, I believe the PC Engine was also designed with 12-bit color in mind and the expansion port actually allows for upgrading the RAMDAC, but they didn't use that feature on any of the CD expansion units. (you could've had 481 colors from 4096 12-bit RGB instead of 512 color 9-bit RGB) Oddly enough, the SuperGrafx also retains the 9-bit color limit, in spite of using dual VDPs. (the pixel bus on the expansion slot also provides other information, so an upgraded RAMDAC/mixing chip could potentially add things like translucency effects in hardware) The PC Engine is one console that was pretty close to ideal for its time, but the upgrades didn't push it nearly as far as it could've been ... and marketing was poor in the US and it failed to get a European release at all. (unfortunate given the tiny PC Engine form factor would've probably sold well as-is) They probably should've had at least 2 controller ports on the TG-16 variant, though, and offered 3+ button controllers sooner, them made 6-button ones standard, and should've either made the Supergrafx an expansion unit, built into a second-gen CD-ROM base interface, or gone another direction with video expansion and added a framebuffer bitmap layer instead, with the VDC function probably built into the upgraded RAMDAC chip and piggybacking on existing CRAM entries for the 255 colors. (either software rendered or blitter accelerated ... probably blitter accelerated) The original 1988 CD-ROM unit could've been simplified and made generally more useful by omitting the ADPCM chip, using a unified block of 128 kB DRAM, and either adding simple 8 or 16-bit DMA sound, or just relying on software driven playback instead. (given how poor a lot of ADPCM sounded, and how poorly it buffered and streamed for cutscenes, even simple 4-bit or 5-bit LPCM would've been competitive at the same bitrates, but you can do software DPCM/ADPCM decoding pretty easily and also do software 8 or 10-bit PCM fairly easily with paired channels at offset volume levels, and software mixing is far more flexible than a fixed, single ADPCM channel: that was also a huge limitation of the X68000's sound system, a single 8-bit PCM channel would've been far more useful) In any case, no sound upgrade at all would've been fine for the first gen CD unit, and they could've added something fancier and more generally useful around 1991 as part of the Super CD upgrade. (an entire base interface unit replacement, say 512kB DRAM, the VDC/color upgrade, and perhaps one of NEC's embedded DACs coupled with 16-bit DMA stereo, allowing CPU or DSP driven software mixing as well as slaving the DSP as a 16-bit multiply-accumulate copro for assisting with 3D or scaling effects)
  15. Incidentally, the MD's VDP was designed to support 128 colors (or 121 colors: 8x 15-color palettes + 1 BG color) from 12-bit RGB (4096 colors) and had external expansion pins for that, but they were left unconnected on the MD itself and used later for the Arcade System C. (which also ditched the Z80 in favor of an 8.9 MHz 68000 and PCM chip) Had Sega wanted to use that full capability in 1988, they'd have omitted the CRAM and DACs entirely from the VDP and used an external RAMDAC chip (as the PC Engine did) and probably could've made up the cost difference by removing the Z80+RAM and had the 68k handle the sound drivers alone. (just add a simple 8-bit DMA sound channel and you're good for sample playback and software mixing too ... interrupt-driven PCM is horrible on a 68k and cycle-timed loops aren't practical for most purposes either, so DMA is the way to go: on a 650x based platform like the PC Engine, interrupt based PCM was viable and a 7 kHz driver would tend to eat 5% of CPU time for tight code: Malducci's driver manages such; plus you can do channel-pairing tricks to get 10-bit resolution and double-buffer sample chunks into wave RAM to get better than 7 kHz without added hardware, though you'd need to sacrafice 4 channels to do 10-bit mono that way, and using 4/5-bit PCM, even for some sample based music would be pretty useful and doable with just 2 paired channels at up to 32x7 kHz ... so also tons of potential for interleaved/multiplexed mixing, but I digress) There was also Ricoh's 8-channel PCM chip Sega later used in the CD, but was already using in 1989 in the Arcade on the Model 18, but that's unnecessary added cost and overkill compared to the potential of software mixing with DMA sound. (OTOH it was MUCH cheaper than the Sony SPC700 module of the SNES ... and manufactured by Nintendo's prime chip vendo Ricoh ... and would've been an interesting choice to see tweaked as an embedded CPU+Sound Chip on the SNES end ... with a much faster 65816 and faster RAM rather than wasting money on the Sony module and cheaping out on RAM with DRAM and a slow DRAM controller: compared to NEC, who managed with 70 ns DRAM and a fast controller to allow for full-speed 7.16 MHz 650x operation in 1988 with their CD-ROM system ... 2.68 MHz is SAD in the SNES; throwing in 256 bytes of RAM for one-chip zero page would also be nice and help somewhat for poor compilers for those attempting to use C on the SNES) The PC world also had the issue of VGA compatibility, and ATI took the route of an 8514 clone, but used a separate VGA core + RAM to provide compatibility there and nothing fancy like genlock to allow overlay of the two screens. Plus, you had 4-bit color modes using bitplanes and 8-bit chunky modes (not to mention the odd organization of nonlinear unchained VGA 8bpp mode: not planar, just interleaved in the 4 64k banks of VGA space ... probably due to the way they got the necessary bandwidth while focusing on linear pixel space in 4-bit mode rather than say, linear 32-bit alligned addresses in chunky mode). OTOH, ATi probably could've made a low cost fast VGA card that simply had some nice added features while focusing on basic VGA compatibility. Remapping RAM to 32-bits wide would be relatively straightforward for a much more friendly/fast (especially for 32-bit CPUs and VESA) linear 32-bit word organized 8-bit packed pixel framebuffer, and also support DMA from main RAM, allowing fast updates of partial or entire screens. (entire ones for double-buffered full-frame rendering, partial ones for looping single-buffered scrolling type graphics, where DMA is mostly filling in portions of the off-screen bits being scrolled-in) Simple DMA block copy and fill function would be good enough for basic acceleration rather than a full blitter and would cater to 8bpp modes and 512kB DRAM. (which become appealing as soon as you adopt high enough bandwidth to do 640 pixel wide 8bpp modes and 640x480 256 colors ... while still being compatible with fixed-frequency VGA monitors; while 640x400 could still be double-buffered, so good for 3D games) You'd also want vblank interrupts to make for fast and simple page-flipping without tedious status register polling. (also very useful for color cycling effects via palette swaps, or 256 color video playback that re-loads the colors for each frame or on key frames: something you can't really do without double buffering or really fast DMA able to copy full frames in vblank ... so using Mode 13h would be out on ISA video cards, while double or triple buffered mode X would be possible via ISA cards ... or of course, a mapper-modified Mode X allowing 32-bit linear pixel organization, though obviously you'd need 2 port or DMA writes on 16-bit ISA for that) DMA functionality without any bit manipulation features would still be useful for 4-bitplane VGA modes, but less useful than something like the Atari STe blitter or Amiga Blitter. (hardware bitfield operations, bitmasking, bit-precise line fill and sprite drawing, etc) But with a CPU with a barrel shifter and fast bit manipulation instructions, you'd be OK software rendering and DMAing that way anyway. (the 68000 was not such a CPU, but a 386SX could handle such ... I forget where the 286 fits in there) So a fast enhanced VGA card that still lacked double-bandwidth modes (640 pixel 8bpp) could still be appealing with DMA copy and such, and offer relatively fast ISA bus performance. (and if it got popular enough, you'd probably have seen games exploiting the DMA function for primitive blitting or screen block/tile updates at 320x240 with square pixels and fast/efficient 32-bit word-packed 8-bit pixels rather than funky mode X, speeding up software blits to the back buffer in main RAM even if copying over ISA was a bottleneck) Gaining market acceptance would be key to getting software like games to support it, but a low cost, enhanced VGA card would seem much more likely to gain such than an 8514 clone. Hmm, perhaps even easier to gain acceptance would be a simple 2 or 4-page hack of Mode 13h, allowing a mapper/bank-switching scheme to let software treat each page like 13h, but have additional control register settings that allow page-flipping and thus more easily allow software to optionally support that with less modification of their rendering code. (just allow for 2 banks to be selected, one designated for the active screen and one designated as the back buffer currently being copied to: you could potentially have 3 back buffers and a quad-buffered arrangement for smoother average framerate, of course) So you get the speed and simplicity of mode 13h without the hassle of single-buffering and ugliness of screen tearing either without v-synch or over ISA where there's no time to copy 64kB in one vblank period. (If you dropped to 60 Hz for 320x200 with a boarder and more vblank time, you'd still only get 62kB at the absolute max over 8 MHz 16-bit ISA ... so with a fast CPU and tight polling of the vblank status register, you could avoid tearing if you had a boarder or status bar or such that didn't need to be re-copied every frame ... plus square pixels, which is nice, though the letterboxing isn't so nice)
  • Create New...