-
Content Count
2,444 -
Joined
-
Last visited
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by kool kitty89
-
That one was excellent, also made use of the 2 fire buttons to great advantage over the A8 port. IMO Let us not forget River Raid (by Activision). The 5200 version is my favorite port of all. I find the lack of self-centering extremely helpful. Yea, that's a pretty awesome game, the A8 version is good, but the single button does have limitations. Of course, Atari easily could have released 2 (or 3, or more depending on the mechanism) for the 8-bits, but they chose not to. (same for the 2600, and also for analog joysticks on either system) Not having the Activision games probably would have been a noticeable hit for the 5200, but there were so many other problems (and more with Atari in general), that that would have been a drop in the bucket.
-
Or continue to evolve the II to a system comparable to the mac, but fully backwards compatible. (and potentially better than the mac in other areas too, depending on the design philosophy) Not really any more so than the Z80, the 68k was a dead-end in the long-run too, it just took longer. Yes, a "tacked on" multi-CPU route would be one option, but using more powerful 650x models would make more sense up to the late 80s at least. (let alone other coprocessing) In hindsight, prolonging architectural transitions for as long as possible would be the safest route given how many architectures ended up falling out of favor. That, and who knws what might have happened to 650x if it continued to be popular on machined that penetrated the market like x86/PC did. (x86 wasn't particularly good, but the demand drove investment in extending the architecture) Yes, if not displaced in entirely. (ie expanded the 1st party market wide enough -and with an expanded product range of higher and lower end models- that you got clones on the level of PCs and interest in 3rd parties over PCs ) Yes, though that was also a niche market and those transitions were often somewhat sloppy in nature. In the context of the Apple II being REALLY big, you'd have to think more in the context of what IBM/PC manufacturers and MS did with hardware and software extensions (technically I believe even modern PC hardware is fundamentally compatible on a low-level with much older hardware, but the software/high-level side of things is a bit of a different story). And, again, the very fact of being a market driving force could have reshaped things to the extent of not having to make such drastic changes. The mac had to make changes due to it having a tiny market share and not driving the mass market (and also using certain parts that didn't get strong mass market support -and Motorola's odd decision to clamp down on 2nd sourcing/licensing with the 68020 onward), the PC architecture became so popular (on top of being a fundamentally evolving architecture -unlike the C64) that it drove x86 to be extended far beyond it ever would have if used only as Intel had envisioned. Of course, if the Apple II went the way of the PC, it might have meant Apple being pushed out of the business or restricted to certain sectors of the hardware market (albeit, IBM themselves could have done a lot better than they did with the PC market if they taken the "if you can't beat 'em, join 'em" philosophy -ie rather than trying to crush clone manufacturers, compete directly with them using in-house advantages). And of course, unlike IBM, Apple was also the main/only source for the OS on the computers, so they could have gone the Microsoft route instead and dropped hardware. PCs ended up dropping into the lower end cost category too, and it was following that that the market really got solidified. (it was not the cheapest or the most capable for a given price, obviously, but there were still lower-cost models with greater and greater cost/flexibility options as time went on -like by the early 90s when you could start building your own machines to get the best deal of cost/performance and a machine well suited to you -at least if you had the know-how or friends/family who did -paying to have a custom machine made could also end up being cheaper too; used/surplus parts warehouses also became significant for low-cost custom/homebrew machines) I think he ended up getting lucky eventually. If you keep trying, regardless of how high the failure to success ratio is (or how much it costs), you'll eventually manage something. That, and he certainly is good at manipulation and hype/PR. (again, like his "teacher" Nolan Bushnell ) Margins are important, but so are volumes. Neither tell the story overall though, you need to compare net revenue, profits, spending, etc, etc. (and if you do compare profits, you also have to take investment spending into account -you could have a healthy company making deficits some quarters to facilitate growth, it's all a matter of context -deficits/debt on top of a shrinking market share is not good though) One of the problems with the Mac is that it did too much in software too. Pushing that philosophy back in 1976/77 made a lot of sense, but by the early 80s you had a LOT more potential for custom coprocessor chips. (and technically, the Apple II did more in hardware than some machines; I think the ZX80/81 manages video with BIOS routines run from the CPU and only allows work to be done with the screen turned off or in vblank -the latter only in the 81, and I think TIA required a lot of CPU assistance too) I'd argue the 64/128k Mac (among others) was pretty damn inelegant, both aesthetically and technically. And more important that "attractive" aesthetics would be "appealing" aesthetics for various markets. The "serious" business market in particular demanded many things the Mac severely lacked. (large, high-res monitor, color -by '84, color was getting more significant on PCs and EGA had arrived, a professional looking desktop form factor with full sized, fully-functional keyboard, modular/expandable hardware, etc) One thing important about Macbook pros is the video hardware they offer. (at least in some models) No (or almost no) other manufacturers provide laptops with reasonably powerful hardware graphics acceleration, that's pretty significant for a number of applications. The only Macs that are really useful and cost-effective on a technical level are the really high-end models (Mac Pros and Macbook Pros), for cases where you really need that performance. (it will either be about as expensive on similar PC workstations, or not available at all) And Jobs seemed completely (if not intentionally) blind to his own logical errors and Woz's obvious proof for real-world success. Granted, Jobs' niche market approach does work in a handful of cases (and eventually worked for the Mac -though almost failed several times -and isn't their main product today either), but it's far more risky and wasteful overall. In fact, catering to a niche market is better used as a back-up route for a mass market product that fails to latch on to consumer interest as intended. (like the Amiga did with professional graphics and video editing, and the ST with music -of course, both became true mainstream platforms in Europe) Again, plenty of other options prior to the '816. (and lots of middleground from the II/IIe to the GS as well -could add more moderately improved graphics, especially with a true 16 color bitmap mode at reasonable resolutions and with a linear framebuffer -perhaps lower color modes with normal linear framebuffer as well, DMA sound or at least a bare DAC or off the shelf sound chip, a nice set of programmable interval timers would be nice too) Not just the profits, but the image of the Apple II and the name it had made for the company. Even with the same amount of funding, Apple would have been FAR more hard pressed to push the Mac without the existing market of the Apple II. That was one of Ray Kassar's problems with the 8-bit line, he had a vision that was strikingly similar to what was done with the IMacs in the late 90s. (very user friendly, "smart" plug-and-play peripherals, an "appliance computer", and plans for color coordinated models to cater to different tastes -especially to make it more attractive to women ) That's how I feel too, though if you look at what he did at NeXT, it doesn't completely match his other examples outside of the "elegant" form factor. (I'll admit many of the NeXT machines do look pretty cool) It also makes you wonder how the market would have done without his influence? How would smartphones, MP3 players, or tablet PCs be? (similar, better, or worse?) Or for that matter, what if Atari Corp or CBM had been "luckier" back in the late 80s?
-
Interesting, but that's a couple years later than I was thinking . . . more like the followon cross compatible model that encompassed the model I/II/color systems. I was thinking of a more rudimentary upgrade in the same timeline and price range (at least for bottom-end models) as the CoCo (1980), including TV output along with backwards compatibility with Model I modes. Hell, if they were still going to go with the cheaper 2.5 MHz Z80 (rather than the Z80A), they could at least bump it up to 2.38 MHz from the 1.79 MHz used in the model I. (using a 7.16 MHz master clock -for commonality with NTSC color clock) Though I think Z80As were pretty cheap by that point too. (after all, Sinclar was using them in the super low end ZX80 at the time) Yes, but the question is whether it would have been cheaper to use a generic ULA (probably with some discrete logic -at least initially) and common Z80 CPUs. It probably saved chips in WOZ's design. I think WOZ is brilliant but I think he spent too much time trying to be clever. A little less clever and a couple more chips might have made the machine much more capable. But then I'm not even sure WOZ designed the IIe. Couldn't they also have invested more into custom chips (or cheaper -but more limited- ULAs/PLAs/etc) to save on overall chip count, board space, and long-term cost? The 6502 alone wasn't a dead end though, there were plenty of options for faster CPUs (even the NMOS 6502 went up to 4 MHz, but I think 3 MHz was the highest rated version to be relatively common), you had the 65C02 (and further extended R65C02), and the 65816. (though the advantages of that over faster C02s are relatively limited) Then you've got the fact that demand for the 6502 declined (especially in terms of pushing for higher performance models), so there wasn't much incentive to push the architecture further. (imagine what would have happened to x86 if IBM hadn't adopted it ) However, once you did hit a wall with the architectural limitations, you could invest in a custom derivative of the design (especially feasible with the low licensing costs of the 650x series), and that would also be attractive prior to really needing to extend the architecture significantly. (doing things like Hudson did with integrated MMU/banking logic, I/O, sound, and some added instructions) Or you could opt to switch architectures entirely and only provide support via emulation (much easier for high-level/OS driven programs) or tack-on the old hardware for compatibility and auxilliary processing (again, facilitated by the low-cost licensing of the 650x chips). A replacement CPU could be off the shelf, or totally custom and in-house . . . like Acorn did. And aside from pure CPU performance, you could address some limitations with off the shelf and/or custom coprocessors. (ALU/multiply/divide units, perhaps floating point, DSPs, blitters, etc, etc) One problem was that other off the shelf architectures were limited, MIPS was still pretty expensive in the late 80s, ARM hadn't opened to 3rd parties yet, and you had x86 and 68k. (68k seems the better choice for the time, but in hindsight you's hit the obvious wall in the mid 90s again where you'd need another architectural shift) In that respect, it probably would have been safest to milk 650x for as long as possible before making a transition. (pushing for higher and higher clock speeds as available -and as memory speeds allowed, transition to the '816, probably implementing some sort of fast SRAM buffer/cache to allow faster operation than DRAM of the time allowed -at least outside of fast page accesses, and filling in the rest with coprocessors) And again, what you are suggesting would have questionable results. Take a look at Commodore. Sure, they outsold everyone... but the didn't make as much money as Apple. Commodore didn't do what I was suggesting either, and they had horrible management problems as well (possibly worse than Apple, or Apple was luckier). If you look at CBM from 1981-1984 compared with Apple in that timeframe, I'd suspect CBM was making a lot more (taking investment spending into account of course -ie money CBM was using to build up capital rather than retain it as liquid assets). The biggest caveat would be the brief period where CBM was selling at a loss (at least after rebates) in 1983, but I doubt that was enough to make it up, especially if you take Europe into account. (again, one of Apple's gaping weak points that a low-cost model could have corrected) CBM made a mess of things, more so after Tramiel left (with some exceptions, but mostly worse). They managed to lose their business/education market moving forward from the PET, jumped to the low-cost consumer market, then made a mess with a bunch of unnecessary and overlapping products, brought in the Amiga, and finally had a broad range of compatible/expandable machines with the later gen Amiga line, but that took some 5 years after the Amiga's launch. (the Amiga's marketing and market positioning was pretty screwed up too) There were plenty of options for faster 6502s though, and other options for acceleration as mentioned above. (and, of course, they could have reasonably been pushing 2-3 MHz models back in the early 80s) What about having the II evolve and expand like the PC? It's pretty hard to make a GS upgrade when the 65816 was late coming out and the first batch or two of CPUs Apple received didn't even work. And you can't do much development and testing of new hardware, and a new OS without the CPU. Apple could have released it sooner but not by a wide margin. Why not ditch the '816 for the time being and go straight for faster 6502s/C02s earlier on? (the IIC Plus came really late)
-
I missed this before, but you say 3 engineers? Who was the 3rd? (I didn't think Ben Cheese had been involved with the jaguar design)
-
Atari and their pork pies (Atari STE specs)
kool kitty89 replied to oky2000's topic in Atari ST/TT/Falcon Computers
To be fair, Curt does explicitly state that the 505 was never put into mass production, let alone commercially released on the consumer market. Albeit that WAS back in 2003, so maybe you've gotten more information since then. It certainly did exist though. (as did the 504 and possibly others) The 504 seems to have ended production to some extent, but it seems to have only been used for developers and not offered at the consumer level. (apparently too expensive, but not even offered as a high-end workstation accessory) Was there any 3rd party support for CD-ROM drives or software? -
The CoCo may not have been compatible with the I/III but it was Tandy's top seller for pretty much it's entire life. What's strange, is that the Laser/VZ 110/200/300 machines were actually TRS-80 clones with a 6847 and hacked ROMs. Sort of a Z80 CoCo equivalent if you will but minus a few features of the CoCo. It would have been interesting to see what would have happened if Tandy had gone that route for a Color Computer. It still would have required software to be rewritten, but I found everything but graphics to be fairly easy to port to the CoCo anyway. Ok, so there is assembly as well but I've ported from Z80 to 6809 and it wasn't that bad. Yeah, I was just thinking of the potnetial for Tandy to have pushed a general/flexible breadth standard range of machines that built on the Model I and II from low-end home computing to hobby to education to business. (I think the monitors used on the model I and model II were using TV sync rates too, so no direct conflict there either -other than RF possibly making 80 column text inpractical -composite with colorburst disabled should have been fine though) Maybe they could have even invested in their own graphics ULA that combined the Model I/II text mode logic and added color capabilities and maybe bitmap graphics. (allowing RAM defined text characters could have been more useful though -and not that hard to hack as bitmap graphics either) Yes, at least outside of Japan. (where you had the neat FM-7 with dual 2 MHz 6809s -one dedicated to manipulating graphics in the 640x200 3-bit RGB framebuffer) Then again, without the CoCo, Fujitsu might not have gone in that direction either. Having more 6809 based machines in general would have been neat (I wonder if IBM ever considered that when developing the PC). Then again, there were a lot of cheaper (if not more cost effective performance-wise) alternatives on the market; similar to how the 6800 wasn't as popular as cheaper contemporaries with reasonably comparable performance. (of course, Motorola didn't ever push hard for competitive pricing with the Z80 or 6502 AFIK, so that was a major factor) I wonder why they didn't just add more banks to the same address range as the language card RAM disk. What support are you looking for here? It was the 2nd biggest seller as it was. The only computer that rivaled the II in software was the C64 and that didn't have near the business software of the II. The II series still had advertising even after the III and Mac were introduced so it's not like there was no support at all. Could they have pushed it and got more of everything... probably, but I don't see it overtaking the C64 unless they revamped it at some point which is why I suggested what I did. The revamp would have had to have been early in the II's life to make much difference. I'm thinking more of the Apple II missing out on the prospect of becoming a persisting market standard like IBM ended up doing. (except IBM -and 3rd parties- kept it going with evolutionary/compatible models rather than doing weird things like Apple had) Of course, IBM also failed to go into the lower-end market (they almost did that with the PCJr, but they screwed that up -Tandy shows more what it could have been, albeit without the same sort of push IBM could have had). Actually . . . IBM also started screwing up the "compatibility/expandability" aspect when they tried to set up proprietary standards retroactively (one of the main problems with the PCJr and, of course, the PS/2 line). They could have pushed a lot harder in general, kept the simpler baseline models in the lower-end range (continually consolidated for lower cost), and expanded models in the mid-range to higher end market. (eventually having a full successor for the next generation built onto the same architecture)
-
Atari and their pork pies (Atari STE specs)
kool kitty89 replied to oky2000's topic in Atari ST/TT/Falcon Computers
Price point was more significant in Europe in general, at least up to the mid 90s (I think economic issues were a major factor, but I'm not sure of the details). In the US, the GG ended up selling substantially better than the lynx in spite of the later release, worse screen, higher price point, worse battery life, etc. That's almost certainly due to Sega's marketing and brand recognition at the time. (and the software) The GB had better software support than either, was generally more convenient to use in many important handheld/portable venues (size and -more so- battery life -power adapters are of pretty limited and specific utility: no good playing at/after school, on the bus, on a train/plane, etc, etc), and of course, Nintendo had the dominant brand recognition in the US at the time and a cheaper product on top of all that. (probably much cheaper still if they'd cut their margins -ie if Atari HAD dropped the lynx to the GB's price, Nintendo could have easily chosen to drop below that and still make a profit) Of course, both Sega and Atari dropped out of the handheld market before ever pushing directly into Nintendo's size/battery life/etc range. (Sega especially in the US) Though Sega should have been able to push out such a model by '95 at least if they'd wanted to. (Sega of Japan, rather, since they controlled most/all of the hardware engineering) -
Yeah, same thing for every other console on the market at the time. That wasn't going to change until consumer interest in the VCS waned. (that happens to most new systems -they may get support, but until the older mass-market leader declines, they aren't going to get similar or better support) Hell, Activision (or Crane at least) actively pushed to extends the life of the VCS with that DPC in Pitfall II (planned for more games, but canceled because of the crash apparently -not sure why it wasn't re-introduced in '85/86 when 2600 sales picked up again) It wouldn't have made much sense to publish for just one or the other when the architectures were so similar. That's one of the better design points of the 5200, being easy to cross-platform develop for with the A8. (granted, it's got a lot of other flaws -like being more expensive than the CV or 400 when it should have been cheaper, among other things -both technical and political/bureaucratic)
-
Playing Burned Games on Classic Consoles?
kool kitty89 replied to CGQuarterly's topic in Classic Console Discussion
Actually, there's notable cases where pressed discs are MORE DIFFICULT to read than (properly) burned CD-Rs. (by properly, I mean burned at a constant speed, not variable -some say to burn as slow as possible, but the main issue is really just being at a constant speed) The Sega CD is well known for being relatively tolerant of burned discs (even using somewhat poor quality CD-Rs -nonlinear write speed is still a big issue), but the commercially pressed discs made (in China) for Pier Solar have consistently proven to be unreliable. The main solution is to burn a backup copy. Also, it's pits and lands, not valleys. (I think valleys may sometimes be used in place of pits though) The 3DO (especially FZ-1) is one of the absolute worst cases of classic consoles with burned discs; it's extremely finicky. Again, slow burning speed doesn't so much matter as constant/linear speed does. The problem cones when you push 32-48 or "optimal" burn speed options which are NOT constant speeds. I think 12-16x is still constant, but I'm not positive. (8x is definitely fine) It's the reseeking process that will wear the drive faster, the same problem happens with worn/scratched official copies of games. That problem is magnified for Dreamcast "backups" as the data has been reorganized to fit onto a normal CD-R rather than the higher capacity GD-ROM, so there's a lot more seeking. Keeping the drive well lubricated (ie with silicone lubricant) will also help to reduce wear. -
It's hard to imagine Activision wanting to help Atari! If I were in their shoes, I would be doing everything possible to make the 5200 a failure. Sure, Activision was using the 2600 to earn money, but only as long as they had to. The moment they could make enough money by supporting Atari's competitors, they did exactly that. I'm not trying to knock Atari here, just thought I should remind you that Atari sued Activision over and over again, and tried desperately to put Activision out of business and/or prevent Activision from ever releasing another game for an Atari console. So if Activision was complicit in the demise of the 5200, it was Atari's just reward. Suing your third parties is not the best way to get their help on your new project. - KS Atari did that with some other 3rd parties too . . . albeit it was the initial losses in court against Activision that really opened the doors to 3rd party publishing. It made sense for Atari to want exclusivity in a market where they had no ability to enforce licensing contracts with 3rd parties (sure, Nintendo managed to do that in Japan with absolutely zero lock-out in hardware/software, but the Japanese market/business culture is a bit weird in general -ie guess what would have happened with the NES if it was as insecure as the Famicom ). It wasn't until the 7800 that Atari had a machine with a lockout scheme (the best scheme on the market for several generations for that matter), but that ironically ended up coming at a time when Atari (corp) would have been better off with totally free and open 3rd party publishing. I rather doubt Activision would have wanted to "kill" Atari, they may have had a bone to pick with them, but it made good business sense to keep supporting them. (and many Activision games WERE best on Atari consoles or computers -later on they tended to favor the C64 more though, and then Activision fell out of its creative freedom years into more of an average "big business" publisher) As for the main topic; software wasn't ever one of the main problems with the 5200. The launch lineup was somewhat weak, but otherwise it had a very good showing of games for the time.
-
5200 VS 7800: Which System has the Better Library?
kool kitty89 replied to Dr Manhattan's topic in Atari 5200
Again, I do agree that calling the 7800 a "souped up 2600" is totally wrong, if anything, the A8 chipset is more applicable in that sense. (incompatible, but with much more low level hardware similarities to the VCS than the 7800 has -comparing it when running in 7800 mode, of course) It's a bit of a shame that the A8 chipset wasn't pushed a bit further in that respect and given full backwards compatibility with the VCS. -
Exactly, even if they did lose some of their high-end customers to the low-end models, they could have gained far more with others who wouldn't have bought anything at all. Plus, they'd also be profiting from all the peripheral sales for the low-end models (they'd be expandable for sure, as I originally mentioned). And, again, they logically would have positioned the full multi-slot expansion module to be more expensive (with the price of the low end II included) than a standalone model with built-in slots. And, again, not only could have have pushed for lower cost models, but they could have repackaged the system in a form factor to be even more competitive against PCs (proper modular desktop box). Yes, regardless of broadening their range in the market, they definitely should have focused on extending/evolving/supporting the Apple II. It was a simple machine, but very flexible and expandable, the same things that allowed PCs to dominante the mass market. (actually, the simplicity also was a big part of that: easy to clone . . . had the ST been an open architecture design from the start, it probably would have had a lot more potential to expand similarly -if not in the US, at least in Europe) Atari engineers had initially wanted to push the 800 with apple II like expansion, but upper management forced it to be an "appliance computer". (in reality, they could have done both, make the 800 a proper Apple II like system -with more integration and cost effectiveness- and pushed more limited lower-end models with the 400 and in-between -better if the 400 had 1090XL type expansion support too though) Expandability and compatibility is the name of the game, and a few companies got that right, but oddly, many of them ended up screwing that up later on. (Tandy did continue support with TRS-80 compatible machines, but they didn't do a good job of supporting it as an open standard -though it had certainly started as such- and in particular, they introduced the totally incompatible CoCo overlapping with the original TRS-80 market sector -granted, the Model II had also been largely incompatible with the Model I, but you could argue they could have managed those 2 lines in parallel for a mid range/low-end and more high-end/business oriented machine -or even merge the standards later on as they were fairly similar overall) To expand the overall market and offer a wide range of machines. (and expand into Europe where low-end machines dominated) Again, I'm not saying they should have dropped the higher-end stuff (they should have pushed harder into the top end with the II family), but I am saying that they should have broadened the line into a full array of compatible models that could cater to most of the mass market in general. Woz disliked the III. And he didn't even like the Macintosh. He said "let's develop a GUI for the II" but was shot down. The IIc had color but the original Macs were black and white. And the IIc was a good looking machine. I can see why Woz wanted to develop a GUI for it instead of going with the Mac line. Development of the Mac was started in about '82 when the II was still a huge seller. Heh, the Apple II had color back in 1977. Woz had a lot of good ideas that ended up getting crushed by the idiots running the company (especially Jobs). A shame Woz couldn't somehow stage a coup to drive them out of the company. Actually, Jobs and Woz are a bit like Nolan Bushnell and Ted Dabney; both stood in the limelight and took credit that wasn't theirs (or no theirs alone). Hell, Jobs even learned a lot of his sensationalist PR BS from Nolan while at Atari. Of course, one major difference is that Dabney was forced out of Atari pretty early on while Woz is still with Apple (more or less). A shame that Jobs couldn't get his head out of his ass long enough to realize how successful the company could be if they championed his PR skills (and general marketing) with Woz's extremely insightful engineering visions and capabilities. People like to think of Jobs as a visionary, but if anything, Wozniak had the best foresight of anyone in the company (if not some of the best of anyone ever in the industry). The Apple II was probably the most innovative and forward-thinking design to ever sport the Apple brand name, a shame they ended up running it into the ground. (rather deliberately too, especially demonstrated in that article) The Zip Chip. Which is the exact route they took for the IIc+. Yeah, some years later, by 3rd parties (and finally with the IIC+). They could have been pushing 2-3 (perhaps 4) MHz NMOS 6502s years earlier. Aside from the previously mentioned wait state and faster DRAM options, they also could have implemented a small local SRAM buffer or "cache" (not a real dynamic cache, but similar in some respects) for the CPU to run at full speed off the main bus. (you could use that RAM for zero page registers and small amounts of high-speed code and data which could be updated as needed in software -sort of a software managed cache) They could have made the cache optional and expandable as well. (either via a special card slot or a DIP socket; if they introduced that in the early 80s, it would probably be most cost effective to start with a single 2kx8-bit SRAM chip with provisions for more and perhaps for 8k chips as well -using a slot would have been the most flexible and taken the least board space) The DHGR mode could be treated as 140x192x16 (using the same 16 colors as the 40x48 mode); I think at least one video card also did so. Yeah, like that. Something like the //e card for the Mac LC line? No, that was just tacked onto the Mac; I'm talking about a ground-up design that makes use of the old hardware reasonably efficiently as an integral part of the system. (then again, there's plenty of argument for never going with the 68k at all and sticking with 650x compatible chips -you could even push custom or off the shelf coprocessor logic to address some shortcomings of the 650x, and tons of other potential for hardware acceleration for graphics and/or sound -then again, if the 650x had higher demand, perhaps it would have been extended further in general, perhaps with additional FPU coprocessors as well) Well, the //e Platinum (which I own) had a keypad, and there was a prototype IIgs with internal drives. Yeah, but I'm talking about a full PC-like form factor with market positioning to match. They could have broadened the market in general though, cater to lower AND higher end than they were at the time. The problem is that no company even tried that, and those that came close totally f*cked it un in one way or another (usually several ways). IBM almost managed to do that with the PCJr, but totally botched that . . . Tandy got it right and DID manage to offer a pretty wide range of machines at that. Hmm, so it's not just a fixed location that gets banked to? (like the 130XE scheme -I think the CoCo III used that sort of scheme too) Actually, from a hardware perspective, the Apple II was cheap, but it never got the positioning to be "cheerful" or cheap on the consumer end. And AFIK, Sinclair was ALL about invention and innovation, hell the engineers at MOS/CBM ended up being pretty damn innovative too (just not marketed quite as such). A shame that most of Sinclair's projects ended up being market failures (the ZX80/81/Spectrum were the main exceptions). Honestly, I think that once the Apple II had served its purpose (in Jobs' mind), he couldn't have cared less about it. What's fascinating about him is that he doesn't have a shred of nostalgia for past achievements. He'll be thrilled to have his company popularize the next big paradigm shift in popular electronics, and will then gleefully bury it as soon as he can for the next big shift. http://webcache.googleusercontent.com/search?q=cache:QgPhaEoL_jAJ:www.netherworld.com/~mgabrys/clock/weak5.html&hl=en&gl=us&strip=1 This paints a pretty clear picture of what Apple management's mindset was regarding the Apple II, and it actually didn't depend on Jobs' presence. That link isn't working, so, here: http://replay.web.archive.org/20080820003428/http://www.netherworld.com/~mgabrys/clock/weak5.html That's really sad, but definitely an amazing insight into Apple. Woz (and his supporters -and the Apple II guys in general) had it right, but Jobs/marketing ruined it with the disasters that were the III, Lisa, Mac, etc. (and all the while downplaying the Apple II as obsolete crap) With that in mind, my whole supposition on expanding the II to low (and higher) end machines seems rather insignificant. They just needed to push the Apple II more in general, regardless of broadening the market model. (hell, a broader range would have been inevitable with more support in general) To be fair, there WERE several companies who actively pushed legacy support, though many in an awkward manner. Atari Inc kept pushing the A8 (and were finally addressing some of the long-held limitations of the system with the 1090XL and such -albeit something engineers had wanted since day 1), Tandy extended the TRS-80 with the Model 3 and IV, though they ended up limiting their market in general with rather modest enhancements and the incompatible coco in the low-end. (the model II and related line was incompatible too, of course, though probably close enough to merge that with the Model I/III line if they'd wanted to later on) Hell, Bushnell (and some others at Atari) had wanted to kill off the 2600 back in '78 when it was slipping (and had intended it to be a short lived machine), but Warner management ended up pushing on with the system and it ended up getting absolutely massive in the following years and still marketable though the late 80s. Warner made plenty of other mistakes, but keeping the 2600 going certainly wasn't one of them -limiting the 8-bit to an "appliance computer" was one of those problem, as was the fatally flared distribution network. (and there's plenty of other areas where Warner Atari management was significantly better than Bushnell Atari had been, like better business sense in general -a shame they didn't have someone like James Morgan in there back in '79 with more effort to limit Warner interference on top of that)
-
Nope. You're confusing this machine with the Atari Falcon I know the falcon (and some macs) opted for 16-bit wide buses (in the Falcon's case, probably more due to cost than compatibility -for compatibility, they could always have dropped to 16-bit modes but also supported 32-bit bus as well -but with 16 bit alone, you cut back on traces used), but I'd gotten the impression that the 1200 used a 16-bit bus as well from several people. (actually odd that the Falcon didn't use a 16 Mhz 68EC020 rather than the 68030, that would have been a much better out of the box trade-off than the 16 bit bus -if they wanted really low-cost versions, they could have opted for a 16 MHz 68000 for that matter -you had the blitter and -more so- DSP to take up a lot of the grunt anyway -lack of a fastRAM bus on the falcon was also unfortunate, but at least it had packed pixel support -and 16bpp graphics for that matter- unlike the AGA chipset, though I think packed pixels my be limited to 16bpp modes and 8bpp and lower is all planar -which would be unfortunate) Maybe the 16-bit comment was for chipRAM and not fastRAM. I'm not sure how they could have managed 24-bits for 2 MB without some weird addressing or wasted RAM. (wouldn't 24 bits also be a pain for the CPU to work with -maybe it drops to 16-bits instead) Again, I think it may have been chipRAM that was being referenced with the 16-bit comment, not fasRAM. (which also means the A1200 and CD32 would have been limited to 16 bit work RAM out of the box) Maybe 16-bit chipRAM, but it seems like fasRAM was definitely 32-bit. The A1200 wasn't really a castrated 1200 though . . . it was a low-end model using some of the same advances as the 4000 had done. (albeit CBM really should have pushed 14.3 MHz A500 derivatives prior to that, perhaps even '020 based lower-end models prior to AGA as well) I don't see how the A4000 was cheap crap at all. It was better in every respect than the 3000 (albeit, less so for the time) out of the box. I agree that AGA was really a poor upgrade to the overall system (no blitter upgrade at all, no packed pixel support, etc -albeit the CD32 supported hardware packed to planar conversion); it wouldn't have been so bad back in '89 or a bit earlier even (or similar but limited to 15 or 12-bit RGB), but they really should have been aiming more at something on par or better than VGA/SVGA. (packed pixel graphics, fast page mode DRAM support, and a blitter upgrade to cater to that, rather like the Amiga team went on to do with the Lynx blitter -and CBM should have had that by '89 or '90) The ST and Amiga both lagged badly in terms of hardware upgrades (especially 1st party) and general successors/evolution; Atari was actually better off in many respect as far as upgraded hardware went (far from what they should have done though), and the Amiga only remained more attractive as it had been more capable from the start. (Atari -and PCs- thushad a lot more room to pull ahead of the aging Commodore hardware, but Atari ended up doing a mediocre job of pushing that -albeit the Falcon chipset WAS better than AGA at least, TT was arguably better than the A3000 on a technical level too- but PCs definitely pulled ahead rapidly -a mix of advantages and trade-offs in the late 80s, but pretty much unmitigated advantages by the early/mid 90s -with the exception of still not catering to analog SDTV resolution editing, though the arrival of digital video editing addressed that as well) Well, more like the 1200 was a much lower end model out of the box without any fasRAM and using a much less powerful 68020 (or EC020 to be specific -not really notable since all it did was cut the pin count by limiting the address bus to 24-bit/16 MB -unlike the 386SX also cutting data lines to 16-bit so you didn't have the option to use 32-bits at all and also couldn't use external caching). It was a lower-end machine, so that's really sort of a given. (I agree with the AGA comments to a degree -not so much in the specific reference to AAA as that was way overkill in some areas, but AGA was definitely weak for the time: if AGA had all its current features plus packed pixel support, buffering for efficient fast-page mode use -and/or wider memory buses- and a blitter to match -maybe even with some added support for 3D acceleration assistance- then it would have been pretty good for the time, but as it was, AGA added very little of practical use outside of static -or limited animated- screens; the lack of highcolor also hurt it a lot for the time -the falcon was lacking too, but it was a lot closer to market standards at least -and the DSP meant if DID have a decent route for 3D acceleration too ) Right, and the guy is full of shit talking about 16-bits. The expansion bus *is* 32-bits wide, just like the CD32, A4000 and A3000. A shame the CD32 didn't use fasRAM out of the box . . . 1 MB fasRAM and 1 MB chipRAM would have made it a lot more usable. (still really underpowered for the time, but a fair bit better off at least) An amiga based game system would have been great for '89 (or even as late as 1991), but they pushed the extremely dated (and poorly executed) C64GS instead, followed by the (again, poorly executed -also rather niche) CDTV. Ahh jeez. Have you been spending time in the Jaguar forum or what? Except it's the opposite: Jag's 64-bitness comes from the external bus (internally, the GPU/OPL/blitter are a mix of 32, 64, and heavily buffered 16-bit logic -not buffered for texture mapping, unfortunately)/ 24-bit data bus? (not address bus) More than that too since the CPU and coprocessors can both run at full bore. (so the blitter can spend far more time on the bus on top of the CPU running at full width without wait states for coprocessor access) Is there ANYTHING in the AGA chipset that's 32-bits (or even anything other than 16-bits)? I was under the impression it was the same old 16-bit blitter at the same speed (higher clock speed would do little since you're limited to random access DRAM bandwidth anyway -it would help a little, going from 280 ns closer to 180-200 ns; a 14.3 MHz blitter could do 3 cycle delayed accesses at ~209 ns vs the 280 ns of the OCS/ECS; you'd need fast page mode support or a wider bus to get more bandwidth than that). 24-bits is just the depth of the master palette, the display depth is limited to 8-bit planar graphics or less. It's generally worse than VGA (let alone SVGA), though you could use the blitter to handle packed to planar bitmap conversion and be fairly close to unaccelerated VGA with a similarly powerful CPU. For games, the best thing about AGA is probably the extended dual playfield support for 2 4-bit layers rather than 3-bit of the OCS (at least I think it supports that). That limits color of course (2 15 color planes rather than 1 255/62/31/etc), but it allows faster rendering due to the dual framebuffers just like the OCS dual playfield. (which was limited to 2 7 color layers -plus the boarder color) Of course, if CBM really was having trouble extending the old architecture, they could have primarily focused on consolidating/cost reducing the existing hardware (with some basic enhancements that didn't require really intimate knowledge of the original designs -doubling Paula's audio would have been nice and pretty simple, maybe support the 14-bit channel pairing in hardware too, rather than having to work around with dual 8-bit streams with the proper offsets). And then just "tack on" hardware upgrades with the old hardware embedded for compatibility, or something like AGA (more so with basic packed planar conversion logic as the Akika chip added) with more powerful additions outsourced or bought off the shelf. (various DSPs could have helped a great deal for multipurpose coprocessing -sound, graphics, math, etc- or TI's TMS340 series GPUs for that matter; very flexible and the older 34010 series might have been cheap enough to add by the early 90s -also a pretty flexible/general purpose chip, optimized for video, but capable of various DSP type tasks and even CPU-like tasks; good for 2D and 3D acceleration and could be useful for audio processing or general fixed/floating point math coprocessing as well) The other route would be going mainly with CPU grunt, but that's far less cost effective in general. Hmm, it would have been interesting if Atari had gone more in that direction with the Falcon: have bottom end models with a 16 MHz 68k, no FPU (maybe a socket), VIDEL video, fasRAM expansion, but a TMS34010 for general purpose coprocessing (best for graphics, but also good for sound, math, etc -and seems to have been much easier to program than most traditional DSPs including the MC56k, very general purpose more like a CPU but with different performance and feature optimizations), probably with 32-bit access to video RAM (if not already using 64-bit wide shared RAM like the TT SHIFTER -not sure what VIDEL uses) and a general purpose expansion port in addition to some internal sockets. (and higher end models with '020s, '030s, '040s, etc -I think the TMS34010 may be faster at floating point math than a 68801/2, I know its faster than a 287, though it's probably significantly slower than the '040 FPU) Of course, if Atari had been further ahead with their computers in other ways, they could have waited a bit and added the jaguar chipset instead of anything off the shelf. (with the bugs worked out, the Jag GPU should have been generally better than the 340, and obviously more cost effective for atari due to owning the IP -though you've got R&D costs and production volumes to consider too)
-
The ST and Amiga were all about multimedia to me, too, but I was a kid at the time. In the mid-80s, it must have been hard to decide how much "fun" to put into a computer. IBM and Apple were printing money with machines that didn't care about multimedia and games one bit. (At least the PC had scroll registers, unlike the Mac!) Huh, CGA had hardware scrolling, I hadn't realized that . . . (EGA too I imagine, or PCJr/TGA for that matter). I knew VGA had scrolling (and packed pixels in 256 color mode, and some rudimentary hardware acceleration), but not the earlier ones. Also, it would have been the Apple II that was largely printing money up into the late 80s, not the Mac. (in spite of Apples attempts to kill it off ) You're thinking of the US market though, not Europe. Europe only lost interest in "fun" computers because there stopped being competitive machines on the market to satisfy that. (you went from 8-bit and 16-bit computers dominating the game scene with consoles well behind in the mid/late 80s to consoles gaining to the point of being on par with computers by the early '90s, and then consoles pulling ahead in the mid/late 90s) In the US, you went from the C64 being dominant in the mid 80s to consoles coming back in full force by '87 and PCs gradually expanding their multimedia capabilities. (to the point where a decent game/multimedia PC could largely match -or exceed in some respects- contemporary game consoles by the mid 90s -more so in the late 90s when hardware acceleration really got common) You had a boom of PC gaming in the mid/late 90s that tapered off to an average 2ndary market (more or less) in the early 2000s. (IMO PCs are still generally better values for gaming than most consoles, at least if you focus on buying upgrades at the best prices -and build a machine that's well suited to upgrades; it's still not going to be cheaper out of the box than a console, but the upgradability and -much more so- the lower cost of games -new and used- makes it far more attractive -granted, DRM has gotten pretty ridiculous on PCs, so there's that hassle to deal with -not quite as bad on consoles by comparison) As I mentioned in my response to OKY2000, I don't think Atari Corp should have done that at all. They should have aimed at a decent flexible, low-cost machine out of the box with flexible expandability and a range of models to cater to different market sectors. (many being nearly identical internally, but differing in form factor, built-in expansion support, etc -and the high end workstation models with exclusive features-) Even the low-end models should have at least had a general purpose expansion port (RAM, coprocessors, sound, video, etc -though some things could make more sense to expand via socketed chips internally), but offer a full expansion chassis (like the 1090XL) and desktop models with that expansion built-in. (or at least big box models like that; pizzabox could require an expansion chassis as well for full expansion) The ST probably would have stayed ahead of the Amiga in Europe a lot longer if it had some basic upgrades that pushed closer (or ahead in some respects) to the Amiga's capabilities without compromising cost (hardware scrolling was the biggest issue -offering faster CPUs would have been significant across the board; scrolling was also significant for non game applications and a faster CPU obviously would as well -something Amiga lagged at severely as well). But for the market in general, they shouldn't have been aiming at the Amiga at all, but rather focusing on expanding the ST based on its original merits and on emerging PC standards. With the clone market the way it was in the US, it certainly would have made more sense to push that angle (Atari DID do that starting in '87, but I'm not sure how successful they managed); as such, the ST line could be supported to cater to niche markets in the US while it remained mainstream (if not dominant) in Europe. But what time period are you talking about for Atari and Commodore? (are you talking about Atari Inc marketing -indeed Atari inc HAD been pushing for PC compatibles alongside expanding their 8-bit line; or are you talking about TTL/Atari Corp -totally different company with a different agenda for the computer market) Yes, Tandy probably could have done even better if they'd expanded their distribution beyond Radio Shack and maybe even pushed into Europe. IBM totally missed out on the lower-end/mid-range game/multimedia market when they screwed up the PCJr. Had the PCJr been like the Tandy-1000, IBM probably would have had a winner and more clones of that standard would have popped up in general. (as it was, it's rather odd that PCs didn't have any sound cards at all until 1987 -aside from the covox DAC; really strange) Again, I don't think TTL marketing was ever pushing for PC compatibles at all (if they even had marketing teams), and if it hadn't been for Trameil, Atari Inc probably would have jumped into the PC market back in '84 or '85. (they'd been pushing for '83 even, but Morgan's reorganization efforts halted a lot of projects in the interim) Of course, alongside the PC machines, Atari Inc also had several advanced (beyond Amiga) 68k workstation class machines fully prototyped back in '83 (shelved with reorganization). They'd been planning on having the Amiga chipset for a game console released in late 1984, but Amiga cheated them out of it with false delays (claiming they failed to produce working chips) in June of '84, then there was the accidentally cached return check that voided the contract (all happening within days of Warner's liquidation of Atari in early July). They also had a BDS Unix based OS with GUI intended for either their own 68k projects and/or the Amiga based design. So, under Morgan, Atari Inc may have continued pushing the 8-bit line, added a PC compatible, and probably pushed the 68k based Gaza or Sierra (or a more mass market friendly configuration of the same chipset -less high end workstation level), the 2600 Jr, and 7800 all being pushed in 1984/85. That also probably would have meant no good low-end 16-bit computer for Europe until the late 80s, if that. Not so bad after the 386 came on the scene with flat 32-bit addressing in protected mode though, but that's a pretty wide gap. That, or 650x based machines still being expanded/evolved through the late 80s. (faster 6502/C02 derivatives, '816s, maybe better if the market demand spurred further architectural enhancements -but even more programming pains than x86 in some respects -or you could have had Z80 machines being pushed, maybe even the Z800/280 getting significant use ) Given the market in 1985, PCs hadn't quite saturated things yet, so Commodore or Atari Corp COULD probably have pushed in a lot further than they ended up doing (in the US), but Atari had funding limitations (mainly limiting marketing) and failed to push expandability or business-friendly form factors until years later (and generally failed to offer a flexible range of machines), while CBM ended up with a powerful but not so user friendly OS (not bad compared to DOS at least ) and got a bit weird with their market positioning on the machine. (they also lacked a wide range of machines until '87 -they should have been pushing low end and higher end models than the 1000 much earlier, and the high-end models needed more features -the 2000 should have had a 14.3 MHz CPU and FPU option) Also, aside from direct PC compatibility, both could have pushed features to facilitate cross compatibility like PC compatible disk formatting (probably offer 5.25" accessory drives for better cross compatibility) and applications that went cross-platform with file compatibility. (they could have pushed PC emulator boards much sooner too, but having cross compatible media would have been far more significant, especially for the business market) The ST would have had an easier time with that since the floppy formatting was already very close to PCs (I think the file system may have been a bit different for TOS/GEM than DOS though), and all you needed was double sided 3.5" and 5.25" drive options to really facilitate cross-compatibility. Those are all things that made the Mac far more mainstream later on, but that was MUCH later and Atari/CBM could have had a massive heard start on the competition.
-
5200 VS 7800: Which System has the Better Library?
kool kitty89 replied to Dr Manhattan's topic in Atari 5200
Yeah, but if you couldn't have the 8-bit in there at all, which would be preferable? The expansion port was pretty pointless . . . laserdisc was a neat pipe dream, but a pipe dream nevertheless. (or if they HAD implemented it, I highly doubt it would have gone beyond a tiny niche market -maybe it would have attracted Digital Pictures after they lost support for their original plans -then Again, Zito never seemed to take interest in a laserdisc based console oddly enough; that would certainly have been the most viable option prior to CD based systems in the early 90s -he ended up waiting about 4 years to bring Night Trap and Sewer Shark to market because of that) That's much more Warner's fault than Tramiels for putting Atari Corp at odds with GCC to the point where neather wanted much to do with the other by the time the 7800 ordeal was finally cleared up. (GCC was almost certainly fed up with dealing with those problems and Tramiel probably was frustrated enough about having to shell out more for something that should have been part of the original deal) Besides, Atari Corp hardly needed GCC to manage the high score cart or a computer expansion (hell, they could have done a lot better than the joystick based keyboard interface GCC was suggesting), though losing GCC as a software developer probably hurt a bit (mainly due to Atari Corp's generally weak position for developer support in general, albeit that was largely due to Warner's mess of a transition that put them in a much worse position than Atari Inc had been immediately prior -or even TTL in some respects, and of course, also related to Ninteno blocking Japanese -and later many US- developers from publishing for Atari -or anyone else). Atari Corp's general financial position and Nintendo's ability to pull off illegal policies are what really hurt Atari Corp on the US market. (in that sense, GCC might have been more important in being able to get more bank for the buck than some other developers -ie squeezing more out of the system within the tight budgets being pushed) But, as for the computer add-on or high score cart concepts could have been done rather easily by Atari Corp themselves (either in-house or outsourcing). Same for the sound expansion for that matter. (with the exception that the save functionality might not have ended up directly compatible with the original high-score supporting games) I wanted to start a separate topic on this (I still might at some point), but: The cart slot was the REAL expansion port on the 7800, that's why the XM was possible, and something like that could have been very attractive (for Atari Corp, developers, and consumers if marketed right) back in the late 80s. In particular, I'm thinking of such hardware being implemented before any add-ons were introduced on carts (ie before the 1987/88 games with 32kx8 SRAM chips or POKEYs came around) and instead implement an add-on supporting similar features as a 1-time cost (perhaps releasing a "7800 Plus" console with that built-in). If they DID use SRAM like the Epyx games (and not gone with DRAM), they also could have included a battery and reserved a small chunk of the SRAM for save data. With a POKEY and RAM onboard, you also have the makings for a nice computer add-on interface. (ie they could have an SIO port and keyboard interface port for a separate keyboard expansion and disk/tape drive -or bundled versions of the add-on/system with the keyboard -tape could be especially important for the European market, especially if they'd implemented a much faster FSK decoder) If they REALLY wanted to make it cost effective, they could have made 4k of the 32k SRAM unused on the add-on side, and map that to the original 4k SRAM for the integrated models (doing away with the 2 2kx8-bit SRAMs) Hell, they could even have made use of the POT lines for additional controller inputs. (you could wire it like a conventional VCS/A8 joytport with some of the analog lines wired to function as simple digital I/O ports, but that would only give 1 usable joyport -for 2 you'd have to rely more on analog and couldn't support all the normal digital lines . . . unless you had 2 different mapping schemes depending on whether analog or digital modes were enabled; then you could have 1 fully functional joystick/pad controller port or 2 paddle compatible ones) Such an add-on/upgrade probably would have made a lot more sense for Atari Corp given their market position and low-end cost aim for games. (having support for cassettes could have been very significant in Europe as well, especially with fast load times) That sort of add-on also could have addressed the primary goals of the XEGS without undermining the 7800 and confusing the market. (albeit it would lack some of the "bonuses" of the XEGS of using up stockpiled resources -other than POKEYs and existing SIO compatible accessories) Which was a bad engineering move and generally bad trade-off. Supporting sound expansion was good, but mainly if you're thinking of doing it years later when things become far cheaper. As it was, it would have been far more cost effective to slap a POKEY or (short of that) an SN76489 (maybe an AY PSG) on the 7800 in the first place; even more so if they'd been proactive and collaborated with Atari Inc engineers for areas they were having troubles with. Even better would have been plans for a standard add-on rather than embedded cart hardware. (one time payment, more attractive all around, though a bit tougher to market) As it was though, I wonder why Atari Corp didn't push POKEY more since they seemed to have a considerable stock of them . . . and after that started to dry up, they could switch to cheaper off the shelf alternatives like the basic SN76489 or the(better) Yamaha/AY PSGs available. (especially the cut-down 16 pin Yamaha versions becoming available in the mid/late 80s -or the full YM2149 if they wanted to add 2 more controller ports on cart a la Codemasters) 3rd parties also would have been fully free to do as they liked for expansion, it's just that the system got virtually no 3rd party support back then, let alone developers who wanted to invest in added hardware on-cart. A short term problem that was resolved almost immediately. Atari Corp was in a very tight position financially up to 1987 at the very least, but they managed to pull through in spite of the mess Warner created. (you can fault Jack for some things, but it's really hard to tell just how things could have turned out if a proper transition had been organized -it's obvious Atari Inc under Morgan could have been much better off, but that was Warner's fault, not Tramiel's -as it was, Tramiel and Co were lucky to salvage the mess Warner forced them into) No, they just took longer to arrive. Also, I'm not sure any Ninteno or Sega games of that era ever pushed 32 kB of SRAM on cart, pretty significant for those epyx games. Except that was an extreme rarity even on contemporary consoles. Even the Genesis was extremely sparing on battery saves. And the fact that most A plus developers (as far as the mass market was concerned) were in Japan anyway. Albeit, Europe had a LOT of untapped potential Atari could have taken advantage of; that's something I really wonder about. (ie if Katz ever seriously considered pushing hard for European computer developers rather than the limited US computer developers they largely ended up with -especially given the number of compelling budget games being produced, ideal for the low-cost angle they were pushing) And aside from Nintendo, there was Atari Corp's general funding limitations and the hefty delays that Warner had induced. (again, Atari Corp probably wouldn't have directly followed what Morgan had been planning either way, but there's absolutely no doubt that Warner's complete and utter mismanagement of the split/transition/sale weakened Atari Corp's position far more than any management changes Trameil made) Again, that would have been limited by their funding . . . and the lack of 3rd party support to go above and beyond what Atari Corp was pushing out of pocket. However, can you think of any pre-crash Atari games that pushed anywhere near the budgets of the higher-end examples seen in the late 80s? (not talking about stupid management decisions paying exorbitant licensing fees or marketing/special competition costs, but actual R&D/programming/development investment) Again, a funding issue, and by the time they were in a real position to push hard (in terms of cash flow and investment interest from creditors -to allow deficit spending for some heavy hitting invesments -risky, but the best chance to really push things, and something Jack had been willing to push at CBM), Jack had stepped down and been succeeded by his (apparently) far less capable son, Sam, and they also lost Michael Katz shortly thereafter. In spite of their weak software support, funding issues, etc, etc, Atari Corp had managed to outsell Sega in the US by a considerable margin (much more so if you included 2600 sales), and had also managed to successfully establish a new 16-bit computer system on the market (somewhat niche in the US, but the dominant 16-bit computer in Europe until the very end of the 80s -and if you go by aggregate sales rather than market share, probably a fair bit longer than that). Now, it's not like Atari Corp was free from their own mistakes prior to Sam taking over, but overall they'd managed to get enough right to pull through. Some things are pure hindsight, but others seem odd even when trying to consider the perspective at the time. Things like hesitating when offered the Megadrive in mid 1988 is rather understandable for example, targeting the low-cost market sector to differentiate from the competition also made sense, but things like making the same mistake of missing expansion on the ST (-as with the A8- even in terms of a simple/compact/cheap general purpose expansion port), pushing desktop models of the ST from the start or ASAP, pushing faster CPU options (or maybe even FPUs or workstation class big-box models), pushing the XEGS in 1987, etc, etc all seem rather odd. If they'd decided to ditch the 7800 and push a direct 600/800XL based console back in '84, that would have made sense, but in '87 it was neither here nor there for bolstering the computer line or offering a "high end" game console -if they wanted either of those, the ST, especially with the Blitter and maybe a sound upgrade (especially if cut back in other areas with provisions to expansion to full ST spec), would have made a lot more sense. (or, alternatively, an upgraded computer expandable 7800 derivative/add-on as mentioned above) -
Since we're playing the 'why' game: Why would the Amiga design team want to build an enhanced ST? Good point, and if they HAD been looking to build-up engineering staff more (let alone enhanced hardware in general), why not look towards some of the former ATG engineers instead? (granted, doing that after the fact with the ST was far less sensible than trying to resurrect some of the existing ATG hardware -and software for that matter- in the first place) Really though, it would have made tons more sense to stick with the existing engineers (in house and outsourced) for the ST. There's tons of simple (many rather foolproof) enhancements that could have been done. (and those who engineered the original -albeit simple- ST hardware probably would have been the best suited to handle that) Not to mention why would he even consider the possible conflict of interest being knee deep in several lawsuits with Commodore at the time. Commodore had already tried to shut him down by slapping lawsuits against Shiraz and two other ex-Commodore engineers for theft of trade secrets, why on earth wouldn't they do it again if a bunch of Amiga people came over? What would be the point though? Many of the mistakes/missed opportunities for Atari Corp's computers weren't things that having better engineers on-hand would have helped at all. In fact, they probably would have been best off by taking a minimalist approach to hardware enhancements in general (the boneheaded decision to not support simple expansion would hurt -they could work around that after the fact if they tried though). Simple things like adding V/H scroll registers to the SHIFTER (maybe line fill too, but well short of the later blitter -simple and cheap enough to allow such a revision to be pushed ASAP and applied to ALL STs being manufactured when introduced). On top of that, they could have added a basic DMA sound circuit and/or the YM2203 sound chip. (that's something that might have been better to initially introduce on higher-end models -especially for music-oriented systems, and then later applied as a baseline standard -short of DMA sound, you could just have a simple DAC with an array of DACs to write to and avoid hardware scaling and hacking the PSG, but an embedded DMA sound chip would have been simple and a better investment overall) Then there's one of the simplest/most foolproof enhancements possible for the system: faster CPUs, be it offering 10/12 MHz models or jumping straight to 16 MHz alone, there was a lot of potential for offering faster CPUs from day 1 (let alone later on). The only thing they'd have had to address was adding a wait state mechanism, but that should have been pretty straightforward. (at worst, they could have forced halts rather than wait states when the SHIFTER -or FDD/HDD/etc- needed the bus, but proper wait states would have gotten better performance -due to the many times the 68k wouldn't even be contending for the bus and thus not need to be halted; or they could have even hardcoded waitstates to fall within the bus access timing of an 8 MHz 68k -that would make it no better than an 8 MHz chip for bandwidth dependent performance, but far more significant for computationally dependent performance or other things with non-bus related overhead like interrupts -of course, better bus sharing logic could be added as time went on, maybe eventually adding a fastRAM bus to allow full parallel operation or resorting to buffering/caching instead) Doing minimal video enhancements (significant, but technically simple and not a major shift to manufacturing either) could have filled the gap until a true generational shift to counter VGA (and go well beyond the stagnating Amiga) in the late 80s. Keeping things minimal prior to that would also leave options much more open for backwards compatibility with minimal waste (no old model blitter to support -could directly push for one optimized for packed pixels, etc, etc). They definitely should have gone the packed pixel route, probably with some sort of hardware interpreter to build onto the planar bitmap logic (either chained like VGA, or maybe using a packed to planar conversion ASIC like the CD32). In hindsight, there's a really cool direct benefit to going that route as well: no blitter up to that point and shifting to packed pixel graphics would have made it ideal to implement the Lynx blitter (maybe some of its other coprocessing logic as well). The 68k is already designed to directly interface with a 6800/6502 compatible bus design, so interfacing with the Lynx Blitter probably wouldn't have been a major issue. (plus, assuming fast CPUs were standard by that point, the ST would already have moved away from interleaved DMA in favor of increasingly optimized serial bus accesses in line with the Lynx concept) Though technically, since the blitter hard only seen limited (often high-level) use in software prior to the STe, Atari could have chosen to totally drop the old blitter and pushed for an enhanced SHIFTER with packed pixel support and the Lynx blitter (again, maybe some of the other coprocessing logic too), perhaps with patches/updates to many of the more popular older programs that made use of the older MEGA blitter. That's another thing, it would have made tons of sense at the time to unify game console and computer hardware R&D as much as possible to get the best performance per cost (obviously some differences would be necessary, but having as many common components as possible would facilitated higher volume unified production for many components, consolidation of both console and computer hardware as designs evolved, ease of cross-platform development, etc, etc -the higher volumes certain components were produced in, or planned to be produced in, the more you could push more cost/performance advantageous chip design techniques -like standard cell or full custom rather than ULA or gate array ICs). In that respect, they could have been pushing a mix of ST/lynx based hardware for computers, handhelds, and a new home console to succeed the 7800 at the end of the 80s. (don't bother with the Panther at all -rather odd that they ever did push that; it seems like the 7800 was the inspiration, rather odd given the 7800's graphics architecture was one of its weak points in terms of mass market support -not in terms of fundamental capabilities, but in terms of being "easy" to work with in general -the Lynx was praised for being friendly to work with due to the hardware and Epyx's development tools). Heh, that may also have meant Flare designing the jaguar with compatibility with a lynx-related console in mind (no object processor, full focus on the blitter instead -probably meaning better texture mapping due to that too -and of course, a better funded Atari without the mad dash to push the Jag out in '93). A big mistake Atari corp made with the ST was also not offering desktop from factor machines early on. That really could have helped position them for a "serious" business/computing/education market, even more so if big-box workstation class models had been introduced (more internal expansion, FPU options, maybe FASTRAM all on top of faster CPU options -maybe even '020s for the workstations). Yep, the opposite is usually the case: think tank/small start-ups seeking out funding from major companies to allow them to do what they want in general. (technically, some companies have supported in-house think tanks with relative creative freedom -like Atari Inc's Advanced Technology division or Sega Technical Insitute- and that's probably what CBM should have done with much of the MOS/Amiga people -spun them off into a semi-autonomous think-tank sort of company, within funding limits of course) And sometimes those designers DO end up getting pulled in/consulted for others' projects, like Martin Brennan on Panther; but that also technically led to Brennan and Mathieson convincing Atari to support Flare II and thus "do what they wanted" (more or less) anyway. Huh, funny that the Flare guys ended up doing a somewhat similar progression with Flare 1/Slipstream to Jaguar to Nuon. (with both the Nuon and 3DO being high-end multimedia entertainment oriented products) Too bad the 3DO wasn't nearly as tight of a design as the Lynx in terms of aggressive cutting edge cost/performance. (that might have given it a realistic price point in spite of the flawed -albeit experimental- business model used for that machine) Yeah, too bad they didn't end up working the awesome Lynx hardware into more designs. (be it the computers and/or a new home game console -investing in the Panther project when the Lynx chipset was on-hand sort of boggles the mind) Except that might end up for the better. A game machine is what the ST was most successful as when all was said and done, especially in its primary European market (of course, it was mainly business/graphics/music stuff at first, but as the price dropped, it became a real consumer-friendly games machine). Besides, being a good gaming chipset isn't mutually exclusive with a great/flexible business/art/music/etc machine. (that's really up to marketing -and form factor is part of that; you could have basically the same machine with a different form factor to cater to totally different markets -like the Amiga 2000 vs 500, albeit the 2000 had more expandability out of the box -though it REALLY should have had a faster CPU and probably FPU support) Actually, Atari themselves had compromised the ST from a "serious" computer with the console form factor (rather than a proper desktop model) from day 1, especially in the US market and especially with how long it took to get the MEGA out. (even then they still lacked a big box model or workstation class machine . . . and then they went way overkill to a fault with the Transputer worksation when something on the level of the MSTE back in the late 80s would have been far more realistic for Atari's general market model -that and/or closer to an earlier TT for that matter, perhaps just a MEGA with 16 MHz 68k or '020 with fastRAM and FPU option, with the video upgrade coming later on) And, again, Atari Corp really didn't need the likes of the Amiga team to evolve the ST, there's a ton of things they could have done differently on their own, but didn't for whatever reason. CBM made a bigger mistake by not managing the Amiga (or MOS for that matter) staff more carefully. Keeping them partnered in a positive relationship/working environment would have been very significant. Managing things to support creative freedom and facilitated efficient, backwards compatible upgrades to the Amiga design would have been very important. They also made that mistake when they lost many important MOS engineers. (with the right people, they could have actually evolved the C64 chipset efficiently -like do something along the lines of the C128 or C65 without tacked-on hardware and allow efficient consolidation of existing hardware, etc, etc -same for Atari for that matter, with the right engineers, they probably could have built the A8 hardware architecture to an impressive next-gen design -of course, the advanced technology division wasn't really moving in that direction)
-
5200 VS 7800: Which System has the Better Library?
kool kitty89 replied to Dr Manhattan's topic in Atari 5200
You picked the most expensive model 8-bit computer to make the comparison. But it's not 1982 anymore, a *working* (meaning controllers work) 800XL is now significantly cheaper, and they're the almost the same thing and play the same games. They were also discontinuing the 800 at that point and the price was a fair bit lower. Of course, the comparable Atari 400 (more features than the 5200) was significantly cheaper than the 5200 by the end of 1982, and the 600 might have been in that same price range if released. (they opted to only release the 1200XL for whatever reason, a pretty nasty mistake given the push the VIC-20 got in '82 and the C64 undercutting the 1200XL -while the 600 could have offered great middle ground between the VIC and 64, with better software than either at the time and totally better hardware than the VIC) Also, by 1984, the 800 XL was significantly cheaper than the 5200 interestingly enough. (the 5200 was still well above $100 SRP iirc, but the 800XL had been dropped to $99 in late summer iirc -not sure if that was being sold at a loss though, or what the 600XL was priced at for that matter) Of course, the 5200's design should have made it cheaper than the 400 or 600/XL in general, but they made many mistakes that compromised that. Even so, the fundamental design would have been cheaper than the 600XL if streamlined similarly. (the 5200 Jr was moving towards that, but way underkill for what they could have been pushing -if they'd consoldiated the board and case for a minimalistic/compact/low-cost overall design, it probably could have been cheaper to manufacture than the 7800, especially after CGIA was implemented) How do you figure "less power"? Does the 5200's 6502 processor running at 1.79 Mhz have a bunch of extra processing power compared to the 7800's 6502 processor running at 1.79 Mhz? Or any other machine running the 6502? Yes, actually. MARIA is far more bus hungry than ANTIC/GTIA and, in certain cases, can even saturate the bus completely in active display (non vblank). Of course, you also need a bit more CPU grunt to attempt some things in general on the A8 (but now we're talking hardware graphics capabilities, not CPU power). The way MARIA works also makes interrupt routines basically useless, so more programmer heavy software times routines would need to replace those (or certain things would be left out entirely). If interrupts had been as useful as in the A8 (and RIOT IRQ had been enabled in 7800 mode), you could have had PCM/PWM modulation in-game like many A8 games pushed (especially homebrew stuff -some C64 games pushed that too). The lack of IRQ support also renders much of POKEY's flexibility useless (as most of that is interrupt dependent). Usually, people aren't talking about CPU resource, but sound/graphics hardware capabilities. "power" is rather ambiguous as such. Though, in that sense, the 7800 definitely has an edge over the A8: MARIA can do a lot of things that ANTIC+GTIA can't, though the conventional bitmap/character modes of the A8 arguably make it more programmer friendly for the mass market of the time than the 7800 does -I thin scrolling also might be more "normal" to manage on the A8. (except MARIA could push a framebuffer too if it had enough RAM -I think the Epyx games do that) The A8 sprites are still the primitive VCS type ones though, also a major clash compared to the "normal" hardware sprites supported on mid/late 80s consoles and computers. (ie x/y position registers and -in some cases- hardware multiplexing -I think the TMS9918 may have been the first video chip to implement that type of sprites) That's something significant to note: "difficult to program" often means "not like common standards" regardless of being difficult or easy in its own right. (of course, in some cases you certainly have more programming hassle in general . . . and if you get more market support in general for other reasons, you can end up forcing programmers to "get used to it" as was obviously the case with the VCS's hardware -programmers coming straight off the VCS probably wouldn't have found the 7800 "hard" at all, probably a joy to work with compared to the VCS ) You could also argue that the 2600 hardware is a general detriment to the system in performance and cost effectiveness. (ie take MARIA and the CPU and mate it with a POKEY for the primary I/O and sound -or an AY8910- and you'd have better sound and lower cost in general -investing in using DRAM instead of SRAM also could have been significant, more board space initially -due to DRAM refresh logic, but removing the VCS stuff would have freed that up some -and using DRAM would be cheaper in the long run by a good margin, maybe even right out of the gate) Investing in a 2 bus design with the CPU and MARIA able to run in parallel (maybe contention for ROM if MARIA was allowed to jump into the main bus rather than the CPU feeding dedicated VRAM alone -or you could go the NES route with more expensive dual ROM, dual bus carts) There's a number of 32/48/64k games that were modified to work directly with the 5200 (like rescue on fractalus) by moving more into ROM. (some cart ports were "lazy" and directly converted the disk versions to ROM that needed to load into RAM rather than modifying the game to fully utilize ROM iirc -of course, that could technically allow games to be compressed in ROM too, but I'm not sure that was ever done) -
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Nintendo may not have taken a huge risk in the US. Though some may argue that spending $50 million on the NES was huge. One thing no one can fault Nintendo for was their tenacity. They looked for a US distributor at first. Brought the NES to show after show and refined its market positioning and finally pulled it off. All the video game companies knew that Nintendo was coming to the US and what they had. If other game companies had shown even some of the tenacity that Nintendo did... Of course, Nintendo's success in Japan fueled all of that; they'd have been nothing without it. However, I do wonder how they managed to enforce their licensing contracts in Japan with absolutely zero hardware lockout (all it would take is leaking of the hardware documentation to go unlicensed), but in the west, they obviously had more to back that up. (maybe it's something to do with Japanese business/culture; I highly doubt Atari could have pulled off anything close to that in the US with the VCS, even is doing things as well or better than Nintendo). They also largely maintained their success by illegal/anticompetitive tactics . . . of course, they got away with that too until competition really pushed hard enough to break them in the early 90s (NEC and then Sony in Japan, Sega in the west). Granted, if Atari had managed to push into Japan (via Namco, or another prominent company they could license to), Nintendo would have been forced to really compete on the market rather than cheat like they did. (and Sega probably would have done far better with the SG-1000, especially as a lower-end alternative to the FC that was still better than the 2600) -
There's lots of different routes they could have taken with the 5200, from something closer to the 3200 (hybrid A8/VCS chipset with VCS compatibility out of the box), a much more cost optimized 5200 design in general (maybe with better provisions for a 2600 adapter at minimal cost and maximum convenience), or sticking with just the VCS and A8 formats in general, and basically have the 5200 a directly consolized 400/600 (perhaps even using the prototype 600 as the basis for the system). Given that there were only a few keys used for the majority of games, a minimalistic membrane/chiclet keypad could have been provided (either built-in or plugged in) with expansion support for a proper keyboard like the XEGS. (the "out of the box" keypad would be to address the foibles of the C64 GS and XEGS sans keyboard -a lot of games that used a few of the keys, but most limited to a few common keys in use overall -of course, they could also opt for the full 400-style membrane keyboard too with an option for a proper keyboard add-on) The 2600 was definitely going to remain the mainstay mass-market console for a while longer in any case, and pushing computers was a safer bet than pushing a separate format (even if close to the computer hardware), especially since they still had no lockout to speak of. As it was, many of the 5200's problems could have been addressed after the fact (especially convenience/cost effectiveness/reliability issues), and that could arguably have made more sense than dropping it like they did. (much more so in hindsight with the delays over the 7800) Hell, they could have canceled a new dedicated "game" system altogether and pushed the 600 out in '82 positioned as a low-end computer and game system. Liar, never stated it was just one source. I've continuously stated that's what internal emails, documentation, and direct interviews have stated. That "one source" is simply the one source that could be used on Wikipedia until all the rest is published. Honestly, most (especially successful) video game consoles do exactly that with their successors: the successors are initially positioned in the high-end market segment while the older machine dominates the mass market (often for over a year before the new system really takes hold -longer if the new system has problems on the market). Better examples also specifically position the new console on the market to avoid conflict with the mainstay system until the latter has fully dropped in popularity on the mass market and moved into the budget-market niche. (some others end up waiting until the system is already close to dropping out before releasing a new system at all, lots of trade-offs and risks involved in the various options, and Atari had no historical basis to drawn on either, with the market as young as it was -albeit many ended up making similar mistakes in spite of the examples in hindsight) Interesting, I'd gotten the impression that it was development delays that led to the original 3200's cancellation rather than political/bureaucratic issues. In that sense, it's a shame that they didn't simply keep pushing directly compatible derivatives of the computers in general. (the 5200 route did have potential cost advantages, but all those were so badly missed that a reasonably clean 600 derived computer/console probably would have been a fair bit cheaper overall) Which of course was never a danger. It simply did not happen regardless, and the 2600 still remained the top selling console. The 2600 hardware ended up breaking ahead of the Intellivision later on anyway (with programmers pushing its advantages more). Coleco had been a more significant threat though, and if it hadn't been for the crash, who knows what might have happened? (they were rapidly gaining market share in '82/83, though if it hadn't been for the existing market instability, that competition actually could have been very healthy for the overall market -Atari's monopoly had a huge amount to do with the instability in general, of course the home computer wars pushed the '82/83 slump into a full-on crash when the market otherwise may have recovered without such an incident) Hmm, so the 5200 had never been considered to become an eventual successor to the 2600? Ie, initially be a high-end companion, and transition into mainstream as prices dropped and if/when the 2600's popularity wavered? I wonder what Atari Inc management thought about the 7800 overall (given Warner was going to force it either way), compared to the various options with the 5200 and/or computers they had to work with. The 5200 had tons of potential on the mass market compared to the 7800, integral backwards compatibility wasn't that big of an issue in the long run anyway (it was mainly the position of the market at the time and Coleco's adapter that confused things so much). It would have helped obviously, but it wasn't necessary (and has never been a make or break issue for any game console -a bonus or good gimmick, but never the deciding factor, sometimes more trouble that it was worth too -a detriment to the hardware). That's especially significant after the fact of releasing the 5200 . . . it was out there and had considerable support; all of the major long-term problems could have been solved (cost, bulk, reliabiltiy) if Atari had invested in that. Hell, it could have ended up CHEAPER to manufacture than the 7800 by 1984 if they'd invested in a consolidated motherboard and generally low-cost design. (short of spring loaded joysticks, they also could ahev simply switched to using a "digital" joystick with pull-up resistors and made analog an optional accessory for the handful of games that needed it -and more that took advantage of it, but could work OK with just pull-up resistors- such joysticks would have been cheaper AND more reliable -the other issue was the flex circuitry/carbon dome switch problems) Plus, the component commonality with the A8 would favor manufacturing even more. (stockpiled chips could be used in either, conslidation of 1 could generally be applied to the other, etc -ie GTIA, embedded DRAM logic, other merged custom chips, etc, etc -albeit you had some 5200 specific options like cutting out the unused POKEY pins/features) The 7800 had a better reception in 1984 than the 5200 had gotten, but the context would be how a corrected 1984 (or even '83 if they'd recognized the shortcomings earlier) 5200 would have been received on the market, and that's totally up to speculation as the 5200 was never "corrected" as such. Yes, and they either failed to consider the advantages of keeping the 5200, or simply couldn't/wouldn't implement those options. (who knows what might have happened if GCC+Warner had never pushed the 7800 at all -the 5200 Jr was a step in the right direction, but still a very modest improvement compared to what they could have been pushing in 1984 -more like what it should have been in the first place in 1982) Hmm, I thought the XEGS was planned as an entry level computer that was promoted for gaming capability. Pushing the XE chipset as a "high end" game console at the time made rather little sense . . . the high price tage (relative to the $99 65XE) made even less sense for the time. It would have made FAR more sense to just use the normal 65XE in a gaming bundle and avoid the overhead of another case/board design. (actually it could have made a lot of sense not to change from the XL board/case deign either -maybe cut costs a little, but it would seem like keeping the existing boards/cases would have cost advantages compared to retooling -let alone avoiding market confusion with a very different looking design -as it is, the 800XL motherboard is pretty close to the same size as the XE's board -they could have switched to a cheaper keyboard while keeping the rest the same) If they really did want to push a "high end" game console in '87, they should have pushed a cut-down ST based console instead. (even that didn't make too much sense though, given the overall market at the time) Or they could have beefed up with 7800 directly (sort of like curt's XM, but more within limits of '87) and offered it as an add-on and a total system. (like added RAM, POKEY and SIO+keyboard ports) Hell, they could also use that to totally eliminate 7800 games with onboard POKEY or RAM in favor of exclusive "Super 7800" games using the add-on/upgrades system. (they could even have used the same 32kx8-bit SRAM chips used in Epyx games in '87/88 anyway, though given it was a separate module, they could have opted for cheaper DRAM and added interface+refresh logic -that might have even made 64k cost effective, albeit you'd need bank switching to go beyond 48k) OTOH, something like the XEGS would have been FAR more useful in '84/85 if Atari Corp had decided to ditch the 7800 (and associated red tape). Though, for the time, it would have made more sense to push a 16k machine (ie 600XL derived) for lower cost in general. (they could have evene started by repackaging remaining 600XL stock before moving on to a modified form factor -perhaps even directly derived from the 600XL's case design- with a minimalistic membrane keypad -wired via a 15-pin plug like the XEGS or embedded into the top of the system- with separate expansion for a full keyboard a la XEGS -and retaining the PBI for RAM expansion) It's odd that Atari Corp ended up repeating some of the same mistakes Atari Inc had made (lack of expansion on the computers, oddly times conflicting/overlapping products, etc, etc). The expansion port was for laserdisk, removing it was no big deal. The cart slot was a far better general purpose expansion port for RAM+sound+I/O+coprocessing, etc. (hence why the 7800 XM is possible) I thought Katz was handling all the marketing for the entertainment side of things.
-
Yes, I don't understand why they didn't include DMA (which even the Atari STE supported), or at the very least some kind of FIFO queue. The audio hardware is certainly disappointing from this point of view. Yea, but at least software PCM management wouldn't take up that much DSP time, at least if it was done with some tight looping code (not interrupts, unless the J-RISCs can manage really fast ints -like 650x or 680x). The PWM DACs are in the Jaguar chipset, but they're not actually used on the Jaguar console (there's nothing connected to their outputs). Sound is generated by a separate 16-bit, stereo D/A audio converter, which gets its data from Jerry through an I2S serial interface. (I don't know why they did that ; maybe the PWM DACs weren't working correctly.) That's really weird, though it would explain why there was no DMA support. (I'd gotten the impression the PWM DACs did have DMA, but maybe those don't either) Then there's also the way they sort of forced digital audio output in the system when it really wasn't needed, that complicated use of red book audio to an unnecessary degree. (if it was plain analog mixing to the Jaguar DAC output, you could have had red book sound with zero hit to the jaguar bus/DSP, but as it is, they had to stream digital data to the DSP, so it wasn't nearly as useful -imagine if the Mega CD had been forced to do that ) Digital audio would have been a good feature for provisions to be used in other applications (workstations, etc), but the console probably could have dropped that entirely. (in terms of actually being connected on the PCB) And you have the 4k wave ROM for other synth techniques (or to use for FM operations -rather than just using sine), but that took more skill and tool support than most developers were working with. (of course, you could also do simple "chip" synth using the wave ROM samples, and a few games seem to use that -though they could be MOD too, just like some Amiga "chip" synth sounds used AHX, but others used plain MOD) Which games used FM BTW? (Supercross 3D sort of sounds like it, but I don't remember anything else doing that) Also, a decent sample sound engine wouldn't need to take up a lot of main bus time either, even with the 8.8 MB/s bandwidth limit. You'd just have to limit the bitrate of the samples used (compression would be doubling important there to allow less ROM/RAM space to be used in general). Focusing in 2-bit ADPCM (maybe 1-bit CVSD for some things) could have been a pretty good option for that while allowing decent sample rates in general. (you'd also want to use the scratchpad for the mixing buffer for those samples -to avoid more hits to main RAM and also avoid the 16-bit write bugs JERRY suffers from) That way, if you used 22 kHz 2-bit samples (5.5 kB/s), you could manage a 32 channel sample engine with only about 2% of the main bus time (or 3% if using ROM directly). Of course, you could use 11 kHz 4-bit ADPCM for the same bandwidth (if interpolated, it might sound better than 22 kHz 2-bit), or just plain 8-bit PCM at limited sample rates using interpolation/filtering to reduce aliasing. (or even 4-bit linear PCM -quite a few cases where well optimized 4-bit PCM will sound better than 8-bit at the same bitrate -ie 1/2 the sample rate) You could also push a mix of different sample formats if you were willing to deal with the added complexity. (more so if you opted to push realtime synth on top of sample based stuff) Of course, you could also do a plain 8-16 channel sample system for significantly less bandwdith, but retain the use of compression/interpolation (possibly reverb effects) to go well beyond simple MOD stuff. (many common sample trackers were still only pushing up to 8 channels at the time, so one reasonable option for average developers would be to use up to 8 channels for music and a few more dedicated to SFX -implemented compression rather than plain 8-bit PCM shouldn't have been a huge step either, interpolation/digital filtering support probably wouldn't have been either -of course, if Atari supported tools from that in an SDK, or at least example code for such, that would have made things easier) There's no hard limit at all, it's just plain stereo DACs for the output (apparently even simpler than the STe in some respects -no DMA), but with a powerful coprocessor driving audio and giving tons of options and flexibility for sound. That could include surround sound, interpolation, decompression, reverb (echo) effects, etc, etc. It's more like the GBA, N64 or 32x in that respect, no fixed audio hardware other than simple DAC output (32x has both FIFO and DMA support though, I think GBA is the same -definitely DMA, not positive on the N64) with sound driven by the CPUs and/or RSP in the N64's case. Most games used very basic Amiga-like MOD player music/sound (some doing 8 channels at least . . .). There was little use of the DSP for realtime synth (additive, FM, subtractive, etc) or for complex sample based sound in the 32 channel range (let alone interpolation, reverb, etc -I think some MOD stuff used compression, at least). Another nice thing about that flexibility is no hard limit on compression: you could use one of the "conventional" 4-bit ADPCM schemes, 2-bit ADPCM, 1-bit CVSD, or a custom format (possibly derived from one of those -like 2-bit CVSD). Ironically, the Sega Saturn's audio DSP wasn't useful for decompression or a couple other features that would have been really useful (it was pretty much never used with the plain 32 DMA sound channels being used directly); the 68k could do it in software, but not fast enough to drive more than a few of those channels at once. (I think the DSP intended for 3D math coprocessing might actually have been more useful for some of those things than the dedicated audio DSP -the SH2s definitely were) I asked about this before, and apparently, that was only an early plan (to use OTIS); the option was kept open, but all the early development documentation released was apparently aimed FM synthesis. (so Atari was probably planning on using a common Yamaha synth chip -lots of options for that, no idea which chip they were aiming at) Not sure if there was any DMA sound support either) There's also mention of the dev units using the Ensoniq DOCII sound chip (presumably a direct derivative of the DOC used in the Mirage and Apple IIGS), so maybe that was what was intended for FM synthesis. (I think the way the oscillators are configured, it can be used for FM in addition to wavetable -real wavetable- and sample based synth, but I'm not positive -that would be rather wasteful compared to using a dedicated Yamaha FM chip though; STe style DMA sound driven by the 68k probably would have been fine for the time, or at least if the 68k+sound+I/O stuff was moved to a separate bus from the video stuff -apparently they already had provisions for dedicated audio RAM, so putting the 68k+DMA sound on that bus would have been interesting -one of the main problems with the panther is heavy bus contention and limited RAM due to the high cost of the fast 32-bit SRAM being used -the Panther processor needed it, the 68k/sound would have been fine with commodity DRAM -the panther still wasn't a very well suited design for the market at the time though, especially odd when Atari had the much better Lynx chipset to work with) The SNES sound system was also sorely underutilized due to the needs of the market at the time (the sound system being WAY overkill) and the tools/interface prvided for programmers to use. By default, the SNES uses 8 channels sampling to 32 kHz with interpolation (forced, unfortunately) and a 4x echo buffer for reverb. That's technically 32 discrete sound channels the DSP is mixing with most being slaved for reverb effects. Thus, if a better tools set had been implemented, you could have had developers with the freedom to disable reverb and have 4 channels (with no reverb) in place of that 1. (there are homebrew demos that do just that -write directly to the echo buffer to allow up to 32 channels) There's also the interpolation and BRR sample format, but I'm not sure if those are hardcoded or not, but that would be great to be able to change as wel. (ie if the DSP/SPC700 could be programmed to disable interpolation or use other compression format -or uncompressed PCM, especially since optimized uncompressed 4-bit PCM actually sounds better than BRR/ADPCM at low sample rates, and is a slightly lower bitrate than BRR, though the same as true 4-bit ADPCM-) As it was, any of that would really have been overkill anyway and the much cheaper 8 channel Ricon PCM chip (FM Towns, arcade, Sega CD, etc) would have been about as good for most things (if not better in some cases). Especially ironic since Ricoh was Nintendo's prime chip vendor already. Not really, there's tons of cases where you'd want many more channels in spite of limited RAM/ROM to use. You do have a LOT more RAM to work with in general though: the 8k scratchpad is just forfast code and data (could also be used to mix samples to before outputting to the DACs) while all the samples are stored in main RAM and/or cart ROM (be it uncomprssed PCM, or any number of compression formats). So you have a LOT more to work with than the SNES in that respect; albeit, the SNES can have samples updated on the fly from ROM/main RAM. (if it had been allowed to read directly from ROM, it would have been FAR more flexible -and cheaper too, since you could cut out the 64k SRAM/PSRAM -maybe just 8k or such for SPC work RAM, unless it's already got work RAM on-chip) You don't use the 8k of SRAM in the Genesis for samples either (aside from a few very limited cases of small sampled crammed into Z80 RAM), but instead have the Z80 pull samples from ROM directly. (which it accesses in 32k banks, unfortunately with a very cumbersome serial shift register making a lot of overhead to switch banks -especially problematic for streaming multiple samples as you'd be switching banks very often -buffing chunks of the samples into SRAM would cut out a fair amount of that overhead though, and also address bus contenton issues for ROM -68k isn't a problem, but the VDP asserting DMA is) Anyway, the main reason no developers pushed massive sample based sound engines (or few to none pushing realtime synth for that matter), was simply due to the developers Atari was working with and the budgets as well. Amiga MOD was a very common and easy to work with format, that's also why the SNES was often used as a glorified 8 channel MOD player with compression and filtering/interpolation -the latter being a double edged sword). If you'd had really heavy hitting developers pushing the system's audio, you'd probably have stuff that beat the Saturn's realtime sample/synth stuff (or PSX for that matter). That's not the same problem the 32x had . . . Sega likely would have pushed the PWM sound a lot further (even in the limited time it was supported) had it not been for one major issue: lack of documentation for the DMA sound feature. Without that, you're limited to using the FIFO buffers or just doing plain interrupt/software times playback with the CPUs. The main reason for that is that the pre-release dev systems were buggy and couldn't use the DMA feature, but all commercial consoles worked (they just failed to update the tools in late 1994, or even '95). With the slave SH2 dedicated to audio and using DMA, the 32x could reasonably manage 32-64 channels mixed at 22 kHz (more variables depending whether you're pushing compression, other effects, of if you bumped it to 44 kHz). That's actually a pretty good use for the slave CPU given the bus contention issues (sound processing being very computationally intensive, but bandwidth light), though things like 3D calculations would also fall into that category in some cases. The 32x's PWM also is configured in such a way that you can increase the max sample rate by dropping the resolution: it's at 23.01 MHz and you could use any combination of sample rate/resolution based on that. The common one to use is ~22 kHz, which gives 10 bit resolution output (23010/1024= 22.47), but you could opt for ~44 kHz audio at 9-bit output (actually 44.94 kHz), or 89.88 kHz with 8-bit output. You wouldn't want to push 11 bit at 11 kHz though, as anything below ~14 kHz has PWM squeal problems. (anything below 14 kHz should be scaled and output to PWM at a higher rate, albeit I think that's also avoided if you use interleave/multiplex type mixing -outputing 1 sample after the other to a sample rate several times higher than the individual "channels" rather than adding them -that could be especially useful if you wanted to use uncompressed 8-bit PCM with minimal overhead; just use 89.88 kHz playback with 6 ~15 kHz channels interleaved, or 4 22 kHz channels, etc -that's channels per L/R stereo, so you could use that as a single bank of stereo channels, or 2 banks of hardwired left/right channels like the Amiga -that sort of mixing is also especially attractive on the Atari STe given the limit of 8-bit resolution output, but pretty high 50 KHz max sample rate -so you could interleave 2 25 kHz channels left/right or 3 16.67 kHz channels, 4 12.5 kHz, etc, etc -no resolution loss from adding, and no overhead dealing with overflow of the limited output resolution) Most/all sound engines on the Jag use main memory. (Flare had intended use of realtime synth -via FM and/or othr techniques using the 4k wave ROM on-chip and allow it to stay off the main bus almost 100% of the time, but I don't think anyone took that route at all -sort of the same thing with the Slipstream/MS, they had intended a lot of realtime synth and FM was demonstrated in the dev tools provided by Attention to Detail, but then you hear complains about running out of RAM with sound samples -even one mention of "FM synthesis" taking up too much memory -I'm almost positive that was referring to sampled FM instruments and not realtime FM) It should be very capable of sample based synth well beyond MOD, but that was far and away from common tools being used to develop for the system in most cases. The most common >8 channel smple format was General MIDI's 24 channels, so a MIDI driver on the DSP would have been the most realistic for getting support (not sure if Jag Doom does that or not). I seem to recall that Atari had actually supported a general MIDI driver for the Jag, but I'm not sure on the specifics. Realtime synth is generally much MORE intensive than simple sample based stuff. (also, very small/short looping samples use a LOT more resource than long samples -the Amiga would have a hell of a time trying to do what the PC Engine does with the 32 word long looping samples it pushes in hardware, but long samples don't use much CPU time at all by comparison) You can pretty easily manage a software MOD player on a 8 MHz 68000, but don't even think about trying to push realtime FM synth.
-
Of course, that's assuming they couldn't have have managed to pull through with the Lynx and Computers alone being pushed in 1993. (maybe a Lynx III in 1993 -reflective color LCDs might have allowed a massive jump in battery lift on top of lower cost and bulk, even if it only ended up roughly 6-bit RGB quality output, that probably would have been well worth it -alongside the backlit models, of course, so as to not lose that market sector) The computers were pretty much dead in the US, but they still had a glimmer of home in 1992 in Europe, especially with Commodore falling apart and PCs still being sluggish in penetrating the market. (the Falcon '040 had some real potential for the mid-range market on top of the low-end '030 -maybe they could have had an middle-ground model with an '030 on a full 32-bit bus and optional fastRAM -like the TT- and optional FPU -I kind of wonder why the original Falcon wasn't using a 16 MHz 68EC020 given the significantly lower cost and relatively close performance -or why they handn't offered lower cost derivatives of the TT prior to the Falcon) Atari had probably missed their chance to make a place for themselves in the PC clone market in the US, but maybe that could still have been profitable at the time. (haven't seen much on their PC efforts) The real issue wasn't revenue/funding alone, but CREDIT/investment capital support for Atari. The weak revenue meant weaker credit, of course (short of the Tramiels being their own creditors and loaning private funds to Atari -that alone could have boosted investor confidence though). And that's what the jaguar really helped with, not profitability, but hype driving lines of credit to facilitate deficit spending to support them in the short run. (given the finanacial reports, the jaguar was never profitable, 1994 came close, but not quite -software R&D, advertising, manufacturing, etc costs continually exceeded net revenue from Jag sales, so the best thing the jag did for Atari was hype them up to extend larger lines of credit) Good original titles are great if you can get them, but having lots of decent (or even average/mediocre) multiplatform titles (even if mostly "shovelware") could have been far more important. As it was, they didn't end up with that many "good" original titles at all, and many of the "original" titles were inspired by common mass market games that would have drawn far more interest than what the Jag got. (ie the hot multiplatform titles on every other system but many that the Jag didn't get, or PC ports that other consoles weren't actually getting, etc) Having original or exclusive titles may seem important on the surface, but if you don't have "shovelware" to cater to the masses, you're pretty much screwed. Yet, a system with few to no compelling exclusives, but lots of good to decent versions of popular multiplatform games (maybe a few that are significantly better than on others or at least cheaper) could be reasonably successful on the mass market. Of course, that's sort of a chicken an the egg thing: to get really strong mass market support, you need general popularity and influence to get strong 3rd party support in general; out of pocket licensing of 3rd party games can only go so far, but probably would have been a better option than many of Atari's investments in unique games. (and if they got lucky, many they'd end up with a few really good exclusives too, aside from that, the best "exclusives" would be computer ports that noone else was pushing -there was tons of Lucas Arts, Sierra, Epic Megagames/Apogee, etc, etc titles that were only on PC -in Epic's case, you also had a relatively small developer pushing low-cost, yet fairly compelling games -Jazz Jackrabbit and Blake Stone on the Jaguar could have been pretty cool, let alone Duke Nukem 3D -which went multiplatform, of course) Of course, not having ANY 4th generation game console (let alone a competitive one) was a huge hurdle to overcome for Atari, not just for market position (for consumers, media, and developers), but for Atari's own financial situation. (with even a mediocre success with a 4th gen home console -something on the level of the 7800's sales, more like the panther might have ended up as- that could have meant a lot of heat would have been taken off the Jaguar's release and greatly reduced Atari's problems of brand recognition on the market, let alone if they actually managed to pull off competitive hardware/marketing/3rd party developer negotiations/etc against Sega and Nintendo in the US and/or Europe) Heh, maybe even Katz would have stayed if he knew something really big was on the horizon (or come back from his vacation to re-join Atari rather than Sega), or maybe go back to Atari Corp after 1990, when he was replaced by Tom Kalinske at Sega. (of course, Katz had also favored the offer for Atari Corp to distribute the MegaDrive in the US -offered back in 1988, before the MD even launched in Japan, but, of course, Dave Rosen and Jack Tramiel couldn't agree on the terms of the partnership -mainly contention over what would happen with Europe- so that fell apart of course, Atari was in a much better position than Sega in the US at the time with the 7800 outselling the SMS by a good margin and the MD was totally untested in Japan -plus, if it hadn't been for the radical shift in management at SoA with Katz and Kalinske, the MD could have ended up another SMS-like snafu in the US regardless of the quality/quantity of 1st/2nd party Japanese software -and some of that JP software was facilitated by SoA management too, and obviously the relationships built with western developers -EA likely would have been sued over unlicensed publishing had it not been for Katz deftly managing a favorable licensing agreement with EA that led to a strong relationship with the 2 companies and made EA a key element of the Genesis's US success) Well, it managed to build up enough hype to partially mitigate some horrible management mistakes made from '89-92 by Atari, and only in the short run (if it hadn't been for the Sega lawsuit winnings, the jag would have done nothing but dragged them deeper into debt given the financial reports from '93-95 that got posted a few months ago). In that sense, the Lynx (or even computers) could have been more profitable in '93-96 than the Jaguar, though if they lost the Sega lawsuit due to 1993/94 cashflow problems (and unwilling to loan private funds instead of 3rd party investors), it could have ended up worse overall for them. (though had they opted to make the minimum contribution of private funds, it actually could have been a better investment overall -and no interest being accrued by 3rd party creditors) Yet Sega ended up with rather sloppy/inefficient (at least cost/performance wise) hardware and had massive internal management issues following 1994 (some in 1994, but mainly starting with the Japanese upper management oddly forcing SoA to push the launch date up to spring of 1995 rather than the planned Fall/Summer date -wrong for some many reasons, and totally exacerbating many of Sega's existing problems). You could argue Atari's management problems were largely tied to funding (except they started slipping almost as soon as Jack -and Mike Katz for that matter- left the company, so there's obviously more to it than that), but that definitely wasn't the case with Sega prior to late 1996. From a hardware standpoint, certainly, it was probably the best chipset (in terms of overall cost to performance in a given configuration and manufacturer) of anything on the market in 1993 or even '94 (if you restricted the PSX chipset to a single bus and similar component/manufacturing costs to the Jaguar, it might have even been weaker overall -stripping away Sony's vertical integration and volume production, of course, but the Saturn obviously is many times less cost effective -hell, if TOM+JERRY was configured closer to the Saturn, it probably would have had significantly better performance and still cheaper to manufacture than Sega's hardware). That's both a testament to the Flare engineers, and the general emphasis on such a tight, high performance, low-cost optimized design. (if Sega's management had set their far more substantial R&D resources to work on a project of a similarly aggressive, high performance/low cost design, they might have ended up with something even better -probably for a lot more overhead, but also probably cleaner/less buggy and/or in less time- but that obviously wasn't the case -in many respects the Saturn is a really weird design for the aggressive home game console market A shame it wasn't pushed as such from the start. There could have been an emphasis on non polygonal 2D+pseudo 3D with minimal use of polygosn when absolutely necessary to make the best of things. (with a gradual evolution of more and more polished games following that formula) You had plenty of options for scaled 2D, height maps (voxels or doom/wold3D type ray-casting), flat/gouraud shaded/texture mapped polygons, and beyond (interpolated height maps are awesome). Even focusing on gouraud shaded optimized models more could have been a big help. (you also have numerous games where they probably should have dropped to lower detail/resolution/screen size to allow a more reasonable framerate -even better as an option rather than default only, be even if you forced AvP to have the same resolution as Doom, it probably would have generally been preferable, you could even have extreme cases like a fully texture mapped polygon renderer in a 160x96 window using 2x2 scaled pixels -maybe interpolated- could have been quite useful -you also had some odd cases on the 3DO like Doom where high detail was forced in spite of low framerate -and dropping the screen size was not an attractive option, dropping to 1/2 horizontal or even horizontal and vertical resolution could have made the game far more playable) Aside from exclusive games, you already had cases like Commanche to start off with. (Cubermorph with voxel landscapes would have been really neat) So a decent starting point to push voxels as well as more options for scaled 2D. Scaled/rotated stuff, warped perspective (mode 7-like) stuff, Wolf 3D, Blake Stone, etc, etc. (a port of Wing Commander 1 and 2 would have been really nice, let alone some scaled arcade games -or clones of those short of actual licenses, or derivatives of scaling heavy lynx games -then there's also the Atari Games polygon and scaling based games, and they were in a fairly decent working relationship with TWI at the time) A port of X-Wing would have been awesome for the time among various other PC games the Jag was reasonably to exceptionally well suited for.
-
95% as in 95% of the RISC core design, right, not the entire ASIC (JERRY being simpler with just the I/O and DACs -and 8k scratchpad- vs the MMU+OPL+blitter and 4k scratchpad)? All those were .5 micron chips? I wonder how usable the 1st revision JERRY chips were . . . if they were tolerably usable, maybe Atari could have pushed an even earlier launch with a simple DMA sound+IO ASIC (maybe an FM synth chip too) instead of JERRY, or maybe even directly re-using the sound and controller I/O logic from the Falcon. Thus, Atari was involved at day 1 of the design. The business side and funding was ironed out in mid 1990. From other interviews, we know that Atari was working on Panther as early as '89, and the Panther chip had just been finished when they funded Flare, in 1990, to develop the Jaguar: http://www.konixmultisystem.co.uk/index.php?id=interviews&content=martin Yes, Brennan's recognition of the Panther's limitations led to his suggesting that Atari drop the Panther and push for a new, more streamlined, 3D supporting design. (which, in the end, not only added a lot of 3D grunt, but also addressed the major problems that made the Panther so unrealistic) Albeit, given Brennan's apparent dislike of the Panther concept, I wonder why they didn't go all in with the blitter in the Jaguar for both 2D and 3D performance and do away with the object processor entirely. (so more like the Lynx or Flare 1/Slipstream) For that matter, I wonder why Atari ever had the Panther in development when the Lynx chipset made the basis for a far more flexible/programmer-friendly/realistic/cost-effective console at the time (granted, it needed a bit of tweaking to be implemented for a proper 4th gen console). Hell, even an STe derived console could have been more practical than the Panther overall (especially if they'd added dual playfield support -basically doubling the SHIFTER- with 2 independent framebuffer scroll layers with separate palette entries as well, perhaps a 16 MHz 68k with wait states). Then there was Flare's own Slipstream chip that Brennan could have pitched to Atari while working on Panther. (though given Atari already owned the IP to the ST/Lynx hardware, Flare would have needed to make a fairly competitive offer for that to be preferable -except the Slipstream design was definitely more attractive than the Panther design Atari had already looked over the Lynx and ST in favor of; the Slipstream was a ready-made audio+video ASIC that meshed relatively well with programming practices of the time and had some pretty nice features vs the video only Panther that had yet to be implemented in silicon and was about as unfriendly as MARIA had been with the 7800 -of course, the Slipstream ASIC was designed to interface with Z80/x86 based CPUs, so they'd have to select among those options if they were going to use the existing Flare ASICs -a 100 pin version supporting an 8-bit bus for a Z80 or 8088/188 and a 160 pin version supporting 16-bit x86 chips and addin an on-chip floppy controller and supposedly added logic for better bus sharing) Hmm I wonder how quickly they completed the redesigned Object Processor. (if they had the OPL+BLITTER completed by early 1991, that could have potentially meant pushing a more limited precursor of the system out by late 1991 -no GPU or JERRY, so they'd have to work with other hardware available much more quickly, be it off the shelf or derived from the ST, Slipstream/Flare 1, or Lynx hardware) Not having any 4th gen game console really hurt them, especially in the US market (and general revenue/stability). Granted, a 1989 released Lynx/Slipstream (or even STE-derived) console could have been better than waiting until 1991 (even with better hardware than any of the '89 options). Even the Panther could have been better than nothing (mediocre compared to the other options -even the STe based option in terms of practical use and ease of programming). 1991 was definitely the latest point Atari could push, both due to the competition, and due to Atari's financial position ('88/89 was really the healthiest period in that respect, '91 had slipped a bit but marked the last point where they were reasonably stable with the further decline in '92 and dire straits of 1993) 1989 marked the point when the 7800 went from the >1.3 million (US alone) annual sales of '87 and '88 down to under 700k in '89. (and then under 100k in 1990 -not comparing software sales, of course) So '89 was a good time to shift gears as such. (plus their financial position at the time) Hell, they probably would have been a lot better off launching a home console version of the Lynx in 1989 and waited until 1993 for the handheld version. (at that point, reflective color LCD screeens might have been boarderline acceptable in contrast/quality -of course alongside "deluxe" backlit models -with higher cost, bulk, and much weaker battery life)
-
Oh, wow, I hadn't realized there was no DMA support at all (be it direct DMA from the scratchpad or DRAM, or a small buffer the DSP could write to -like the FIFO in the 32x, though there's also full DMA for that, just not officially supported in any of the dev tools -the homebrew coders figured it out thanks to a leaked Sega diagnostic cart). That would add a lot of unnecessary overhead on the DSP compared to hardware buffered playback. (why not implement a simple DMA circuit?) Also, I thought the PWM DACs in the Jaguar were configured as 14-bit stereo supporting up to 207.8 kHz (implemented as 4 PWM registers at 26.6 MHz configured as 7-bit linear DACs and paired for 1L and 1R 14-bit channel at 207.8 kHz), though I'm not sure why they wouldn't have configured them as 4 8-bit linear DACs paired for 103.9 kHz 16-bit stereo instead. Having something like the Falcon's DMA sound logic embedded on JERRY would have been great. (just use the DSP for scaling, interpolating, decompressing, managing effects like reverb and digital filtering, realtime FM/additive/wavetable/subtractive/granular/etc synth, and mixing any additional channels -when you needed more than 8, or just plain 44-50 kHz 16-bit stereo DMA sound with the DSP mixing all the time) I wonder if Flare did the same thing with the Flare 1/Slipstream design. (it would be odd given how simple a DMA PCM cuircuit is) Also, the Jaguar's DSP is more like a general purpose CPU highly optimized for graphics oriented tasks (being the same RISC core as the GPU in TOM), similar to some other "real" GPUs like the older TMS340 series. Those graphics optimized tasks ended up having a fair amount of performance useful for audio-DSP as well, and while it used more gates/transitors than most contemporary dedicated DSPs of similar performance, it was a low-cost option for Atari due to owning the IP and having it on a cutting-edge .5 micron process (so cramming a lot of gates on a small chip), plus that meant putting it on-chip with the I/O and sound logic. In any case, the DSP was still sorely underused the vast majority of the time. Mostly simple MOD players, maybe using ADPCM and/or interpolation and a few games that may have pushed other synth methods. I think some may even be using plain uncompressed 8-bit PCM for MOD music, something on the level of the Amiga's basic DMA sound system. (in that sense it would have been far more cost effective to simply implement a PCM sound system with hardware not scaling, maybe hardware decoded ADPCM support -maybe even offer 2-bit ADPCM and/or 1-bit CVSD in addition to 4-bit ADPCM; that would have been FAR simpler and cheaper than adding another RISC core -maybe cheap enough to make them consider a more powerful CPU with a cache -in that case, you could even have simple Falcon level DMA sound with more CPU resource dedicated to audio work)
-
Another note on this: Chilly Willy (32x homebrew programmer) recently mentioned that Jaguar Doom actually does all the game logic on the 68k with only the graphics (and sound) handled by the custom chips. (presumably the 32x version Carmak assisted with in parallel with developing the Jag version also used the 68k for the game logic -in that case without bus contention, but at ~58% the clock speed- with the SH2s doing all the rendering and sound, but there's no source code available for 32x Doom like there is for Jag Doom, so there's nothing definite on that issue)
-
They're bad especially when compared to Virtua Racing on the 32x but I am able to get a pretty good pattern going and can find my way around the tracks. How much of that is luck, I don't know. It seems to me that when racing against the computer that even despite crashing and rolling your vehicle often that you can still keep pace with the rest of the field. I'd say they're pretty bad compared to Stunt Race FX on the SNES . . .
