-
Content Count
2,444 -
Joined
-
Last visited
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by kool kitty89
-
SMB 3 didn't use the MMC 5 chip. SMB 3 used the MMC 3 chip. I got that info from http://www.gamefaqs.com/nes/916386-nes/faqs/2946 and http://en.wikipedia.org/wiki/Memory_Management_Controller . Right, thanks, I remembered right about the MMC5's capabilities (though I oversimplified the feature set), but mixed up SMB3, it uses the earlier and cheaper MMC3 and thus does no add the 8x8 attribute cells. (normal 16x16 cells) The MMC3 does add scanline interrupts to faciliate added software effects. (on top of various scrolling effects, it would allow interrupt driven raster effects as well -like palette reloading) I'd though the NES had built-in raster interrupts, but maybe I was mistaken. (or maybe it was hblank interrupts rather than scanline interrupts) That's just done via bank switching, not fundamental extention of the address logic. It's actually unattractive in some areas due to the multiplexed address and data lines and need for an external latch to make use of the expanded addressing. (so less flexible than some external bank switching schemes anyway) The HU6280 expanded the 65C02 (or rather the "full" version with the added instructions from Rockwell -and thus all the instructions of the 65816 but lack of the added internal logic) to a 20-bit (2 MB) address space (also using bank switching, but non multiplexed and with some other advantages including internal logic facilitating single cycle bus accesses rather than 1/2 cycle accesses requiring faster memory -the same thing several 6502 systems accomplished with external logic). The 6280 also added sound and I/O hardware on top of the CPU core and mapping logic. And, of course, there's various options for external banking with discrete logic or a cutom mapper IC, or the option of merging such logic with a 6502 (or '816) core for a custom CPU ASIC. (which was relatively common with the 6502 -especially due to it being relatively cheap/easy to license, even Atari Inc did that by merging halt logic into the custom SALLY 6502c for the A8, Nintendo/Ricoh did it twice with the NES's CPU+sound+I/O chip and SNES's CPU+I/O chip -the latter also added a fast multiplication coprocessor on-chip iirc) Unlike the 650x family (or even Z80 or x86 to some extent), the 68000 was not particularly cheap/attractive to license and thus you didn't see nearly as much licensed/custom implementations as such. (only large, dedicated foundaries tended to licence it -Signetics, Hitachi, etc- up to the mid 90s when you saw it being more readily used in embedded ASICs -like Sega merging it with the VDP/system ASIC in late model Genesis consoles and with the audio ASIC of the Saturn) That's also a major reason Nintendo went with the 6502 architecture twice and both times with custom implementations merging additional logic. (or NEC doing the same, or Atari with SALLY, or with the single chip VCS, etc) So even with CBM's in-house chip manufacturing, there would be considerable overhead from the license to counter the advantages of in-house production. (let alone economies of scale -volumes CSG/MOS would be producing it in vs what Motorola or other licensees were putting out -unless CBM sold 68000s as a general 3rd party vendor on top of using it in-house) I dunno. Audio from the Amiga Paula chip seems "warmer" to me. The Yamaha just sounds "synth" in comparison. I can't imagine the Yamaha chip doing as good of a job with the music from, say, Shadow of the Beast. There are a lot of trade-offs, but in a pure "game" setting without an Amiga with a beefed up CPU and lots of RAM (for large samples and software mixing), the only major disadvantage of the Yamaha chip is for some percussion instruments -which is what you most often see sample channels being used for to supplement FM. (everything else is possible to synthesize -or in some cases nearly perfectly replicate, and at mugh higher sample rates/carity- with FM, though some string instruments and such many require pairing channels -which you'd also do for echo/reverb effects among other things) There's a lot of mediocre FM synth out there, especially underutilization of 4-op FM synth chips (like the YM2151 and the YM2612 in the Genesis/MD), but also some exceptional examples that really show what's possible. (even without added PCM perchssion in some cases -though FM percussion is certainly one of the weakest areas, you still have some good examples with the likes of Sunsoft and Technosoft on the Genesis among others) For 2-op FM a la Adlib, weaker sound is less surprising, but even then there's tons of wasted potential. (underuse of pairing channels for more complex sounds, underuse of the variable waveforms, underuse of the full 9 channel if pairing isn't used, etc) The 4 channel limit of Paula (without software mixing) is rather significant as well in terms of the complexity of music, granted, if you pair 4 channels on the YM2151 you'd be limited to that too (but it's optionaldepending on the case). You see music catering to the limited hardware channels of the amiga as such, and better examples of FM arrangements catering to the strengths of that as well. (it's really nice to have both though -and that's why you see a lot of the better sound engines on the Genesis pushing the 5th FM channel in DAC mode as an 8-bit software driven sample channel -almost always managed by the Z80- ) As for Altered Beast in particular, yes I think the YM2151 could do better renditions of most (if not all) of the tracks in the game, though "better" is relative (up to personal taste) and you wouldn't get the best example (by far) if you just tried to emulate what the Amiga does. (some people don't like the arranged CD-DA tracks on the FM Towns or PC Engine CD versions of Altered beast for that matter) You don't need a speech synthesizer either, you can do it in software with some speech samples on the cartridge and CPU driven speech algorithims triggering software playback of various samples with the right timing. (with very limited memory, you'd need to use an actual speech synthesis engine as such or very cut-down speech samples in number/length/quality, but with more memory, you can push more for straight recordings -there's also the possibility of lossy compression beyond just dropping the resolution or sample rate, but that's CPU intensive -maybe possible if you halt the game when speech is played) You'd probably want to use both for both purposes (on top of TIA) depending on the circumstances. (a lot of Genesis games make use of the limited SN76489 for music -sometimes even for lead instruments or for additive synth with FM channels, and POKEY is more useful in many ways than the simple SN PSG) Even TIA might have some value for music (mainly for bassline and percussion type stuff), but would obviously be useful for SFX as well. (it would depend on the game and the FX sounds you wanted to determine what mix of sounds you'd want) Given the limits of TIA and the fact you probably wouldn't be doing IRQ based sample playback, TIA would probably be the most attractive option for 4-bit volume modulation PCM playback. It amazes me nobody [MOS, Atari, Synertek, Rockwell, etc.] didn't produce a modified version of the 6502 that allowed for more memory without bank switching. Don't stone me if the Lynx's version of the CPU actually accomplished this... That has nothing to do with the NES's DMC banking though: the DMC channel is limited to reading 8 or 16k blocks of memory (via DMA) for playing 1-bit delta modulated samples at ~4 to 33.5 kHz with no CPU assistance otther than switching banks if a sample exceeds the 8 or 16k limit. (iirc, an interrupt can be generated for switching banks) As for the 650x architecture itself, I'm not sure, but I'd think extending the 16-bit address space might have been tricky to do while maintaining full bankwards compatibility. (or maybe it was more of a low-cost/silicon issue) It took x86 until the 386's protected mode to achieve practical flat 32-bit address space. (the planned 32-bit successor to the '816 would probably have added flat 32-bit addressing though) The bigger issues with the 65816 was that it was only offered with address/data lines multiplexed, and that wasn't so much of an issue for the 808x CPUs with slow memory accessing, but for 650x that made things more problematic from what I understand. (on top of not resolving the 1/2 cycle memory access timing needed -without addign external logic) NEC's 6280, while simpler and less powerful (per clock) in many respects, also managed a considerably more useful design with the fully integrated mapping/MMU logic and full ingle cycle bus accesses meaning memory could be clocked at the same speed as the CPU for zero wait states. Going from the side of missing features to the line and aside from a 32-bit memory model there's things like: -added/enhanced prefetch or full dynamic caching beyond that (goign beyond the single cycle memory accessing and making it far more useful to have faster 650x chips without goign beyond memory limits of the time) -a derivative of the '816 with a 16-bit external data bus (on top of non-multiplexed address lines) And a variety of other possibilities for the line. (the route taken with the Z800/280 would have been rather interesting for the 650x architecture) Yes, but very few games did so with Phantasy Star among those exceptions. (more games pushed such in the later years of the system's life, but that's a different context than '86-89) Remember that the vast majority of Genesis/MD games were no larger than 512k for the first couple years. (and was still the common standard in '91 -Sonic 1 was only 512 kB) The Phantasy Star games were among the exceptions to those too.
-
Tramiels didn't like spending money on carts where they didn't have to. Krewat said the same thing about the Epyx titles: "Can't be 256K, got to be 128K." Peter Pachla said the same thing about fighting with the Tramiels over the additional RAM in the Jinks cart. The official dev guide says that there must be no additional RAM in any game without it in writing from a Tramiel. Sad reality. They just didn't want to have games that size. It makes sense though, especially since low cost was the one definitive selling point for the 7800. (and by '89, the 7800 was in decline with only about 1/2 the sales of 87/88) I will agree they pushed the low-cost angle too sharply for some things though, and not only could have pushed a higher end side of the software market, but probably could have pushed higher profit margins in general on the games and still had a significant advantage on the competition. (in Europe it was tighter with the heavy computer competition of budget games -yet another reason to push for Euro development support though, a great source of budget titles that would mesh well with the low cost advantage of the 7800) Hell, given the amount of hardware they had to put on some carts from RAM to POKEY, I wonder fi they ever considered passing that off to an add-on (one that would also be cheap enough to integrate to the main board for future models -more or less something that would "correct" many of the 7800's shortcomings tied to the 1984 release plans) As it was, the production volumes of games pushed apparently never reached the point of economies of scale that favored custom logic for bank switching (embedded in the ROMs in extreme cases), embedded low-cost sound chips (embedded in the banking ICs or maybe even ROMs in extreme cases), glob top ICs, etc. (not sure about the glob-top issue since some pirate games and relatively low production run games used glob tops too iirc) Or completed GUMBY like GCC planned. But any additional sound chip is still an extra cost. Yes, but we don't know how capable GUMBY would have been, we don't know the added costs related to that, Atari Corp was already tired of the issues from dealing with Warner over GCC and weren't interested in pressing on with that, etc. Any such custom IC would A. only become attractive once POKEY supplies were exhausted, and B. only if they were producing said chips in high enough volumes to be attractive over using off the shelf options. (especially if also investing in custom banking logic in which the sound logic could be integrated for a general savings on cart board space) Such investments should have paid off the long run, but only with sufficiently high production and resulting economies of scale. As it was, I'm not sure why GCC didn't cram a smaller off the shelf sound chip into the 7800 from the start or ask for Atari Inc assistance in designing the sound chip logic, perhaps even deriving it from POKEY. (especially if they could embed it in MARIA, but at very least make a new low-cost POKEY with a smaller die and pin package -though keeping some of POKEY's I/O functionality could have paid off for the planned computer add-on) Very true. I wonder if part of the issue here is that they'd have to pay someone to design and develop a game like Zelda? It's probably pretty easy to fill 128K with a relatively simple game. Look at Cracked. The development costs on an adventure title like Zelda would be quite a bit higher though. This also reminds me: Nintendo wasn't willing to put it on cart initially either (or several other large games) and hence the use of the Famicom Disk system add-on for such large games early-on, though all eventually went to cart as well and the FDS ended up declining in Japan and never was released in the US (in part due to piracy).
-
It's sort of a chinking and egg sort of thing, without strong marketing, you won't go anywhere (especially in the US), and even without strong in-house software support, strong marketing and sales of the hardware WILL drive 3rd party developers. But that's not even what I was saying alone: you'd start with the initial 7800 lineup and have more of a budget not only for marketing, but for software development as well. (lack of funding for both is what Atari Corp lacked -they DID have some in-house game programmers a la the computer staff, but they really lacked the resources to put forth any sort of major development in general -otherwise they could have been outsourcing to 3rd parties in general much more than they did) Having exclusive games wasn't THAT critical either, but having a lot of marketable games WAS critical regardless. (exclusives are important, but just one facet of the overall issue, and one that feeds back into funding -exclusives licenses are considerably more expensive than non-exclusive ones -or even temporary exclusives that give a several week or moth lead on the release on one platform over another, plus commissioned games in general) Fully independent/licensed 3rd party support would have been a huge factor too though and had been critical for every platform since the NES. (just as marketing has -in house software has only occasionally been the deciding factor, and even then never for all regions -Nintendo losing Square was one of the biggest single factors -if not the biggest- in their weaker competition with the N64, especially in Japan -the catalyzing effect it had on pulling other 3rd party support in favor of Sony was massive as well) I'm operating on the notion of Tramiel still not releasing 7800 until '86 or '85 at the earliest. In the face of that, there is no chance for them to secure JP support because by then Famicom had come to rule the roost in Japan and Nintendo had already done the exclusive contract thing with said JP 3rd parties. They'd have to try for western devs, but many had fallen off during the Crash or gone to the personal computer market. 7800 would still need exclusives. Atari Games, GCC, and internal consumer developers ensures that. That's a totally different context than what I'm saying, and implies Atari Corp purring mor effort into other things which also may have left them better off. They wanted the 7800 out ASAP, and a proper transition could have facilitated that. Assuming they DID hold off on the 7800's release, that could mean various preparations for securing game licensing, building up software more, making connections with various 3rd party developers, etc. If all of that was diverted to the ST instead, that may have meant the ST got better marketing or even better hardware from the start among other things. (and that also could have put Atari Corp in a stronger position in general) And even then they could have still looked into Japanese arcade licenses much sooner than they did. Plenty of other possibilities for this context. Again, exclusive games are well down the list of the issue: having reasonably strong 3rd party support (even for multi-platform games in general) would be a MASSIVE step up especially if those games were good quality versions on the 7800 and got good marketing. (unless the lower price point and 2600 compatibility could push beyond that and broaden the market for Atari in spite of weaker net software support -getting European developer support would also be significant and something they missed out on historically) In that context, we're already talking about Atari Corp being behind Nintendo in general, but much better off in the market than they were historically. (a smoother transition with a 1984 7800 release, more in-house development staff and probably a good relationship with Atari Games would have been the context for Atari Corp to actually pull AHEAD of Nintendo at the time) Yes, but they'd have gotten nowhere without that marketing, and that's partially depicted in Europe. Sega really just lacked the right marketing, they had enough good software to compete in '86/87, and having reasonable 3rd party support would have bumped things a lot more. (again, more or less so what they had with the Genesis, but earlier and with some contextual differences -rather than breaking Nintendo's stranglehold on the US market, they'd want to prevent that from forming and thus get generally competitive multiplatform 3rd party support and prevent Nintendo's exclusivity in general -including pushing even Japanese developers to support it and possibly finding loopholes around any of Nintendo's licensing contracts) Tramiel did, as did several others, but none would agree to Warner's terms (either lacking the funds or unwilling to make such an investment with the corresponding risks). That's why Warner later called Tamiel (more or less out of the blue iirc) and offered the split deal on more favorable terms. (Marty or Cure would know more, but I'm not sure what the terms were for any full Atari Inc sales proposals) They didn't need Atari Games or GCC specifically, they just needed more funding and resources for marketing and (to some degree) in-house/licensed software in general. (having the better chunk of console/computer programmers would have done that, having GCC and Atari Games would have helped too though) Sega managed to beat Nintendo in several European markets mainly due to marketing/mass market appeal and distribution. (and the arcade tie-in was only part of it -having better Euro sports games is also a notable area on top of the competitive marketing and perception of superiority of the SMS in general -especially the graphics being closer to 16-bit computers vs the NES looking more like the likes of the C64 to a fair extent) Yes, or no Sale at all rather. A sloppy sale with other snags (which definitely could have included the 7800/GCC issues as those had nothing to do with the split and everything to do with the transition away from Warner) could have been just as problematic and worse than a split that got a proper transition. (the loss of Atari Games wasn't nearly as bad as the legal conflicts the sloppy management of said split forced on Atari Corp vs Games and the separate issues over the 7800/GCC, Morgan's reorganization plans, general chaos at Atari in general, etc) Best case would be that Warner had stuck it out regardless or spun off Atari Inc in some other manner that avoided such conflicts and problems but still got it off the books. And, iirc, the litigation Nintendo was fighting. That probably helped, but was a relatively small factor in the short run and Nintendo took years to pull their exclusivity. (no antitrust suits were ever won in court Against Nintendo iirc, but some were settled favorably out of court) There were plenty of loopholes prior to that though and the lack of interest/market share was the bigger problem for getting 3rd party support. Except that ignores that NES shipped with lock out chip in place already. So they already planned the enforcement of such policies in the west. Yes, but so did the 7800 from day 1 and in any case, lockout only meant they could (to some degree) protect against unlicensed development, but actually enforcing restrictive contracts (beyond royalty agreements for licensing) would depend on Nintendo attaining a strong market position and market share without strong competition to make such terms unattractive to 3rd parties. That, and they also needed the chip for region locking. (something some previous systems did by necessity due to differences from PAL and NTSC logic internally) But there was no technical limit that required such licensing whatsoever. It would have been like Atari, Mattel, Coleco, etc offering such licensing contracts to said developers. (the ONLY thing Nintendo could have supplied was development documentation, nothing else would have remotely been a factor) I need to look more into the specifics on the Famcicom, but I really don't see how they managed to enforce such licensing agreements in Japan. (and extending them to westerm markets would have been a separate issue anyway on top of the various loop-holes for publishing under proxy or licensing the game to be published on another platform -same games Nintendo explicitly paid to be licensed exclusively for their platform -which is a different context of the limiting licensed publisher contracts) Except Nintendo kinda had those policies in place in Japan already from my understanding (otherwise devs in Japan would've supported Mark III or other JP home consoles in the way they did NES. They didn't). Yes, but how secure were they? I really don't see how there couldn't have been predominantly unlicensed Famicom publishers in Japan, let alone licensees who later jumped ship for unlicensed releases. In the west, Tengen jumped ship as such and the only snag was the copyright infringement of the code in the RABBIT chip (they were cleared of patent infringement, but the copyright issue remained iirc). Had they done that in Japan, it would have been a non-issue (and I think Tengen Japan may have done just that). It's rather ironic that Atari Games put all the effort into R&D for reverse engineering and implementing a clone lockout "key" while contemporaries used a simple/cheap (and legally foolproof) voltage spiking mechanism to get around the lockout chips. (later model NESs supposedly prevented that from working, but I believe later voltage spike mechanisms also broke that and the number of such consoles was few anyway -plus the NES2 removed the lockout chip entirely) Yes, and the only thing stopping that was funding, just as with Strong marketing. (a smooth transition would have not only meant an earlier entrance of the 7800, but many other areas leading to more funding earlier on to re-invest into software and marketing) By '87/88, Atari Corp was in a much better position than '84/85/86 financially, and that's part of why you saw a lot more investment in the 7800 among other things. OTOH, in an ideal case where Atari Inc stayed whole and NATCO was completed, they'd almost certainly have had BETTER games than the above list by '86/87 on top of stronger 3rd party support and a much larger chunk of multiplatform games in general. (something I forgot to mention explicitly above: every multiplatform game means 1 less exclusive for the competition ) Another thing I forgot to mention earlier was that Nintendo could focus most of their resources on the NES while Atari Corp had to balance several console and computer platforms on top of the other issues. (that also feeds back into the possibility of pushing the likes of the XEGS in '85 with unified software support)
-
NATCO, perhaps, Atati Corp: no. Atari Corp WAS Trammel Technologies Ltd. which had been previously founded. It was just TTL with added properties which had formerly been Atari Inc's. (ie the consumer division -of course, any added Staff were a separate issue entirely and would technically be no different than hiring new staff in general to build up the company, but the added context of taking on Atari Inc's former consumer products would put that in a bit of a different context) Plus there was never any bankruptcy (Atari Corp never went bankrupt either, or Games for that matter), so it would be more like one of various other automotive companies buying out a division of GM and folding it into themselves rather than actually reorganizing GM. From the press stories of the time and what has already been posted here it appears Jack did to the firing. Most if not all of Atari's programmers went - as an example and this was mentioned in a couple of articles related to the ST development kits. 1) NATCO was not put in to play yet, hence they wound up having no case. Their grievance was that since they were sold on it and signed on, then they were part owners (employee stake) and the sale was illegal. The NATCO reorganization never occurred. 2) Jack fired zero NATCO staff because of point 1. Likewise, once again Atari Corp. was a completely different company. People were being hired over to that, not fired. I'm sure they certainly had that viewpoint based on how the transition occurred, but if they were being "fired" it was from Atari Inc. because they weren't needed at Atari Corp. Likewise, most of the press reports of the time were misleading or incorrect - they also stated Jack as "taking over" Atari, and often confused the two companies as well. That and most of the Atari Inc computer staff (who hadn't already left) DID get hired to Atari Corp. (including many who had been involved in games) It seems like a best case scenario for managing a transition with the Split of Coin and Consumer going to TTL would be something like: notifying all Atari Inc staff well ahead of time to allow upper management to collaborate in smoothing the transition and allowing staff in general to make preparations for finding other jobs if they couldn't be carried over. Part of that should have been reworking the NATCO plans into reorganizing Atari consumer alone and doing so specifically in the context of TTL/Atari Corp. (preferably with a collaborative effort of Atari Inc and TTL staff and possibly even retaining Morgan or some other management staff after the completion of the transition -all on top of Warner and TTL mapping out precisely how the properties were to be divided) Yes, and he also was far more effective at managing Atari than Nolan had ever been. (in terms of a successful business) However, his management (and conflicts with Warner) let to some substantial problems elsewhere on the market. The way the A8 was managed had a lot more than just the top-end cut out, but was mismanaged in general. (even the low-end models weren't expandable enough, marketing/advertising was nowhere near that of the VCS -though it seems to have gotten better around '83/84 with the Alda ads and such- among other things -and then there's Europe . . . and Japan for that matter) Apparently there was also conflicts of Atari's computer and game divisions, but I'm not sure on the details of that. Yes, and they should have either forced them to do it, or at least carefully modeled the game distribution network on WEA's. (possibly bringing in WEA staff to help establish that -if not folding in some for actual operations of Atari's distribution) It didn't really matter how they managed to establish a functional and stable distribution network for Atari, but just that it got done. (again, probably the #1 reason for the crash -or the destabilization of Atari and th emarket in general that led to the crash) Electronic Gaming Monthly made a big deal about it back in 1991. Time Warner was distributing EGM and EGM said Time Warner had informed the Tramiels of their intent to reacquire the assets of Atari Corp. following their (re) acquisition of Atari Games Corp. 1991 definitely would have been the time to pull that off: much later and you'd have a muh weaker company on your hands. In 1991, Atari Corp had already fallen quite a ways from their late 80s peak, but that doward spiral would continue as the ST line was discontinued, Lynx was pulled back, etc. (the Jag hype helped bring them out of debt and stabilize things long enough to win the Sega suit -and some other things iirc- and the Jag almost seemed to be making headway in 1994, but by '95 they were bleeding money again and burning through the Sega winnings -which is why it made so much sense for Jack to liquidate things as he did) It certainly would have been interesting to See what Time-Warner could have done with Atari Corp from '91 onward. (from the computers to the Lynx to the Jaguar -or maybe even an earlier console pushed to get them back into the mainstream home game market ASAP and with better funding and in-house software development via AGames/Tengen/TWI on top of the management) The razor and blade model is more or less what happened once competitive pushed prices down more, and more so in following generations with licensing contracts for 3rd parties and such.
-
Hell, back in '87/88 it was already extreme enough to make them opt for a 32kx8-bit SRAM in games like Winter Games over dual 8k SRAMs (let alone DRAMs with added logic -custom logic would have made that cheaper, but they weren't using custom LSI banking logic either). Yes, it's a pretty awesome chip and hopefully we'll see some stuff even putting better Amiga tunes to shame. (and thus also better than anything Atari Games ever did with it -at least AFIK: all the Atari Games YM2151 music -and most western arcade games in general- tends to be rather average and bland in terms of what the chip is capable of -weaker than the better examples of Adlib stuff, and the YM3812 is much less capable than the 2151 -tons of awesome 4-op FM from better examples of Megadrive/Genesis games -Japan, North America, and Europe- as well as various Japanese Arcade/Computer stuff) It's well beyond what most/all NES carts have onboard. (even some historical 7800 carts do a lot more than most -or even all- NES carts, though if you go by the context of the 7800 only using 16k of the 32k SRAM on the carts, it's more in the range of what a few NES/FC games pushed -FC games pushed a lot of sound add-ons, but NES didn't push any since they were never offered for the expansion port as such and the on-cart audio input was removed) A relatively small percent of NES games use any advanced mappers or added RAM. (even SRAM for battery saves was extremely rare). Most of the added chips were just normal bank switching logic, mappers actually enhancing the VDP capabilities (or in a few cases accelerating other things -I think some were sued to completely avoid the CPU overhead for switching 8/16k banks for delta modulation playback) No, those games only avoided bank switching, like the 48k (and smaller) 7800 games did. The vast majority of any added logic for later games would be simple bank switching. (a relatively small portion of games actually used "advanced" mappers that actually offered significant enhancement -like the MMC5 in SMB3 allowing 8x8 color attribute cells rather than the default 16x16) Wasn't STUN Runner's [arcade version] graphics made possible by a very special and expensive TI graphics chip? It wasn't a "graphics chip" as such, it was a graphics-oriented DSP (or DSPs rather: 2 or more depending on the Hard Drivin' based arcade board in question) used for rasterization (which they were relatively inefficient at compared to dedicated blitter logic) and for 3D calculations, etc. (with a 68010 as the main CPU for game logic and such) That DSP-like chip in question was flexible enough to also be re-used as a sound DSP on some of the boards. And the 3D math and logic (including realtime physics) would be among the biggest issues for practicality on the 7800. It could probably be possible to some extent, but extremely cut-down (ie much more than the Lynx -which has a fast math coprocessor on top of a fairly fast 65C02 and blitter for simple fill operations) Line fill is useful, but only a small component necessary for building 3D graphics. Even the Jaguar might have had a hard time managing an arcade perfect STUN Runner. (then again, given the resolutions you'd be pushing on a TV, it wouldn't ever be truly arcade perfect as such -and the Jag can do a ton of stuff the Arcade machine would be horrible at -the Saturn Race Drivin' port isn't arcade perfect either, but it would obviously be capable of many things the arcade board wouldn't do well) You'd still have limitations though, especially since you won't be doing any A8 style POKEY tricks. (you could do software timed versions of many of them, but not the same interrupt driven ones -you also have less CPU time in general than the A8, let alone NES, and NES pushed quite a bit of CPU driven sound -less often graphics- stuff from envelopes to PCM) And then there's things like the hardware delta modulation channel. (you could match that to some degree with good quality 4-bit PCM, but that would need to be done in software and cater to MARIA's bus sharing issues -which are the main problems with using IRQ- and have enough CPU time for that to be practical as well -and the NES could easily beat any of that with software 7-bit PCM anyway -interrupt or software driven) So the biggest advantages for the 7800 XM for sound (sans the 2151) would be use of POKEY's (and TIA's) strengths in hardware that wouldn't require CPU intervention (you do have 6 channels vs the NES's 5 -stock, or always if you don't count Famicom- and more flexible in some areas than the NES's sound -but less so in others) The same thing as if you were going up against the C64's SID. OTOH, if you wanted to limit things for late 80s tech, you'd probably limit ROM to 256k max (256k NES games only appeared at the tail end of the 80s -512k not until the 90s and there's only a handful of such large games -Zelda was only 128k, for example), and probably limit RAM to 32k as well. However, you wouldn't have to discount the YM2151 either, just use it to simulate a cheaper sound chip from the time. (like the low cost OPLL -YM2413- used in add-ons for the SMS and MSX, or the full OPL2/YM3812 of Adlib and such or the older/slightly simpler YM3526 -or for non FM stuff as well -like a simple pulse/triangle/saw/etc sound chip)
-
Forgot to mention a blitter (or alternate graphics driving logic) with variable source and destination depth and indexed color look-up. (let alone hardware decompression logic -or a simple/low cost DSP coprocessor aimed at such -or CPU if you were doing something simple like RLE)
-
Nope, I already addressed all of that (unpacking indexed tiles into RAM -let alone proper lossless compression, or using fewer bitplanes alone for some things -if not using packed pixel graphics), and 5-bits is WAY superior to the NES (better than the Genesis in many, many cases, even without added raster tricks). 4bpp would be ahead of the NES is most cases and even 3bpp would have some advantages. (the master palette depth is also important -ie the ST's 16 colors from 9 bit will often have advantages over the SMS's 16+15 color palettes from 6-bit RGB -and you only get the 15 colors for sprites) The SMS has to deal with everything being 4bpp as well and only 16k to load things into AND hold all the tilemap information, etc. (so there's a rather limited amount you can decompress/unpack into V-RAM -and maybe a tiny amount you could add to main RAM -the rest would need to be updated on the fly and uncompressed or simple enough packing/compression that the CPU can handle it on the fly -the same limitations the PCE/MD/SNES have to deal with as well, except the PCE supports 2-bit data unpacking in hardware and the SNES has 2-bpp tile modes -rarely used for more than an overlay or sparse far BG layer- and the Genesis and especially SNES have more CPU work RAM to buffer into -the PCE's pure planar graphics also make the fewer bitplanes option more realistic on the fly while the SNES s composite planar and especially MD packed pixel formats make it tougher -the packed pixels make higher performance lossless compression and CPU based pseudo-framebuffer based rendering a lot easier though -SNES's Mode 7 is also useeful for software rendering due to such and more so being 8bpp with a separate 256 color palette but limited to 16k pixels max) Limitations of how indexed palettes (subpalettes) are used on a tilemap of any given depth can make for massive differences from bitmaps of only slightly higher depth or even equal depth depending on master palette. (in the latter case see the ST vs SMS or more so STe vs SMS -or Game Gear had 12-bit RGB though, so would have more flexible use of color than the Lynx) Granted, several of my points about DRAM costs and shared bus efficiency would also apply to making more powerful/cost effective sprite/tile based systems too. (the shared bus issue would be less practical without caching, though having as few buses as possible would be significant too -ie eliminating dedicated sound/coprocessor buses and have shared DMA to the CPU bus for sound and such)
-
Yes, except that's not a direct comparison either. 5bpp framebuffer is SUPERIOR in most ways to what the Genesis has. (and even some advantages over the massive number of subpalettes the A framebuffer means you can have ANY indexed color on any area of the screen, so even if you have a lower per scanline (or per screen) color count, you still have many other advantages. (hence why ST graphics usually look MUCH better than NES graphics) OTOH, with the SMS, you have 2 15 color palettes to use, though only 1 can be applied to sprites, yet ST games STILL tend to look significantly more detailed color-wise. (but in that case, it's due to 9-bit vs 6-bit RGB for the most part -and in some cases, more restricted ROM space than ST RAM/disk space) I'd say that even an 3bpp framebuffer would have substantial advantages over the NES (or C64) at the same resolution, though less so if it had to be limited tot he C64's color set. (with the NES's color set -let alone the A8's or NES's with RGB control used- it would be much more flexible, plus you could do additional raster effects for palette reloading) And character and framebuffer graphics modes aren't mutually exclusive either. (you could have multiple planes with ability to anable character or framebuffer graphics in general, and the chip space used for sprite logic is generally MUCH more than that used for character generator logic -let alone a simple framebuffer manager) That and having flexible resolutions would also be important, especially if you stuck to packed pixel graphics. (where framebuffer size would be less variable via color depth) RAM for the framebuffer is an issue, but it can be cheap DRAM (and relatively slow) vs the fast SRAM/VRAM used in contemporary systems, let alone investment in bus sharing. (interleaved random access DMA is slow and relatively unattractive -OK in the mid 80s, but the Jaguar and Lynx did much better like other modern design- and packed pixel graphics are more necessary for fast/efficient use of fast page mode accesses, let alone added line buffers -aside from line buffers, separate source and destination will help keep in fast page mode, and dual bank interleaving is a more comprehensive implementation of that principal -the lynx doesn't have line buffers or dual banks/buses but does rely on fast page mode -so careful management of the game to avoid page breaks is critical to peak performance) In the late 80s and very early 90s, the Amiga's memory sharing scheme was still OK, but not really great compared to alternatives. (the overall performance of the Amiga was still fairly impressive though, and the slower memory interface would also mean less -or even no- performance drop for using ROM as the source and RAM as the destination, or various other options -it's a case where using the FASTRAM bus would also be attractive so the blitter could saturate chipram -short of a much more comprehensive redesign) By the end of the 80s, DRAM was getting cheap enough to quite feasibly have over 256 kB in a game console, and by 1991, it was easily feasible to push over 512k of DRAM (especially on a unified bus) at console prices. http://phe.rockefeller.edu/LogletLab/DRAM/dram.htm Actually, it was more feasible back in '86/87, but the DRAM shortage/crisis of '88 pushed things back up. (not until 1990 did it drop below the lowest 1987 prices) Prior to that, low resolution framebuffer based systems were practical, but limited. The Astrocade was pure framebuffer, though single buffered and only 160x88-102 2bpp. (up to 320x204 with RAM expansion) OTOH, it would have made a better low-end computer than a console. (it also had a 256 color palette -and technically it had split-screen indexing for 2 4 color palettes, but not proper cell based attributes like many character systems -or the ZX Spectrum with 32x24 attributes over a 1bpp framebuffer) The Lynx did so with only 64k of shared memory (ROM was used basically as a disk to load data from -and it was too slow to do much else with anyway). The double buffered framebuffer took up some 16k of that for 160x102x4bpp. And in general, a 320x204x4-bit (from 12 bit with raster interrupts for added color effects) display (taking 64k double buffered) would be realtively competitive with 4th gen consoles in general (assuming the blitter is relatively fast, let alone has resources for scaling or affine rendering -or 3D acceleration). 128k would allow that with 8bpp, plus added RAM to work in or decompress into. (even in '89, 256k DRAM wouldn't be unreasonable, 512k might be pushing it, but it would depend on what other trade-offs were made) And 8bpp would mean it would have FAR greater color flexibility than any console out until the Jaguar/3DO. (SNES technically could do 8bpp tiles, but rarely used the feature -and conventional 4-bit -sometimes 2-bit- tiles ended up usually with under 100 colors on-screen, and other trade-offs due to the cell-based nature -hence why the PCE STILL had some advantages in color in spite of 9-bit vs 15-bit RGB -albeit with significant trade-offs) The Flare 1/Multisystem design with a decent chunk of RAM would probably have made a highly competitive 4th gen console. (it could do 4bpp at 512x200 or 256x200, but it was faster with 8bpp 256x200, and as long as you had a decent amount of work RAM, that would be preferable by far -you'd need 100kB for double buffering, and with a 256kB DRAM system, that would leave a full 156 kB for added work RAM -again, very feasible for 1989, let alone later as RAM prices dropped steeply up into the early 90s before stagnating for several years at around $3 per Mbit) Actually, a bigger issue than the framebuffer RAM (destination) would be the source: uncompressed 8-bit graphics cells/chunks would take up a good amount of space (or even 4bpp for that matter, depending on the timeframe), so you'd want to avoid using ROM as the source as much as possible unless you had hardware decompression or hardware texture indexing. (so you could have 4 -or even 2- bit sources and 8-bit destination -or more flexible if you did planar graphics- but otherwise you'd want to unpack graphics to raw 8bpp when loading into RAM -if you could do it well ahead of time, also probably use lossless compression to conserve more ROM) A system with 2 planes (or maybe more) with either possible of a few different resolutions and maybe different color depths as well as character AND framebuffer modes (or maybe framebuffer only on 1 layer), would have made a very flexible system even with no hardware sprites. (you'd use the framebuffer layer for that, and/or use character objects) Still, tons of advantages (and some trade-offs) for pure framebuffer as well. Se above, you're heavily oversimplifying things. Genesis equivalent to a 64 color framebuffer. (you'd be lucky to be competitive with a 5bpp buffer overall -let alone with raster effects used, and you'd better hope that framebuffer isn't using a higher output color depth like the amiga or you'll have more trade-offs) You end up with a LOT of redundant colors if you want reasonably smooth graphics, or more colors if you can sacrifice for more clash or high contrast tiles. Hence why digitized photos or FMV (the good codecs) still only end up with around 30 colors (or less) if clash is to be avoided. (and it's not that dramatically different from an optimized 16 color version of the same image -and in some ways worse than an optimzied 16 color version using a higher depth palette to index from) Which is why various bus sharing schemes is important. (if you don't push a multi-bus design) And no, the RAM isn't really expensive compared to the options most consoles took. (128 kB SRAM+128k DRAM in the SNES, 64k+8k SRAM -and pretty high speed too- in the PCE back in '87, etc -compared to commodity DRAM, that's a lot of overhead for the RAM components and at a time when embedding the DRAM interface logic shouldn't be an issue at all) Let alone the advantages added RAM gives for decompressing data into (having compressed into ROM) and using slow ROM without performance loss. (so cheaper games, and still a cost competitive console design as well) Especially compared to the likes of the PC Engien using 140 ns ROMs. (somethign NEC's vertical integration allowed -apparently more attractive than investing in more RAM onboard the console itself... rather skinpy on the RAM in the CD units too for that matter) The Sega CD, less the CD-drive and interface, and a bit of simplification, would probably have been a highly competitive piece of console hardware in 1991, even cost wise with the 768 kB. (and even if you had the separate framebuffer bus -using the word RAM to render into-) All it needed was logic for a framebuffer and video DAC, and it would have been pretty awesome. (swapping generic DMA sound for the Ricoh chip +64k SRAM/PSRAM would have been a significant cost savings too)
-
Or make sure the LCD set they get has good SDTV support. (it's very inconsistent and seems like a lot of the cheaper models/brands actually have BETTER and more flexible support for such, though some high-end stuff is OK too -I think Phillips tends to good, maybe Samsung, not sure about Sony -Sanyo seems OK, but those are generally lower-end sets in general -our "720p" Sanyo set also has great VGA quality for a brand that's supposed ot have crap scaling/AA -and 1365x768 looks pixel perfect, though is ONLY possible via analog since HDMI thinks it's 720p native -though oddly supports 1024x768 in perfect pixels) The scaling is a bit blurry, but it's not weirdly artifacted like I've seen on a lot of HD sets (especially for SD stuff). Plus, that Sanyo set has some of the most flexible (and convenient to use) aspect ratio and overscan options I've ever seen. (a neat automatic overscan selection option can allow full overscan -with some boarder- or auto zooming to shave all the boarder off but maintain the aspect ratio -only 2 general pixel aspect ratios iirc, one for anamorphic and one for normal NTSC res) I think most dedicated LCD SDTVs have good SD support (out ~6 year olf Phillips set has excellent composite/S-video/component both for 240p and 480i, some of the best deinterlacing I've seen too- though RF isn't as good as some SDTVs -fine for Atari, onyl really an issue with really finicky Sega RF and the like -which has native composite support anyway -also has anamorphic support but no progressive scan AFIK). Hell, if you really look hard, there are some HD sets that accept 15 kHz sync via VGA (so you could use it for an ST, Amiga, etc with no sync doubler), but those are uncommon.
-
There are few long games, but they came later. Scrapyard Dog, Midnight Mutants, Commando and Dark Chambers are all pretty long. Also, Fatal Run and Meltdown have password saves. But yeah, would have been cool to have a Zelda. According the lore, there was supposed to be a Zelda-like game called Time Lords of Xantac. It was Tramiel-ized. Here's the details from Digital Press: ----------------------------------------------- Designed by James V Zalewski. Description: According to programmer James Zalewski, the original concept called for this game to be a “Legend of Zelda” style game to compete against Nintendo. James informed Atari that he would need more memory on the cartridge to handle everything they wanted (map, characteristics, etc). Atari refused to spring for the added memory and thus the idea was scrapped before any work was started. Part Number: CX7867. From the Digital Press "Classic" Guide. I don't get some of these claims as such. They were willing to put 32kx8 SRAM chips onboard the carts, but weren't willing to push for a bit more ROM? (what, 256k?) And then there's the lack of added sound chips used in general. If POKEY was too costly to use practically, they could have opted for the likes of the SN76489 or the 16-pin mono sound-only version of the YM2149. (available by 1987) If they had a huge back stock of POKEYs to use up, they might as well have gone with those though. (there's also the potential for a low cost in-house embedded sound chip, but given they didn't even use custom logic for bank-switching, I doubt that would have been done at the time) Short of potential add-ons of course. (like a 1987 counterpart to the XM -possibly using the same 32k SRAM chips and POKEY of games being developed/released in '87 already, but potnetially leaving an additional port with the key and SIO lines for a keyboard/computer add-on -so cover their bases for the XEGS, more or less, but better and without the conflicts with the 7800) Except, Zelda on the NES is only 128k (plus the SRAM), and that's no larger than games Atari Corp was pushing by '89. In any case, the 7800 was already declining in '89 and they needed a new platform to push into the next generation by that point. (and had more reasonable funding to push for a much stronger launch than the 7800 had had in '86 -and a strong position of the SE in Europe, and potential Euro developer support to exploit -of course, they'd need good management and marketing to get anywhere in either market) Just cutting down the STe would have been a bit weak hardware wise and the dot clock is hardly preferable for NTSC (extremely tall pixels -compared to square- and heavy composite video artifacting, a 6 MHz mode might have been nice -maybe even 4 MHz for some low-res stuff), plus the 16 color single playfield limit (8 bitplanes with dual playfield support and 2 16 color palettes would have been a big jump -maybe a 5 bitplane 32 color mode too plus 6/7/8 bitplanes using semi-indexed colors along with RGB control with the added planes), though many of those are changes that would have greatly benefited the STe as well.
-
It's not clear to me how the removal of features from just the 400's chips would have saved money in 1979. If anything, two sets of chips would have cost more money? You'd want to eventually merge all those chips, and a side effect of that would be the removal of useless features, but it's the merging of the chips itself that saves you money eventually if volume is high enough. Less silicon=cheaper, lower pin count=cheaper (ie 28 pin POKEY), less board space=cheaper. (due to lower pin count and no PIA) Unless you want a comupter add-on for the system, there's no reason to have such features at all. Actually, that's also what makes using SRAM attractive, especially once 2k densities got affordable (compared to 512 byte chips): consoles (that avoid use of framebuffers and have direct access to ROM) don't need the RAM, and unless you invest in embedded DRAM interface logic, SRAM makes a much more attractive option for such embedded systems. (even then, there's the issue of more board space due to the number of DRAM chips used -at least until 4-bit wide DRAM chips became available -you could certainly argue that by the time of the Genesis, they should have been pushing for DRAM, but the earlier you go, the more trade-offs there are -if you want the A8 framebuffer capabilities, you'd definitely want DRAM though -so not like the 3200's planned 2k SRAM as of 1981, but you'd STILL have more work RAM than the Colecovision -even if almost 1/2 was eaten up by the character index/map/table -or whatever term Atari used-) There's a reason the Colecovision, SG-1000, Master System, NES, and even PC-Engine (but not CD) used all SRAM (aside from video -PCE was also SRAM for that due to speed iirc). The limited work RAM also limited framebuffer hacks to few systems (or requiring on-cart RAM) up to the 4th gen. (even the PCE has too little work RAM to really allow hacked framebuffer graphics -unlike the SNES and Genesis) The CV has DRAM and embedded interface logic in the VDP (thanks to TI), and while Atari could have pushed that for the 5200 to allow the bitmap modes (and potentially decompression of ROM data into RAM), it could have been a more competitive move to take advantage of the shared system bus that didn't need the VRAM and make do with less work RAM in general. (restricting mainly to character graphics as such with 2-4k SRAM to work in -if they'd gone with embedded DRAM logic, that might have paid off more in the long run -especially when 16kx4-bit DRAMs became available so you'd only need 2 16-pin DIP DRAMs for 16k) As for common production: yes, that can be significant, but there's also a wall between cheaper/simpler components and using more expensive components across the board. Otherwise, the 6507 (which also uses a modified die compared to the 6502) wouldn't have existed, or a least would have been replaced by the 6502/6502c once Atari ramped up A8 production. Removing all those features certainly would have made it tighter to require/force developers to make games that stuck to those limitations for such forward compatibility with a nonexistent console in general. (unless they launched the console derivative back in '79, but that probably still would have been in the $400, maybe closer to $300 like the Intellivision if they chucked the DRAM+logic in favor of just 2k SRAM, etc) But that would really be unattractive and rather pointless to limit the computers like that. (albeit removing PIA, dropping 4 of POKEY's ADC inputs for plain IO lines, and going with 2 ports from the start would have benefited the A8 line's cost effectiveness by a good margin as well) And that's why it makes more sense to A. stick to only full computers (with low to high end), and the closest thing to a console would be a low end gaming computer still includign a keyboard. B. design a fully standalone console that avoids the restrictions of compatibility, or focuses on VCS compatibility alone (regardless of using some of the A8 hardware or not). The 5200 opted for incompatibility with both and had potential for being more cost effective than either option, but fell far short of that due to other areas of missed cost optimization. (such an incompatible system could still have been optimized with provisions for making an adapter module as low-cost as possible -one obvious thing would be using the 6502 as the 6507 as well as compatible controller pinouts with the lines wired to an expansion port -or the cart slot- for passthrough to RIOT+TIA I/O) The 5200's POKEY could have been cut to 28 pins from the start since SIO unused and the key inputs would be unnecessary if they were aiming at a 2 port design. (just use POKEY's POT lines for POT and I/O input plus the GTIA I/O lines) And lockout would be important to add as well. The low-cost computer idea is a good foolproof option for Atari's position at the time: one less unique product line to support (from the consumer PoV), just the VCS and the A8 line with a bottom end computer positioned in direct competition with the upper end of the console market. (but making a far better game system than the VIC-20 ) Hell, in the extreme, for compatibility alone (and not really intending the system to be used as a proper computer), they could have pushed for a super cheap ZX80/81/Timex 1000 type keyboard. (maybe with a port for an optional "real" keyboard as with the XEGS's keyboard port) Or go beyond that and only include the keys most used by games. (space bar, start, probably the number keys, etc) The inclusion of the keyboard expansion would still make it attractive as a computer too. (and much cheaper than any contemporary console computer upgrades since it would literally be just the keyboard -or maybe they could include the SIO lines on the same general expansion connector and have the SIO port on the keyboard unit as well) Umm, the box wouldn't get any smaller at all if you used a super compact membrane keyboard (or negligibly so). In the extreme case, using a really compact/minimalistic membrane keyboard would be almost no different from using a solid slab of plastic and only a couple buttons/switches for power and reset. (maybe option/select/pause if you didn't do that in software with controller inputs) Let alone the massive 5200 where even a full Atari 600 would have been smaller and cheaper to distribute (and possibly cheaper to manufacture), even the 1200XL has a smaller footprint (not sure if it's lighter, but the motherboard is even slightly smaller than the 5200's). If we were talking about the 1200XL, removing the keyboard would have been obvious (though switching to a cheap membrane keyboard could be close to the same thing as such), or even the 600, but that would have the same possibilities for a cheap keyboard. (and a slimmer form factor that didn't comply with the full throw keyboard -again, possibly removing SIO and adding that and the key lines to an expansion port for a keyboard upgrade that included SIO) We're just going to disagree about whether a keyboard belongs in a game console. History seems to have shown that demand for keyboards and keypads in game consoles is low, but everyone's entitled to an opinion. I agree, more buttons and option keys (and accessory keyboards) are more useful in general. (with careful use of menus and button combos once you've gone beyond the normal limits -the Jag's keypad could really have paid off if a good amount of PC ports had been made, but in either case it should have had the Pro Controller layout from the start -and honestly, the pro controller has enough to push most complex PC games with a few tweaks and avoid the need for the keypad as such -aside from instant access to certain functions like FPS weapons, etc) As above, I think a dedicated console is preferable in general, and as such, including computer compatibility is generally impractical and would tend to be sloppy. (aiming at VCS compatibility out of the box -or optimized for a cheap adapter- would be more significant, though makign the hardware similar to the A8 to facilitate ports -and sharing some production components- would also be useful) However, after the fact, or for a low-end gaming-oriented computer in general, a cheap keyboard is still a reasonable option. (especially given the various conflicts over the 5200/3200/etc) I'm not talking about home computer video hardware. This may not be obvious because I'm talking about Atari 400. It's on the table only because we're talking about 5200 and Atari's design process. With the exception of 2600, 5200, 7800 and Jaguar, I don't know of a successful game console from 2600<X<PS1 that didn't have hardware sprites and hardware tiles. Pretty much all the arcade hits from Pac Man onward had them, too. Atari's game console designs didn't have them and were more flexible, but in retrospect this was a liability after 2600. That's sheer coincidence: there were simply no blitter (etc) based systems that were pushed on the mass market by major players. (the Lynx was all blitter based and quite powerful, but not managed well under Atari -and at a fundamental disadvantage in cost, size, and battery life, like the Game Gear -it did outsell in the GG in parts of Europe, but not the GB anywhere AFIK -hardly surprising though) A lot of arcade games pushed blitter driven graphics as well, though less often software driven. And, like the A8 and Colecovision (MSX, etc), several later consoles pushed for software driven graphics as well where the sprites were unsuitable or undesirable. (be it pseudo bitmap or character-by-character movement) The SMS did that in several games including Space Harrier (very few hardware sprites used), and After Burner (similar), which is also why you see a lot of attribute clash in those games (especially Space Harrier), but not much flicker. (on the Genesis, such options were mainly used for 3D games, though some used software scaling/blitting in 2D games as well -sometimes blitting onto sprite objects as well for realtime scaling animation) The Sega CD was the first home console to definitively include a blitter (a few games that use no sprites at all -or render animation to them), of course that went beyond a simple 2D blitter and added affine texture rendering for scaling and rotation. (and full 3D texture mapping if you draw on a line by line basis with the CPU recalculating perspective -the same thing the SNES needs to do for mode 7 warped for 3D perspective) But really, the main reason you didn't see any really successful consoles pushing such hardware is simply because none of them tried to do so. (had the likes of the Amiga or Flare 1 chipset been pushed by a well-known and capably managed company, they could have been exceptionally popular and competitive platforms -obviously you'd strip out the unnecessary I/O hardware and cut RAM down in the Amiga -maybe even push for use of fastRAM to allow more total bandwdith depending on the cost you were willing to push -cutting out the fastRAM bus would be a notable cost savings, especially if you're talking a late 80s release) And in any case, you wouldn't have to worry about "odd" architectures as long as you stuck with framebuffer+blitter/CPU or character+Sprite (or some combination) based graphics as you'd have tons of developers experienced with any of those. (the 7800 was problematic as it didn't do any of that in a conventional manner as such and didn't get out into the market when developers were more open to "odd" hardware as such -especially if Atari had the funding to push it enough to make developers really interested, as that tends to overcome even the toughest architectural issues) That's also why the Panther was a bad idea: very much in the 7800's spirit in terms of list processing with no framebuffer or "normal" sprite or character support. The Neo Geo used ALL Sprites with no character support, so it was also an oddball to port to or from, but it goot a good amount of support. (both developing for and porting from) Since most games of the time were totally remade for the home conversions, similarities allowing source ports was a moot issue. (as long as the platform was something they reasonably understood in general) It's a shame they couldn't/didn't push more comprehensive indexing (with CLUTs in dedicated CRAM or main DRAM -like the TMS9918 did- and using RIOT instead of PIA would give you 128 bytes for potential CRAM entries ). The C64 had some nice on-screen color flexibility, but the master palette really limited it. (Atari is the opposite -though good use of DLIs helps a lot) Yes, you've found the other problem with Atari's game console designs: a shared bus for GPU/CPU. 5200 and 7800 had it, I don't know about Jaguar. That was a good idea for a home computer, and a bad idea for a game console. Not really, it has the very same cost benefits for a computer and a console: like how lower cost computers have shared video memory. There's other trade-offs, but an efficient and well-designed system on a unified bus (or at least as few buses as possible) will tend to be far more cost effective (higher cost to performance ratio) than a multi-bus system. (a multi-bus design is also an easier method that requires less R&D to implement, but will also be more expensive in the long run as such -unless you go really extreme with heavy buffering to simulate multiple buses -the final model Genesis 3 used a single SDRAM bank with heavy buffering to use for both main and VRAM) A unified bus design also means memory and be allocated more efficiently. (be it ROM or RAM) That's why the VCS did it, A8, C64 (with interleaved DMA, though still less CPU time than the A8), 7800 did it, Amiga, Lynx did it, Jaguar did it, N64, Xbox did it, 360 did it, etc. There's obviously many other areas of cost trade-offs, and with similar buffering AND multiple buses, you end up with something like the PSX. (which has a chipset that could still perform capably if using a unified bus -or could even perform better than it did if they pushed the system bus to 66 MHz with SDRAM or buffered for 64-bit DMA) As such, using multiple buses on the jaguar would have been unattractive: more investment in caching and buffering was far more attractive. (and investing in a CPU with a cache -or added logic to allow external caching for the 68000 even) Many, many trade-offs though. (like consolidation and die space: another area the jaguar shines with an extremely ambitious .5 micron process targeted for a design laid out in 1990! and in production before even high-end CPUs were using .5 micron -they cam in '94 with the likes of the PPC603, granted the N64 was pushing .35 micron in 1996) That and some other factors (like total RAM or use of TONS of high-speed ROM) make the difference between cost effective game console (or lower-cost home computer) and an arcade machine or high-end workstation. There's a reason no (no arcade) cart based home consoles opted for the NES's dual external bus design again (the PCE had the video bus on the expansion port, but not on the card slot). It required multiple ROMs (minimum) and a larger connector and PCB with more traces. (and 2 separate bankswitching schemes) The onboard RAM option was less flexible, but much more cost effective. (the single bus option was a compromise that was as flexible -or more- than the NES, and cheaper than either of the other options, but also with more performance trade-offs -and the faster memory needed to counter those trade-offs would again counter the cost effectiveness, though in the 7800's case I wonder if the SRAM was fast enough to make full speed interleaving feasible -maybe with optional interleaving on the cart slot for later games with faster ROM, I'm not sure what speed MARIA accesses the bus at, but if it's just 1.79 MHz, that wouldn't have been tough at all to interleave at the time, even with DRAM like the Amiga -if it's 3.58 MHz, that's a lot tougher and you'd need at least 140 ns memory -though the PC Engine was pushing ROM at that speeds from the very start in 1987, probably facilitated by in-house production whereas the Lynx pushed for cheap ROM in the 400ns range and the Jaguar at only 375 ns standard) Plus there's the development time constraints to design for multiple buses and/or interleaving on top of cost issues. (with a tight design and no backwards compatibility dragging it down, the 7800 probably could have included a TMS9981 like DRAM arrangement for video updated via I/O ports and just a single 2k SDRAM chip for the CPU -actually it might have been interesting if they could have pushed a 3.58 MHz 6502 given the use of SRAM -and knock it down to 1.79 MHz for ROM or allow either) Again, lots of trade-offs, not just in cost/performance, but also in terms of R&D time and budget available. The Saturn is a prime example of both the issues of multiple buses, a general lack of buffering (for wider bus width OR higher bus speeds), and lack of consolidation with cutting-edge chip fab processes (the 3DO is an even more extreme example using all 1 micron parts in 1993 -which also physically prevented use of heavy buffering without extreme expenses -though it also meant a massive potential for consolidation had it ever gotten that far -it was also licensed and sold for profit and in a higher end form factor, so even worse), plus a lack of tempered design for attractive cost (a lot of RAM, dual CPUs, etc -and using SDRAM when EDO would have had the same performance -but taken more work to interface). Hell, if the Saturn had pushed the likes of the N64 or Jaguar (or PSX for that matter), they could have merged both VDPs and added line buffering for 64-bit DMA with almost no page breaks (maybe even caching -aside from the CPU) and/or taken advantage of the 66 MHz rated SDRAM and clocked the system bus at 2x that of the internal speed of the custom chips (ie ~57.4 MHz rather than ~28.7). Or various other compromises with multiple buses and slower RAM and/or narrower bus widths like knocking it down to 2 buses and both using 32-bit EDO DRAM with one shared sound+CPU bus and one video bus. (the N64 used 9-bit RDRAM at 500 MHz with an MMU to connect that to the 32-bit system bus -albeit only practical due to being partnered with SGI, and there's trade-offs for fast/expensive RAM on narrow/cheaper buses vs slow/cheap RAM on wide buses -or in between) As it was, the Saturn could still have been OK if they trimed things more modestily for the inefficent/wasteful areas (an SH1 with 512k of SRAM just to manage CD-ROM data transfers, a highly programmable synth chip with DSP that was used almost 100% of the time just for the DMA sound channels, dual CPUs but without good support for the flexible DSP coprocessor also included -that would have made the slave CPU far less important, and the overuse of RAM -probably should have knocked it down to 2.5-3 MB and put more emphasis on expansion if needed -merging the sound and CD-ROM subsystems would have really helped with the 68k managing CD-ROM data transfers with a small ~32k cache/buffer like contemporaries and the sound RAM to work in and store samples; they could have even re-used the Sega CD interface and save cost and R&D time -just a clock doubled CD-ROM chipset for 2x speed mode and an otherwise nearly identical 68000 interface -the 12.5 MHz 68k in the SCD was used for CD-ROM data management as well as a general purpose CPU) The Jaguar 2 stuck to a single bus design (actually with a slow sound/IO bus iirc), but with even heavier buffering, an actual GPU and texture cache, and a CPU with a cache as well (derived from the in-house GPU RISC core), all with 64-bit DMA. New technology shifts options for bus sharing: there was a time when interleaving was attractive, but that fell apart once fast page mode was in regular use, but then you had potential for multi-bank interleaving to avoid page breaks and enough chip space to allow for buffering (and eventually on-chip caching), and it's a combination of interleaving and caching/buffering (and ever faster memory speeds) that makes bus sharing realistic for modern consoles like the 360. (albeit you could argue that using cheaper DDR2 with dual buses could have been more cost effective in a well optimized design for that -and would have allowed more RAM without excessive cost, though single bus with DDR2 would have been cheaper than either, but be slower though also allow more flexibility due to more RAM capacity and still at lower/competitive cost) Short of that, the jag does support 2-bank interleaving to reduce page breaks for unbuffered operations (texture mapping, JERRY accesses, and 68k accesses), so adding another 512k bank could have helped a lot (both doubling texture mapping speed and cutting out a lot of 68k overhead -technically it should allow a 50/50 split from 100% FPM bandwidth and the 68k running full bore) That, and a 26.6 MHz 68000 would have been great if they were available as such without custom grading; technically you should have been able to have a 40 MHz 68000 with zero wait states with 75 ns FPM accesses. (a 26.6 MHz 386SX would probably have been the next step up, or a 13.3 MHz 386SX short of that -probably cheaper than a 68EC020 due to the higher volume production too) The CoJag didn't use dual buses, but did add a bank of dual-port VRAM in the 2nd bank and used a 25 MHz 68EC020 or R3000. The limitations of the Jaguar meant that they either needed a more expensive CPU with caching, a GPU cache (which may have been impractical with the resources and time flare was working with), or dropping the pure single-bus design for an added (slow) CPU/IO/sound bus (probably 16 bits as well). The lack of texture mapping buffering is hindsight since Flare could have focused on that rather than some other areas. (high speed shading, Z-buffering, the object processor, etc -hell, they could have dropped the object processor and focused purely on making the blitter as fast as efficient as posssible: I wonder if Atari management influenced them to build off the Panther's OP rather than pushing an all new design -albeit still a spiritual successor to the Flare 1 and Slipstream) Investing in JERRY (vs a simple sound and I/O ASIC) was a bad move in hindsight as well. There's probably some things they could have had the foresight to change at the time, but the management issues at Atari Corp (and general financial problems) obviously hurt things a lot. (the Jag was lucky to get as far as it did, especially with how ambitious the hardware was ) But on the A8/5200 specifically, the CPU still has a lot of time in active display as video doesn't eat it up like MARIA can/does and also leaves consistent enough intervals for DMA to allow fairly even interrupts. (hence POKEY modulations are possible via interrupts whereas it's not so simple on the 7800 AND you tend to have a lot less CPU time -or at least much less consistency) IIRC, you've got close to 1.2 MHz performance in the A8 (vs 1.6 MHz with V-DMA disabled -due to DRAM refresh), so still better than the C64 or Apple II with interleaving, but weaker than the BBC Micro's 2 MHz AND interleaving. It's not a problem that went away with dual bus designs either, but the bottlenecks change. (except the NES where you have dual cart buses) With the CV/etc and SMS, you have to use CPU time to update VRAM (the SMS also eats through that 16k very fast since it uses 4bpp graphics) and you can only update in vblank. The PC-Engine allows interleaved access to video RAM (fast SRAM), but still requires CPU driven updates. The SNES and Genesis have DMA to main ROM in vblank that halts the CPU (burst DMA) for fast VDP updates, but still relatively limited bandwidth and NO option for slower interleaved DMA. (it would have been really useful to have a slower interleaved mode since you'd have 1/2 bandwidth with full CPU resource -and clipping the screen for more DMA wouldn't mean losing more CPU time) The SNES is even tigher than the Genesis in terms of DMA, and the CPU cycles eaten in vlblank for animation heavy games roughly equates the resource used for the PCE to copy graphics as needed on the fly. (even better would be interleaved DMA to ROM with VDP driven copying in active display -or a dual bank arrangement like many framebuffers use for hardware page flipping -the Sega CD uses that for the word RAM and a nearly identical set-up for the framebuffers in the 32x -same 80 ns DRAM chips even, though clocked at 7.6 MHz vs 12.5 in the CD) And sorry about the multi-topic rant, I tend to draw a lot of parallels with these discussions. You're right when you're talking about home computers, but I'm simply not talking about them. Can you list arcade boards that did frame buffer graphics? I guess the very late sprite games with lots of really huge sprites did that. But by that time, the world had largely switched to 3D which always has a frame buffer. I already addressed most of this above, there was no necessity to stick with pure character and sprite graphics as such, and computers are directly comparable to consoles (and in semi-direct competition in many cases as well -especially in Europe in the 80s and early 90s, and then worldwide once 3D hit and hardware acceleration on PCs -which had been getting ever more competitively priced and user friendly- hit in the mid/late 90s). Had the Amiga been bigger in the US, it may have ended up giving more direct competition with game consoles as well. Since none of the big home console companies (prior to the 5th generation) pushed a purely blitter based system, it's really a self defeating prophecy. And again, what arcade hardware used didn't matter at all since you wouldn't be doing direct ports anyway (and often extremely heavily optimized/modified ports for weaker consoles). So all that mattered was that you had an architecture that a lot of programmers would reasonably understand: and framebuffer based blitter (or even CPU) driven graphics fall in that category as do character based graphics and "normal" hardware sprites. (the jag's object processor wasn't really "weird" either so long as you had a framebuffer to work with -otherwise you'd have to deal with building up the entire display with objects sort of like the 7800) Even with "odd" or "difficult" architectures, you can/will get strong support if the system is marketed/hyped well enough. (the PS2 is one of the best examples of that)
-
Yes, rather ironic that Morgan ended up pretty good while Kassar didn't though neither were experienced in the business . . . except Morgan WAS experienced in management of a consumer products based company while Kassar really wasn't AFIK. (he had his experience in the textile industry iirc, and not even the consumer/marketing end with clothing -ie consumer products- but I may be mistaken) Given the young age of the industry, it probably would have been more feasible to (instead of Kassar) bring in someone experienced in managing/marketing some form of consumer entertainment products. (maybe even from Warner's record business -on that note, many have said it before, but using Warner's record distribution network for Atari's distribution probably would have avoided the issues seen with distribution) Morgan's halt came with unfortunate timing, but it at least made some sense. (had he come in in early/mid 1982, that may have been fine: maybe it would have even allowed some issues to be addressed before they hit the market or were canceled -1200XL, 600, 5200/3200, ET, etc -maybe even Pac Man if it got there really early in '82 -more conservative production of Pac Man could have been critical -like no more than 2 million with plans to replace it with a proper long-term "Pac Man deluxe" or such or Ms Pac Man for that matter -they also missed out by not having Space Invaders pack-in, but it was getting a bit late to bother with that unless perhaps as an option with Pac Man as an alternate pack-in) Also, I'd take that info on Tramiel's opinion of Morgan with a grain of salt. (it would be interesting to confirm though) That may have been supported by the 25% stake in Atari Corp, but what about how Warner turned around and sold off Atari Games to Namco within a year? And what was that about Time-Warner trying to buy Atari Corp in the early 90s you mentioned before? (might have been in another thread) That seems to often be the case. (more to the story) Though the selling for more (for profit and for perceived quality) is something you'd apply to a luxury item . . . there's a lot better ways to push quality without having a high price. (IBM's PCs with the sturdy build quality and high quality keyboards certainly did more than the high price in that regard -that and the IBM name) A solidly built machine with high reliability and quality "look and feel" would go much further. (in that sense: the Atari 400 was much more solidly built than the VIC-20, but to an average person comparing the 2, the keyboard would probably be amjor push for defining the VIC as "higher quality" ) 1 engineer AFIK, and he may have still been there when the split occurred. OTOH, I think the OS work would have been separate from the ATG engineering stuff (especially since it was to be for the MICKEY design as well), so even if they lost the "easy" route to continued development of Rainbow (etc), they may still have been able to continue work on a killer OS. (starting over with the ST design wasn't THAT bad, but the decision to limit expansion and the lack of continued evolution in a timely manner were mistakes -especially given how the Amiga stagnated and how PC standards took until the late 80s for anything but top-end PCs to totally outstrip the ST's capabilities, a reasonable progression of the ST architecture could have kept it highly competitive into the early 90s -at very least though, general purpose expansion ports, possibly more socketed chips, and faster CPU and desktop models from the start would have been very significant -especially desktops with higher quality keyboards, maybe workstation class machines with FPU support) That and Miner only did the VCS (mainly TIA) with only conceptional design contributions to the A8 before he left Atari iirc. Doug Neubauer (POKEY), and George McLeod and Steve Smith (ANTIC and CTIA/GTIA) designed the A8 chips, probably with some other engineering assistance for the overall design configuration with the CPU, memory, and I/O. (and of course the PCB design, etc, etc) So the A8 would share a rather intimate bond with the ST via Doug Neubauer (albeit software on the ST vs hardware on the A8). I wonder if the other A8 engineers were still at AInc at the time of the split. (or if they went over to Atari Corp -given that Trameil took on most of the computer related staff from Atari Inc, it would make sense if they were brought onboard) I also wonder if the ST had any hardware design/assistance from former Atari Inc staff and not just TTL engineers. Atari Corp didn't terminate anything, Warner did when they liquidated Atari Inc without due course of an organized transition and the forced layoffs of all non coin personnel. (Tramiel didn't do that, Warner did, Tramiel and Co. had to decide who to hire to TTL -or Atari Corp rather; It wasn't "who do we need to get rid of" but rather "who do we hire to this new company" -I think Marty or Curt made a very similar comment a while back) It wouldn't have been with Tramiel though, it would have (rightfully) been Warner's responsibility for their grievances. I also highly doubt Tramiel coerced Warner into rushing the split without a proper transition as such, that really benefited no-one other than Warner getting things off the books that much sooner.
-
From what I understand: Atari Inc staff had been promised (ir oral contract, if not in writing) certain things that tied into the reorganization and NATCO plans, but Warner's split contradicted that (especially in the way they managed it with no proper transition and no due notice to Atari Inc staff). Technically, I think it was Atari Inc that they had to sue (legally), and that would have been a corporate shell at that point, though maybe it would tie into Warner, since Warner was the parent company for AInc at the time and thus responsible. (Atari Corp shouldn't have been an issue since that was Warner/Atari Inc's responsibility, not Tramiel's) That's one more way that Warner shot themselves in the foot with how they manged the split.
-
I'd say yes, if that's the context then they were stupid and incompetent. However, that configuration namely implies that the computer add-on wasn't planned from the start, or the original plans fell apart, or the designers of the Adam failed to comply with the CV's expansion limitations. The fast is, the Colecovision was a game console that was built from computer hardware and then morphed back into a computer and is technically very similar to the Spectravision and related MSX computers (sound chip and related I/O are the main differences -or compared to the TI99/4 it's the CPU that's the main difference, and the fact the CV has 4 times the work RAM of the TI99). The intellivision was a tougher push for a computer add-on and would be fundamentally limited by the original design (graphics, CPU architecture, etc) as well as the expandability of the system. (it would be easier to build a computer based on and compatible with the IV than it would be to build an efficent add-on -hell, it might have been cleaner/cheaper to offer the computer standalone and as a trade-in upgrade for existing IV users rather than trying to build an add-on -or have a limited add-on and a full computer with faster CPU, more flexibility, etc) The VCS also would have been ugly to make a computer add-on for (let alone an efficient one), it would have been more realistic to build an evolutionary (or hacked) derivative of the system with a full 6502, more onboard RAM, a BIOS/OS ROM (probably including code to drive the display -like the ZX80/81), and a general purpose expansion interface on top of the VCS cart slot and a VCS compatibility mode. (possibly use a POT input to software decode cassette data, perhaps remap RIOT I/O for the keyboard interface while the keyboard is in-use, various other options as well) And, at best, that would have made a CHEAPER system than any decent computer add-on for the VCS, or short of that, still not much more costly and a lot more flexible. (most of which would also have to include a separate CPU and added logic -not sure exactly what they did in the graduate though, they might have used a RAM expansion hack as well as bank switching on top of a CPU driven text/graphics hack) The CV OTOH, was close to perfect for upgrade to a computer as long as they made sure to include the necessary expansion signals (for RAM expansion and additional peripheral I/O -like the cart slot on the VIC or CoCo, or PBI on the A8 -technically the cart slot on the 7800 would be fine too, though the 5200 would need a read/write hack or an added signal to the cart slot like for the VCS adapter), if they failed to include the necessary expansion, or failed to take advantage of it for the expansion module, that was no fundamental fault of the based hardware configuration/chipset used. And from what I understand, the CV's expansion port DOES offer proper expansion in general with the main bus, read/write, IRQ, and a fair amount of address space for expansion. (20 kB iirc -any more and you'd need bank switching, but that's not a big deal) There's a reason the homebrew expansion module for the CV works like it does. (for that matter, the 7800 XM is just a keyboard away from turning the 7800 into a formidable mid 80s home computer, and POKEY already has logic for key scanning in hardware -albeit you probably wouldn't have had 128k back then or the YM2151) That's a ridiculous question (ie why are the Mac and PC still around but Amiga and ST dead, why did Tandy fall out of favor in the computer business, etc, etc), in many cases it's marketing and not much more (the IBM brand name was the biggest factor in the PC's success -that and the ease of cloning). Also, the GX4000 was an example of a BAD case of such as it was a weak system to release in 1990, though it might have been OK for 1986. (same issue with the CPC+ in general) A shame Amstrad hadn't picked up the Loki (Flare) team when they merged with Sinclair's computer division, or licensed the Flare 1 chipset after the fact for that matter (not only completed a couple years earlier than the CPC+/GX4000, but also far more powerful, and made even more attractive with the moderately enhanced, single-chip Slipstream ASIC intended for the Konix MS -though Knoix couldn't afford to buy the IP exclusively and thus Flare retained full ownership with the freedom to license/sell it to anyone they wanted -like Amstrad, Atari Corp, etc -they ended up licensing it out for several set-top box designs made in China later on -no idea if Martin Brennan ever suggested it to Atari Corp when he was working on Panther or when he convinced ACorp management to ditch Panther and have Flare 2 design the Jaguar) The good good example (more or less) would be the SC-3000, which didn't do exceptionally well, but did do better than the corresponding SG-1000 and was released early enough to matter. (1983 -vs the 1990 GX4000/CPC+) The Adam was never going to beat out the PC/clone market in popularity, but it could haev been reasonably successful. (it was weaker than the C64 -at least for games- but was directly compatible with the CV and could have probably put up reasonable competition) However, they didn't offer a low-end model to compete more in the VIC/TI99/Atari 400, or even C64/800XL range, but went for the all-in-one total system (sans a monitor) as the only option. (given the problems with it, they probably should have even held off on the desktop model initially -it still could have been out by '84 to meet the PCJr) The bottom-end system could have been nothing more than a CV (maybe slightly more RAM) with a built-in OS ROM (maybe BASIC too), keyboard and peripheral ports (perhaps cassette and a parallel I/O port) and enhanced expansion port (for RAM and added interfaces), and it would go up from there with systems with more RAM and built-in functionality all the way up to the desktop models. That and drop the funky tape drives in favor of normal (but still relatively fast -ie 1500-3000 baud like the CoCo or Speccy) cassette interface and floppy disks being the next step up. (probably SSDD 160k disks from the start given the timing -ie bypass SD drives) That and some memory and I/O logic, yes. The Colecovision has MORE memory (even CPU work RAM) out of the box than some contemporary computers (TI99/4A as before). It lacks built-in peripheral ports, but so did the likes of the TRS-80, or the Sinclair ZX series (among some others), save an analog cassette interface. (which could have been a simple add-on even without any special provisions -tons of options via the cart slot alone on the CV and with hacks MUCH easier than what the likes of the VCS had to deal with -the Z80 doesn't require Phi-2 for RAM interfacing like the 650x among other things, so a cartridge adapter for a cassette interface would have been easier than the Starpath Supercharger; likewise, if they HAD to -ie had a crap expansion port- they could have used the cart slot for the whole computer add-on with RAM/ROM accessed in banks within the cart address space along with added I/O mapped to the cart space) The software licensing schemes, relative standardization, marketing, etc. (consoles today aren't that different from home computers in the 80s in that respect) That, and consoles tend to be significantly more cost effective -or at least sold at much lower/negative profit margins. (the software market doesn't make it attractive for general application programming though, and the standard OSs don't either) Honestly, it was more interesting as a low-cost server due to the CELL. (but I knew a couple people who were using their PS3s as supplementary PCs or in the interim when they had no main PC available) Of course, Sony didn't really want the systems used as such (and it opened more exploits they REALLY didn't want), but otherwise they COULD have pushed the necessary software/OS support to be useful as an actual computer. I've heard a lot of different answers, but fear of piracy/emulation is the one to make the most sense. Licensing customized x86 designs and a largely compatible video chipset shouldn't have been that much of an issue if they wanted near 100% backwards compatibility. (especially due to the high level OS/API emphasis of the original model such that a hardware compatible GPU was less necessary -just one compliant with the general API and OS routines used in the Xbox, though the few games that bypassed that and went low-level would have been problematic -nothing that patches shouldn't have solved though, rather like PC games: as it is, the "compatibility" is acchived through porting the entire game over -recompiling the engine as a massive "patch"- and using the old discs for authentication and data -the bulk of the disc space is for mass data storage, the game engine/code is proportionally tiny) I highly doubt it was a performance choice since contemporary x86 cores available at the time of the 360's design (especially from AMD) were competitive if not ahead by a fair margin in per-core performance. (you also get greatly diminishing returns on multi-core designs for many common operations -purely computationally intensive tasks favor multi-cores, but tons of I/O or bandwidth intensive stuff just means multi-cores are wasted over single cores -or dual vs tri/quad- the only reason multi-core processors became popular is because engineers hit a wall with faster processor speeds and improved per-clock single-core performance progresses much more slowly than clock speed increases had allowed on top of per-clock performance boosts in previous generations -which is also why the PPE in the 360 is considerably more than 1/3 the average performance of the 360's tri-core version of the same CPU -actually more than 1/2 the performance on average iirc) People seem to automatically think "oh, oh, PPC, it must be better than x86", but really, that has next to nothing to do with modern performance since the ISA is totally separate from the actual internal CPU logic (the micro-ops translated from the external instructions). Benchmarks are limited too, but they're better than clock speeds, MIPS ratings, let alone vague architecture -or "bitness" . (hence why the PIII derived Xbox CPU was generally more powerful than any other CPU in a console that generation -including the GC's PPCG3 based CPU -in fact, due to the clock per clock performance, the Xbox's CPU should outperform the Wii's CPU -also a G3 derivative- even though it's almost the same clock speed -though I used to think the opposite in my ignorance) More like conclusions that are difficult to make since there's so much that's up to hypothetical interpretation as it simply never happened: I think the Adam would have been much more successful (maybe not super well selling, but a hell of a lot more popular than the Adam as it was) it it was offered an an MSX/C64/etc like form factor (possibly in addition to a desktop bundle), but we'll never know if things would have played out that way. I see very little here that's semantics (other than many the issue of the limited release of the IV keyboard module), the rest is more comprehensive deductive reasoning under a hypothetical "what if" context. I never said that (unless you mean Marty). I don't think being an add-on was the issue: being only offered in an expensive (relatively) desk-top form factor with reliability problems and a funky tape format were the reasons it failed. (the poorly configured add-on was a problem too -though that's one issue I hadn't fully realized and had previously assumed was a proper expansion module for the system -since all it needs is RAM and some I/O for keyboard and peripherals, taking it on makes no sense from a cost or engineering perspective -it would be like having a complete A8 computer attach to the 5200 as the computer expansion module) Maybe, though corporate management can also be a major factor in making such messes. (not so different with Atari Inc either . . . except that corporate management is also what drove Atari to the height of what it was in 1980/81, but then nearly drove it into the ground before Morgan finally starting turning things around -and then Warner yet again pulls a snafu with the way they managed the split and sale to TTL)
-
And at the end of the day, a console designed to be a console is better as a console and a computer designed as a computer is better as a computer. The market figured it could fight the economic forces by a piecemeal attack of add-on components for consoles. It heavily depends on the circumstances in question and that's not always true. Hardware specifically designed for a game console could be (in many cases) re-used for a highly efficient computer, and vice versa. (OTOH, the amount of reworking it would take depends) The Colecovision is a prime example, it used hardware that was designed for a computer through and through, and while it was less efficient due to that, they still managed to make it work. (the fact that the hardware was in largescale production and off the shelf would also cut out R&D overhead and issues with economies of scales -but also have the issue of relying on 3rd party parts and not being able to consolidate things without reverse engineering or licensing said parts) The fact that the Adam failed has nothing to do with it being related to a game console since the game console's hardware was designed predominantly for a computer. (the C64's hardware had generally been designed for possible use in arcades, home video games, or computers as such as well, but only got properly implemented as a computer -actually, the hardware is even more optimized for video games than the A8 chipset with the complete lack of bitmap graphics modes and less flexible resolution options) The MSX is almost identical to the CV hardware wise and did rather well in the low-end niche in Japan and OK in parts of Europe. (the Sord M5 and SC-3000 are even closer, and again the SC-3000 was cross compatible with Sega's SG-1000 console -the SC3000 was actually considerably more popular from what I understand) And the TMS9918 was, of course, originally designed for the TI99/4. (the biggest change to later systems using that VDP was the CPU: most/all used a Z80 -a lower cost mass market CPU vs the 9900 which would even have been fairly expensive for TI to produce, let alone if they'd licensed/cloned the Z80 to manufacture in-house -like NEC did) The cost of what was too high? The A8 chipset, let alone the cut-back 5200 chipset in 1982 should have been LESS EXPENSIVE for Atari to build a console with than Coleco buying off the shelf parts. (Atari already had R&D and mass production of the chipset and with only the overhead of 3rd party chip vendors vs buying for profit off the shelf parts) Hell, in many respects it would have been more cost effective than the Intellivision, but again, the 5200 wasn't nearly as cost optimized as it could have been. (something that could largely have been addressed after the fact with further consolidation for embedded DRAM interface logic, GCIA, a much more compact motherboard and case design, etc) Likewise, the C64 (even in '82/83) could have been cut down to console form factor (sort of like the MAX, but more so) and been cost competitive with others on the market. (especially with CBM's vertical integration) Just because console/computer transitions didn't often work out, hardly is grounds for it being a bad idea. (it just means they screwed up in other ways -which the Adam obviously did, or they screwed up in the specific implementation) Except it DID happen after the crash: computer hardware being turned into consoles and vice versa (and tons of potential for more of it), but not so much in the US. (more in Europe and Japan, and rather successfully in some cases, but less so in others -like the GX4000 in Europe, though that was a bit of a joke when it was launched) The Xbox was basically a PC in a console form factor, and modern consoles have (in many ways) become very PC line in overall functionality. (and up until recently you could even run Linux on the PS3 without hacking it)
-
Yeah, except he appeared in Sonic Adventure 2, one of the highest rated (if not the highest rated) 3D Sonic game of all time, and even up there with the better scoring 2D games with a 89% average from Metacritic and 9.4 overall review score at IGN. (at least in terms of the critical response and review scores at the time -a shame it was released several months after the Dreamcast had been discontinued) Yes, and I recognized my inconsistency (and lack of proper separation of personal preference from There's plenty of other games that I like that go against the consensus (Star Wars Rebel Strike is better than RSII IMO, I rather liked Star Fox Assault -hated Command due to the touchscreen though, I like Star Voyager on the NES OK, etc -and others with similar tastes going against the grain like prefering NES JVC Star Wars to Super Star Wars, preferring the werehog levels in Unleashed, etc), and I shouldn't have tried to address so many issues at once. (ie personal preference vs retro fan consensus vs mass market appeal vs standalone quality -and, as above, true quality is much less important than marketability) I was thinking Shadow was about as mediocre as the masses up until I tired it again a couple months ago (previously my brother had played though to the final ending, but that was about 4 years ago and my memory was fuzzy on it). If you want a buggy/unpolished 3D Sonic game, look at STH2006 (and don't even start on the plot and cast of characters ). Granted, to some extent the original (Dreamcast or DX) Sonic Adventure: which I like, but it feels rushed and unpolished, nowhere as awesome as SA2. (which was the highest rated 3D Sonic game ever based on review scored from the time -let alone if you go by IGN, how many games have thy given 9.4 scores on?) SA was good for an early DC game, and still a good game, but I'm not surprised that many complain about the bugs and lack of polish. I think the main problem with SA2 is people who didn't care for the "filler" levels, though they were much more fluid than SA1 (where you had a lot of redundant levels) and I even had fun with the emerald hunting stages. (enough to enjoy it in 2 player vs mode as well) There's a lot of back and forth on those 2 games here: http://www.sega-16.com/forum/showthread.php?t=16231 Then Again, I'd put Mario Sunshine WELL ahead of NSMB or either Galaxy game in overall gameplay and fun. (I'd put SM64 ahead of all but sunshine as well) Heroes was OK (another one I need to go back to), but the voices were more annoying and the plot was weird. (I'd have much preferred a Sonic Adventure 3) But as to some other points from your previous post: How can it be worse than the 2006 360/PS3 game??? (or Secret rings . . . unless you LIKE annoying tilt controls and on-rails platforming) I guess I have a tolerance for certain things, I got very used to the shooting controls and auto-aim feature, and the "jumping off edges" thing wasn't nearly as problematic as reviews make out. (then again, I actually got used to Star Fox Command enough to enjoy the on-foot levels -though the control options needed to be more flexible and a first person POV would have helped a ton) It's also the SA2 fans who would more likely tolerate the shortcomings of Shadow. (especially if they like 3rd person shooters -and didn't mind the 3rdPS stages in Rebel Strike or Star Fox Assault -both of which I managed to enjoy) Though they'd also probably be among those who enjoyed Heroes the most. And, again, I don't see how you could put that below the 2006 game. (Secret Rings is OK if you like the controls, but they ruin it for me like Star Fox command -that and the on-rails aspect even more so: Atomic Runner managed that OK in 2D, but I don't like it at all in 3D) Huh? I never made that argument. I actually didn't elaborate at all: it's a general thing from the art style to the sound to the music, etc. (a lot of the compositions are pretty good, but the arrangements are . . . off, I though the music was just weak at first until I hard some rather awesome remixes that shifted the pacing a bit and modified the overall arrangement) But compare the gameplay to Sonic 3&K, or to fan games/hacks like Megamix or the ongoing Sonic Fan Remix (which looks and sounds absolutely amazing) and you get an even better idea of where it falls short. (again, not bad, but not outstanding) No, sonic shouldn't be about going fast, it should be about speed as well as expolorations, navigating levels (with your preference of extreme speedrunning to finding the secrets, etc, etc), and that's what the SA games did great. Neither of them is about speed, they're actually a bit like Sonic 3/3&K that way, tons of replay value and hidden stuff to find which you'd miss if you just speedrun through the whole game. (which you can only do with Sonic and Shadow -or Sonic and Tails in SA- anyway) Both are massive games that require repeat play to get the full enjoyment of. And in any case, Sega pushed for going fast (predominantly) in Sonic 2 mostly, and that (along with Sonic 3&K) is considered the best of the classic Sonic games (especially in Europe). 3&K pushed the expoloration side of things a lot more, as well as rewarding repeat play (especially due to the save system) among other things. But that was the 4th 16-bit game made, and then there's the 8-bit GG/SMS games to consider as well. Which fan game? Fan Remix uses direct remakes of the 16-bit sonic levels... Megamix (not sonic 1 megamix mind you) doesn't seem to apply to your claims either. That and the SA games seem to do EXACTLY what you list above, dynamic gameplay with a lot more than "going fast" though (as with all good sonic games) you can choose to speedrun the game and ignore all the extra stuff. (for the Sonic/Tails SA or Sonic/Shadow SA2 levels at least -Knuckles/Amy/Big/E102 or Tails/Eggman/Rouge/Knuckles in SA2 don't favor speedrunning as such) All favor expoloration and you've also got options for multiple paths to navigate a level. (granted, none of them do what SM64/Sunshine did with the nonlinear "world" type levels and focus more on branching/linear progression where you often can't go back -the classic 2D games were often like that as well though, with lots of areas you couldn't backtrack from -a fair amount of the 2D mario games are like that too) And back to the main topic: What was more important than any of that, was a strong marketing budget from day 1 and a strong budget for software development (regardless of it being in-house or outsourced). A timeply launch with strong marketing would mean a build-up of 3rd party support in addition of more revenue to feed back into the market (and/or reduce debt), and strong 3rd party support would mean Nintendo not being able to tie-up western developers. (and also making the exclusivity agreements far less attractive to Japanese companies -especially if Atari had already been pushing for licensed in '84) Atari Games split off would have been fine as long as the split had a proper transition that avoided the conflicts of AGames and AInc and actually promoted reasonable collaboration. (such a transition would also favor much more selective layoffs of programmers and other staff -as per Morgan's plans- vs the 100% layoffs Warner forced with the liquidation of the company and the horrible transition that made it a mess for Tramiel to sort throgh new staff to hire -let alone a careful transition meshing more with the former NATCO plans and keeping Morgan on in the interim) The split certainly had disadvantages over keeping on with Morgan's NATCO plans, but the sloppy management of the split was MUCH more harmful overall. No, I didn't claim that, I claimed the "arcade at home" marketing campaign was flawed, not the games themselves. Good games (be it arcade ports or originals) are significant regardless, but pushing the arcade at home angle was not smart. (it should have been part of it, but the shift seen in '89/90 onward is the sort of marketing they needed, direct competitive marketing that promoted far more than the arcade software and attacked Nintendo's hardware -which the SMS could have done given its graphics capabilities, but not as extreme as the Genesis obviously) If Sega had come roaring in in '86 with excellent marketing and management (and work to build up SoA in general -and push for western developed games to supplement the Japanese stuff), they could have dug in before Nintendo made any serious headway in the US. Unlike Atari, Sega had strong software (if mainly 1st party), impressive hardware, and a marketign budget that rivaled (initially exceeded) Nintendo's. All they needed was the right management to push that for the North American market. (NEC failed far more spectacularly given their position as a megacorp) With western developers split between Sega and Nintendo (let alone some possible interest in Atari), Nintendo wouldn't have been able to push the exclusivity and Jpanese developers would even have to think twice due to the export market. (plus Sega could offer to publish 3rd party games, which they did in quite a few cases on the MD -not just commissioned games, but fully independent 3rd party games that were even published independently in Japan, on top of licensed games that Sega ported themselves like Ghouls n' Ghosts) Tramiel is tangential to the issue: we can't know for sure what would have happened, but it's painfully obvious that Atari Corp would have been much stronger with an organized transition rather than the mess Warner made of things. (depending on what sort of compromises Tramiel was willing to make and how well Morgan had managed to delegate things based on the original NATCO plans would be the deciding factors on just how much of a difference it would make: from Tramiel's perspective, more game sales and revenue would mean a healthier company and thus more to put towards the ST -let alone dropping the ST for Atari's own advanced hardware which was prototyped in LSI vs on paper only for the ST, etc, etc) As am I, but I'm thinking beyond that and considering what a proper transition managed by Warner could have meant for Atari Corp. (Atari games splitting off would not have been a bad thing necessarily as such a transition could have favored a reasonable partnership as such -especially since they shared a brand name in the public's eyes -there could even have been a healthy exchange of Atari Corp hardware and AGames licenses and Tengen development resources -which sort of happened with the Jaguar to a very limited extent) Actually, they did have internal devs, they took on most of the Atari Inc computer game (and application) programmers, but without the budget it didn't matter. (and the delays and mess caused by the split killed much of the potnetial revenue and funding possible in '84/85) Because by that time 3rd parties were VERY tired of Nintendo's antics, and, iirc, there were some investigations on Nintendo's monopolistic tendencies. That's why it really opened up for Sega with Genesis. It was Sega's marketing and careful management that allowed that to open up, and it took several years of persistent pushing to do so. (early on it was all 1st/2nd party and licensed stuff along with some computer developers who had yet to get tied up with Nintendo) Nintendo also had no lockout on the Famicom and thus no way to prevent 3rd parties going unlicensed as such. (and their success in the west would determine whether they could assert such policies or not -without a strong lead in the west, they couldn't enforce such policies) There's no reason they couldn't have done that in '86 had they had the right management, and without the strong early lead Nintendo got from fall/winter of '86 and full solidification after the 1987 holiday season, Nintendo would never have been able to establish the policies they did.
-
!!! [off topic rant] Wrong on several accounts: the 32x does almost nothing in hardware, it's more or less like a VGA PC with very little RAM, like an 8 MHz 16 bit ISA VGA card with 256 kB of video RAM (that would actually have slightly higher bandwidth than the 32x -which uses 7.67 MHz 16-bit DRAM framebuffers), and dual (fast) 486SLC CPUs with zero wait state SDRAM. (but only 256k -plus slow ROM for mass storage) The SH2s are stuck on a 16 bit bus to save cost and SDRAM was used to save development time. Likewise the system was clocked based on the Genesis, rather than fully asynchronous like the Sega CD. (same 80 ns DRAM, and as such it could have been 12.5 MHz -and if they'd done that, they could have used cheap DRAM for main memory on a 32-bit bus rather than expensive fast SDRAM on a 16 bit bus -plus 12.5 MHz 32-bit is higher bandwidth than 23.01 MHz 16-bit) There's a LOT of other trade-offs, like going all software vs dropping a CPU in favor of a VDP that did more than manage a framebuffer and offer single plane hardware V/H (and per line) scrolling, or dropping the slave SH2 in favor of a faster/lower cost DSP even. (still inefficient as a blitter, but much faster for many dedicated tasks than the SH2 -even the low cost SSP1601 in the SVP chip would be considerably faster at some things than the SH2) All games use the 8bpp (256 color) framebuffer mode in-game (a few have highcolor splash screens I think), and the limited framebuffer bandwidth would go a long way towards making it unattractive for the highcolor mode. There's also the issue of all blittign being done by the CPUs: all rendering (other than simple line fill) has to be doen in software, very much like VGA PC games. (there's also the Genesis graphics, of course) There's also the PWM DACs, but those are also CPU intensive to drive without DMA enabled. (and apparently the dev kits never good tools supporting DMA -the preproduction dev units had broken DMA sound, but all consumer models had that fixed -that's one feature that current homebrew is taking advantage of) Some of that is excusable due to the less than 6 month development time (going from a sketch on a hotel napkin in January to a solidified prototype in June at the Summer CES to production less than 3 months later), but you'd think they could have accelerated various hardware acceleration option by deriving the hardware from the Sega CD's blitter+memory interface, and/or SVP, and/or DSP/VSP1 logic from the Saturn, etc. (the MCD ASIC with texture mapping blitter and memory interface would probably be the best bet to rework for a cartridge add-on -as it is, it's begging to have a framebuffer to work with rather than being tied down to the MD VDP and related VRAM/DMA/color issues) But yes, it is quite different from the 7800 since it uses genlock and an added video layer as well as a coprocessor rather than just RAM+sound upgrades, plus the 32x needs its own power supply. (or does the XM as well?) Something more like the XM would be a module with a 512k DRAM chip (like the 256kx16-bit ones already used in the Sega CD), either DMA sound or the ricoh PCM chip used in the arcade and Sega CD, and maybe another YM2612. (for the ricoh chip, you'd preferably want DMA to DRAM rather than having to use the 64 kB PSRAM block of the MCD, or simpler amiga-like DMA sound that reads directly from RAM or ROM -though you'd also have bus contention issues to deal with since the VDP asserts total bus bandwidth with burst DMA for vblank updates, and there's hardware and software solutions for that -software is to stagger V-DMA with "holes" in vblank for shared access to avoid missed reads/writes or at least reduce timing error -especially for audio reads for PCM data) Though given the context of 1994, adding a low-cost coprocessor on top of that would have made a lot more sense too. (like makign the SVP into an add-on rather than putting it on cart -especially since they had no other on-cart enhancements, not even RAM, on the Genesis, unlike the NES or SNES, so it could be a standard for upgraded/enhanced games-) In that case, probably the 512k DRAM chip in place of the 128k one used in Virtua Racing, and simple DMA audio with added mixing/scaling/etc handled by the CPU or SVP (it is a DSP after all, an reasonably capable for sound duties if not monopolized by other coprocessing). So tons of possibilities for decompressing sound and graphics data into the nice chunk of RAM, ports of games otherwise extremely difficult without the RAM, on the fly DSP driven decompression, various coprocessing for polygons, ray casting, etc. (let alone in conjunction with Sega CD games, couplign the SVP's 3D math capabilities with the drawing and texture mapping hardware of the Sega CD -it also includes bitmap to tilemap conversion on the fly) 128k would be cheaper, but while the 512k chip would be some ~$8 more in component costs than the 128k chip in '94, the amount of board space would be the same, and the other advantages could really pay off. (especially in further facilitating smaller ROM sizes in games with much more compression and thus lower prices or higher profit margins -perhaps enough to push for the module to be sold at a loss, definitely at cost at the very least) All for MUCH less than the 32x's multi-bus, multiple RAM banks, dual high performance RISC CPUs, VDC ASIC, etc. (and potentially low power enough to avoid a separate power supply -as it is, the 32x uses less than .35 amps) Something more like that probably would have been a much better idea than the 32x for multiple reasons. (available sooner -ie in place of the standalone Virtua Racing in spring of '94, much lower price point -probably less than 1/2 the price of the 32x- that could drive popularity, make it more feasible to integrate with later genesis models, be more realistic to push for CD games using the 2nd add-on, and then there's the lower price and more modest capabilities -especially remaingin color limits- that would make it clash much less with the Saturn -which, granted, had its own long list of hardware/software/timing/marketing issues, and it also would have meant adding the reasonable potential for forward compatible games that could still work -with limited content or features- on a bone stock Genesis -which you could do with the 32x, but not without sticking to genesis graphics alone and wasting the 32x layer or adding a lot more ROM for redundant Genesis graphics to be used without the 32x -otherwise you'd just need added code for the enhanced modes, very small compared to graphics data) The other thing is that the Genesis easily supported RAM and stereo sound on-cart (let alone added expansion signals -far more comprehensive than the gimped port the Sega CD used), but for whatever reason, no games used that at all. (not even with Sega's massive stock/supply line of 64 kB PSRAM chips -or 32kx8-bit earlier on- would have offered a very good option for embedded RAM expansion without investment in custom logic for DRAM interfacing -or more expensive SRAM-, or adding another YM2612 -or various other options Sega or 3rd parties could have pushed- for more sound synth capabilities and a 2nd DAC port to optionally use, let alone low-cost embedded DSPs for math coprocessing and such for help with 3D games, realtime decompression, etc -like the SNES was using almost from day 1 with the cheap DSP-1 chip followed by later revisions as that became obsolete -bug fixes and enhancements- let alone the more substantial super FX GSU -though that's getting close to the point where an add-on becomes more cost effective) It's rather ironic that the 7800 chucked full 32kx8-bit SRAM chips into several games back in 1987/88 and a full 40 pin DIP sound chip yet the Genesis had absolutely nothing in its entire life on cart other than SRAM for batter backup (surprisingly uncommon), EEPROM for a handful of late games using saves, FeRAM for a few games (must have gotten a special deal on mass market testing or something given how exotic FeRAM was -or is today even- and especially given the reliability issues seen on many Sonic 3 carts), and of course the 1 game to use the SVP+128k DRAM chip (same 80 ns 64kx16-bit Toshiba DRAMs as the MCD word RAM and 32x framebuffers used) and that was in 1994. This is getting way off topic though and has all been discussed on Sega 16 before. (well, aside from the direct parallel to the 7800 XM, which hasn't come up on Sega 16) Wrong on the expansion port issue, the original 7800 expansion port was very limited and better off removed. The main cart slot is a FAR better expansion port (even more extreme than the Genesis's cart slot vs side port), though if they'd done like the SMS and simply cloned the cart slot connectivity (more or less) for the expansion port, that would have been a nice route that avoided a piggyback cart module for some things. On top of that, you've got the SIO interface a la POKEY with the XM as well. (and potentially could have mapped the POKEY key lines to an external connector like the XEGS did -unless the XM actually does have that as well, but I haven't seen anything about that) No, it's FAR more than anything on the Famicom as such, other than the Disk system. It adds as much RAM as Star Fox or Virtua Racing did, it adds an arcade standard YM2151 on top of POKEY, etc, etc. It's certainly different from 32x, but MUCH more than any on-cart MAPPER or modest VRAM expansion on the NES ever did. (it's also much more than would have been feasible for the 7800 as an add-on until the early 90s -cut it back to 16-32k and POKEY and you'd arguably have something preferable to cramming those things on cart back in '87/88 ) The number of NES games to use expanded RAM is rather few, the number of NES games to use sound expansion is zero (due to the removal of the sound input on the cart slot all games had to make do with the onboard sound, and any Famicom games using expanded sound had to be reworked for the NES), MAPPERs are rudementary ICs that range from plain bank switching (which the 7800 did with discrete logic) up to mapping out ROM to get around some limits of the PPU. (like allowing 8x8 color attribute cells rather than 16x16 as it normally is -though raster interrupts would also get around that in the vertical, and with more options since you'd be reloading the palettes completely and you could do any interval from 1 line up rather than just 8 pixels, but still stick you with the 16 pixel wide limit) I'm pretty sure that if the 32X were pushed as hard as the SNES was pushed, you'd see a hell of a lot more out of it than you would out of the SNES. The 32x is weak for 2D stuff, it's got no hardware acceleration and, again, is more like a fast 386 VGA PC with graphics overlaid (or under) the Genesis graphics. (with lots of other trade-offs and bottlenecks) It would have a hard time matching the SNES in some areas, and impossible in others (approximate, but not fully match), but would shine for other things like software rendered 3D and such. (somewhat like a VGA PC vs SNES ) Comparing the 32x and SNES is a bit like comparing a 128k ST and NES in that respect. (except the color advantages are a bit less for the 32x and it has the Gensis graphics layers and CPU to work with too: maybe more like the ST with gelocked with the A8 vs the NES ) [/off-topic rant]
-
He already mentioned that normal 7800 controllers work fine for 2 button games, just not the joypads that came with the 2600 Jr, so that wouldn't seem to be the case. I think the VCS/A8 sticks use 6 wires for gnd, 4 directional switches, and the fire button. (the other 3 are for 5V and the 2 POT lines)
-
No, the Mk.III/SMS was natively 100% backwards compatible out of the box with the older (colecovision like) SG-1000 Mk.I/II. The consumer had to buy an adapter, just like with Coleco and 2600. The SMS chips are in every Genesis, but that's a technical detail, of interest only to nerds like you and me. None of my friends with Coleco could play 2600 games, and none of my friends with Genesis could play SMS games, either. As far as they were concerned, it was the same deal. No, you're thinking of the Mega Drive/Genesis, the Mk.III/Master System in Japan was 100% compatible with the preceding SG-1000 and Mk.II. The Genesis is still a different context though since it has fully embedded SMS compatibility, like the Game Gear (except the GG isn't dragged down by it like the Genesis hardware was -ie they had to sacrifice board space and VDP die space for compatibility on the MD). Requiring the adapter must have been a marketing decision and it shouldn't have been a big issue to make the systems directly cart compatible. (and probably just require an adapter for the card games) But from a consumer PoV, it's closer to the CV somewhat: except the native controllers and controller ports can be used, the adapter was very inexpensive, and again is more comparable to adapters required to play famicom games on the NES, or Japanese SMS/Mk.III/SG-1000 games on the westenrn Master system. (or the PC Engine to TG-16 for that matter) Yes, but there's no reason they COULDN'T have doen what you suggest after the fact, but retain a cheap built-in keyboard and STILL have a cheaper system than the 5200 (mainly due to the 5200 being unevenly cost cut over the 600). And no, the 400's price point is VERY different from the PSX and such, the 3DO's launch price is probably the only thing to come close by comparison on the game console front (PS3's 500-600 launch price is a fraction of the 400 with inflation taken into account), that and it may have been marketed as a game machine (to some extent), but it was also marketed as a lower-end computer, and most bought it as such. (marketing in general wasn't as strogn as it should have been for the A8 line though, among other issues) And? they could have released the game console with the keyboard in any case, and it would have been a plus on the market in the earl/mid 80s anyway. (even a cheap keyboard like the 400's would have placed it into the low-end home computer category ) There's a LOT of other areas you's want to cut features for a dedicated console than the keyboard: you'd want to remove the fundamental keyboard I/O from POKEY, and SIO logic from POKEY, and remove PIA, drop to 2 controller ports from the start, drop 4 of POKEY's POT lines in favor of 4 plain I/O lines, etc. That's what you'd do if you wanted a truly cost-cut console only derivative of the system with no regard to expandability to a full computer. If you wanted it truly compatible with the computer, taking the 600 design from 1982 and swapping in a cheap keyboard wouldn't have been substantially different in cost from making the keyboard an accessory. (the the cleaner, more consolidated board design compared to the 5200 should have saved cost -in spite of PIA and the keyboard- as would the smaller/lighter casing in general and the smaller amount of shelf space it would consume) Yes, except that falls apart with the system already being price competitive WITH the keyboard included and a low-cost keyboard at that, and the 5200 being less cost effective in spite of the cut-down design. (which, again, seems to be an odd mix of corner cutting countered by various other cost inefficiency) It not only would be a selling point, but would better combat the home computer war and continue selling when game consoles came under attack (including in the heat of the crash). Then again, you could argue the plain Atari 600 without the lower cost keyboard would have managed close to the same thing. (that DEFINITELY should have been released either way though) The Atari 400, as it was, was price competitive with the 5200 if not the Colecovision. (not that a cut-down consolized design couldn't be MORE cost effective still though, but the 5200 of 1982 was not a good example of that) 1. you're only talking about sprites and 2. you're making a lot of generalizations The A8 an VCS do indeed have very similar sprite architectures, but the bitmap/character mode playfield capabilities of the A8 are what sets it apart and makes it much more like later contemporaries. You could have A8 games (and do) that do everything with the character/bitmap display with no use of sparites at all (but software "sprites" or blitter objects in the more modern context) Dozens of other platforms lacked sprite position registers or any sprite hardware at all, and either software rendered to a framebuffer or character display (VIC-20 only had characters and no framebuffer), or you had a blitter to accelerate such. (and blitter logic is what took over with the 3DO/PSX/Saturn/etc -the "sprite" engines in all of those cases were blitter driven using 2D textures) The Jaguar has both a blitter and the object processor, which is a list processor somewhat in the vein of MARIA or such with several generations past and the fact that it works with a full framebuffer rather than having to carefully arrange things on a scanline basis. (it doesn't use hardware sprites and it's not a blitter, but it does have a framebuffer to build up and read -the Panther didn't have enough RAM to use a framebuffer and thus was stuck with the 7800's method more so) And from what I understand, the 7800 (let alone Jaguar) doesn't burn though CPU cycles to build the diplay as such, the biggest hit to CPU resource is the shared system bus forcing the CPU to be halted for MARIA's burst DMA. The early concept design of MARIA with no DLLs DID eat up CPU time more like the VCS (the CPU had to manually build-up the display for every scanline based on the display list), but the use of DLLs allowed MARIA to offload much of the overhead as such. If MARIA was used in a dual bus design, or with fast enough memory to allow interleaving, you'd have loads more CPU time to work with. In the Jaguar, it's sort of the opposite: the CPU is one of the least efficient (if not the least efficient) bus masters in the system, it hogs the bus when used and it's thus necessary to use as little CPU time as possible (and keep it off the bus as much as possible) to allow anywhere near peak bandwidth on the main but. TOM has line buffers for many of its operations (every object processor operation, some blitter operations -not texture mapping- and I think some RISC GPU operations as well), so it can manage near 100% fast page bandwidth (~106 MB/s) on its own, but the 68k drags that done hugely as does JERRY (since it was sort of hacked in with a somewhat buggy and slow DMA interface -not that big of an issue if purely used for sound synthesis though -with low bus usage). The Jaguar would have hugely benefited from either a slow bus for the 68k to be on (especially 68k and JERRY), or using a CPU with a cache (even a modest one like a 68020), but that was one of the design trade-offs to push low cost on top of other limitations in development time and funding. (in hindsight, one good trade-off would have been to drop the huge DSP -same RISC core as the GPU- in favor of a small, simple ASIC that just has I/O and DMA sound channels -maybe hardware ADPCM decoding logic or other features like per channel low pass filtering, supporting PCM of different bit depths, etc- and add a 2nd bus dedicated to the CPU and audio OR shell out for a more powerful CPU with a cache and keep the single bus design -lots of options, and many other platforms made the mistake of an overbuilt sound system that would never really be worth it -the SNES was probably the first to do that -OTOH, Atair Corp had many, many, other problems that were bigger than any hardware issues with the Jaguar by orders of magnitude: mainly the downward spiral the company fell into around 1989 which Sam Tramiel's weaker management compared to his father was at least in part responsible for -it not largely responsible for) I'm not a programmer for any of these systems, just more of a tech/history geek (at least I'm not programming for any YET ), and I don't have a deeply intimate understanding of the systems, but a reasonable high-level understanding of the unique architectural aspects and limitations, so I can't offer a whole lot more than I've summarized above. (and there's a point where I hit a wall in understanding the dev manuals as well -though maybe you could glean more from that short of poking some of the Jag programmers for more details -the doccuments should be reasonably easy to find online, I think Atarimuseum has them and I know they've been uploaded to AA before -there's also doccuments for the Jaguar II, but it was still a work in progress when those were printed) See, I understood everything you just said, because I'm a meganerd like you. But 90% of game programmers I know aren't meganerds. The design says to put the sprite over there, so they go read the hardware manual to see where to poke the X, Y, SHAPE, and COLOR. For better or worse, these people were disqualified from Atari development. Yes, but only a small part of my above statement was on the hardware sprite issue. TONs of developers were working with systems (namely computers) with absolutely no sprite hardware at all (namely framebuffer or character graphics manipulated by the CPU -maybe with hardware scrolling if you were lucky), not even the tricky A8/VCS type sprites. Plus you had arcade board doing the same with CPU driven framebuffer graphics or a blitter in some cases. OTOH: the pure X/Y register stuff was relatively new as well (though the TI99 had it in '79), and was NOT the defacto standard for console/arcade games until the mid/late 80s, and even then you had tons of computer platforms that required software and/or hardware blitting (and tricky stuff like dealing with planar bitmap graphics or attribute clash on some older systems -like MSX or Spectrum). The dominant Japanese home computer was the PC8801 with 640x200 3bpp planar graphics as the standard/lowest resolution and was used for games many (early models used 3-bit RGB, later ones allowed indexing from 9-bit -many games opted for 2 bitplanes and lots of dithering), and only a 4 MHz Z80 to drive graphics on top of that. There was a MASSIVE range of different limitations on different systems, you had the likes of the Apple II and Spectrum getting new games into the early 90s (more so the Spectrum), and the A8 was better off in pretty much every way than the Apple II for ease of programming. (if you didn't like the sprites, you could focus on CPU driven character/bitmap graphics and learn the best advantages to push more color with DLIs and such -and make use of the hardware scrolling- but making some use of the sprites would be something any good A8 developer would eventually push -even if just for decals of added color/detail to bitmap/character graphics -the trade-offs of the A8 are unique, but that's the same for many platforms and any developer working on it would learn to push it accordingly) The 7800 if a bit more distinct from that though since MARIA is a bit further from the conventional character/bitmap modes from what I understand (aside from sprite manipulation), but maybe its character modes do work rather like contemporaries (my understanding of MARIA is very general). However, when it was being developed and planned for the original release, there was still a lack of the defacto sprite+tilemap standards in the arcade or home console industry: on top of that, the VCS was still dominating the market by a large margin, and thus the programmers experienced with that would be used to dealing with "odd" or "unconventional" architectures as such. (even though the VCS isn't really like the 7800 in how its programmed -and obviously doesn't involve the tight CPU intervention and cycle timed code necessary doe the VCS to display even a static screen) The A8's sprites likewise would be favorable for those used to dealing with the VCS, but with MUCH greater flexibility of CPU time, use of interrupts, and all the modern graphics features offered for the playfield. (be it in bitmap or character modes) If the 7800 had been released as planned in '84, and Morgan's other plans (Natco, etc) had followed through, the market would have been much more favorable for the 7800 and developer support would have pushed beyond the unique architectural limitations by the time Nintendo (or Sega) were even remote issues on the market. (plus, the lack of delay could have meant negotiations for licensing Japanese arcade games before Nintendo had secured exclusive rights in mid 1985 -which is when Michael Katz was attempting to license said games and was forced to resort to licensing computer games instead for a large part of the 7800's library) Granted, you could argue for the 5200 to have pressed on in spite of the problems (since it was already there, had a notable userbase, and many issues that could be corrected with new hardware revisions), but I already commented on that in my previous post. (including other things like common prodcution of ICs for the A8 and 5200, ease of cross-platform development, etc -plus an architecture a good chunk of 3rd parties were already developing for -for A8 and 5200- and thus already past the stages of "odd" architecture being a major boundary in such cases -plus hype, market share, and userbase would solidify such developer interest regardless of difficulty of development: a la PS2 )
-
This came up in the MARIA add-on thread too and it had to be pointed out (especially for NES vs 7800 -or CV for that matter) that lower resolution isn't THAT huge of a handicap with good art design. (there's limits, of course, but still general trade-offs -and the fact the 7800 rarely had its capabilities really pushed, but when it did, it often showed advantages over the NES -Commando looks better on the 7800 IMO, though more so with a modded 7800 for corrected composite video output -rather than the weaker forced by the shared TIA+MARIA line out) Yep, it's more of an all-new system with the VCS hardware tacked on (and dual clock speeds for the CPU to access the old hardware -especially TIA, though RIOT technically could have been bumped to 1.79 MHz too), but also using that tacked-on hardware to a fair extent (I/O and sound) in leu of adding more custom/off the shelf logic to the board. (again, take out TIA and RIOT and drop in an AY-3-8910, and you'll have a system that looks and plays about the same -maybe very slightly better since you'll be acccessing the I/O and sound registers at 1.79 MHz, but with much better sound capabilities -though lacking TIA's periodic noise- and you'd also have a system that was simpler to design and cheaper to manufacture than the 7800 or 5200 for that matter) I don't think many A8/5200 games use the 80 pixel GTIA modes, especially in-game, very few do AFIK, and almost none use the lower horizontal res 4 color graphics modes. OTOH, I think a fair bit more use 160x96 graphics. (still not super common, but a few do) And plenty of the character mode stuff, of course. There are a few games (at least on the A8) that use the 320x192 modes as well, and more that use DLIs for a higher res status screen or such. Whether they SHOULD have canceled the 5200 in favor of the 7800 like that is another matter too: the 5200 had a lot of areas of missed opportunities in the design from the start, including some you couldn't really go back on (like integrated VCS compatibility, or a careful design to allow a cheaper/simpler add-on module for compatibility -and have it out from day 1), but there's other flaws that could be corrected and should indeed have taken advantage of the hardware design in general. (the 5200 should have been significantly cheaper than the 400 -or 600XL for that matter- and it had a ton of potential for consolidation: single-chip DRAM interface/refresh IC, CGIA, removing the expansion port, using a much smaller motherboard based on all of that -switching to 2 16kx4-bit DRAMs later on when those became available, etc, etc) The controllers could have been fixed much earlier (the pseudo-digital pull-up resistor option should have been cheaper and more reliable on top of controlling better for most games -the button/key issue would be a separate fix though, except using simple switches with pull-up resistors could have meant a single PCB for the key matrix and joystick switches -side buttons would still be separate though) Technically, they could even have made a deluxe 5200 model with 2 cart slots and direct VCS compatibility. (and have it set up to use SALLY as the 6507 to save some cost, but that would also mean having 2 5200 joyports and 2 2600 ports, so it might just end up being a mess -especially as time went on, and compatibility was less and less of an issue -let alone how the JAN chip could have made the VCS adapter a fair bit cheaper) But the main thing about the 5200 vs 7800 is that the 5200 was already on the marker for better or worse and had sold a few million units already (at least 2 million going by articles from the time), and while the 7800 would have been great in place of the 5200, there's many more trade-offs with releasing it after the fact rather than pushing on the with 5200 and fixing its hardware issues (aside from native compatibility) and press on. It also facilitated rather direct ports from the 8-bit, more so once they had more digital-like controllers that didn't conflict with some of the games. (rarely such an extreme case, but there were some -and the pseudo digital option should have been there from the start, and even useful after they released the nice revised controllers with the high precision spring loaded pot modules) The 3200 could have been better (potentially more cost effective than the 7800 even -especially if they merged STIA and ANTIC like CGIA), a directly compatible low-end model of the A8 (like the 1982 600 with a cheap keyboard -sort of like how the 400 was originally aimed, but finally cheap enough to be at game console prices) could have been better too (especially in light of the computer wars cutting deep into the consoles and Atari Inc's own need to push more of the computers -which was rather overdue, especially on the marketing side), and also waiting for the 7800 (or if the 3200 had been delayed until '83) probably would have been better than what happened with the 5200, but none of that mattered (except pushing the computers) after the fact with the 5200 and there were lots of trade-offs in the decision to drop it rather than press on and fix many of the hardware problems that could be -more reliable and more cost effective. (hell, with a DRAM ASIC, CGIA, and no expansion port, they might have been able to crap it down close to the 7800's motherboard size -actually it might have even been smaller than the 7800 PCB since the chipset would take up less real estate than RIOT+TIA+MARIA+SALLY though the 8 DRAM chips and DRAM IC would take up a bit more space than the SRAM chips in the 7800 -let alone if they continued with discrete logic-) Continuing with the 5200 also would have avoided some of the issues with the split and transition to Atari Corp (no delay with the 7800, just pressing on with the 5200), and you'd have common production of all the custom chips in the 5200 and A8. (just without all the chips used in the 5200 -no PIA and no MMU/FREDDIE later on) CGIA would have been on both platforms once stockpiles of ANTIC+GTIA ran out, similar DRAM interface logic could be used on both, additional consolidation of POKY and SALLY could be applied to both, etc. No, it's not smoother, the A8 version is reasonably competitive with the C64 with some trade-offs (and it has the right music), but the 7800 version is better looking and better sounding. (the latter thanks to POKEY in conjunction with TIA -I think the A8 POKEY composition was weaker in general not just due to the lack of TIA though -the music could have been the same on the A8 -save for cutting out some music sounds for SFX, or potentially better with some software modulation of POKEY, but that wasn't the case) Fixed that for you. It's not an amazing game, though it's a decent sidescroller (in the grand scheme of things with all the competition, it was average to mediocre -especially if you take the sound into account regardless of technical limitations), it wasn't a stellar game on the Lynx either for that matter. (Ninja Golf would probably be a better example of a sidescroller on the 7800 -pretty decent art design for the color and resolution too, not an exceptional game, but decent and with a neat and very original concept -makes you wonder how they could have pushed that concept with a higher budget) The only reason scrapyard Dog is notable on the 7800 is because the library of games is limited on the 7800 as such. (and most games that are there are relatively low budget -vs still having a small library but having most games exceptionally high quality and fairly large ROM sizes for the time) That's nothign against the machine itself, but the reality of the situation Atari Corp was in at the time. (especially with a general lack of 3rd party support and relatively tight in-house budget -especially compared to the competition-) Edit: and all of the above issues are much less than the fundamental management problems and flawed distribution system that largely caused the destabilization of the market (due to oversaturation, bloating, and the "glut") that led to the crash (with some help from the computer wars), and THOSE were the real problems Atari/Warner needed to deal with. (and Morgan was doing handily by early 1984, albeit over a year later than they needed to avert disaster entirely)
-
Yes, digital (or pseudo digital -via pull-up resistors) controllers would have been better and more fool proof (probably more reliable among other things). Analog would be a nice accessory though. (the keypads probably could have been accessory controllers too -possibly including ones with built-in keypads) That and the fire buttons could have been a little better (and everything should have use PCBs rather than flex circuits -and probably metal dome switches) Having the stick not spin and rubberized (especially like a mini CX-40 derivative) would have been great too. The ergonomics are great though, better than the 7800 proline sticks, better than the CX-40, better than the Colecovision controllers too. (the CV ones would have been great with non recessed buttons and a stick more like the Gemini -but better quality more like a mini CX-40 -as it is, the CV is OK for games where you don't need the fire buttons and can thus use the knob as a thumbstick, but it's not a gamepad and the button placement doesn't favor simultaneous use of the stick with your thumb) All of the above (even the stock 5200 controller) are better than the Intellivision controllers though. The Vectrex also showed that a good quality, sensitive, short-throw analog stick can be far more competitive for 8-way games. (which also implies the revised 5200 stick with compact spring-loaded pot module and shorter throw movement would be close to that, though it would hardly be as cheap as a simple digital stick or pull-up resistors) I have no idea why they didn't offer a pseudo-digital controller using pull-up resistors (as standard or an accessory) to address the analog problems after the fact. (the button issues are separate though) Not only that, but not even a 3rd party controller using such a simple hack to address the issue. (cheaper to build than an analog pot based mechanism on top of that) The 5200 controllers are better in concept and form factor than the 7800, CV, and Intellivision in every respect, but the implementation (analog only joysitck with no centering and long throw, unreliable flex circuitry/carbon dome switches, chiclet buttons, etc). The size, shape, and feel of the 5200 controller is great, but the internals and some details of the design are . . . off. (I also don't like the spinning hard plastic joystick, but that's even worse on the 7800 proline stick while the CV's knob is only good with the thumb and the IV's disc is worse than any of those -and the general form factor is weaker, though I'll argue the tactile membrane keypad is better than the 5200 pad and arguably the CV pad as well) The pac in is always important to some degree, but not THAT big a factor. The total launch lineup is an even bigger factor, and that's a compounded problem. Pac Man wasn't ready at launch (the main reason it wasn't pack in iirc), thus people couldn't even buy it after the fact. Likewise, the VCS had the decent (but simple and aging) Combat as the standard pack-in until Pac Man replaced it after '82: Space Invaders was a killer app, but it was never a pack-in standard other than the Sears Video Arcade 2. (though it probably should have been -a shame that wasn't the game they overproduced rather than Pac Man ) Space invaders would be a bad pack-in for the 5200 too since it was one of the few A8/5200 games that was significantly weaker than the 2600 version. (Super Breakout at least had multiplayer support -and 4 player simultaneous at that if I'm not mistaken) What was the total launch lineup for the 5200? OTOH, the 7800 pack-in was also an issue. It really should have been Ms. Pac Man (at least early on): it was a couple years old by that point, but was still very hot in the arcades, is a fundamentally fun and addictive game, looks and sounds pretty good on the 7800, plays well on the 7800, and endured as a popular game for years after (to this day) including being the single best selling 3rd party published game on the Sega Genesis. (and was released on almost every game console released since the 7800/NES/SMS -all 3 of those, the Genesis, SNES, etc, etc) Had it added a 2 player mode (like the Genesis version with added features), that would have pretty much made it perfect. As I said above, all the new controllers had problems, and the analog (especially long throw and to some extent centering) aspect of the 5200 controller was and is a problem. Pac Man and Ms. Pac Man are NOT the worst cases by far though, they're OK (not ideal, but not unplayable), but some games were much more problematic. Vanguard is a more problematic case by far from the comments and reviews I've seen, and one of the cases where it's barely playable even with a new/refurbished controller -unlike Pac Man or Ms. Pac Man where even a old/worn twitchy controller is at least moderately playable) The other thing is that analog control was unnecessary for most joystick based games of the time (driving/paddles or trac balls would address most cases where analog was somewhat useful -and in some cases, you've got analog used in the wrong way like Missile command -should have been "speed sensitive" to simulate a tracball vs position tracking where you move the stick). An analog stick would have been a nice accessory (especially for star wars or star raiders), but digital (or pull-up resistor pseudo digital) would have made a better standard option on top of paddles, trackballs, and analog stick options. (one cool thing about the pull-up resistor option is that you could use the same controls for both that and the full analog stick with no difference in programming -you'd just only have the "fast" speed of movement and the pull-up resistor option wouldn't work for star wars or paddle games where it's position tracking) It's also interesting to note that the original VCS/A8 controller ports have all the I/O needed to use joystick with 100% of the features used on the 5200 sticks. (2 POT lines and 5 digital lines -you'd need to multiplex the keyboard more heavily rather than a simple 3x4 matrix and poll it in software rather than using POKEY key scanning, but functionality would be identical) That, and if you limited the 5200 to 2 ports from the start, you'd have enough total I/O to support the VCS/A8 pinout with direct compatibility. (combination of GTIA I/O lines and POKEY analog ports -possibly key inputs- hacked as plain digital I/O lines with internal pull-up resistors for the POKEY pot lines and perhaps POKEY key inputs wired to act as 6 normal I/O lines, plus the final 4 POT lines would be wired as normal analog POT inputs -except you wouldn't even need the key inputs for 2 ports as such, so you could use those on an expansion port, dedicated keyboard port, or for a built-in keyboard)
-
7800 a bandaid? Sorry dude - but if you think the 2600 or 5200 can pull off anything like Sirius, or Alien Brigade or Tower Toppler or Scapyard Dog or Midnight Mutants or Commando, I'd like to see what you're smoking. I'd contend that many of those games could have been done well on the 5200, but not 2600. (including Scrapyard Dog -with careful use of DLIs, hardware sprites with flicker and multiplexing, and softsprites -granted, still not as good looking as the 7800 overall, and those 7800 games weren't maxing out the system either) But yes, MARIA was more advanced than anything else on the US market, still with trade-offs of course, but even more advanced than the NES in some areas. (put it on a full dual bus system -like the NES or CV/SMS/etc- and add competitive sound hardware, and you've got something that could really kick ass -albeit still with resolution trade-offs unless they added more flexibility to the graphics modes, like a higher dot clock version of the 160 wide mode -5.37 MHz would have been a nice compromise too- drop compatibility, and you might keep it in a similar price range on top of the added features -dual bus with more RAM dedicated to video or NES type wide cart slots with external video bus -if GCC made it on their own, the AY8910 would have provided some nice sound capabilities and 16 I/O ports for the controllers and any select switches) MARIA was programmed rather differently from most contemporaries, but that wouldn't have been an issue if 1 they got it out early enough for developers to get used to it (especially developers coming off the 2600 and thus used to *odd* architectures and not the *normal* character/sprite/framebuffer based graphics), and 2 build up enough popularity (with the right advertising, licenses, 1st party software, etc) to get 3rd parties prefer the 7800 over contemporaries. (the PS2 was the least friendly architecture of its generation -vs the PS1 where it was the friendliest to develop for- by a large margin -and at a time when low-level programming was very unpopular, but that didn't stop it from being the most popular to develop for at the time -and even quite a few cases that pushed the hardware close to its limits). Some of the biggest limits of MARIA are due to how it was implemented in the 7800 (though some of that could be expanded -like RAM- but not the bus sharing problem) , though others are tied to the short development cycle or probably GCC being very new to LSI chip design. (which only makes their achievement more impressive though) A preference based on what WAS done, not what the actual potential is, even within the limited implementation in the 7800. (the sound issue is right on for the most part though, it IS 2600 audio though with more CPU time to work with) That's the one areas GCC should have made more compromises over. (collaborating with Atari Inc engineers to make a Low-cost low-pin count POKEY with the I/O removed, write only registers, shrunk die, single clock input, maybe no irq -since the 7800 doesn't favor interrupt driven effects- maybe even down to a 16 or 18 pin DIP; otherwise they'd need to make other compromises to fit a full POKEY onboard -though maybe make use of the I/O for an expansion port for the computer add-on- and the trade-offs would be dropping one of the 2 2k SRAM chips and/or using an external RF modulator -like various computers of the time, or added production cost of a larger riser board) Short of any of that, they could push for the smaller/cheaper off the shelf SN76489 (not great, but a lot better than nothing -and with TIA to back it up), though even then there's some added board space needed. (using an external RF modulator might have been enough to make the added space without increasing the size of the riser board used on the first production run) Actually, more like the Master System being a SG-1000 (or Colecovision) with a bandaid since, like the 7800, the sound was non-upgraded and rather weak for the time (the same boring SN76489 -better than TIA for music, but weaker for SFX and weaker than pretty much any other sound chip used in a console/computer save the VIC and C16/Plus4 -much more bare bones than the AY8910, let alone POKEY -granted the AY8910 is more favorable against POKEY if you can't use any interrupts) Except the SMS and SG-1000/CV have the same exact CPU at the same clock speed (but more work RAM) vs the 7800/NES/2600/5200/etc that use the same CPU architecture (some missing stuff from the NES's custom CPU) and the VCS's is at a lower clock speed and has more limited address space and no IRQ/NMI input. (granted there's also the performance differences on the NES/7800/5200 due to DMA conflict on the latter 2 cutting down CPU time -more so on the 7800 from what I understand, but it depends on the complexity of the display lists iirc) That, and I think the SMS VDP actually built onto some of the SMS TMS9918 logic. (rather than the 7800 tacking on MARIA or Genesis wasting die space with the Master System VDP logic on the main VDP ASIC -probably forced them to limit the CRAM of the Genesis to 64 entries rather than 128, possibly the 9-bit RGB over 12-bit as well -granted, they probably wasted a fair amount of space with the hilight/shadow logic as well -especially for a feature that was almost never used and less needed than more palette entries or color depth . . . GTIA like pixel accumulation for 1/2 res 8bpp graphics layers would have been nice though ) NES has 4 buttons technically, and before you say they're select/option key type buttons: some games DO use them as action buttons. (the TG-16 did the same thing, except so extreme that they developed 4 button controllers that remapped those to main buttons as well as the middle buttons -sort of like the Jaguar Pro controller did with the keys) But yes, they're totally different systems, and if it wasn't for the tacked on VCS compatibility, the 7800 would have almost nothing in common with the VCS. (other than a CPU architecture shared by dozens of consoles, arcade machines, and computers)
-
Right, the adapter - like Coleco's adapter that played 2600 games. No, the Mk.III/SMS was natively 100% backwards compatible out of the box with the older (colecovision like) SG-1000 Mk.I/II. The Genesis was 100% Master System compatible internally, bug Sega chose to use an incompatible cartridge slot, thus NOT like the 5200 or Colecovsion, but more like the Famicom vs NES, PC Engine vs TG-16, or Japanese Mk.III/SMS vs western SMS. (all natively hardware compatible, but needing a pin adaptor to play games -the difference on the Genesis is that it wasn't done for region divisions but a different reason which I'm not sure of) It's like if the 7800 used a unique cartridge port and required a piece of plastic and small PCB to plug into it for VCS compatibility. (which would have been cheap enough to do out of the box as well -though you'd only need that if you wanted to add more to the 7800's cart slot) The only reason the PBC is so bulky on the Genesis is due to the need for the card slot and a stable form factor. (otherwise it could have been the side of the Sonic & Knuckles cart or European PBC-II) They were selling the 400 in '79 anyway - might as well have started establishing it as a viable gaming platform starting then, instead of waiting until '82 and releasing the 5200, which was essentially a keyboardless 400 with zero backwards compatibility. As for the 8k RAM, well that's twice as much as NES which arrived in America in '84! Those who needed more RAM for computing could have upgraded. The 400 WAS inteded as a game console as such, but it was WAY too expensive back then and not until around '82 did that change. (or could have been in '81 if they got their act together and pushed a new consolidated design with the FCC class B regulations) The 400's price was already low (for what it offered and the cost tied to making it class C compliant) at around $500-600 in '79/80. Besides, Atari hardly needed a new console until '82. '82 was perfect timing IF they had gotten the hardware and marketing right, but waiting until '83 or even '84 would have been better than rushing a far from ideal system to market. (pushing the computers is a separate issue that definitely needed to be addressed as well, let alone the bigger problem of all around management and the extreme;y problematic distribution system) The VCS was just coming into its own in 1979, and 1980 was the really big year when it hit massively with Space Invaders on top of the back library of decent to good games. (and good marketing) As history has shown popular system (especially a market leading one) tends to have a healthy span of 6+ years from launch until a successor comes and the old system transitions to the budget market. (even non market leading platforms that do well tend to have close to 5 years before a successor comes along, but the really big ones are often 6 or 7 years on the market without a direct successor -ie Famicom/NES, SNES, Genesis -albeit the successors were quite probelamatic and the budget market transition was botched, PS1, PS2, etc) In many cases, those figures persist in spite of mounting competition: NEC and Sega well ahead of the Super Famicom (Sega with the Master System and Mega Drive as well), Dreamcast almost 2 years ahead of the PS2 in Japan (and about a year ahead in the US), Game Cube and Xbox both technically superior to the PS2 (and more developer friendly) and both marketed aggressively. (the Xbox mainly doing well in North America, the GC weaker on average but more spread out), and the 360 also a year ahead of the PS3. (and the latter has been much weaker than its predecessor, but the later release is way down on the list of problems -price, cost/performance, bluray not being the same gimmick as DVD for the PS2, etc, etc -albeit the same may have happened with the SNES if NEC had pushed the PCE/TG-16 like Sony did with PSX some 6-7 years later -NEC's position as a megacorp with the PCE hardware at their disposal and rising Japanese market share was not unlike that of Sony in the mid 90s, Sony just took the initiative to push extremely aggressively on almost all fronts and also had the advantage of Sega screwing up heavily and Nintendo screwing up with the lack of optical storage -otherwise Nintendo may very well have dominated Japan for another generation or more, losing Square was the catalyst for that) When I was talking about releasing the 400 with an optional keyboard, I wasn't thinking about cost reduction in '79. I was thinking about forcing the mass market "videogame" cartridges from '79 onward to require no keyboard. Then a cost-reduced slim model could have been released in '82 maybe with no keyboard option at all, and still play all the "videogame" cartridges. The keyboard wasn't that much of a cost addition and would have been a very attractive feature to retain in light of the low-end computer wars cutting in on consoles. (just have a cheap keyboard for the lowest end system and a slightly less lower end model with a proper keyboard then up to the mid-range 32/48/64k models) Many games SHOULD have had keyboard support and wouldn't have been as good without it, if you were going to make a lowest common denominator for cartridge games, at least have it include a numeric keyboard and a few function buttons. (plus the space bar was usually used for pausing, a very important feature) Again, I don't think making the shift after the fact and retaining a full (but cheap) keyboard would have been much of an issue, and could have been a selling point in general. (you could have 5200-like controllers using the standard A8 pinotu but with no keypads -relying on the system's keyboard and function keys- and thus only the joystick and fire buttons standard) Why bother though? COnsumers probably wouldn't have bought such an expensive $500+ machine if all it did was play games by default, and by '82 the price of the keyboard (a cheap, minimalistic membrane board) would have been relatively small on top of consolidation and cost reduction. (could have been significantly cheaper than the 5200 was historically, the Atari 400 already WAS price competitive with the 5200, so take that a step further with a consolidated single-board design and similar keyboard -if not more efficient to produce than the 400's keyboard- and you'd be even better off) OTOH, you could have removed PIA from day 1 and used 2 controller ports with 4 POKEY pot lines and 2 GTIA I/O lines per port. (with the POKEY lines configurable as analog or digital -true digital with direct CX-40 compatibility or pull-up resistor pseudo digital- I/O lines and no need for PIA -unless the added 4 select lines on PIA were used for something else-) In such a set-up, you'd probably have 2 GTIA lines and 2 POKEY lines configures to the pins used for directions on the CX-40 (as such, probably have only 4 POKEY ADC channels for POT scanning and 4 plain digital I/O lines -though it could be hacked after the fact as it was, like was possible in a 5200 alternative) and have the 2 POKEY pot lines pin compatible with the A8/VCS as well and optionally employed for paddles, 2-axis joysticks, or as analog "buttons." (let alone analog multiplexing schemes allowing more than 2 buttons on those 2 analog lines using a matrix of pull-up resistors) I put the phrase "Jay Miner architecture" in quotes for this very reason. Of course Jay didn't design all those video systems. But the idea of a simplified GPU that required CPU support was Jay's. This idea made megabucks for 2600, and so it persisted in 5200, 7800 and Jaguar even if it wasn't practical and Jay wasn't personally involved anymore. No, that's very wrong, the VCS is the ONLY Atari system that ever did that as such. The A8/5200 chipset has heavy hardware acceleration (for the time) with ANTIC managing the display lists for framebuffer or character graphics and hardware scrolling (in addition to sprites and the neat support for display list interrupts). The 7800 also had heavy acceleration compared to the VCS, but MARIA was more bus hungry and made the CPU wait much of the time in active display due to the shared bus design. (the A8 knocked the CPU down form 1.79 MHz to about 1.2 MHz performance due to video DMA, but the 7800 could be much more, potentially saturating the bus to 100% if pushed -not CPU intensive, but halting the CPU for video access due to the shared bus architecture used to save cost over a dual-bus architecture) In fact, the only real common ground from the VCS to A8 to 7800 to Panther to Jaguar is the single bus design. (also shared with the Atari ST for that matter except that is used interleaved DMA more like the Apple II and Amiga -later Amiga models offered the separate fastram bus, of course, as did Atari's TT) The Jaguar is the exact opposite of the VCS in terms of CPU heavy graphics, but similar to the 7800 and A8 (more so 7800) in terms of bus contention. The Jaguar has EXTREMLY heavy hardware acceleration with a powerful object processor (2D background and "sprite" generation -all objects hardware scaled as well), powerful blitter with smooth shading and texture mapping (albeit the texture mapping was put on low priority and is thus unbuffered like the Sega Saturn -but on a shared bus vs the Saturn's multi-bus design), hardware Z-buffering, and a very powerful RISC GPU coprocessor that's a complex DSP-like processor with programmability more in line with a CPU (functionality is DSP-like in performance -and is ineficient at many CPU tasks- but it's much easier to work with than contemporary DSPs of the time -perhaps more like Hitachi's SH-DSP line derived from their SuperH RISC CPU architecture). The Jaguar uses a minimalistic 68000 CPU for low cost due and due to Flare expecting relatively little CPU time being needed (thus avoiding contention too heavily), but the sort of 3D games that ended up being pushed in the mid 90s demanded MUCH more CPU resource for the complex logic. (the GPU can be -and has been- hacked to offload some of that, but it's rather inefficient at it -only good because the 68k is so bus hungry and slow- and if the 68k was on its own bus with a chunk of work RAM, or they'd used a CPU with a cache -even a 68EC000 or perhaps a cyrix 486SLC- that wouldn't be seen as a useful option as the GPU has MUCH more useful and efficient operations for 3D math, transform, lighting, etc -and there's also the bugs that make things even more complicated) The Jaguar's limitations come from a combination of foresight based on the 1989/1990 market (given the jaguar was laid down in 1990/91, Flare had some outstanding foresight, but obviously much more that they missed in hindsight -the weak texture mapping and lack of a true system/GPU cache -or CPU with cache- being among the biggest issues), but also due to Atari Corp's increasingly limited funding and struggling market position under weaker management -both in terms of Sam vs Jack and the lack of Mike Katz's marketing/entertainment management skills. (and no new game console on the market to replace the 7800 -which they really needed by 1989 or 1990 on top of the business and marketing management capabilities to make that a success -and then there's the mistakes made with the computers and the decline of that market in Europe and many, many other issues culminating from the end of the 80s onward -any early mistakes and shortcomings of the ST line made in 1985-1988 are relatively small compared to the shift that led Atari into a downward spiral and being on their last legs -in debt with almost no market share- in 1993) The limit of 4 8 pixel wide monochome sprites per scanline was a much greater limitation than the hassle of managing those sprites on-screen. (hence software sprites in character or bitmap graphics modes become an attractive option) The lack of flexible color indexing for character graphics was also a disadvantage against the TMS9918 and VIC-II (at least you had the 5 color mode, but that only gives you 1 optional added color and none for the 1bbp modes). DLIs go a long way towards helping that, but that's still limited on a horizontal line basis (and uses some CPU time -the NES could used raster interrupts for color reloading as well to take advantage of a palette that was considerably larger than even the GTIA/7800 palette, but few if any games used that -the NES has ~56 colors/shades in its default palette but 3 added registers that shift it with 1-1-1 RGB values for a total of some ~448 unique colors/shades but only one bank of ~56 to be indexed on any scanline -and then the limits of CRAM allowing 13 colors for the BG and 12 more for sprites -all as 3 color palettes plus one common BG color) With more flexible color indexing (like a couple dozen CRAM/look-up entries or more even), the systems's large palette could really have shined, and character based software "sprites" could have been much more colorful as well. (though you'd have choppy per-cell character movement or attribute clash -but for fast paced games, the cell-wise movement isn't that big of a deal, and you've got 4x8 cells that are narrower than CV/SMS/NES 8x8 cells and thus will have less choppy horizontal movement than on those systems -though vertical movement would be more choppy) OTOH, the Colecovision lacked the hardware scrolling assistance of ANTIC which would have become more and more apparent with later games. (as it is, you've got a few games like Zaxxon that show the smooth scrolling on the 5200 quite well -and the CV version had to hack in faster scrolling to make the charscroll less choppy -the SG-1000 and MSX versions opted for normal slow speed at the expense of extreme choppiness -same for MSX R-Type)
-
Well, according to metacritic, they are the ninth best reviewed publisher in the world, on a list with the likes of Sony, Nintendo, and Capcom. Yes, but that's not remotely tied to the original statement about their strength as a DEVELOPER. The discussion evolved from there, but the original context was that they still had some of the top development prowess in the world. (divide the list among actual development houses and it's a different story) Publishing is another story, and Sega is a much more notable publisher today then they are a developer. (unlike 8-10 years ago when they were still had a conglomerate of some of the best development/programming talent in the world in active development -maybe a little less than 8 years too, and definitely much futher back than that as well, from the mid 80s to the early 2000s at least) Yes, sales, another separate issue entirely that has little to do with the fundamental quality of the game and more to do with marketing and catering to the current market niche. (many, many games sell very well and are well liked by the masses in the short run only to be labeled mediocre in retrospect -or others that sell relatively poorly or in a small/niche market may persist as critically acclaimed games among the best of all time in the long run -a lot of niche RPGs have gotten that way, Castlevania SoTN is somewhat in that category as well -it sold OK, but nothing compared to the critical acclaim it got back then and the following it has as a classic today) That was one of Sega's problems on the Saturn: lots of very high quality software (at least 1st/2nd party), but not the sort that really cartered to the western (especially North American) mass market. (as they did with Sonic and Sports games on the Genesis -and again on the Dreamcast ) Having mass market appeal is far more important than having outstanding quality and software that stands the test of time. (if you can do both, that's great, but quality is less important than good marketing and having products -even mediocre ones- that mesh with market demands) Some of the hardcore classic Sega fans (especially in Europe) tend to deride some of the software that SoA was pushing in the mid 90s as western development expanded, but by most accounts, those games were highly successful (in some cases integral) in North America at the time. Hell, one of the most extreme cases is with FMV, and while I agree the Sega CD wasn't marketed for the versatile software library and capabilities it had, NOT pushing FMV/multimedia would have been stupid. FMV sold, it sold on consoles (well into the 5th generation), it sold well on computers, and it was the genesis of multimedia in video games that evolved into the integrated high-end multimedia cinema pushed so heavily on the PSX up to modern consoles. (on the MCD itself you already saw a transition for FMV used exclusively for cienematic intros and cutscenes and in increasingly high quality as better compression formats emerged -SoulStar if a fairly good example for intro+cutscene stuff, though games like Tenka Fubu had been pushing it since 1991 -the first example of FMV on the system- and Sonic CD did that too but at rather low quality/framerate for 1993 as it was uncompressed, Silpheed would be a big one too that not only pushed cutscenes but made use of streaming video in-game exceptionally well -and did so with excellent realtime FM+PCM synth music, lossless video compression, and model 1/system 21 like polygonal graphics stylized in 16 colors that looked like it could be realtime rendered -and tricked many into thinking it WAS realtime) Yes, I never claimed they didn't have mass market appeal either, and that was one of the problems with the Saturn (among many, many other things) as I just mentioned above. Having the best developers and software in the world does nothing if you don't cater to what the market wants, or if you cater too much to one region (ie Japan) over another (like North America). Sega of Japan never pushed predominantly for US/western specific appeal, but it just happened that many of the games they (and several JP 3rd parties) were pushing in the late 80s/early 90s meshed rather well with the west and SoA bolstered that with western specific development in-house (STI) and with 2nd/3rd party commissions and collaboration. (along with pure licensed 3rd party publishing) And having the software that matches the market is only one part of it, you also need the right marketing: you could have 2 products that both match the market near perfects, one being exceptional and the other being mediocre, but if the mediocre one is marketed expertly while the exceptional one gets mediocre marketing, guess which is going to be more popular by far? (at least in North America where viral marketing is very weak, especially compared to the late 80s/early 90s Eruopean market -Sonic would have been massive in Europe even with zero marketing, but it could have fallen into obscurity in the US is Sega hadn't had the great marketing at the time -let alone the decision to make ti pack-in) They may still be among the top 10 publishers of home video games worldwide (depending on the metric -mass market appeal and success would be the most realistic though not the one generally used for "quality" in hindsight), but they're definitely not anywhere close to what they were as a publisher at their peak(s) (where they were close to -if not- #1 in the world -even on a "quality" basis), let alone in terms of actual development capabilities. (which was the original comment) Likewise, the compilations are also a good move from a publishing PoV, though not really much to show as a DEVELOPER. (other than that they could manage some decent emulation that comes fairly close to the best PCs already offered and with a pretty nice menu system -but also the annoying "unlockable" content in a vain attempt to improve replay value -it must work as a marketing ploy, but I find such content restriction annoying, let alone if I lose my save file -happened with Mega Man collection on the Game Cube and is the main reason I haven't bothered to play it: I didn't mind it the first time around since I didn't have to TRY to unlock anything, it was all by accident, but now I just want to get a game shark memory card and download a pre-completed save file . . . the same thing happened with Sonic Adventure DX and Sonic Adventure 2 Battle and while my brother and I played through both again, we stopped short of going though all the bonus missions and such to unlock added content -and again, a game shark is looking very attractive; at least most lucas arts games have cheats to unlock most such content -a god send for all the special features of Rebel Strike including one of the best home ports of the Atari Atari Wars arcade games -that work surprisingly well with the GC analog stick) I explained the complex context of my original statements above. As for Sega "coming back", I'm not sure what that would mean since they've have plenty of good selling and/or well reviewed games post dreamcast. Things have gone up and down many times, even after the Sammy takeover, and especially as far as publishing (ie not just in-house games) is concerned. And again, personal opinions are totally separate (I think Shadow the Hedgehog is a better playing, more polished, more enjoyable game overall than Heroes, but the latter got much better reviews closer to the Dreamcast Sonic games while Shadow was more in the range of the Game Cube and PC ports of the DC games -again, oddly DX and SA2B got poor reviews when the DC ones had gotten shining ones -and not even because of DX's bugs, but mainly due to "annoying" factors that had been ignored by the dreamcast reviewers -the xbox 360 port favored even more poorly on most commercial reviews though Classic Game Room did a side by side with the Dreamcast original and found the 360 port -derived from the old GC/PC versions- to have more responsive controls and otherwise being nearly identical) I can't personally comment on Sonic Colors yet beyond the videos I've seen and such, but on the surface it seems to have most/all of the shortcomings of the day levels in Sonic Unleashed. (if they did actually managed to reduce the annoyance factor, I -and especially my brother- will have a much greater chance at enjoying it -I really would prefer a game expending on the best aspects of Sonic Adventure 2 though, that's still the best 3D platformer -or at least 3D sonic game- that Sega has ever published IMO) They've done better after the fiasco of STH'06, but with something like that, there's nowhere to go but up. (though I'd argue that Secret Rings and Black Knight are more frustrating to play -not buggy, but I absolutely hate the motion controlled on-rails mechanic) It does seem to be one of the highest rated (if not the highest rated) Sonic game post dreamcast (ie after the >9.0 score of Sonic Adventure 2 -IGN gave it a 9.4 and the press average is claimed 92% by IGN though 89% by Metacritic; it's a shame that game came several months AFTER the DC was canceled ) I don't like the 2D emphasis at all (again, 2D/sidescrollers are NOT my forte, though I like some OK -none can compete with my favore full 3D platform games -thoguh flight/space sims among some others are more towards my favorites). OTOH, they probably should have aimed at such a "2.5D" style for a Saturn sonic game (be it prerendered or realtime, or a combination) and doing the same for Sonic 3D blast (on any platform) would probably have made the game much more popular and marketable (and better standing the test of time -though I rather like it for what it is, a lot of people find it mediocre). So far, for me, Sonic Adventure 2 and Mario Sunshine are the peak of 3D platformers from Nintendo and Sega respectively. (and I'm not sure any 3rd party games go beyond those either -tons of other 3rd party platformers, especially those on the PS2 I've never delved into) The style in SM64 is better in some areas (darker theme at times), but the gameplay is so much smoother and more polished in Sunshine. (the camera in SM64 is worse than Sonic Adventure IMO, let alone Adventure 2 or Sunshine) Edit: One of the added things I preferred about Sonic Adventure 2 (also true for Adventure, but with other trade-offs) was the extremely limited use of in-game voice acting. (more on the level of SM64 or Sunshine)
-
True in the sense of it being designed by former CBM engineers, though I don't think it was derived from any pre-existing CBM design (maybe inspired by a vague concept design at CBM, but given the fact that the design hadn't even been finalized on paper until mid 1984, I rather doubt it was "stolen" technology as such). Also, I'm not sure, but Atari Corp may have used some of the former Atari Inc engineering staff that were hired to Atari Corp (not sure), and more likely, the Atari Inc computer programming staff in building the OS based on the GEMDOS prototype. The Single Europe market was finalized in January? (the info online points to December of '92 being the date/deadline of the final transition) In any case, dropping the VCS/7800/A8 machines would have made sense by that point since they'd been selling increasingly poorly since the end of the 80s. (the A8 was declining before that even and the 7800's hardware sales -in the US at least- had been exceedingly weak by 1990 -1989 saw a massive decline from the '88/87 hardware sales too, in the US- ) Dropping those machines would have made sense regardless of general success of the company, but it was the decline of the ST family in the early 90s as well as a lack of a new home game console (and less than stellar performance of the Lynx -though it did apparently beat out the Game Gear in the UK) along with the general downward spiral of the company since Sam took over at the end of 1988, all put Atari in a rather bad position by 1992, worsening in 1993. (the Jaguar hype helped hold things together a bit longer in the short run, but Jack's decision to liquidate the company and get out while they were ahead was probably best -some have pointed to 1991 as being the point of no return, and that may be an apt evaluation, or at very least, thing would have been increasingly tight and increasingly unlikely for them to pull back into a major player on the console or computer markets) Had Atari Corp maintained (or improved) its strengths of business and entertainment/marketing management under the likes of Jack and Michael Katz, who knows what they might have managed in the long run. On another note: Jack's disagreement on the terms of Rosen's 1988 proposal for Atari Corp distributing/marketing the MegaDrive in North America may have been a mistake in hindsight -one of the major points of contention was Jack wanting to extend that to European distribution rights as well- but at the time, there's a lot of other factors to consider and the only clear-cut advantage of partnering with Sega was their 1st party software development strengths (significant, but still short of what that grew into in the early/mid 90s) and Atari Corp had a stronger market share in the US along with very strong European presence with the ST and revenue/market position to put up a much better fight in both the console and computer markets than they had earlier on with the debt and bringing the ST to the market. On top of that, Atari Corp had various plans for their own next gen game system (ST derived console, potential to rework with lynx chipset into a full home console, etc) and they had stronger US marketing/advertising staff/management (though working with tight budgets), a stronger distribution network, etc compared to Sega of America at the time. If it hadn't been for the likes of Katz (and later Kalinske) building up and managing Sega in the US as they did, the Genesis wouldn't have been nearly as big as it was. If Sega had screwed up much like with the SMS's marketing, failed to do all the key things that Katz and Kalinske managed (Competitive marketing, building up SoA in general, pushing more for western software development, smart deals with western 3rd parties -especially what Katz did with EA in the face of them going unlicensed, etc) and if NEC screwed up just as bad along with Nintendo being arrogant enough to ignore encroaching competition (and keeping their unappealing licensing contracts that 3rd parties were rapidly gettign fed up with), Atari would have been in a FAR better position in the US market. They'd have had the continued business sense of Jack and marketing/entertainment skills of Katz along with a much healthier budget to work with. The European software market gave a massive amount of talent to draw on and already had a strong bias for Atari's computers and they could have pushed as hard as possible to gain stronger North American developer support as well (especially from computer publishers -as Katz was already pushing for at atari- including the likes of EA but also gradually cutting into Nintendo's hold on 3rd party console publishers and maybe even gaining a bit of Japanese support -or at least open the doors for licensing Japanese arcade games and maybe some console/computer specific games), plus pushed for western arcade licenses and possibly built a better relationship with Atari Games. Hell, they could have started building up in-house software development on top of that and cut out the overhead and delays tied to only outsourcing software as such. (still outsourcing when desirable -as Sega quite often did- but balance that with increased in-house development starting with the remaining Atari Inc computer programmers and going up from there -and not just for games, but they could also have pushed for stronger in-house development for ST applications and updates to TOS) It still would have been a bit up to luck to hit upon the killer apps that would really drive them to massive success, but the more support you've got, the more likely that was to happen. (getting the EA tie-in like Katz did in the with Sega would have been very substantial for the US due to the Madden franchise -among other things- and sports games -especially Sega sports football and madden- was one of the biggest killer app categories for Sega in North America -another being the Sonic games, of course) That, and having a bunch of good games that sell reasonably well can eventually build up enough to supplement a lack of killer app games. (Sega's 1st and 3rd party software sales were distributed far more broadly than Ninteno's as such: you didn't see the extreme multimillion unit sellers nearly as much on the Genesis as the SNES -or NES- but you saw a lot more games were strong sales and a lot more 1st party games in general -of course, Nintendo's Japanese sales also heavily boosted those killer multimillion selling games where Sega was mainly limited to western markets)
