-
Content Count
2,444 -
Joined
-
Last visited
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by kool kitty89
-
Huh, the 1200 has a 32-bit bus? I thought it was all 16-bit (for chipRAM and the fastRAM expansion bus -or at least for chipram).
-
Yes, the CoCo commonly used that early on, but those chips were only common up to the early 1980s and rapidly disappeared with improvements in yields. However, there were some custom densities available later on (I believe Sega used 32kx16-bit PSRAM chips for the later model MD/Genesis consoles -up to the early model 2 systems, they were using dual 32kx8-bit PSRAMs thouch), and apparently SDRAM was offered in 256 kB densities early on too (the 32x and Saturn use 128kx16-bit chips -Saturn has 256kx16 as well), but maybe that was related to yields on early SDRAM chips too. (we're talking 1993-95 here) The 600XL does use 2 4416(16kx4-bit) DRAMs for 16KB, and is easily upgraded internally to 64KB using 2 4464s(64kx4-bit) and some jumpers. Late production XE's use 4464 DRAM, 2 for 64KB on 65XE/800XE and 4 for 128KB on 130XE. Thank you! That corrects the misinformation I received here: http://www.atariage.com/forums/topic/176524-7800-what-did-atari-wrong/page__st__150__p__2209704#entry2209704 4-bit wide DRAM? Didn't exist in 1984, or especially in 1983 when the 7800 was actually being designed. 16Kx4 would have been perfect for the consoles of the day. So it seems 16kx4-bit DRAMs became available by 1983, though probably not for most of 1982 (or at least not cost competitive with 16kx1-bit chips as in the 600 prototype and Atari 5200). That would also have meant significant cost reduction for the 5200 if Atari had invested in that. (a ton of potential to consolidate the 5200 -it should have been significantly cheaper than the Colecovision in the first place, but ended up rather sloppy overall -not just in terms of cost optimization, but that was one of the issues for sure -also one of the issues that could be completely solved after the fact too) Did the 1064XL module use 6 4416 DRAOM chips?
-
Hmm was that also the case with the Spectrum 48k in the early 80s? Also, that's all in the context of 1-bit wide DRAM chips (albeit 2-bits wouldn'd be any better). What about 16kx4-bit DRAMs? (where there never 4-bit wide DRAMs at 8 kB densities, even by the mid/late 80s? -they must have had 64kx4-bit DRAMs by the late 80s given the C64 switched to using dual 32 kB chips by that time, but maybe that was never pushed for lower densities) Even if 4-bit wide DRAM chips became available too late to be of use for the 800, they could have been useful for the 600XL's RAM expansion (at least if the 600XL had stayed in production longer) via 6 16kx4-bit chips rather than 8 64kx1-bit chips. 2-bit wide chips would have been useful for 32k onboard RAM or expansion boards (4 32kx2-bit chips), but not too much else. (if there WERE 2-bit chips available in 1983, it might have been smart to make the 600XL a 32k machine out of the box and making the expansion board cheaper/lower power as well -the price of 8kB densities had already dropped below 2kB chips by '83 -and with a rapidly widening gap- plus you'd save traces and board space with 4 chips rather than Also, it was previously implied that you're limited to multiples of 2 for DRAM densities, but it's really multiples of 4 most of the time (with few exceptions -often "half bad" chips rather than genuine 1/2 densities as well). The "normal" DRAM densities are always powers of 4 starting at 1 kbit (128 bytes) and going up from there. (4kbit, 16k, 64k, 256k, 1M, 4M, etc -4Mbit, of course being 512kB)
-
How much would you pay for a Neo Geo AES
kool kitty89 replied to ATARI7800fan's topic in Classic Computing Discussion
There's been a lot of JP systems in similar/better shape than that going for around $230-250 (buyit now, shipped), but that's with no games. (probably better to sell the games separately though) -
Atari and their pork pies (Atari STE specs)
kool kitty89 replied to oky2000's topic in Atari ST/TT/Falcon Computers
For them, or as a reflective look back? As stated, for them the price point was the major issue they felt held it back. Hence the lawsuit. Verbatim from Leonard. I mean from a mass market perspective at the time. 3-4 hours (or less) was just too short to be practical. The PSP ran into the very same problems against the DS (more factors than that, of course). Price point is important, but it's almost never the deciding factor, at least for the video games market (especially in the US and Japan). Hype seems to be the #1 factor, albeit tied to many other factors to back up that hype. Also, I highly doubt Atari could have ever undercut the GB's cost, price perhaps, but only because Nintendo chose higher margins (for profits or re-investment in marketing/software development -and higher revenue in general making investors more attracted). The GB would have been fundamentally cheaper regardless, let alone if Nintendo decided to invest in even lower cost hardware sooner than they chose to, and as such, Nintendo could have dropped margins (and price point) to below what the competition was pushing. If Atari thought price point was the most important, they needed a much cheaper design than the Lynx. (hell, it's odd that Sega priced above Atari unless they had sloppy engineering/manufacturing and/or opted for higher margins than Atari -or that it had a shorter battery life given the hardware was generally simpler and the screen was smaller) Software support (1st and 3rd party) and marketing/hype are always huge issues, in fact often the MAIN factors for the success of a game console. (that's how Nitendo pulled off things off so well -they had the exclusive contracts, but they had to get the market position to enforce those contracts as well -they had to do it in Japan, then again in the US, and failed to do so in Europe; that's also how Sega managed to get big in the early 90s thanks to Katz and Kalinske's marketing push and build-up of the western divisions and 3rd party software connections; that's also the main reason Sega failed with the Master System in the US -the 7800 had much bigger disadvantages due to limited funding, far less ideal hardware for the market that emerged in the late 80s, and nowhere near the in-house software resources of Sega -Sega had enough to meet Nintendo and push a good way towards competing with 3rd party games on the NES as well -by 1988/89 Atari Corp was finally in a financial position to push the sort of investment needed to really succeed in the US games market, they needed to be willing to take big risks with heavy investment if they wanted to break in, and they should have had the assets and credit/investor interest to pull that off under the right management -like Sega did) In Europe, it's a bit of another story in general, especially in the late 80s and early 90s when price point was still a major issue (less so from the mid 90s on), but the high population density and heavy game/computer magazine culture meant FAR more effective viral marketing. (the MegaDrive probably would have been a smash hit in Europe even with far weaker marketing while it probably would have faded into obscurity without the management and investment made by SoA -Sonic wouldn't have become an immediate smash hit without decent ad campaigns and without being pack-in) However, if Atari DID push for such a heavy-hitting campaign in the video game market around '89/90, they really should have focused on the 4th generation home console market. (pushing into the handheld market was great and the Lynx was certainly innovative -and ended up doing rather well in Europe, but home consoles were established and Atari had real potential to push into it -of course they had planned the ST derived system and then the Panther, but that all fell through and the Jaguar left a massive gap for Atari's market position to be healthy -also odd that they invested in the Panther at all when they had the makings for an awesome chipset via the Lynx, and even an ST/STe derived console could have been better than the Panther in some respects -at least more developer friendly and flexible, plus with common components with the computers, like the 5200 -or for that matter, even when they decided to drop the Panther, why didn't they switching to something else in the interim -since the Jag was going to be 1992 at the very earliest, and that was highly optimisitc; aside from the panther/ST/Lynx, they also were working with Flare and Flar had the completed Slipstream ASIC which Konix had not licensed exclusively and thus Flare could sell it to anyone else they wanted to -the Slipstream Design would have made a really nice, flexible low-cost/high performance 2D/3D game console ready made for production in 1989 -at least if they opted for a Z80 or x86 based CPU- and with even more potential than the Lynx in some respects -though a Lynx based console could have been tweaked in other ways, so that's a bit open-ended) That would be something to ask Martin Brennan about: if he ever suggested Atari Corp use the slipstream chipset while he was working on Panther. (given he was not fond of the Panther's design and specifically suggested the Jaguar -or what would become the Jag- due to the problems with that design -actually, I wonder why they opted to evolve the Panther's Object Processor and implement it in the Jaguar given the dislike of the concept -and not pushed a more powerful blitter alone with simple framebuffer VDC logic) I was always impressed by the GG back then when I saw one (hadn't realized the battery time issues until later), but never saw a Lynx growing up in the 90s in California. (I'm sure I'd have been impressed with that too) However, I also never had any real problem with the GB's unlit screen in general; hell, reflective screens are BETTER in a lot of cases (namely in brightly lit areas, like outside -ie playing after school, etc), though the lynx DID add a visor to address the problems of a backlit screen in direct sunlight (unlike the GG). The viewing screen on color LCDs of the time was also a lot more problematic than monochome screens. (also related to the severe contrast issues on early color LCDs, the main thing that made reflective color screens impractical until the mid 90s -at least impractical for high color depth, maybe 6-bit RGB or something close to that would have been reasonably approximated by the early 90s -probably good enough to be well worth the cost/size and -mainly- power consumption advantages) One HUGE issue is the software, especially for the US market. Even before the GG or Lynx got dropped in the mid 90s, the GB had so many more options across the board for the games consumers were demanding and also the brand recognition to back that up (both for 1st and 3rd party games). Nintendo's exclusive Tetris license was obviously a significant factor too. (and the fact it was a common pack-in option) I can understand why some people find the GB simply unpleasant to play, but I can also see why many others find 3-4 hours (sometimes less on early model Lynx and GGs) to be totally unacceptable for a portable on the go system. (let alone the bulky size vs a pocket-friendly system -GG can just barely fit into a large front pants pocket though -and that's all aside from the hype/software/brand recognition of the public; those being the main reason the GG did so much better than the Lynx in the US in spite of being less powerful, more expensive, smaller/poorer screen, and having worse battery life) Hell, the Game Boy Pocket's drop to ~1/3 the battery life of the original was also a huge problem for many; albeit ~12 hours wasn't too bad for sure, but it was a far cry from the ~30-40 hours possible on the original models. (they probably should have used 2 AAs rather than AAAs -AAAs are also less common and less cost effective to use in general, so another headache) -
Atari SM147 Monochrome Monitor - Impressions?
kool kitty89 replied to wood_jl's topic in Atari ST/TT/Falcon Computers
Cause of the crazy ST video signal. The ST borders are actually part of the picture from the monitor's perspective It's all about calibration. There's no hard limit for overscan (until you hit the actual edge of practical vblank or hblank for synchronization purposes), and indeed on standard definition monitors with calibration like those of most SDTVs, you can very well end up losing some of the picture due to excessive overscan (ie you might have a 320x240 image that ends up with only 296x224 on-screen). With user adjustable controls, that's a non-issue though, and something that's even more necessary for later multi-sync monitors. (actually, it would have been cool if Atari had supported grayscale multi-sync monitors with grayscale versions of the 200 line modes to at least allow users to run color-specific software in grayscale -heh, that's what I was stuck with on a low-end VGA grayscale monitor as a little kid in the early 90s; hell, maybe some developers would even have supported a grayscale mode for games with different colors used to cater to that ) A monitor "properly" calibrated for the ST video timing will show large borders when connected to an ST. It is largely because the ST's pixel clock is not correct for the screen timing. The ST's pixel clock runs too fast, so all the displayed pixels are squirted out in less time than would otherwise happen. the border color is displayed where there are no pixels. Yes, but what defines the standards for such calibration? How much is "normal" for off-screen/hblank (and vblank) area? Etc, etc. For a lot of multi-sync analog monitors, you have to manually calibrate things for certain resolutions (some resolutions have common overscan boarders, some are considerably off). And given the ST's resolutions are all direct multiples of one another, a single monitor calibrated with that in mind (either to minimize boarder or compromise for square pixels in 320x200 or 640x400) would be the best by default. (preferable with easily accessible pot knobs for manual adjustment -technically you could have 640x200 with square pixels and a huge vertical boarder) For normal TV's there's a fairly wide range of "standard" calibration. The most common for modern sets (late 80s onward) is around 224/448i vertical lines and approximately 75% of the horizontal scan visible (25% is in overscan), but the "correct" calibration for NTSC (or 60 Hz/15.7 kHz SDTV of any sort) is technically 240/480i lines and ~80% of the horizontal scan visible. Of course, some older (and even not so old, but cheap/low quality) sets will be further off and possibly even have aspect ratio issues. (some sets don't even show a vertical boarder with 192 lines) As such, the Amiga will also show a significant boarder: vertical just as large as the ST (for NTSC -PAL can optionally go up to 256 lines though, non interlaced), but a smaller (but still noticeable on most sets) horizontal boarder similar to the 2600/A8. (ST's horizontal boarder is more like the C64 -apparently the C64 uses a 4/8 MHz dot clock rather than being tied to the NTSC color clock) The 5.37 MHz dot clock of the TMS9918/NES/PCEngine/Genesis(some modes), SNES, etc gives almost exactly 256 pixels on NTSC TVs at the common (75% visible) calibration while the Genesis's 6.67 MHz mode gives almost exactly 320 pixels for that common calibration. (it also gives a good middle ground between PAL and NTSC pixel aspect ratios -so games that have art designed for square pixels will look OK on both, not perfect, but not too bad -unlike 5.37 MHz which looks OK on NTSC but super wide in PAL) 7.16 MHz gives almost square pixels in PAL, but way too tall in NTSC, while 6.25 MHz does the same for NTSC. (the 6 MHz used on the Neo Geo is pretty close to that, but very slightly wide -it also only ends up showing about 288 pixels on "normal" TVs, and is, of course, pretty wide in PAL) Of course, with a computer, you can't afford any loss to the boarder, so the Amiga/A8 resolution was pretty much right-on for NTSC. (320x200 at 7.16 kHz was pretty much right-on for any good quality TV from the mid 80s onward -though might be problematic for some old/cheap sets -without resorting to recalibrating, albeit a lot of older sets had external pots for adjustment, even some really low-end sets like GE's portacolor -that one even has external RGB pots on top of H/V scan adjustment) 7.16 MHz is also pretty close to the limit for practical viewing via RF, at least for text. (and only for good RF at that, and both composite and RF would be limited by chroma interference for color, though for a luminance only signal, you'd be about as good as RGB through composite, though RF noise would still limit things -all A8s with monitor ports have- with a decent TV with composite input, you could reasonably manage much more than ~7-8 MHz, closer to double that -only limited by beam precision and phosphor dot pitch) Yes, just like other systems used common NTSC color clock derived frequencies. (had the ST used composite video by default, it would have had to have a 3.58 MHz compatible oscillator on all models rather than only some) As it is, it would have been nice if they'd used a faster master oscillator (like 16, 24, 32, or 48 MHz), they could have had a lot more options for potential dot clock rates for the SHIFTER, and more options for different CPU clocks at launch (or at least later on). Hell, even using different dot clock versions of existing modes (just with different shaped pixels) could have been really useful, namely lower res versions of current modes. (like 640x200 4 color with wider pixels -with more and more into overscan as dot clock dropped, or the same for 320x200 4 colors) As it is, they at least could have offered a 4 MHz low-res mode (exactly the same as the C64's pixel resolution) with 320x200, but only about 192 pixels (or a bit less) visible on-screen with normal calibration. Actually, what might have been really interesting if if they'd used a 50 MHz master clock from day 1 and used 10 MHz CPUs as the base standard (rather than 8 MHz) and offered 12.5 and 16.67 MHz models too (maybe 12.5 initially as the high-end version and 16.67 MHz as it became widely available). Maybe use 8.33 MHz if that was an acceptable rage of overclocking for 8 MHz rated parts. (otherwise the next step down would be 7.14 MHz) Faster oscillators (especially high resolution ones) would generally be more costly though, especially ones at non common speeds. (I think 50 MHz was pretty common though, probably a lot more so than something like the 53.7 MHz Sega was using with the Mk.III/Master System in '85/86) -
Atari and their pork pies (Atari STE specs)
kool kitty89 replied to oky2000's topic in Atari ST/TT/Falcon Computers
Modest upgrades to the SHIFTER like H/V scroll (maybe some really simple fill type acceleration operations -well short of the BLITTER), maybe even adding more bitplanes (and color registers or direct color+indexed hybrids), maybe a dual playfield mode (would have been really significant, especially if dual 16 color layers were supported -vs the Amiga only having 8 color layers with DPF -not counting sprites), and 12/16 MHz CPU options would have been great. (CPU upgrade and scrolling would have been the most foolproof by far, that and simple DMA sound support and/or a YM2203 replacing the 2151 -fully backwards compatible, but adding 3 4-op FM synth channels -exactly half of the MD's YM2612, especially with the talented Euro chiptune artists pushing it -Japanese obviously had some great examples, US was pretty weak on average with FM synth in the arcade/computers/consoles unfortunately) Most of those enhancements should have been employed across the board as the base standard too rather than retaining the old models. (aside from high end features and a full shift to a much more powerful standard to counter VGA, more powerful CPUs, better sound, etc on PCs -the Amiga lagged there too, only aided by having more of a head start) That, and they made the same Mistake Warner did by blocking easy expansion. (especially a cheap single expansion port in place of the cart slot for general expansion without having to solder internally) Then they could easily roll that into the expansion supported on pizzabox/bigbox models. (they should have been pushing both of those sooner too, if not from day 1 -especially in the US) Irrelevant, GameBoy was more popular because of the price point at that early stage. Certainly not the technology, and the game library wasn't that large yet either. A same priced or lower priced Lynx would have been much more competitive, hence why they were pursuing that at the time. Likewise the Wintel comment makes no sense, wintel dominance didn't occur until Win95 - which was after Atari Corp. had already pulled out of the computer line. 3.1 simply gave a launch pad. The price point was probably way down on the list, especially in the US. It was much more due to advertizing, brand recognition (tying into market positioning), software support (with established 1st party franchises and 3rd party support), better funding in general (tying into all of that), and then the hardware advantages on top of that: the biggest being the excellent battery life of up to 40 hours (real world figure for using good alkalines, though often closer to 20-30), so close to 10x that of the Lynx (with 2/3 the batteries), and then the compact size on top of that. (the price point came after those issues, and even if the Lynx HAD dropped below the GB's price, it would have still had all those other disadvantages -at least until they could start offering models with unlit reflective color screens, and then somehow boosted software and marketing support on top of that -lacking a 4th gen home console was a major problem too) The Game Gear was significantly more popular than the Lynx in the US in spite of being more expensive, having even worse battery life and a weaker screen (and weaker hardware in many respects). It was simply marketed better and supported by a popular brand at the time. (the GG did poorly in Europe by comparison though) -
Hmm, it was my impression that it was RAM that took the place of the entire address range in the XLs (save I/O), with the 400/800's OS ROM address range filled with RAM. (with the OS loading into RAM rather than running straight from ROM on the earlier models -or having a full 62 kB for low-level programs that bypassed the OS -except carts which needed at least 8k of that for ROM banks) Besides, even if they HAD wanted to keep that for the OS (and kept the OS ROM flat mapped into memory), they could have taken the 130XE route for memory expansion and used banking within the normal (non ROM) 32k address range. (stay out of the 16k that doubles as cart ROM space -to allow that RAM to be used for carts, even better if a mode was supported that mapped the 16k normally overridden by carts to be bank switched in as well) That would mean 48k RAM flat mapped at any given time (or 32k+cart ROM), plus the OS ROM and I/O and additional RAM paged into the 32k (non cart conflicting) range. That would also mean no RAM wasted by cart addressing or OS usage. (and a scheme that could be expanded with no hard upper limit -just as with the 130XE's scheme or Mosaic) Hell, if they really wanted to make it flexible (and felt the cost was acceptable), they could have included all that AND offered a 62k (or 46k+cart) flat mapped RAM mode as well. I wasn't saying having 2k for I/O was a bad thing, but that moving on to bank switching alone with the XLs could have avoided that 2k sacrifice altogether. (at the expense of somewhat less flat address memory) Plus, didn't they end up using some of the PIA lines for the XE memory map. (part of the reason they dropped controller ports 3 and 4)
-
They didn't need it for 8K, 16K , or even 32K boards. The select signals handled those boards fine. But if you try to make a 64K RAM board that covers the whole address range (like the XLs) then you miss A14 and A15. You can get A15 from the OS ROM slot, though, and could jumper it over to the RAM board. In fact, you can regenerate A14 using A15, S0, S1, S6, and S7 (all of which can be found either on the OS ROM slot or the 1st RAM slot) but it's inconvenient (and perhaps not obvious). So maybe they just did not foresee the need for more than 48K RAM in the 800. It wouldn't have mattered if Atari had gone with bank switching within the original memory map rather than remapping the system for the 1200XL. What's interesting is that the Mosaic expansion board for the 400/800 had predated the 1200XL by over a year and had not only used bank switching, but used part of the address range that was totally unobtrusive to the original memory map. (ie used the 4k "hole" that was unused in the original map -3 banks of 16k for RAM, 2 of which double as cart address space, the 4k "hole", 10k addressed to the OS, and 2k dedicated to I/O) That also meant that the Mosaic board used a full 64k -or more- (albeit only 50k flat addressed at any given time) vs the XL/XE map that was a maximum of 62k (130XS banking was also 62k at any given time switching 16k banks) with 2k wasted for the I/O space. But it's worse than that since you normally also have a chunk of RAM for the OS to load into. (though for games and applications that bypass the OS, you could have full use of 62k) I think Chilly Willy explained that in the "What is the Atari 400?" thread. It would have made sense for Atari to jump in with a standard for expansion compatible with the Mosaic board (which was getting software support well prior to the XL's release and offered a standard expansion scheme with no upper limit -banking in 4k chunks), but the disadvantages would have been added complexity in the memory mapper logic (bank switching rather than direct addressing) and there may have been legal issues over using Mosaic's scheme. (and Warner/Atari probably didn't want to pay licensing fees -though I can imagine a favorable deal could have been negotiated with Mosaic since Atari's compliance with Mosaic's scheme would have obvious advantages for Mosaic while a competing standard would have the opposite effect) They probably would have avoided many (if not all) of the compatibility issues that the XL had and also could have retained the 4 controller ports. (iirc the XL mapping scheme used additional PIA I/O lines that were normally used for the controller ports -of course, dropping the ports was also a cost saving measure, but then they could at least route them to a cheap PCB edge connector expansion port -like the PBI- for addition of 2 more ports)
-
Yes, a lot of potential there too, and multiple routes to go for evolving the platform. It took until the IIc Plus to get a faster CPU, but they at least could have bumped it to 2 MHz with interleaving in faster RAM (like the BBC Micro) or 3 or 4 MHz even (but then it would make more sense to switch to a wait state mechanism more like the A8's DMA rather than plain interleaving in fast RAM -or going to a dual bus design, but that's more costly). There's a lot of middle ground for the IIe to the IIGS in feature set. Maybe just a true 16 color bitmap mode (even if only 140x192), boosted CPU speed, and a basic sound chip or a simple DMA sound circuit more like the Mac. (hardware V/H scroll registers would have been rather significant) Then the rest is just software, you could have a GUI running in monochrome 560x192 and probably decently fast at apple IIc plus speeds, that on top of all the software the Apple II was getting in general. Maybe an upgrade more like the CoCo I/II to CoCo III jump (double speed CPU, added interval timer functionality, added color and resolution capabilities, hardware scrolling -maybe DMA sound or at least a DAC or bank of DACs to work with), but earlier. (like in '83 or '84) Hmm, maybe an AY-3-891x instead of the simple DACs, especially if they used the parallel port(s) provided by that chip. (more so if the memory map for the chip was consistent with the Mockingboard -volume modulation on the AY also allows some fairly decent sampling, even on single channels -POKEY's 4-bit linear volume is a bit better, SN76489 is significantly coarser, but approximating 8-bit PCM is also possible with multi-channel hacks) Then they could bump to something closer to the IIGS, but maybe add a blitter of some sort and cut back sound a little (simple DMA sound probably would have been more economical than the ensoniq chip, something like the Archimedes perhaps, or an 8 channel Amiga). Using a faster CPU from day 1 would have been critical though and a fast 65C02 (especially an R65C02 with all the added instructions) would have been better than the slow '816, plus they could have used a custom bankswitching scheme that was more desirable than the '816's segmentation. (might have ended up with something more like Hudson's 6280, at least after further consolidation -probably initially just a plain 65C02 with external logic for banking and to allow single cycle memory access timing rather than 1/2 cycles and 2x the RAM speed -hence why the 6280 at 7.16 MHz only needed 140 ns ROM/RAM to run at full speed in, granted you'd need FPM support to reach those speeds in DRAM and you wouldn't get much more than 4 MHz with plain random accesses to DRAM) Then there's also options for boosting a 650x externally with various coprocessors for fixed point math (like the multiply/divide/ALU hardware added to the SNES or Lynx -the Flare 1 did that too, but for Z80/808x). They could also have opted for relegating the 6502 as a coprocessor on a separate bus with all the audio, video, and I/O and add a 68k as the new CPU (sort of an Apple II/Mac hybrid that uses the old hardware relatively efficiently). But, in hindsight, given what happened to the 68k architecture by the early 90s (Motorola limiting licensing of anything beyond the 68000, limiting competitive pricing, falling behind x86 and newer RISC designs), they may have been better off sticking with 650x alone and making the jump to RISC sooner. (ARM would have been the lowest cost option and I believe they entered the open market around 1989 or 1990 -prior to that Acorn had limited it to in-house designs iirc- so they could have jumped in with the ARM2/ARM2as and ARM3 for the high-end stuff -somewhat fitting to make the jump from 650x to ARM given that Acorn supposedly had the 650x as part of their inspiration for the design, and their previous computers had all been 6502 based) Well, that depends on some things. Lower margins don't mean lower profits unless you don't sell proportionally more in total. You'd want high enough volumes to make those low margins MORE profitable than the high margin low-volume production example. As volumes go up, economies of scale go up (even if they totally outsourced production, the savings would be very substantial, plus such volumes make heavy integration of components more economically attractive), so the margins would again go up as production got cheaper. (they wouldn't have to keep cutting the margins, just managing the prices on the low-end models to remain competitive in value -they lacked the hardware capabilities of much of the competition, but made up for it with software and expandability: again I'm thinking on a simple expansion port for the low-end models that could be extended to a full peripheral module, of course with that module set at a price that would make it more expensive to buy with the low-end machine than a standalone full Apple II) The other thing is that they wouldn't just be pushing for the low-end. They'd have the cut-down low-end models and the full-fledged apple II machines for the mid-range market (probable still at somewhat lower prices, but in part facilitated by the same cost savings of integration on the low-end machines), but then also pushing for even more advanced higher-end models in addition to that. There's all the hardware possibilities above for evolution of the line, and as it evolved, the older models could fall into the low-end and the previous bottom-end models could be discontinued (and so on). But beyond true architectural upgrades, there's form factor, and it may have been significant to offer the Apple II in an IBM-like desktop box+separate keyboard form factor with bays for internal disk drives, etc, and a keyboard more competitive with IBM's. (at least a keypad added) And then there's the potential for expansion into the much more price-sensitive European market. (again, the pretty decent tape loading speeds of the Apple II would have been an asset there) They could have had bottom-end machines competitive with the ZX Spectrum's pricing (more than the ZX80/81 though), but getting started earlier and with US software support to boost it early on. I agree that going all low-end could have been a bad move for apple, and that's also why it probably wouldn't have been practical to push such a move until the early 80s after they'd built up a fair bit with their initial success of the Apple II in the late 70s. (especially since initial sales for computers in the late 70s was going to be limited in general, and there were the FCC issues for any TV compatible machine that forced Atari's more costly configuration of the 400 and 800 with Class B not arriving until about 1980 -the CoCo and VIC-20 seem to have arrived at just the time to cater to that -Atari lagged with cost reduced redesigns catering to that standard, let alone European models that never had such restrictions) Not Sinclair though, like Apple they had very simple hardware to offer, cheap enough to compete without any vertical integration. Yes, but high volume low-margins could be just as profitable, if not more. (plus I was suggesting higher end/mid-range machines to remain while specific low-end models were introduced -and replace those low-end models with upgraded ones as the higher-end machines evolved) That's basically what I'm suggesting, except introduced around 1980/81 in a form factor more like the CoCo (and no FCC class C stuff to deal with), and an even more cost-cut Euro specific model with no shielding at all. (probably bottom end models with hard capped chicklet keyboards -like the CoCo- but then low-end models with proper keyboards at a moderately higher price -something Atari should have done for the 400) Again, I'm not saying it would have been smart for Apple to go full-in low-end, but diversify their market once they got established enough to do so. And, critically, do so with a compatible range of machines rather than totally new/incompatible machines. (Commodore had that problem for sure, Tandy did too for that matter) Diversity is something Atari would have greatly benefited from as well, like if the 800 had been closer to the Apple II (comprehensive expansion support and monitor only design spec to meet FCC Class A, then have the 400 -preferably with full keyboard models available- for the TV compatible lower-end and go on from there -better if the 400 had a PBI like port for an external expansion box compatible with said internal slots on the 800). Atari engineers initially wanted to push the 800 more into the Apple II design region, but Warner's emphasis on entertainment (and apparently Kassar's insistence on the "appliance computer" concept prevented that). Actually, diversity in the market is something that most major computer manufacturers of the time, even IBM. (they tried with the PCJr, but screwed up pretty badly -Tandy showed how it should have been done, except they limited it to Radio Shack distribution as with their other machines -something that probably helped the TRS-80 early on but ended up limiting Tandy's computers later on when they didn't expand to other outlets and dealers) Conversely, Commodore and Atari both could/should have been aiming at diverse markets as well, catering to the low-end, mid-range, and high end with compatible machines. (Atari sort of did that, but marketing was limited, they didn't push the 800 into a "full" computer like the Apple II -internal expansion and FCC Class A design- and CBM kept pushing incompatible machines from the 40 column PET to the 80 column to Super PET to the VIC to B128 to C64 to Plus/4 to Amiga -you did have the C128, but that was rather an inefficient way of allowing compatibility and it clashed more with the Amiga line in the end; if they'd diversified the C64 line directly -let alone if it had been directly VIC compatible- they could have totally cut out the Plus/4 and phased out the VIC in favor of just C64 compatible machines and then the Amiga as the next leap -they could have grafted C64 compatibility onto that too, but that would add other trade-offs especially if done without even making decent use of the C64 chipset) There was Tandy in the lower end market too (though also in the higher end and mid-range), but the CoCo didn't ever manage the market share of the C64 or A8 AFIK. (a lot like the Apple II tech wise, better in some respects even, but it obviously didn't get the same kind of software support as Apple did) Yes, but I'm talking about keeping those full models and offering a cut-down package that retained expansion, but limited it to a single slot more like the Laser 128 or some other 8-bits. (same flexible expansion, but only 1 expansion board without adding a separate expansion module chassis -and to make the full Apple II models more attractive, the cost of that chassis plus a low-end model would need to remain higher than models with internal expansion slots) Plus, there's what I addressed above with actual expansion into even higher-end models (and progressive evolution) including a PC-like form factor in addition to the lower-end range. That's what limited (or limits) them to more niche markets whey they had the potential to become a real market leader. A wide range of compatible machines complementing each other could have done wonders for Apple. (and is something that pretty much all the 8-bit computer makers lacked -some had a wide range, but not compatibility -and the value of compatibility and evolutionary design, at least for computers, can often outweigh other cost advantages of a fully incompatible design, especially for architectures fundamentally oriented around flexibility and expandability -something that Atari, Tandy, CBM, and Apple were all limited by, and something IBM was also limited by with their later attempts to regain control of the PC market rather than going with the flow of established market standards of the growing PC clone market -where IBM still would have had an advantage with high volume vertically integrated production and for setting new standards -as long as they didn't conflict with the mass market like MCA did) Its onboard feature set was even less than the CoCo and more like the ZX Spectrum (actually more limited in some areas), and like the CoCo and Spectrum, it had flexible expandability, but got more support for such expandability for the most part. (part of that was in offering the higher-end models, something they definitely should not have stopped doing if they went for the low-end -the whole idea is to expand their range of the market rather than catering to a small niche; imagine if the Atari 800 had featured the apple-II like expansion slots engineers had wanted, or at least if PBI had been there from day 1 on the 400 and 800 and a 1090XL-like chassis was available as well at a moderate cost -if they had models with that expansion built-in, the pricing should have been managed to make the low-end models plus chassis more expensive than the high-end models, but otherwise there's no limit on the pricing) 6809 has the same address limitations as the 6502, though better features and clock per clock performance; no compatibility though. (the 65C02, especially with the added Rockwell ISA added many more 6809 competitive features and all the instructions that the 65816 had, but with clock per clock performance disadvantages with the 6809 or '816 -also lower cost than any of those though) There's no reason not to just push for faster 6502s and later 65C02s and even advantages over doing that rather than jumping to a 65816. (if volumes got high enough, they could even license the 65C02 core for a custom version with such bank-switching/MMU logic integtated into a single package -again, like Hudson's 6280) Why even bother with the Mac as such, why not focus on evolving the Apple II hardware (and software) to the level of the Mac? (from the programming PoV, a 650x -even with with extended ISA and good MMU/banking scheme- wouldn't have been as programmer friendly as the 68k and would have had performance trade-offs -some things the fast/simple instructions of the 650x are better at, other situations where the 68k ISA is better suited- but from the user PoV, as long as the applications run well and the OS was comparable, it shouldn't have mattered -the IIGS's OS was actually better than the Mac's at one point, though with a stock 2.8 MHz CPU, things ran pretty slowly -a blitter would have greatly helped that too) There's tons of middle ground from the older Apple II models to the GS, things they could have been pushing even before the IIe, let alone between the IIe and gs, but they didn't. When the GS came out, it still had an arguable value advantage over the Mac (much better features at lower cost), but the slow CPU and lack of blitter hindered it. (plus Apple's preoccupation with the Mac and generally divided support for the II that was addressed by Ransom) Until the Laser 128 with the legally reverse engineered ROMs. (a stronger Apple II market could have meant more clones like that emerging as happened with PCs) It also offered expansion support and a numeric keypad that the IIC/C+ lacked.
-
Yeah, I can see that, it would be turning too hard all the time. (probably part of what makes the accelerated turning worse is just how hard you can turn at the peak rates) Huh, weird, at first glance it seems to be relatively similar to Virtua Racing, but then there's something "off" that's hard to figure out jut by looking. (that would definitely explain some of the problems) That might explain the lack of turning mechanics corresponding to speed. What about the "button tapper" for pseudo-analog control? (it seems like allowing fixed rate turning combined with the button tapper would make things better in general . . . or if the game had supported the internal analog controls -specifically added to avoid the cost of external DACs, but obviously unused -I wonder if Atari ever planned to release a racing wheel, let alone an analog gamepad or joystick) I don't think it would be good to turn any faster, maybe a tad slower. (but that would require more breaking) Playing it back to back with Virtua Racing, it definitely seems to oversteer at your current settings. (by comparison I pretty much never have problems oversteering in VR to the point of running into walls -though it does force a spin-out if you turn too hard at too high of a speed, so that's different-) Tapping the break or throttle is a lot easier to manage than tapping the steering wheel IMO.
-
Atari SM147 Monochrome Monitor - Impressions?
kool kitty89 replied to wood_jl's topic in Atari ST/TT/Falcon Computers
Don't all those analog monitors have calibration support via pots? (it's just that most are inside the casing, so you have to open it to adjust it) External pots would have been great, not just for fine tuning to one's personal preference of overscan, but also for allowing perfectly square pixels (which would leave a noticeable vertical boarder by necessity). Also, it would allow you to stretch it even further for Mac emulation. (you'd want both overscan and positioning controls though) Also, can't you just wire an ST's monochrome lines to most/all VGA/SVGA monitors in general? (most of which have overscan control) Unlike the color modes which use 15 kHz h-sync and would only work for standard def (or multimedia) RGB monitors and some early VGA monitors specifically designed to support EGA/CGA sync rates. (if you managed to find one of the latter, you could even rig up a simple switch with a custom/hacked cable to toggle between mono and RGB video lines from the ST) Cause of the crazy ST video signal. The ST borders are actually part of the picture from the monitor's perspective It's all about calibration. There's no hard limit for overscan (until you hit the actual edge of practical vblank or hblank for synchronization purposes), and indeed on standard definition monitors with calibration like those of most SDTVs, you can very well end up losing some of the picture due to excessive overscan (ie you might have a 320x240 image that ends up with only 296x224 on-screen). With user adjustable controls, that's a non-issue though, and something that's even more necessary for later multi-sync monitors. (actually, it would have been cool if Atari had supported grayscale multi-sync monitors with grayscale versions of the 200 line modes to at least allow users to run color-specific software in grayscale -heh, that's what I was stuck with on a low-end VGA grayscale monitor as a little kid in the early 90s; hell, maybe some developers would even have supported a grayscale mode for games with different colors used to cater to that ) -
Would Apple have been successful (or more successful) if they'd pushed the Apple II into the low-end market in the early 80s? It really seems like an ideal platform to push as such given the extreme simplicity, but also the general support it got in the late 70s. They could have kept the mid-range market (maybe even extended into higher-end models in a more IBM-like form factor), but also pushed heavily for a consolidated and more minimalistic design closer in overall configuration to the CoCo (and even cheaper). Something with full compatibility with the standard apple II line, but cut down to a much smaller motehrboard with only 1 (external) expansion port that can be used to add a proper expansion box for multiple cards (like the CoCo had or Atari was supposed to with the 1090XL). By about 1980, relatively low-cost ULAs should have allowed for considerable consolidation of the Apple II's discrete logic chipset without even having to invest in full custom chips (which would only become economically viable at high volumes -but a good option to move on from commodity ULAs). In addition to that, you had the FCC Class B making TV compatible machines with onboard RF adapters without excessive shielding realistic. The Apple II hardware should have been fundamentally cheap enough to actually match (or undercut) the offerings of the vertically integrated CBM for machines in similar configurations (like a bottom end Apple II with 4k DRAM vs the VIC-20 or a 48k/68k model against the C64), and higher volumes and further integration would have further advantages. Then there's the European market. Apple did relatively poorly in Europe as it was, but a low-cost strategy could have totally changed that. Technically speaking, the Apple II probably could have been made to be cheaper than (or close to as cheap as) the ZX Spectrum, and that would have been very significant. The Apple II also had a decently fast tape loading speed (about 1200 baud iirc, double Atari's and quadruple CBM's -aside from custom loaders- though somewhat slower than the Spectrum -less than 1/2 the speed of double speed loaders for the speccy or CoCo) Would it have been wise for Apple to expand their market as such, would they have been more successful or even the dominant platform of today? (the latter would be dependent on the clone market which obviously have expanded much faster with a more popular Apple II) And if they were successful as such, would it have been wise to continue with upgrades and backwards compatible successors to the Apple II line and avoid the Mac altogether. (or possibly merge the Mac and Apple II in as efficient a manner as possible)
-
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Yep, Nintendo railroaded Atari in court as such with that grandstanding BS and Atari f*cked up in responding to it. Still have someone on block, so didn't see the original post. The content of the quote is wildly inaccurate on so many levels, and even the context it's being presented in here is a bit off. The book it's taken from (Game Over) goes on right after that to say that even Lincoln was surprised when the jury ruled in favor of that one issue (there were three issues and the jury was deadlocked on the other two). Yeah, sorry, I responded to the direct quote as if that was directly from the trial and without knowing what actually happened. (obviously if Atari DID lose over misinformation that simply, they'd have had to have absolutely terrible legal representation) Are there even any public details on that case? But obviously, that claim was ripe with errors and false accusations. (hence why I was citing it as grandstanding/BS/railroading if it actually happened like that in court) OTOH, Sam did seem to drag Atari Corp down after Jack had built it up, but that's a totally different topic and unrelated to that court case. (though it might have come up if they'd tried to sue Sony) Absolutely, there's plenty of places to blame people/firms for the crash and situation of Atari Corp and the market in general in 1985, but that's a totally different issue than things that prevented Atari Corp from building itself up after the collapse of the market and Warner's liquidation of Atari Inc. Yes, and Mike Katz probably would have been a person of interest as well given he managed the entertainment division. (though Jack was the president/CEO of the company as a whole) I can't imagine that Katz and Jack weren't brought into that trial as witnesses. Yes, and also totally disregarding Nintendo's position and actions in Japan. (or how they'd actually done the same thing to Sega in Japan and the US with those policies -I'm not sure why Sega didn't sue, but I can only imagine it had to do with Japanese culture) Yes, perhaps not handling things ideally (even aside from the problems caused by Warner), but hardly inept. (and as best as one tries, it's hard to say how they could have done things better without digging into hindsight). Would it be inaccurate to label that case as Nintendo railroading Atari with their greater resources for legal support? Spacedice doesn't seem like a troll at times, and maybe he's not formally (at least all the time), but I'm definitely getting frustrated with parts of the discussion. (and distracting from more useful areas of the discussion) Hmm, it also seems like he rejects any hypothetical suggestion (or historical reference) that deviates from pushing for an apple-like business model. (or that Apple's business model, while successful, was/is also flawed and limited them to a niche when they had the potential to be the dominant player of the home computer market with the Apple II -or that going for the low end with the Apple II could have been a good idea, among other things) But that's another topic. -
It's better, but there's still no compensation for speed. (ie changes in turning based on how fast you're going) I still had a problem with oversteering at times. I'm not sure, but playing it back to back with Virtua Racing Deluxe 32x and SVP for the Genesis, the road seem less cramped than in Checkered Flag. The framerate is definitely still a big issue, even in Project Tempest (it's choppy, but I think there's more slowdown on real hardware). I actually noticed more aliasing issues in Virtua Racing Deluxe on the 32x (some z-fighting and clipping stuff) and more obvious limits of 256 color shading and dithering in place of alpha blending (plus lower detail models in general -much more so in the SVP game obviously -and the fact its clipped to 320x192 or 256x192 on the Genesis), but the framerate is far more important for its playability. (granted, those games were programmed by some of Sega's top development teams and based on a well designed -if simple- arcade game, so that's obviously a factor too) It seems like Rebellion should have been more willing to cut detail and maybe suffer some visible on-screen clipping issues (possibly drop to a somewhat smaller window rather than full-screen) to maintain a sold framerate (at least a solid 15 FPS if not topping at 20-30 FPS). Some of the models actually could have looked better than they do now with fewer polygons (the in-game car models look a bit odd compared to VR on the Genesis even). Cutting detail to the environment would detract somewhat, but the framerate would be worth it, plus you'd still have the nice highcolor (CRY?) shading effects and any translucency used as well. (hell, one good way to help offset lower polygon counts would be to design the game around a smooth gouraud shaded look and take advantage of the blitter's very fast g-shading capabilities both for general smoothing of lower polygon count models and for smoother lighting effects). For NTSC users, dropping to 288x192 wouldn't have been bad at all for the time. (especially if they modified the score/throttle/track map display to take up less space on-screen or move it to a totally separate boarder outside of the game window) For PAL, 192 lines would leave a much larger boarder though. (like most Master System games, Virtua Racing, etc) I'm not sure about the flexibility of the dot clock options for the Jag, but if they're fairly flexible, maybe it would be desirable to actually drop the horizonatal screen resolution rather than clipping, or at least with less of a boarder. (dropping to a 5.37 MHz dot clock would mean the same pixel size as the SNES and some genesis games -Virtua Racing uses that- but something a bit higher like 6 MHz would be even more convenient for NTSC as it's very close to square pixels -I think 6.25 MHz is perfectly square for NTSC; 6 MHz is exactly what the Neo Geo uses and gives just under 288 pixels visable on most normally calibrated NTSC TVs) What they did made the game look good in screenshots, but not much else. (though gouraud-shaded optimized models would probably have looked better in screenshots too, even if the on-screen poly count was dialed back a fair bit)
-
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
It's the opposite though. Nintendo took their big risk with the launch of the Famicom in '83, but beyond that they simply forced 3rd parties to cater to them after they'd gotten established as the market leader of Japan. Nintendo didn't "license their wares" as such to be more competitive through more support, they used exclusive licensing that forced those developers to virtually only develop for Nintendo consoles or not publish for Nintendo at all and to pay Nintendo substantial royalties for the privilege. (and have full control over release dates, manufacturing, volumes, etc -3rd parties were not allowed to manufacture their games independently, something that's still a problem with the DS) Nintendo's monopoly in Japan is what gave them their most fundamental advantage worldwide. (NEC was the first to break through that in large part due to their massive corporate presence -though they failed to push that in the west like Sony did, which was good for Sega though -and favored more open competition in general when NEC could have steamrolled the western markets like Sony did later on -and if they'd managed to pull Square from Nintendo, Japan would fall in line as well) Yep, Nintendo railroaded Atari in court as such with that grandstanding BS and Atari f*cked up in responding to it. Lynxpro already addressed this a few times. Atari Corp and Atari Games should have won their antitrust suits against Nintendo, but they both screwed up. (and it certainly didn't help that Nintendo had more funds backing things up) I could make a direct counter argument to that totally BS Nintendo claim as Atari should have, but I think I've already done that in the previous discussions and I've said enough, really. Except, they shouldn't have had Sam on the stand at all since he had relatively little to do with the company at the time. Jack and Michael Katz would have been the main people of interest. (though twisting Sam's arm was certainly a smart thing for Nintendo's defense to pull, obviously BS, but the legal system is full of that -a shame that Sam wasn't smart enough to realize that and push Jack and Katz to respond) Hell, did Michael Katz and jack Tramiel even have a major presence at that trial? And why didn't Sega of America sue Nintendo or Atari bring Sega people in to support the litigation? (for Sega, it may have been the Japanese culture that held them back, even from the US branch taking legal action) Again, making use of what the system has or has not got is entirely down to the programmer. And that's the problem, you want a system that's as friendly/easy/standard to use as possible for the market at hand. You can make do without that, but it's one more strike against you. (Sony could suck it up with the PS2 as they had so many other advantages that drove developer/market interest at the time; if they'd done that with the PS1, that would have had much bigger consequences -especially if you flipped it so Sega had the likes of the PSX's programmability/tools/etc and Sony had the likes of the Saturn or Jaguar -obviously Sony would have still had many other advantages, but it wouldn't have been a perfect storm like it was) Having a more idiot proof architecture is ALWAYS important. (look what happened with sound on the Genesis, especially how often you got The Atari 8-bit chipset had -
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Reading this reminds me of an interview that Sam Tramiel did in NextGen magazine. Google "sam tramiel nextgen magazine interview." You mean the one where he downplays the Saturn and threatens the price dumping suit? (I assume he was basing that on the older Saturn specs that had been published in late 1993 -in EDGE and EGM iirc, which definitely were weaker than the jaguar) If you want to see someone rip on Sony's anticompetitive market tactics and monopolistic/megacorp tendencies, etc, etc, you should see some of the arguments sheath has made over at Sega-16. (not sure if he has any articles on that on his gamepilgrimage site) In any case, I doubt a dumping suit would have held up since Sony was dumping the price internationally across the board, not in the characteristic manner of selling high in one region and not another. (they were using massive investment capital leveraged by their massive size/internal funding/credit to subsidize things rather than more conventional dumping techniques like what Jack Tramiel thought the Japanese would push on the computer market and did with DRAM -legal action was taken too late to prevent the collapse of DRAM production in the US though) They were taking the megacorp approach to the conventional razor and blade business model with unprecedented losses on hardware (vs the more conventional sales at or very near cost) that the competition couldn't afford to match. Then they poured tons more money into massive ad campaigns and software (and buying up exclusives to some games and buying out some developers entirely) on top of having some of the best suited hardware for the market at the time. Then you have the competition all either screwing up and/or falling onto hard times for other reasons (or both in several cases) which turned it into a perfect storm where Sony had almost every advantage except an established position in the market. (even more of a perfect Storm than Nintendo with the Famicom or NES -or Atari with the VCS and all their early competition being rather weak in one way or another -Astrocade was probably the best, but expensive and not marketed well by comparison) Even if Atari had had a case, Sony would have railroaded them in court like they did Bleem. (who actually won, but collapsed due to the cost of that litigation) Or like what DRI would have come up against if they'd really pushed against IBM/Microsoft's infringement on CP/M with DOS, or how Apple probably would have railroaded DRI if they'd actually gone to court over the "look and feel" BS. (albeit if DRI had teamed up with IBM, the shoe would have been on the other foot ) Anyway there was already a thread about that interview: http://www.atariage.com/forums/topic/90927-sam-tramiel-interview-next-generation-1995/ (note some of my comments in that thread are made with ignorance of some things I've learned since then -that, and that thread is where Marty made that interesting post that rather concisely summarizes Atari Corp under Sam Tramiel) http://www.atariage.com/forums/topic/90927-sam-tramiel-interview-next-generation-1995/page__st__100__p__1823912#entry1823912 But that still fails to parallel my main points with the GC/Xbox/N64/Saturn analogy, as I was talking about Sony being in an even more aggressive position by using Nintendo's exclusivity tactics to drown the competition. Anyone who thinks that such contracts didn't have an absolutely massive impact on the Japanese and US video game markets (and some on Europe as well) is either ignorant or delusional. Remember, it was those contracts in Japan that limited Sega -and other competition- in that market and also prevented Atari from licensing or commissioning games from major Japanese developers even before Nintendo had test marketed the NES. Hell, even with the VCS, Japanese developers were pretty substantial; can you imagine where Atari would have been if they'd been totally locked out from all the major Japanese arcade games? (Space Invaders WAS their first massive killer app after all) Actually, even if Atari Inc had stayed under Morgan, Nintendo's lock-in of the Japanese market could have been a massive problem. Even by late 1984, Nintendo was getting substantial interest on the market and licensing games would have become an uphill battle for Atari. (regardless of having better resources than Atari Corp) They'd have a hell of a lot of a better chance of maintaining strong US developer support (maybe to the point of Nintendo modifying their contracts for US publishers), but losing Japanese developers/licenses would have been a huge disadvantage. (especially since the market was dominated by Japanese games by the late 80s) -
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Thats rather a blanket statement coming from somebody who has never actually programmed the 7800 (as far as I'm aware). Don't forget that all programmers are not created equal. You can't say what a system can and cannot do unless you try and make it do it. I'm not talking about the raw capabilities, in fact I specifically qualified that in that post. I'm talking about the specific MECHANISMs used for those capabilities and how easy they are to use, both for a beginner on the system and for programmers who would be constantly working with other popular platforms at the time. (ie tilemap based character modes with indexed color attributes, hardware sprites with x/y position registers, x/y playfield scroll registers, and/or bitmap/framebuffer graphics using a blitter or CPU grunt -or various mixes of those feature sets) The sprite mechanism of the A8 would have put it at a disadvantage too, but it least it had framebuffer modes and relatively conventional character modes. That's the same reason the Panther would have been a bad idea, not meshing with easy multiplatform development or with common standards of the time. (the Panther is rather like MARIA on steroids with much higher bandwidth, higher clock speed, higher color depth, hardware sprite zooming/scaling, etc) Well, that, and the specific configuration of the Panther not only made it eat CPU cycles like crazy, but also require expensive high-speed SRAM (I think it was planned to use 30 ns SRAM) and thus limited to only 32 kB of total onboard RAM shared with the CPU. The Jaguar's object processor avoided those problems with 64-bit word buffering in addition to line buffers allowing commodity 75 ns FPM DRAM to be used and enough RAM to allow a framebuffer in addition to the list generated objects. (and more flexible use of color among other things) Again, those disadvantages with the 7800 wouldn't have mattered so much if the 7800 had dug in and gained developer interest before being "spoiled" by other hardware (except arcade and computers would divided things regardless) and thus had more established development support with the planned 1984 release. (but still you could argue whether it would have made more sense to minimize the number of distinct products Atari was pushing out and instead push ahead with the 5200 and fix many of the problems that could be or drop it and switch to a direct derivative of the computer line -maybe keep MARIA development going for use in a next generation console) It still would have been harder to port to from the many common arcade/console/computer architectures of the time (in case of actual ports rather than total ground-up remakes -albeit many "ports" are actually remade/custom games as such with no relation to the source code or original graphics data), but it would have had enough established support that that wouldn't be a major problem. -
Jagaur games that get a bad rep, maybe not so bad?
kool kitty89 replied to slackerwithin's topic in Atari Jaguar
Yeah, I usually like his videos (finding his channel was part of what pushed me more into retro games/computers back in early 2009), though in this specific case he did make some mistakes on the comments on the technical/development problems with the Jaguar towards the end of the video. (it wasn't a lack of dev tools, but simple/minimalistic tools without comprehensive workarounds for some of the bugs or good high-level programming support -it was generally on the level of a 4th generation introductory dev manual, just not up to the standards that Sony started pushing -something that was also a problem on the Saturn; then there's the REAL problem of the Jag simply being underfunded by Atari and most developers being limited by that as well -with some exceptions, it was Atari outsourcing to relatively small/less experienced/lower budget developers, which was understandable given their position -so you often ended up with delays or sub-par games, or more often, both) In the case of checkered flag though, I can only imagine why they chose to retain that control scheme for the release, I'd have thought locking in the controls at a fixed turn rate (on/off) would have been rather straightforward. (simpler than what they did implement) Anyway, yeah Jake/MN12BIRD's reviews are usually pretty good. Oh, and iirc, Club Drive is more playable, but worse looking compared to Checkered Flag. (and a different sort of driving game) -
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Because they had brand recognition, a significant market share, etc, etc. (and that market share would have been considerably higher with open 3rd party support without the walls put up by Nintendo) I mean, in spite of the budget/support issues, Atari DID have a substantial brand name at the time and DID sell 3.77 million unite from '86-90. (nearly 3 million in '87 and '88 alone, the best 2 years for the system) They were also well ahead of Sega in the US in spite of Sega's better funding and software resources. Maybe Warner should've just paid GCC off for the 7800, gave it to Atari Games Corp. - which they still owned a 10% stake in during Namco's takeover - and then negotiated with TTL/Atari Corp. to revert home video game console rights to the name "Atari" back to Atari Games. That would've made more sense than giving the home gaming rights to Tramiel when TTL/Atari Corp. did not retain any of the former Atari Inc. programming staff while on the other hand, Atari Games Corp. immediately wanted in on the home video game console action following the split. Game Over in its narrative clearly states Atari Games staff wanted to compete and to beat Nintendo. Had they had their own console [7800], they probably would've been more successful than what Tramiel & Co. had at it... Yes, it would have been a wise investment on warner's part as a shareholder and if they seriously had plans of ever reclaiming Atari consumer. (but in that sense, the whole botched liquidation process was a huge blunder) If they wanted to get the debt off the books, but try to put Atari in a position to be healthy and have provisions for a buy-back, they really should have pushed for a total sale of some sort. (I think the isue wa that they were asking for too much and may have been able to sell AInc outright if they'd pushed something closer to the loan/promisary note route they did with the consumer division) But in any case, I'm not sure the 7800 was preferable to release at all, even if they'd released it in 1984 as planned. (ie there were other advantages to never investing in that development or halting it in favor of pressing on with the 5200 or switching to the XEGS route -or similar- there were many options for correcting much of the 5200's problems after the fact, or droppin it and shifting more to the computers and pushing them directly onto the console market -in either case you have a lot of commonality inhardware and software development, but the 5200 could have been cheaper than the A8 while the A8 would have full computer compatibiltiy and a loophole for Nintendo) This is really a separate dicusion entirey though. (with Tramiel, there ws even more reaon not to go with the 7800 asthey lacked the added reources Warner had provided and ended up stripped of many of AtariInc's other resources even, on top of the conflict over the 7800 -so there's an argument that the 7800 was unnecessary for Atari Inc in 1983/84, but much more of onefor Atari Corp -for Atari Inc, maybe they could negotiated with Warner/GCC to hold off and roll the work done on MARIA into a future design for the successor to the 5200) Then there's the whole separate argument of what Atari Inc should have done in place of the original 5200 backin 1982. (ie 3200 or similar, direct consolized A8, cheaper/low-cost optimized 5200 with provisions for cheaper/convenient VCS compatibility and an adapter at launch, etc -or maybe a directly conolized A8 with special provisions for VCS compatibility) True, or not deciding to drop it entirely back in '84 and preson with exisitin products. (and either conolizing th 600XL or re-releasing the 5200) Edit: Oh, and a good analogy for the Nintendo licensing situation: Take another generation when one platform/company had a substantial lead in the market, like the 5th or 6th generation. If Sony had implemented Nintendo-like licensing agreements (or at least the console exclusivity portion) for the PSX or PS2, what would have happened to the 3rd party support on the N64, Saturn, Xbox, or GameCube? By extention, with the much more limited support (pretty much only 1st party), how much less would those systems have sold and how much more limited would the resources have been to product/commission 1st party published releases? (especially for Nintendo since MS was managing a loss making business with the Xbox as it was with massive subsidation from other divisions, Nintendo just had the -albeit quite successful- game boy/color/advance line going at the time -and like with the 7800 vs ST, diverting excessive resources to the GC that could go to the handhelds wouldn't have made much sense -though there's other exceptions to that parallel with the 2600 in play among other things) That's all assuming that government/legal action wouldn't stop Sony from doing so (just as Nintendo wasn't stopped in the period we're addressing) And, of course, Sony had other advantages that allowed them to put even more pressure on the market than Nintendo ever had, but without being illegal. (or at least less illegal as what Sony was doing was to some extent similar to what led to IBM's antitrust suits years earlier -which contributed to the PC being made from mainly off the shelf parts and not taking advantage IBM's vertical integration and massive capacity, and if Sony had been a US company, it might even have come up against such legal action) But Sony's impact on the market is yet another separate discussion. -
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Oh, I forgot to mention, the NES and SMS both had hardware advantages as well that made them more attractive (actually even the A8/5200 was more attractive in several aspects than the 7800 in terms of ease of development relative to "common standards" for arcade/console/computer programming of the time -like character modes, bitmap/framebuffer graphics, etc). But I think that aspect of things has been done to death already. (regardless of the 7800's advantages, it was definitely less adept at and more difficult to program for common games of the time than its contemporaries, and the fact it didn't get established back in '84 when that was less of an issue obviously makes things worse) But even if Atari had had hardware close to the Master System or NES, it probably wouldn't have changed things that substantially with all else being equal. (the Master System might have actually been a fairly poor choice given the ROM sizes Atari was generally working with and how much the SMS's 4bpp graphics take up -you could compress things in ROM and unpack into VRAM -or maybe unpack some stuff on the fly if you could spare the CPU resource to do so, but that's only for stuff you're not updating on the fly and that 16k in the SMS gets eaten up pretty quickly with 4-bit sprite/tile data -actually, the SMS probably would have been better off in general for the time if it had supported 2-bit or maybe even 1-bit graphics data -outside of the TMS9918 mode- but with much more flexibility of indexed colors than the NES -an NES with 8x8 attribute cells and double the subpalettes probably could have looked better than the SMS in many cases -let alone with the bandwidth needed for more 3 color sprites per line -sort of like the Game Boy Color vs the Game Gear except with ROM being far more expensive and limiting than in those cases; by the mid 90s, the flexibility of 16 color subpalettes with the GG would have been much more significant) -
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
There's no "NES excuse" though there is the very real issue of Nintendo blocking competition with unprecedented anticompetitive licensing policies. (obviously Atari still wouldn't have gotten more support than Nintendo if they hadn't done anything else different, but that's not the issue: the issue is Atari getting much more support than they themselves did historically -or Sega for that matter, and it's a cumulative thing too: better support, better funding, cyclical, more and more as time went on with the real potential being the leap to the next generation -something they already had a big opportunity for, but could have been even bigger) Yes, I too would like to know more about just what Atari's licensing policies were and how 3rd party development was managed in general. I'm not sure how the osborne's self defeating prophecy has anything to do with the video game market of the time at all. Given that Katz didn't even have any commissions from European developers, I think it may have been more of an oversight on focusing on US developers. But for native EU dev support, there wasn't a whole lot on the NES or SMS either as far as Euro publishers were concerned (especially prior to 1989), and both of those sold a lot better than the 7800 in Europe. (actually, most of the European published NES and SMS games are from the early 90s) Maybe there was simply too much interest in the computer game market (and the totally free licensing, low-cost media, etc native to that) to attract publishers to consoles, especially in the late 80s. (all the more reason Atari would have had to commission games as such and/or use especially favorable licensing policies -like totally free/open licensing other than the cost of development tools and possibly reviewing games for objectionable material or such) I need to look further, but I can't seem to find any European published NES or SMS games (not sure about 2600) prior to the early 90s. That's one more reason to argue Atari going all computers back in '84/85, or rather having a fully compatible consolized version of the 600XL released at the time. (no delay issues a la 7800, fewer platforms to support for software or hardware, lots of existing software and development support, getting around Nintendo's licensing with a "game computer", etc, etc) Going with the XEGS in '87 was a bad move, but something like that in '84/85 could have been a very different story. (let alone doing that instead of the 5200 back in '82 -especially since the 16k 400 was already being cut to 5200 level prices in late 1982) It seems odd that they hadn't been pushing for more 7800 support right out of the gate (ie by late 1985, when Katz was preparing things), unless the above is incorrect. (haven't seen Curt/Marty make any statement one way or the other on that issue) By late 1985, the 2600 should have been mainly aimed at old games with re-releases and compilations and a low priority for new games, at least for 1st party stuff. (just like any late gen console -like the NES from 1991 onward, SNES from 1996 onward, PSX from ~2001 onward, etc) If any 3rd parties wanted to publish for it, fine, but don't add to the encouragement over the 7800 (or computers). Lockout was there, yes, but all that meant was that Atari had control over how to manage 3rd party support. It would have been up to them to allow free development, very low licensing fees to be competitive, etc. (under the circumstances, they probably should have had free licensing, or maybe low-risk contracts that only charged royalties above a certain volume sold) Yes, the XEGS was a bad move, at least in the way it was released, after the fact of the 7800 rather than instead of it. (also it was oddly expensive, some $200 when the 65XE had been 1/2 that previously) Again, a 600XL derived console in late '84 or '85 would have been another matter entirely with advantages over the 7800 route. (let alone back in 1982 instead of the 5200, but there's even more alternatives there too) WTF? What would having a diverse company have against publishing for their game system? Anyway, Nintendo didn't have many of those advantages until well after the fact in the US. They had Japan locked-in, and that was their main advantage until they could otherwise establish themselves in 1986/87. (I'd imagine US developers were taking a "wait and see" attitude across the board) Of course, Nintendo's position would have generally meant better support across the board regardless, but it wouldn't have meant what they got, and that's blocking the competition almost completely. If Japanese and US developers have been totally free to publish for any platform they wanted without any catches, that would have made the market a very different place. Noone's saying that Nintendo alone forced Atari from a dominant position in the market, but the issue is that they DID keep them from being competitive. (ie Nintendo would have had a lead, but not nearly as big a lead over Sega or Atari in Japan or the US than they did historically) What's interesting is that Nintendo was somehow able to hold down their licensing policies in Japan with absolutely no hardware lockout. (maybe more to do with the Nature of the Japanese market and honor or something, or maybe Nintendo had other underhanded tactics like compelling retailers to not stock unlicensed games) The Famicom had no more lockout in Japan than the VCS did in the US, the only physical/business reasons to go licensed were for the development tools and for the branding of their games as Nintendo licensed. Atari had limited funds for such licensing, and as Nintendo pushed more and more, they had an even bigger advantage for in-house development, let alone the even bigger 3rd party stuff. Again, Nintendo wasn't the only problem, but they were a huge one. It would have been in their interest though, but not if publishing on the 7800 would have meant not publishing on the NES in the context of the time. (for the 7800 to get support, it would have had to be the system with the absolute most profit potential and not just another outlet for their games -had Morgan's Atari Inc stayed, that might have happened by the time the NES was released, but in spite of the limitations of Atari Corp's position the 3rd parties still would have had lots of incentive to publish for a less popular system, just at lower priority -plus the 7800 would have been much closer in popularity without Nintendo's licensing) In the US, Atari had a stronger name for consumers and Retailers up through 1986, of course the entire industry was still shaky up through 1986 and coming back from the bottom of the crash in '84. (again, a major issue with "wait and see") Nintendo had Japan to work with with no crash (indeed a boom when the US crashed), but Atari Corp had no such advantage of overseas success. They also had a growing brand name in Europe, especially with the increasingly popular ST making Atari a true household name across Europe in a different context than they had been in the US (obviously associated with games too given the huge computer game market for the ST). That followed for the Lynx and is the reason Atari's brand name was far stronger in Europe in 1993 than in the US. (and one of the big reasons not pushing the Jag predominantly in Europe was a major mistake -Jack got it right when he was willing to allow shortages in the US to maintain their European presence, but Sam apparently didn't see that) Hmm, I don't know about that. I was under the impression Nintendo had been rather successful with the arcade, pong consoles, and Game & Watch line in Japan. A small company, but a successful one. Now, they did risk bankrupsy with their gutsy move with the Famicom, investing in a 3 million unit initial contract with Ricoh to keep per-unit costs down. Had the FC failed, Nintendo may have been no more, but it took off so fast that even the recalls with early hardware problems didn't hurt them too badly. (they were more or less like Atari with the VCS in the US, except rising a bit faster - more or less accomplishing in 1983-1985 what Atari did in 1977-1980, though given the nature of Japan, that's hardly surprising -much denser population) True, they had to work hard to get into the market, probably one of the major reasons they managed to get marketing right in '86/87 when Sega was floundering in spite of comparable (or greater) advertising budgets. Not so much the no-risk thing, since many others had done that too (or not quite as low risk, but full options for retailers to make returns -one of Atari Inc's big problems was the massive returns of games/hardware from retailers and distributors, had they made the distributors/retailers liable, Atari wouldn't have had nearly as much debt to deal with). Talking about just after the launch of the NES: There was no "safe bet" for developers; the industry had just crashed. The NES "success" was not instant, and it was not instantly a "safe bet." It hadn't just crashed, it had crashed some 3 years earlier and had been recovering rapidly from mid 1985 onward (when 2600 sales picked up). Nintnedo's success in Japan also would have given some indication, but Atari's brand name would as well. On the whole, all western developers seem to have taken a "wait and see" approach as such while Nintendo already had Japanese companies locked-in so Atari couldn't even license/commission games if they had the money to do so, let alone expect independent or licensed publishing. Thus, once Nintendo became the most attractive to publish for, western developers were also willing to make the sacrifice of exclusivity for Nintendo. (at least for a time, though many eventually got fed up with the BS, more than just lock-in but many other limits on publishing for Nintendo) That's the same thing that happened in Japan, there was notable competition, but Nintendo made it big enough to make developers willing to go exclusive when they otherwise would have published cross-platform. (which of course widened the gap further for future licensees, and so on until NEC's PC Engine tore into the market strong enough to change that) I don't think anyone's arguing that Nintendo wouldn't have still been the most successful under similar circumstances but with unlimited licensing. The point is that other platforms would have done much better (if still been behind Nintendo overall for other reasons), and that goes for Sega's SG-1000 and Mk.III (and some other competition for that matter) in Japan as well as Sega and Atari in the west. (actually, Sega might have risen above Nintendo, though given how they messed up their marketing in spite of funding and good software, I'm not sure that would have worked out either in the US) -
Jagaur games that get a bad rep, maybe not so bad?
kool kitty89 replied to slackerwithin's topic in Atari Jaguar
The main problem is the control scheme. Instead of simple fixed rate controls (on/off type steering), they tried to add some inertia-like elements to it with accelerated turning the longer you held the button down. However, not only does that break the game for simple d-pad controls, but it doesn't even make sense for analog controls or logical real-world sense either since inertia/acceleration doesn't work like that either. (unless they were trying to simulate analog controls by using time as a control for the x axis, but that's really not a good idea -it would be like setting the throttle/accelerator to get pushed harder and harder as you held the button down rather than going full instantly as digital button based controls normally do) I'm not sure why they didn't catch that before the release, it's not something that should have been hard to fix (ie just lock turning to 1 fixed rate like other d-pad games -and use tapping to get intermediate rates). It's definitely a case of over-thinking things where "less is more". The oddly sensitive spin-outs, framerate issues, and weak AI for other cars on the track would be the major problems after the control scheme. (and I mean how the cars fail to avoid running into you, not how easy they make it -they seem to have dialed down the difficulty to address the control problems oddly enough) MN12BIRD recently posted a review that gives a fair overview of the problems from an open-minded perspective, and most of those common problems crop-up. http://www.youtube.com/watch?v=NubFwOMnsWg I can get used to the controls to be point of being reasonably playable, but it's still a major issue. Plus, it ruins you for normal racing games and you have to un-learn the controls again. (playing in Project Tempest avoids the framerate issues though, but the control is still a big problem) It depends on your taste. I just like that primitive 3D stuff (or 3D in general). I can have fun with F-22 on the Genesis (main problem is the lack of rudder control IMO -a common issue for console flight sims), or LHX, among others, but there's a line where it just gets to be a chore as with Steel Talons on the Genesis. (the framerate hangs around 2-3 FPS for much of the game; that seems to have been done to manage a fairly high on-screen polygon count, but it just wasn't playable enough to really be worth it -probably could have been OK on the Sega CD, but oddly that didn't seem to get much support for those games in general -it should have been several times faster at 3D than the Genesis due to the separate/faster 68000 and the blitter to accelerate drawing the polygons, etc) I have a lot of fun with Virtua Racing on the Genesis, and it's actually one of the bigger reasons I started collecting for that system about 2 years ago. (the first Genesis game I emulated in Fusion and the first I bought after I got my Genesis in spring of 2009) It's sort of like how I got so interested with playing the 2600 back around 1999/2000 when I'd never seen one in action before. I wasn't disappointed either, it was fascinating to see those old games. (hell, I didn't even notice how mediocre Space Chase was at the time, granted I was only about 10 or 11) But the 3D stuff goes further since many of my favorite genres are 3D-specific or at least better in 3D (like space/flight sims, rail shooters, action-adventure, or 3D platformers). X-Wing for DOS was so awesome back then, and that's one of the games I can admit nostalgia for (and one I have kept going back to since we got it in the mid 90s -longest gaps in play were cases of incompatibility). Hell, the DOS CD versions of X-WIng and Tie Fighter are significantly better than the graphically upgraded Win9x conversions using the XvT engine. (mainly due to how they removed the awesome dynamic iMUSE MIDI in favor of a generic looping compilation of John Williams' theatrical soundtracks -X-Wing Alliance got it right with fully dynamic dervatives of the theatrical soundtracks though -a real shame that the updated X-WIng and Tie Fighter didn't get a treatment more like Wing Commander I/II for the Kilrathi Saga release -awesome revamped general MIDI, digitized SFX, and the original 2D graphics with a new directX engine with no speed-sensitivity and super fast/smooth scaling) We didn't get an SNES until late 1996, but I still thought Star Fox was awesome and played the hell out of it (hadn't heard of it before we ended up getting it with our used SNES). It was so cool to find out about Star Fox 2 years later (around 2005) and downloaded the patched game to actually play. (the first game I ever used an emulator for, unless you count Bleem or VDMSound) Cybermorph isn't so much of a "pop in and play" sort of arcade-fun game though, it's a more complex game in general with the roaming nature and some rudimentary mission requirements. That's one of the problems it has in terms of catering to a broad audience: regardless of being a polished game, an arcade-ish railshooter (like Star Fox) is a much more pick-up and play sort of game with a broader audience in general. (Cybermprh isn't as complex as a sim like X-Wing or such, but it is more complex than some others and that complexity also made it far less foolproof in terms of appeal and polishing -in some respects, it's closer to Rogue Squadron, but obviously far simpler and less polished -one important thing is that Rogue Squadron holds your had a bit more at times and makes it easier to know what you're supposed to do -the Starglider games are more complex in that regard too) The lack of in game music hurt too, of course, and the bland art design (especially the lack of 2D backgrounds), so a number of issues. (let alone how neat it would have looked if they'd opted for a commanche-like voxel engine for the terrain) Honestly, I think they should have pushed for a streamlined arcade-style railshooter, or multiple ones for that matter. It's a real shame that they didn't given the potential for high-speed scaling based shooters (like Sega's and some others) as well as polygonal stuff (and a mix of the 2). One big thing about railshooters for 3D is that it allows extreme optimizing/clipping and simplified AI/logic for some of the most impressive/detailed 3D on-screen that a system can churn out. (let alone a fully optimized polygon/ray-cast/voxel/scaled 2D hybrid using all the strengths of the system) On the 2D side of things, a derivative of Blue Lightning would have been good for an early release. -
A pretty major reimagining of the Jaguar - why not just fix the h/w bugs, drop the 68k and release a CD only machine in 93 That comment was in the context of someone other than Atari pushing the Jag chipset (TOM and JERRY) as a high-end console or arcade machine more like the PSX or Saturn in terms of component/production costs. In the context of Atari Corp, I made a separate statement on that in the last 2 paragraphs: Unless TOM wouldn't mesh with EDO DRAM without modification, in which case there were still other options for different configurations with FPM DRAM to consider. If they'd realized that the market was mainly using simple sample based audio (the lines of the SNES barely did more than 8 channel MODs with compression, interpolation, and occasional use of reverb, the jaguar mainly used straight-up Amiga mod -sometimes 8 channel- without), they could have stripped the DSP out of JERRY to make it a much more basic sound/ASIC, maybe with more hardware DMA channels or just relying more on the CPU (or GPU for that matter) to drive audio. (having an ASIC including the Falcon's sound hardware, controller I/O ports, and a UART would have made sense, maybe an on-chip SRAM buffer to allow the DMA sound to read that and stay off the main bus) Then either the 68k, or a better alternative (be it on a 16 or 32-bit bus) along with the dual bank interleaving to reduce overhead without other major redesigns to allow better bus sharing. (a bottom barrel 16 MHz rated 386SX might have been OK, but a CPU with a cache would help a lot regardless though and the 486SLC and 68EC020 were probably the cheapest examples of that, though you could go the other way and look for a higher performance low-cost CPU that lacked a cache like an ARM60 or even a plain ARM2 -specifically the newer CMOS ARM2as- which definitely would have been cheaper to manufacture -much smaller die, even smaller than a 68000- but I'm not sure if the volumes/competition would have actually priced it cheaper than a 16 MHz 68EC020 or 486SLC -or a 25 or 33 MHz rated 386SX for that matter) Or you could widen the hypothetical context and change other things like release date, add more funding for Atari, etc, they weren't going to have the hardware bugs fixed in time. Hell, if you solved all the bugs in TOM and fixed the memory interface for JERRY, all that would be left was to tweak the system to make it bootable via one of the RISCs and you'd have a pretty damn powerful system even without caching. (the DSP would be just as usable as a CPU other than being limited to a 32-bit bus) You'd still want to have dual banks to allow separate source and destination for textures though, unless you added a word buffer for the blitter (even without a line buffer -and only 175 ns performance in DRAM- that would be much faster than unbuffered fast page rendering, especially for 8-bit source and 16-bit destination -moreso if they'd added support for 4-bit source with 16-color indexed textures) Hmm, maybe another hack instead of a whole added 256kx16-bit DRAM bank (or 512kx8-bit if you wanted to mainly focus on 8-bit textures and cut some cost) could have added a single 35 ns 32kx8-bit SRAM chip as texture buffer of sorts (aimed at 8bpp textures specifically) and allowing 26.6 MHz reads (so as fast as any textures buffered into TOM's SRAM) to allow boosted performance of the most used textures. (and with a bit of added work, it could be managed on the fly as sort of a rudimentary texture cache) That would be less attractive in a case where you still had another CPU, especially one without a cache. (in that case, there's a lot more benefit from having a full 2nd DRAM bank that the CPU could also work in) Perhaps having the cart bus configured as a separate bank and supporting faster ROM to allow code to be run directly from ROM at full speed could have been useful, especially if made optional with support for a range of ROM speeds -Kskunk gave me the impression it's locked at 375 ns, otherwise you could make it pretty damn fast for modern homebrew stuff - and maybe it would have been reasonably cost effective to use a smaller fast ROM chip for code and another bulk storage chip for compressed data to load into RAM -and make that optional as well, with cheaper games using only slow ROM, that is unless it was CD based from the start, then you'd just want a nice slot with support for RAM expansion -hell, then you could even add a 2nd bank after the fact ) Also, cutting out the "helper/manager" CPU in general is also a significant design change that went against what Flare had designed the system as from the start (albeit, unlike the older Flare 1, the Jag actually had a custom RISC processor embedded in the fundamental chipset that could be used as a fairly decent CPU -even if not optimized as such- vs the DSP in the Slipstream/Flare 1 which obviously wouldn't be able to fill that role) There's plenty of other hypothetical stuff that could have put Atari in a much better position in general, and hardware wasn't Atari's biggest problem anyway, as I mentioned in the final paragraph in my last post. (but having fewer hardware problems definitely could have gone a good way towards alleviating some of the other problems -or potentially even helping to resolve them . . . without buggy RISCs, Atari's development tools would have been much more useful too, and that buggy compiler probably wouldn't have been so buggy or useless) They were really in a bad position in general though, and Kskunk had a good point when he said they were doomed after 1991. (they had some chance of recovering, but 1991 marks the point where things really started falling apart -missing the 4th generation console market on top of the screw-ups in the computer side of things took its toll on the company and exacerbated the issues with declining management following Jack's retirement at the end of 1988) This is going further off topic, but on the computer side of things, it's interesting to note that Atari started to get things right when CBM was still stagnating. (the TT's video was a step in the right direction, but they didn't extend that to lower-end systems or upgrade the blitter -or maybe even hacking in the Lynx's blitter- to be faster and efficient using 8-bit packed pixels -or even working directly on the 64-bit TT video bus, but the Falcon got more things right -blitter was still weak though and the Falcon was the opposite of the TT being limited to the low-end range, but the Falcon was especially progressive compared to what Amiga did with AGA -except they kept up with a range of machines with the 1200 in the lower-end and high-end 4000 with workstation class configurations as well, a shame that didn't work out with the falcon though -they also ended up pulling out of computers just before CBM collapsed, so there's another missed window -albeit with PCs coming into Europe at that time) I wonder if Atari could have done better with their PC line too, or if it would have been wise to transition to PC/compatibles in Europe rather than extending the ST line in the early 90s. (especially after the previous mistakes with that and missing major opportunities to maintain a dominant position in Europe)
-
I think now I understand why the NES beat the 7800
kool kitty89 replied to Atari Joe's topic in Atari 7800
Because they had brand recognition, a significant market share, etc, etc. (and that market share would have been considerably higher with open 3rd party support without the walls put up by Nintendo) I mean, in spite of the budget/support issues, Atari DID have a substantial brand name at the time and DID sell 3.77 million unite from '86-90. (nearly 3 million in '87 and '88 alone, the best 2 years for the system) They were also well ahead of Sega in the US in spite of Sega's better funding and software resources. A better question is why 3rd parties supported the Jaguar as much as they did in spite of the sales. (not talking about the Atari commissioned/published stuff, but the fair amount of actual 3rd party publishing the console saw) Except I wonder why the 7800 didn't at least get a little European publishing given the general lack of Nintendo-bound publishers and wealth of computer game developers at the time. (and the 7800 apparently being decently popular in Europe if not actually selling better than in the US -it did have a better overall market share in the US due to Nintendo being far less dominant, but it was also behind Sega and had the low-end computers to contend with) That and I again wonder why Katz didn't push for direct licensing/commissioned computer games from Europe. (a lot more on the table than in the US and probably at more competitive prices) After the 7800's release it would have been hard to push, but there were other reasons to not release the 7800 in general. OTOH they could have released a keyboard/computer module expansion for the 7800 too. (maybe something like a scaled back 7800 XM with added DRAM or DRAM -Epyx carts were already pushing 32k SRAM chips in '87 to put that in perspective- and POKEY with a minimalistic membrane keyboard integrated with it and added peripheral/expansion support using POKEY's IO for use of a full-quality external keyboard a la XEGS, SIO port, and maybe added controller ports using the POT lines for analog and/or hacked as digital I/O ports -except you'd need more than 8 IO lines if you wanted more than 1 added controller port, so that would mean hacking the key inputs or just limiting it to 1 added controller port possibly with POKEY POT scanning support for analog joystick(s) and paddles) That probably would have been a better investment than the XEGS by 1987. (and that also could have meant no carts using onboard RAM or POKEY, just require the expansion and possibly offer them as tapes or disks to load into the module's onboard RAM, and 1987 was the first really big year for the 7800 with over 1 million units sold) Hell, they could even have released a "7800 Plus" with the expansion module's features totally built-in. That also may have made it more attractive for the European budget market. (probably best to add a built-in analog cassette interface in that case too rather than relying on SIO based drives -or at least release upgraded drives with high-speed FSK decoders supporting at least 1500 baud -preferably double that or more) Not really right, I already addressed it though. Microsoft had very little to do with it other than scaring Sega with added competition. The windows CE deal was highly favorable to Sega, it was Sega of Japan who ultimately decided on the PowerVR chipset (and it had been they who fostered the conflicting dual platform development with neither team knowing of the others' existence which led to some of the later problems). The PowerVR chipset was more cost effective and ended up having awesome development tools. Otherwise, i think I summarized Sega's problems pretty well. The DC had sold poorly in Europe and Japan, but done well in the US, Sega had not been conservative enough for their dire financial situation (wasting money with the free modem, hefty rebates, unnecessary price drops and a lot more), the piracy exploit certainly didn't help and then there's Sony's hype. (and a lot of other stuff I addressed in much more detail in my previous post) If Sega had managed to get a better foothold in Japan, have an even stronger following in Europe than the US (the former hardcore Sega market and one where the Saturn didn't crash as hard or as quickly as the US), temper their spending (though the biggest thing they couldn't afford to drop in the US was the ad campaigns -those are critical in the US and something that went a long way towards weakening the Saturn on top of Sega's many other problems), prevented the piracy exploits (GDROM ripping and CD-R booting), and invest in PC publishing a hell of a lot more (especially after the Saturn was failing in '97), and Sega may have been able to remain profitable in the home console hardware business. Agreed in full. However, does anyone know if Namco ever expressed an interest in this? Maybe Warner Atari didn't want Namco getting too big for their britches, so to speak... Japan was the second largest economy in the world behind the US with 120 million'ish consumers... Maybe, but maintaining a good partner relationship would have facilitated avoiding such conflicts. (and in hindsight, they'd probably have been a lot better off regardless) Yeah, that palm-held Epyx stick is nice. Interestingly, Konix distributed a nearly identical joystick as the Speedking. It would have been perfect if there had been a version with thumb operated fire buttons. (perhaps in addition to redundant trigger buttons)
