Jump to content
IGNORED

Its 1993, you're in charge of the Jag, what do you do?


A_Gorilla

Recommended Posts

If i was in charge i would have changed the company name to Nintega and called it the Meganes, since by that time nobody really wanted to buy Atari as the name reminds people of Space invaders and the 2600. Which is what your dad would have played (Hardly good for selling to the next generation of kids.) Oh and hope to not get sued for to much!

Edited by Crude Dude
Link to comment
Share on other sites

An ST derived game system would probably have been best for '88 or '89 though
I'm not sure about that. Computer-derived consoles weren't successful before the first Xbox -- remember the fate of the Amiga CD32, Amiga CDTV, Apple/Bandai Pippin and Amstrad GX4000 (OK, the last one was already quite obsolete when it was introduced).

 

The fact that they were computer derivatives wasn't the problem, it was other issues: the XEGS, C64GS, and CDTV were all released when the hardware was fairly dated and that's just to start. The XEGS was released only a year after the 7800 and was neither here nor there as a system; being marketed as a game system capable of computing capabilities when it was actually a full 65XE computer in a different form factor. (the 5200 was the right time o bring out such a console, but they screwed that up) I think the XEGS was at least profitable though, if confusing to their consumer base.

The C64GS came far too late and was flawed in that it was supposed to be directly compatible with the C64 yet lacked even a rudimentary keyboard. ('85/86 would have been the time for a C64 derived game system and should have been made not directly compatible and featured a lockout system like the 7800/NES/SMS if they wanted to compete in the same market with the same business model)

The CDTV was using aging hardware and was rather pricey with CD drive and rather large chunk of RAM, plus it was aimed as a multimedia machine rather than a pure gaming machine. (an Amiga derived game console stripped to bare minimum and optimized with a consolidated board might have been a good idea ~1989, but it was late and not cost optimized)

 

The CD32 is probably the best of that bunch and supposedly had a fair amount of popularity early on, but Commodore was failing and really got screwed by being locked out of the US, but for lack of a market and all the machines already manufactured in the Phillipines sitting in warehouses. It was a bit outdated at the time too, but should at least have been able to cache in on FMV stuff and some primitive 3D (some raycasting type games seemed to do fairly well too). Still would have been crushed by the competition later on, but Commodore was in bad shape like Atari at the time, Atari just seemed to have better luck. (and better hardware)

 

 

 

 

 

 

 

 

BTW where did you find that quote, is it from a book or magazine article; is it available online?

Same site you linked, coincidentally!

http://www.konixmultisystem.co.uk/index.php?id=interviews&content=martin

Heh, that makes the contradiction in the other article I linked to even bigger. ;)

 

Such a system would need to come out in 1988 to have a chance. By 1989, it would be compared directly to the Genesis which is launching world-wide. Compared to the Genesis, the ST blitter isn't fast enough to do overlapping backgrounds at 60 FPS, nor can it trivially support 64-183 colors on screen -- these are all standard features in the Genesis, available without fancy programming.

Hmm, i suppose they'd better beat Sega to market at least, though it took a while for marketing to really kick in anyway and not until the SNES was released did it really start moving. (Michael Katz had predicted that --commenting that he thought a lot of people were holding back until they could gettt a look at the new competition)

Again, EU could have been the more likely market to get established in with Nintendo not having such a hold, Atari with name tied to the popular ST, and the MD and SNES being released later. (though Sega was also stronger in Europe --interestingly in some of the same regions Atari was, like the UK)

 

As for the graphics, the MD only has 4 15-color (plus transparent and a solid far BG color) palettes to work with per frame shared between sprites and BG tiles indexed from 9-bit RGB. Relaoading the palette mid-screen (between scanlines) causes artifacting (garbage pixels) and in cases this is used, flickering sprites are often used to hide it. (usually for water level effects, like in Sonic 2 and 3)

Plus, you have to consider other genres as well: racing games were quite popular, and such required a large portion of the screen to be updated (Out Run, Super Hang on, Lotus, Super Monaco GP, etc); this combined with the VDP's DMA limitations meant rather limited framerates for such games (15-20 FPS), same for 3D games or any others using a pseudo-framebuffer (a major limit for Virtua Racing and the Sega CD's ASIC and even the limiting factor on some software rendered games on the Genesis) --Due to the single bank of VRAM only being able to update during VBLANK. (can't be updated during active display) Clipping hte screen to provide more VBLANK space helps this and 50 Hz PAL is far less limiting due to the much larger number of vblank lines.

So, on average such games like that and probably pseudo 3D scaling type games would be better on the ST+Blitter.

 

With the way most art is done and the limited 4 shared subpalettes, you end up with much fewer than 61 colors onscreen normally as several indexed colors are going to be redundant in the other palettes. (the Amiga's 32 colors, let alone halfbrite, more than likely being advantageous much of the time, and the ST probably at least being able to manage a rough approximation)

Now the SNES or TG-16 OTOH have many more subpalettes, 2x8 15-color palettes split between sprite and BG on the SNES (as well as rarely used 256 color paletized modes -the 16x16 color palettes often quoted as "256 colors" as the MD is "64 colors"), and the TG-16 has 2x16 15/16-color color palettes split like the SNES, so a total of 512 indexes and a practical maximum of 481 colors, though it's usually limited to far fewer due to redundant indexed values and, of course, the limited 9-bit RGB palette. (hence many TG-16 games having smoother gradients than SNES due to greater number of palettes, but more limited range of color due to 9-bit vs 15-bit RGB master palettes)

The ST of course uses 9-bit RGB as well, unless you're talking about the STe, but for '88 it would have to be a standard ST+Blitter as with the MEGA STs. (no DMA sound either)

 

Hell, the Neo Geo supported 256 4-bit indexed subpalettes from a 16-bit master palette, hence the "4,096 colors onscreen" yet few games even exceed 256 colors on any frame. (usually a fair bit less than that going by some comparisons made on a old thread on Sega 16) Some exceed that by a significant margin though, but it's not the norm.

 

Price wars were a big deal too. The Genesis was very cheap to make, in part because of its tiny memory and very specialized architecture. The ST's architecture is more general and (typically) uses quite a bit of memory. For example, typical games would want two frame buffers -- the Genesis does away with such costly overhead.

In the case of a consolidated board, that could be achieved by omitting any unnecessary support chips and expansion ports; just a cartridge slot and AV/RF/power ports.

On RAM, remember that the ST uses cheap DRAM, and for a baseline unit facilitating cross platform development with the computer, perhaps 512 kB (possibly even make due with 256 kB since you can run things from ROM rather than loading all into RAM). The MD on the other hand used a mix of the then quite exotic PSRAM (68k and Z80 memory) and VRAM (dual ported DRAM) for video memory, so that's proportionally MUCH more expensive than the ST, probably rather close to 512 kB of DRAM if not more expensive. (eventually economy of scale kicks in though -probably aided by similar components being used on Sega Arcade boards) --In fact, it was suggested in several discussions over at Sega 16 that the MD could have been a lot better off using DRAM for main memory, facilitating a bit more memory and still saving on cost.

 

On top of that, the limited memory eliminated the possibility of several games which did get ported to contemporaries, like Wing Commander and Wolf3D (the former was ported to the Sega CD, and both were amazingly cut down to work in the SNES's scant 128 kB).

 

in the SNES's case, it did use DRAM for main memory, and only 128 kB at that, but it used another 128 kB of SRAM (64 kB for video and sound), and that's definitely more expensive. Then there's the cost of the rather advanced sound system with SPC700 CPU plus the DSP, the dual PPUs, etc. (the CPU and main RAM were cost savings, as would be the 8-bit data lines) The MD, by comparison simply has the TMS derived PSG embedded in the VDP and YM2612 for sound, plus the Z80 coprocessor with 8 kB work RAM. (oddly, neither the 68k nor Z80 int lines are connected to the YM2612, making PCM playback much more CPU intensive and tricky)

 

With the unnecessary chips removed, I think the ST could be consolidated onto a board comparable to the MD. (and further consolidated later on)

Had it launch in '88, it would probably have been more expesive than the Genesis launched in '89 depending on some factors, but given that Nintendo and Sega were selling hardware at profit, competition shouldn't be that tough; the bigger problem being advertising budget. (Again, Europe is much more attractive to work with viral marketing)

 

Don't forget that the 7800 came out in 1986 and the Lynx came out in 1989. That's a lot of consoles to be juggling at one time.

Don't forget the XEGS... In fact, I think it would have made the most sense to go for an STGS rather than the XEGS by that point. There plate was a bit full, but the 7800 (and 2600, finally) were declining steeply in the late 80s (7800 peaked in '87/88) and were pretty much dead in the mainstream market by 1990 (slightly later in Europe). At that, they were always in relatively small market share and with very limited software releases and even fewer 3rd party releases. They could have struck while the iron was still hot in '88, with the 7800 selling fairly well and the STGS could have taken advantage of that, along with being truely technically superior to others on the Market. --Prior to the BLiTTER, it would have been a no-go though, but such plans should have been established as soon as the BLiTTER was on its way (which would have been '87 at least, with an intended release on the MEGA 1, though that didn't happen)

By Marty's statements, it sounds like an STGS was planned ~1988, but they switched to working on the Panther around that time, it seems.

 

What happened with the Lynx was rather unfortunate though, somehow managing to get misguided responses from their consumer study groups (I wonder how much they included kids and young adults, the expensive, battery hungry implementation, and coincided release with Nintendo really hurt it. (IMO the Lynx and GG would both have been better off without backlighting, or at least a cheaper/smaller non-lit model being offered alongside a "deluxe" version) The way the Lynx had to load into RAM without direct access to ROM didn't help optimization either. (apparently due to the original cassette based architecture --might have been better to cut RAM to 32 kB and split the address space with cart ROM -if they couldn't implement bank switching or MMU in time)

 

If Atari had a SNES beater in 1990 or something, maybe it's got a shot... But neither the Panther nor the ST were SNES beaters. The Jaguar was, but... this thread is getting pretty circular. ;)

Yes, though, if they were going to release anything ~1989/1990, an ST derivative would at least have been better than the Panther though; possibly rather mediocre if released after '89. (the Flare 1/multisystem might have been closer to such a killer; perhaps doable too if there had been focus on a conventional cart based console rather than the mess the Konix Machine evolved into)

 

Even launched in ~89/90 such an ST system might have been modestly profitable and at least managed a market footprint along the lines of the 7800 (over 3 million in 3 years the US market alone isn't bad, in fact, it's probably the 2nd best selling Atari branded console -not sure about Pong). They might have been in better shape both financially (with the computers dead) and in terms of brand recognition and developer relations by the time the Jaguar was ready.

It might have been a case of being seen as a great company fallen to lower status, possibly with a reputation of more mediocre hardware (really depending on the quality of software released) rather than a dinosaur with nothing but a dead (formerly niche) computer line, a modestly successful and somewhat obscure competitor to the NES, and maker of an even more obscure handheld system. (and romantically looked upon for their classics by a minority)

 

Of course, it could have failed spectacularly and killed Atari off before the Jaguar reached fruition, though that's a rather extreme case; more likely would it to have so little impact on the market that it didn't much matter, though at least made a slim profit.

 

The Amiga was also better suited to compete with the MD and SNES, but Commodore tried that too late and in a confused fashion, plus they had almost no name in the mainstream North American console game market at the time. (Europe was, again, a different case, but the Amiga itsself was already quite popular as a game platform almost in direct competition with the MD and SNES --probably not quite as direct as the market had been in the mid/late 80s --computers and consoles dividing a bit more into the early 90s)

Edited by kool kitty89
Link to comment
Share on other sites

  • 3 weeks later...

In 1993, the Jaguar DID support 2-bank interleaving, but you needed twice the DRAM chips to use it (i.e., either 1MB or 4MB). With bank interleaving, you get to keep those 75ns accesses as long as each master accesses its own bank. In this mode, the 68K (and other processors) can share the bus more fairly as long as you allocate memory carefully.

 

A Jaguar in this configuration has much better texture mapping performance as well. The downside is that it's not cheap.

The bank sizes can be different. Theoretically it works to have one 16-bit bank (i.e., 5 DRAM chips, 2.5MB). That's an interesting idea that could help with texture mapping and relieve some of the 68K burden. Never thought of that one.

 

For the 2 banks, other than the interleaving support, can the 2 banks be treated as separate (parallel) buses as well?

Could such a second, 16-bit bank facilitate the DSP's bus accessing, or would TOM retain priority for both banks?

 

Another 512 kB chip could add to cost, but it seems like the benefits could be significant. (perhaps not as cost effective a change as the least expensive 68EC020 variant would have been though -you use more board space and keep the 68k with the added RAM, you use no additional board space and lose the 68k with the 68EC020)

 

Crazyace commented on aother matter earlier in the discussion:

I was just thinking that the only part of the system that really efficiently uses the 64 bit bus is the OP. Sure the blitter is ok for clears and copies to/from internal memories. But for general memory copies the page breaks end up limiting the process.

It might have been quicker to actually only use 32 bit memory, but have 2 banks - so that blit's wouldn't page break all the time.

That would also free up pins on tom to allow a seperate data bus for 68k/jerry and the dram.

Interestingly OP heavy games could have the 68k( or GPU/DSP ) running on a seperate bank to the OP.

Though the bank interleaving feature wasn't expressly mentioned. (perhaps implied)

 

However, could it have been practical to set-up the system so that it could be used as either 2x 32-bit DRAM banks, or 1x 64-bit bank? (catering to different games: those heavily relying on the object processor likely using the 64-bit configuration)

 

Also, how would the peak bandwidth compare for 2x 32-bit banks interleaved compared to the single 64-bit bank? (the full 64-bit width is only useful for some parts of the system anyway, namely the OP and some blitter operations -the GPU is limited to a 32-bit bus for almost everything and Jerry is 16-bit, and a faster bus wouldn't matter for the 68k, though separate buses would)

Even if the configuration was fixed after booting, having that option would seem a significant advantage.

 

 

 

 

One other thing: Gorf seems to state (or imply) in several instances that Flare was forced to design the system with a separate "warn and fuzzy" general purpose CPU; that the Tramiels (or Atari Corp. management in general) forced it into the design and Flare had originally considered using the RISC chips alone. However, given their previous designs, it seems likely that they'd always wanted a common (in use), general purpose CPU to be used, like in Flare 1 (Z80, later 8088/86), though the Flare 1 design didn't have general purpose RISC processors like the Jag, so the context is a bit different. (Flare 1/Multisystem needed a general purpose CPU, the DSP couldn't have substituted like the Jag's RISCs have the potential to)

 

Maybe part of that assumption comes from the Jaguar II, as they certainly took that approach there.

 

 

 

 

Also, I noticed here: http://robmoody.files.wordpress.com/2009/10/insidethejaguar.jpg that TOM and JERRY were manufactured by Motorola (I thought Toshiba had been the chip vendor, though I suppose they could have used both). Also, while JERRY is labeled "DSP" TOM is labeled "CPU" which is a bit odd, unless it was simply a misprinted G and not a C.

 

 

 

 

 

Another post from a couple pages back:

We know the chip designers were really good, but they had plenty of free time to work on Jerry while the software effort was being utterly, hopelessly, mismanaged.

 

I think the software development problem had more to do with clueless management at Atari (or "incorrect assumptions about game development trends", to be polite), not so much a lack of resources.

 

I wonder how much was poor management and how much was due to the funding issues at the time, or how the latter impacted the former... The Tramiels also had a fair amount of other assets to fall back on -digging into those to help support Atari would have been riskier -going back to the whole "bettting the farm" thing. (except doing so in moderation wouldn't have been betting the entire farm so to speak)

 

Even if they made the wrong assumptions about high-level programming, the bugs still hindered assembly programming significantly. (so if not putting more work into compilers, finding good work-arounds for assembly programming would still have been significant)

Edited by kool kitty89
Link to comment
Share on other sites

For the 2 banks, other than the interleaving support, can the 2 banks be treated as separate (parallel) buses as well?

Could such a second, 16-bit bank facilitate the DSP's bus accessing, or would TOM retain priority for both banks?

In the actual Jaguar, all memory is Tom-mastered. In a hypothetical world, any arrangement is possible of course. But in understanding the Jaguar's architecture, it's helpful to understand the reasoning of the designers.

 

As you know, the designers did intend to add a second bus to Puck (Jerry II), but that's in large part due to the addition of a "real" CPU (the RCPU). The original DSP was not designed to do very much memory access -- and when used mainly as an audio synthesizer, it doesn't.

 

If you're interested in dual bus designs, it's normal for consoles to have one dedicated graphics bus (i.e., for Tom) and one dedicated CPU bus (i.e., for 68K, Jerry, etc). This would benefit certain scenarios but not the ones the Jaguar designers had in mind.

 

But the design tradeoffs are really complicated. There's never a clear winner -- even today we have Xbox 360 with a unified bus and the PS3 with a dual bus. Both have different kinds of advantages and the Jaguar designers were in the unified bus camp.

 

So to clarify, the Jaguar has ONE bus, but it can have two banks on it.

 

you use no additional board space and lose the 68k with the 68EC020

Not exactly. A lot of the board space in the Jaguar is spent on routing traces. If you use wider busses everywhere (to the 020 and the DSP), you have more traces to route and the board gets bigger. Sorry to be nitpicky but once again we're in a huge nest of subtle tradeoffs.

 

For general memory copies the page breaks end up limiting the process.

Again the weak spot we're all talking about is "texture mapping" and/or rotated sprites -- it's really the same feature, copying from sprite/texture memory to framebuffer memory. For standard and scaled sprites you can use the OP. For smooth shading and/or z-buffering you can use the blitter. (Crazyace didn't mention this but it's the main intended use case for the blitter and it's REALLY fast.) In both cases the bus utilization is very good -- these are the cases the designers focused on and they work close to optimally.

 

If you were designing the Jaguar and you knew texture mapping would be important, there are endless ways to rearrange the architecture to support it. Going to dual buses is one brute-force approach, this is exactly how the 3DO handled it. Using small on-chip buffers with the Jaguar's unified 64-bit bus is another approach that is cheaper/faster/more flexible but requires more design work.

 

However, could it have been practical to set-up the system so that it could be used as either 2x 32-bit DRAM banks, or 1x 64-bit bank?

Yes, this is practical if you mean to use 1 bus and selectable 1/2 banks. It's a very modest change to the existing controller and doesn't require any additional signals (since there are already two pairs of bank control signals for two banks).

 

Let's analyze performance: If you only care about 2D sprites and smooth shading, splitting banks like this can ONLY hurt performance. This is because you can't keep both banks busy all the time, half of your bus will usually sit idle -- whereas the blitter/OP can keep all memory at full utilization. On the other hand, if you KNOW texture mapping performance is critical, just optimize the blitter to use a single bank at full utilization!

 

Now, if you realize at the last minute that you made some major mistakes in the Jaguar design and texture mapping and C performance is key, well... that goes back to the earlier ideas of an extra 512KB or a 68020 or whatever you can hack in at the last minute. If you know the requirements from the beginning, you can meet those needs on a 1 bus, 1 bank, design.

 

Also, I noticed here: http://robmoody.files.wordpress.com/2009/10/insidethejaguar.jpg that TOM and JERRY were manufactured by Motorola (I thought Toshiba had been the chip vendor, though I suppose they could have used both). Also, while JERRY is labeled "DSP" TOM is labeled "CPU" which is a bit odd, unless it was simply a misprinted G and not a C.

My understanding is that they were designed and prototyped with Toshiba, and Motorola was the second source. I don't know the actual ratio of Toshiba to Motorola chips, but the netlists are clearly designed to work on both fabs.

 

I wonder how much was poor management and how much was due to the funding issues at the time

Well, to me, they're the same thing! Funding is never infinite... except in defense-related projects! ;) Management's job is to decide what's important and allocate funding appropriately. Clearly they thought the chips were VERY important and software not at all. You could go with dumber chips and better tools as an alternative (see 3DO for exactly that). Those are the management decisions.

 

I still think the project was wildly successful against all odds. Any time a company gets a product out into a market dominated by players literally 100 times their size, it's a success!

 

- KS

  • Like 1
Link to comment
Share on other sites

Also, I noticed here: http://robmoody.file...dethejaguar.jpg that TOM and JERRY were manufactured by Motorola (I thought Toshiba had been the chip vendor, though I suppose they could have used both). Also, while JERRY is labeled "DSP" TOM is labeled "CPU" which is a bit odd, unless it was simply a misprinted G and not a C.
I doubt it. They designers made it pretty clear that they considered that the "real" CPU (as in "central processing") in the Jaguar was the RISC GPU, and that the 68k was to be used only as a bootstrap.
  • Like 1
Link to comment
Share on other sites

I doubt it. They designers made it pretty clear that they considered that the "real" CPU (as in "central processing") in the Jaguar was the RISC GPU, and that the 68k was to be used only as a bootstrap.

I always wanted to know the story behind this one! Atari was QUITE clear that Tom was the CPU and the 68K was just there to 'read the joysticks'.

 

But the actual netlists refer to the 68K as the CPU and Tom's processing system as the GPU. I always found that surprising. I wonder if marketing helped push the Tom-as-CPU idea or if that's something the chip makers felt all along.

 

The dev manuals are somewhere in the middle... they usually call the 68K the CPU, but there's a couple of paragraphs downplaying its use.

 

I always got mixed messages from the dev manuals. At times it seems like the GPU was designed mainly for transform, lighting, and rasterization -- most of the discussion of the GPU is in the context of how to make it do that. Every now and then they mention in vague terms that you could do other things with it, like a real CPU.

 

There's almost a complete lack of discussion about game logic in the dev manuals -- it's all about graphical effects. That may be part of why so many people used the 68K for game logic -- there's no help and few hints in the manual about how you could run the system any other way.

 

- KS

  • Like 1
Link to comment
Share on other sites

Yes, the dev manuals raise almost as many questions as they answer, technically and otherwise.

 

I'd love to know what Matthieson and the other involved people could tell us about the history of the Jaguar. I know someone managed to contact him several years ago to get the Jagref V8 PDF file...

Edited by Zerosquare
  • Like 1
Link to comment
Share on other sites

you use no additional board space and lose the 68k with the 68EC020

Not exactly. A lot of the board space in the Jaguar is spent on routing traces. If you use wider busses everywhere (to the 020 and the DSP), you have more traces to route and the board gets bigger. Sorry to be nitpicky but once again we're in a huge nest of subtle tradeoffs.

Right, I'm not sure why I overlooked that, I was thinking about just that in context of an additional DRAM chip as well. (actually, the amount of additional traces needed fro another 16x256-bit DRAM chip might not be much different than from replacing the 68k with a 68EC020, perhaps less even for the RAM -I'm not sure on the exact pinouts)

 

Let's analyze performance: If you only care about 2D sprites and smooth shading, splitting banks like this can ONLY hurt performance. This is because you can't keep both banks busy all the time, half of your bus will usually sit idle -- whereas the blitter/OP can keep all memory at full utilization. On the other hand, if you KNOW texture mapping performance is critical, just optimize the blitter to use a single bank at full utilization!

The GPU would benefit as well, wouldn't it? (since it almost always accesses at 32-bits) And wouldn't it benefit the DSP somewhat as well?

 

Now, if you realize at the last minute that you made some major mistakes in the Jaguar design and texture mapping and C performance is key, well... that goes back to the earlier ideas of an extra 512KB or a 68020 or whatever you can hack in at the last minute. If you know the requirements from the beginning, you can meet those needs on a 1 bus, 1 bank, design.

Those changes would impact cost a lot more than supporting split/single bank configurations though. Such a modification to the bus controller would definitely not be practical to make later in the design cycle?

 

 

Clearly they thought the chips were VERY important and software not at all. You could go with dumber chips and better tools as an alternative (see 3DO for exactly that). Those are the management decisions.

The 3DO hardware was also much more expensive by comparison (especially in terms of cost/performance ratio -though the Jaguar itself has a high cost/performance ratio in general). I'm sure the Flare's design philosophy came into play for the chips though, so not entirely driven by requirements set by management (Flare had proposed the design, not been commissioned to make it to Atari's Spec, right? -unlike with the Panther)

Sony of course did both to some extent, or actually much like 3DO, but with newer hardware, more optimization, and a totally different business model. (exceeding even the normal razor and blade marketing of contemporaries, dumping the price rather than selling for slim profits or at cost)

 

I still think the project was wildly successful against all odds. Any time a company gets a product out into a market dominated by players literally 100 times their size, it's a success!

Atari Corp had already pulled off some similar things before though, granted most that were more successful did so under better circumstances. (less stiff competition, stronger brand name, etc)

 

 

 

Edit:

I doubt it. They designers made it pretty clear that they considered that the "real" CPU (as in "central processing") in the Jaguar was the RISC GPU, and that the 68k was to be used only as a bootstrap.

I always wanted to know the story behind this one! Atari was QUITE clear that Tom was the CPU and the 68K was just there to 'read the joysticks'.

If Flare had really intended (or strongly considered) having the GPU act as the system's primary CPU that would certainly tie into Gorf's comments about Atari pushing the "warm and fuzzy" thing, though I think you already mentioned there doesn't seem to be an indication that TOM was ever intended to boot the system.

Using a 68k seems like an awful waste to simply boot the system, even if it was used as a shortcut to save time on the custom chips there should have been cheaper alternatives... (though looking at things, most cheaper, poorer performing alternatives had other limitations, like addressing on the 8086 -if the booting CPU imposed such limitations ont he entire system, not to mention 8-bit microprocessors -which would further hinder Jerry's bus accessing)

 

 

There's almost a complete lack of discussion about game logic in the dev manuals -- it's all about graphical effects. That may be part of why so many people used the 68K for game logic -- there's no help and few hints in the manual about how you could run the system any other way.

Perhaps they had planned on using the RISCs more, but the bugs prohibited it? (and they didn't invest enough in software tools to counteract that with work-arounds)

Edited by kool kitty89
Link to comment
Share on other sites

One other question on Jerry though: was there anything from stopping Flare from re-using the DSP from the Flare 1/multisystem design (or a similar derivative) rather than the RISC core?

 

Had there been legal issues about ownership of the chipset (of the companies Konix sold it to -and other licencees), I suppose that would have been stopped things up... Though that would apply mostly for directly copying the DSP in Slipstream (or Flare 1), at some point modifications to the design would at least blur things legality wise. (depending on the exact conditions of the contract with Konix too)

 

Or barring all of that, they could have gone with a new DSP design, separate from the original and the RISC MPU core in the Jaguar. (it should have made for a simpler -less flexible- chip, probably a fair bit cheaper than Jerry, though it could mean more design time -opposed to reworking things already done in TOM) A more DSP-like chip could be more optimized for dedicated functions as well, faster for certain things. (in addition to sound, things like plotting 3D coordinates, similar dual-purpose as the DSP in the multisystem)

 

That's of course, assuming licensing a DSP core, or using an off-the-shelf DSP were off the table. (or using a combination of off the shelf and custom hardware -or re-using some previous components, like from the Falcon, as mentioned earlier) If they could have gone that route, using a Motorola part might have made sense. (possibly even the same DSP as used in the Falcon -though that may not have been the best choice)

Edited by kool kitty89
Link to comment
Share on other sites

The GPU would benefit as well, wouldn't it? (since it almost always accesses at 32-bits) And wouldn't it benefit the DSP somewhat as well?

I don't think the GPU would benefit much. The only benefit of two banks is when you have interleaved accesses. That is, when the system must change tasks on each memory cycle. (I.e., read from one bank, write to the other, repeat.) The Jaguar isn't generally meant to work that way except in the (non-optimized) 68K execution and texture mapping cases.

 

Normally, bus masters trade off. I.e., the OP runs for 100 transactions, then the blitter, then the GPU, etc. In this mode, having double the bandwidth allows bus masters to finish sooner. Having two banks ONLY eliminates the overhead of switching banks. It doesn't let you do two different things at once.

 

Maybe you're thinking of 2 buses (allowing two different memory transactions at once), but that's a different architecture and a big change.

 

If Flare had really intended (or strongly considered) having the GPU act as the system's primary CPU that would certainly tie into Gorf's comments about Atari pushing the "warm and fuzzy" thing, though I think you already mentioned there doesn't seem to be an indication that TOM was ever intended to boot the system.

Yeah, they never made a Tom chip that could boot. We've been over the 68K ground a lot. If it weren't for the 68K I don't know how long the Jaguar could have made it. As it is, games like AvP were massively delayed -- without the 68K could it have ever come out? It's doubtful we'd even have Tempest 2K. It just seems like a really bad plan.

 

The 68K was a crutch, but the system was crippled. It needed crutches.

 

Perhaps they had planned on using the RISCs more, but the bugs prohibited it? (and they didn't invest enough in software tools to counteract that with work-arounds)

There's a difference between designing something while "hoping" it will work out for future purposes you haven't thought through, and actually designing something for specific purposes. In my opinion, the Flare team focused on 3D smooth shaded graphics as a specific purpose, and hoped it would be good at general purpose CPU stuff. For graphics, it is pretty great. For all other uses, it's pretty crippled, but if you're really really good you can bend it to your will.

 

I feel like this thread is just one continuous rehash... maybe it will never die... ;)

 

One other question on Jerry though: was there anything from stopping Flare from re-using the DSP from the Flare 1/multisystem design (or a similar derivative) rather than the RISC core?

I think the main reason is that it's really not a very good DSP. It's certainly much more limited than Jerry's JRISC and slower too.

 

Honestly, the Jaguar has plenty of computing power. Aside from the bugs, the biggest performance limiters are system issues, especially the way the bus is optimized ONLY for cases that turned out not to be very important.

 

For example, Jerry's DSP can do anything you like, but the way it fits into the system makes it quite slow for anything but music synthesis and streaming data from CD. That was probably all they ever designed it to do, and anything else it can do is a hoped-for "bonus".

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

The 68K was a crutch, but the system was crippled. It needed crutches.

:D

 

There's a difference between designing something while "hoping" it will work out for future purposes you haven't thought through, and actually designing something for specific purposes. In my opinion, the Flare team focused on 3D smooth shaded graphics as a specific purpose, and hoped it would be good at general purpose CPU stuff. For graphics, it is pretty great. For all other uses, it's pretty crippled, but if you're really really good you can bend it to your will.

I was mainly mentioning that in the context of claims that the design (at some point) had been intended to (or at least considered) use the RISC processor as the main CPU and that Atari forced Flare to do otherwise. (I wasn't suggesting that the 68k in and of itself was good or bad)

 

 

I think the main reason is that it's really not a very good DSP. It's certainly much more limited than Jerry's JRISC and slower too.

 

Honestly, the Jaguar has plenty of computing power. Aside from the bugs, the biggest performance limiters are system issues, especially the way the bus is optimized ONLY for cases that turned out not to be very important.

 

For example, Jerry's DSP can do anything you like, but the way it fits into the system makes it quite slow for anything but music synthesis and streaming data from CD. That was probably all they ever designed it to do, and anything else it can do is a hoped-for "bonus".

 

If not related to the Flare 1 DSP, couldn't a less general purpose, more DSP like processor have been more useful for dedicated sound tasks, or intended for both sound and possibly helping with 3D math as well. (or would any other processor suffer similar issues on the bus as Jerry?)

If you wouldn't mind humoring my other comment: would a 56000 be at all suitable for handling sound (and/or 3D math) in the system? (in place of Jerry) Though that particular chip may have had other disadvantages. (I think it was coupled with 3x 32 kB SRAM chips in the Falcon)

That, in part ties into the previous comment on possibly using the Falcon I/O and sound hardware and avoiding TOM entirely. (DSP coupled with the DMA sound channels, rather than the DMA channels alone) Or short of that, a DSP driving sound directly to DACs, possibly onboard a

much simpler ASIC (along with I/O hardware) compared to Jerry. (perhaps like Jerry without the RISC)

 

Other than that Motorola DSP, there may have been other, more powerful and cost effective options (though the association with Motorola is a plus), Sega used a Samsung SSP-1601 DSP in the SVP coprocessor used in Virtua Racing for the Genesis. (in that case used for not just calculating the the vertices, but also rasterizing the polygons and converting the frame into the MD VDP's tile based format -rather like the Super FX, except that was a custom RISC microprocessor, Sega licenced an off the shelf DSP instead, very fast 16-bit fixed point arithmetic)

 

 

Not to drag this on, but going back to Jerry's use primarily for sound, at least in hindsight, it seems a bit of a waste. Most, if not all Jaguar music is MOD/Tracker based (I think maybe 1 or 2 cases of FM synthesis) such that the Falcon's sound hardware would have fit quite well (in several cases there's a lack of music altogether in-game, be it due to rushed development, bus limitations, or both). Or rather than use the Falcon's hardware directly, possibly design JERRY to contain the I/O hardware and DMA audio hardware. (you mentioned uncompressed samples being a waste of cart space, but you could store compressed samples, decompress them into RAM, granted, that means more RAM being eaten up too -depending on sample size/quality- while the current Jerry could facilitate decoding compressed samples on-the-fly)

Of course, one could go with FM synthesis as well as limited sample playback support (like 1 or 2 channels with stereo), but I think most of the market was wooed more by sample based music (be it as complex as the SNES or wavetable sound cards, or DMA based audio, like the Amiga).

I personally would not have minded use of FM synthesis personally, a variety of lower-cost Yamaha chips available (and licenseable too, in some cases at least -Sega integrated the YM2612 into the MD's VDP by late 1992). A pair of YM2612s would have been nice (12 4-op FM channels and direct access to 2 8-bit DACs, though cutting FM to 10 channels when used; all with hard L/R/center panning), or a YMF262 like the Soundblaster Pro 2.0/16 (and compatible) used, 18 2-op FM channels, each with hard panning, and support for pairing to 4-op for up to 6 4-op channels and 6 2-op channels. (in that case, it requires a separate DAC, and separate digital channels) The SB-16 was the industry standard PC sound card in the early 90s, you had others coming in, but it was the universal standard (in in some cases, was preferable to the wavetable cards -in several cases I find the Ultrasound rather distasteful, soundcanvas based GM is usually better, though SB is still nicer in some cases IMO, or if not the real SM-16 due to it's crap analog circuitry, higher-quality compatibles)

The dual YM2612 set-up is particularly nice though, and probably preferable if properly taken advantage of. (probably superior to the OPL3)

 

However, all that is rather moot if the larger portion of the mass market is more interested in sample based music... (or if the sound engined for the FM chips were poor -which happened frequently on the Genesis, particularly the 1st party GEMS engine, but others like EA used were similarly poor)

Edited by kool kitty89
Link to comment
Share on other sites

If not related to the Flare 1 DSP, couldn't a less general purpose, more DSP like processor have been more useful for dedicated sound tasks

My point is that the way Jerry fits into the system makes it useless for most general purpose functions. The system is the bottleneck, not the specific DSP model.

 

Jerry's DSP is pretty good compared to its contemporaries. It's close to a TMS320C25 or a 56000 and better than a SuperFX or SVP. For Atari, it's obviously much cheaper than any of the above, since they own the IP already. So from that perspective it was a perfectly good idea.

 

The only advantage I can see of using some other DSP is that it might come with solid tools. The bugs in the JagRISC could have been mitigated with better tools, but Atari didn't much spend money or time on tool development.

 

And again, once you put in a new DSP, you still need to redesign the system architecture to give Jerry sufficient bandwidth for the tasks you think are important. In the Jaguar, that bandwidth is not provided because they weren't thinking about the use cases you are.

 

Not to drag this on, but going back to Jerry's use primarily for sound, at least in hindsight, it seems a bit of a waste.

I agree. They probably overestimated the value of DSP-like sound processing. They heavily marketed QSound and 3D Sound technology, but little of that was implemented.

 

At the same time, you could also say that all the abilities of the Jaguar were a "waste" because most people used 16-bit era tools, assets, code -- and most of all, 16-bit budgets. The budgets on leading "32-bit era" games by 1994 were many times what Atari was able or willing to pay, so you can't expect people to be experimenting with advanced sound and music even though the capabilities were there. 16-bit era music was much more fitting with the Jaguar's 16-bit budgets, sadly.

 

In my opinion, the Jaguar has a fairly cohesive architecture. All the parts fit together like puzzle pieces. It's not easy to put in something radically different, with the exception of the main CPU. They specifically intended that. Otherwise, they didn't leave a lot of space for alternatives.

 

Given that most of the Jaguar architecture was dreamed up in 1989, right before a huge transformation in the way games were made, it's hard to imagine why they would do things differently than they did. From a 1989 perspective, Jaguar rocks. :D

 

- KS

  • Like 1
Link to comment
Share on other sites

I doubt it. They designers made it pretty clear that they considered that the "real" CPU (as in "central processing") in the Jaguar was the RISC GPU, and that the 68k was to be used only as a bootstrap.

I always wanted to know the story behind this one! Atari was QUITE clear that Tom was the CPU and the 68K was just there to 'read the joysticks'.

 

But the actual netlists refer to the 68K as the CPU and Tom's processing system as the GPU. I always found that surprising. I wonder if marketing helped push the Tom-as-CPU idea or if that's something the chip makers felt all along.

 

The dev manuals are somewhere in the middle... they usually call the 68K the CPU, but there's a couple of paragraphs downplaying its use.

 

I always got mixed messages from the dev manuals. At times it seems like the GPU was designed mainly for transform, lighting, and rasterization -- most of the discussion of the GPU is in the context of how to make it do that. Every now and then they mention in vague terms that you could do other things with it, like a real CPU.

 

There's almost a complete lack of discussion about game logic in the dev manuals -- it's all about graphical effects. That may be part of why so many people used the 68K for game logic -- there's no help and few hints in the manual about how you could run the system any other way.

 

- KS

 

It was also my assumption that the GPU was meant to be the central processor unit... Wasn't it Atari who first suggested to turn the M68K off in some document out there? They may have said that after the fact they discover how much havoc the M68K reeked on the system bus.

Link to comment
Share on other sites

It was also my assumption that the GPU was meant to be the central processor unit... Wasn't it Atari who first suggested to turn the M68K off in some document out there? They may have said that after the fact they discover how much havoc the M68K reeked on the system bus.

Most everything from Atari or Flare refers to the 68K as the "manager" or "conductor" or "coordinator". The idea is that the 68K performs an important (one might even say Central) processing role, but does very little of the actual work. The co-processors do all the heavy lifting and the 68K is sleeping 95% of the time.

 

This is a good idea in theory! In practice, some heavy lifting just doesn't fit anywhere in the Jaguar BUT on the 68K. This often includes "game tasks" such as AI and scene management.

 

Tom, also named the "GRAPHICS Processing Unit" by Flare, is great at offloading GRAPHICS tasks. It's not very good at offloading other kinds of tasks.

 

It's easy for me, never having implemented a particular style of game, to just assume that the whole program can fit on the GPU with some creative use of overlays. This is how Flare felt. They even use phrases like "software visible caching" to describe how you're supposed to break up your code into little bite-sized overlays in GPU assembly.

 

There's a difference between believing, hoping, and engineering. I think the Tom-as-CPU was more of a belief/hope/desire, and Flare didn't go too far in engineering it to be that.

 

In any case, nobody at Flare and (probably?) very few people at Atari had any experience writing 90s era videogames OR building general purpose CPUs. So neither perspective was available to them. The guys at Flare WERE very good at making graphics chips and that's what they did. The RISC GPU they created resembles a conventional CPU in some ways, but many key features are neglected. I think the lack of required experience shows -- Tom is not a very good CPU.

 

Of course, anything is possible -- there are always top 1% engineers out there that can make ANY hardware, no matter how broken, sing and dance. John Carmack is one of those guys... there are few architectures more broken than DOS-based PCs of 1991. ;) But there aren't many guys like John and they aren't cheap.

 

Still, you'd have to ask the guys at Flare for the whole story.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

Jerry's DSP is pretty good compared to its contemporaries. It's close to a TMS320C25 or a 56000 and better than a SuperFX or SVP.

Do you know much about the Super FX GSU? I know it's a bit off topic, but I've been wondering about that for a while: was it more DSP-like or a more general purpose RSC MPU with some application specific emphasis like the J-RISC?

And the Samsung SSP-1601 is really less powerful for such (even for dedicated fix point matrix calculations like Sony's GTE)? That's the context Sega used it in, that and rasterization, nothing sound related.

 

The only advantage I can see of using some other DSP is that it might come with solid tools. The bugs in the JagRISC could have been mitigated with better tools, but Atari didn't much spend money or time on tool development.

That is a bit odd, and it doesn't seem to have changed even after funding improved and the importance became more apparent. (granted they had advertising to consider too)

It seems like some 3rd parties (like id) put more effort into developing software tools... (weren't they working on their own RISC compiler?)

 

I agree. They probably overestimated the value of DSP-like sound processing. They heavily marketed QSound and 3D Sound technology, but little of that was implemented.

 

At the same time, you could also say that all the abilities of the Jaguar were a "waste" because most people used 16-bit era tools, assets, code -- and most of all, 16-bit budgets. The budgets on leading "32-bit era" games by 1994 were many times what Atari was able or willing to pay, so you can't expect people to be experimenting with advanced sound and music even though the capabilities were there. 16-bit era music was much more fitting with the Jaguar's 16-bit budgets, sadly.

Yes, but it seems to be mixed goals there... On one hand it was intended as an SNES killer and to be as low cost as possible to appeal to the mass market as well as include some advanced features for things like 3D rendering and smooth shading. On the sound side of things, it seems (even without hindsight) like it would have been reasonable to aim at sound with reasonable competition against the SNES's sound, possibly with some trade-offs (the falcon's DMA sound did support higher sample rates and lacked forced interpolation and filtering -which can go both ways, I personally think the interpolation may have hurt as much as helped, going for more than 8 channels would be another thing).

Plus, for the mass market, Stereo would still have been the most marketable, an almost no 5th generation games supported anything more (a few N64 games support Dolby 5.1 I think, but that was in the late 90s/early 2000s), and plenty of lower budget users wouldn't even be using stereo, but mono. (so specific mono support is important -either in software, in most cases, or with a dedicated mono amp, which was used for RF, of course, and Sega used that in the Genesis AV port too, as games tended to not support it in software)

 

Perhaps they really thought such new high-end sound systems were going to hit it big? That may be part of hindsight, but given how low-level consumer (and low cost) oriented Atari was pushing, things like that really don't seem that important... S-video and RGB (for EU users), yes, as contemporaries already offered that (SNES both, and Genesis RGB+Composite).

 

If they went with a DMA driven sound system, perhaps it would have merited a dedicated slow bank of sound RAM too (they did add that to the Jag II) to keep it off the main bus, though with the single-bus emphasis, perhaps buffering may have been utilized instead. (allowing full 64-bit bus access with DMA) Again, that's not just with re-using the Falcon's sound hardware per se, but alternatively designing their own sound+I/O ASIC.

 

Some of the other areas may not have been used to their full potential, but they were still very useful for the overall design, they could have even opted for analog inputs for sound expansion (and mixing of CD-DA, as Sega and NEC did), and not bothered with digital audio output wither.

 

Given that most of the Jaguar architecture was dreamed up in 1989, right before a huge transformation in the way games were made, it's hard to imagine why they would do things differently than they did. From a 1989 perspective, Jaguar rocks. :D

Like Flare 1 rocks for 1986? Actually, that's still pretty nice for '89/90 (when it was ready), I'm repeating myself now, but it seems a shame Konix went the route they did... Atari could have really used something like that (opposed to the Panther), or Amstrad rather than the GX-4000 which was more outdated than an STe based console would have been in '90 or '91. (plus wasn't Flare 1 being worked on at Amstrad at some point? It predated the CPC+ too by a good margin... it would have been interesting if the Loki/Flare project had been implemented in an enhanced CPC compatible -opposed to the originally planned Spectrum- in addition to the GX4000, which was a CPC+ derivative)

Edited by kool kitty89
Link to comment
Share on other sites

I forgot one other thing... you wouldn't necessarily need a separate block of RAM for samples to have such an advantage, but a sample ROM with a reasonably broad selection of default instruments and sounds could be good, anything else being in main RAM.

That reminds me: they way Jerry is now, could you have it generate sound by using its sample ROM alone (and possibly some additional samples in local RAM) without hitting the main bus at all?

 

Also, is Jerry's sample ROM storing compressed samples? (otherwise it would seem to be really tiny, only 4 kB as I recall)

Link to comment
Share on other sites

That reminds me: they way Jerry is now, could you have it generate sound by using its sample ROM alone (and possibly some additional samples in local RAM) without hitting the main bus at all? Also, is Jerry's sample ROM storing compressed samples? (otherwise it would seem to be really tiny, only 4 kB as I recall)

They're not really samples, just lookup tables useful for music synthesis. And yes, they're intended to allow sound and music generation without main memory access. Naturally Jerry can directly synthesize a wide variety of sounds using FM, additive, subtractive, or even speech synthesis.

 

Although sample-based synthesis is more bandwidth intensive, it still doesn't use much bandwidth. 4 channel MODs need less than 0.5% of available bandwidth. If that's too much for you, you can also do on-the-fly ADPCM decompression to cut that in half and save some cart space.

 

Music and sound is just not a big deal for Jerry -- it was meant to do that and it does it well. It's the non-music stuff that gets tricky.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

Hmm, I wonder why there are quite a few games without in-game music then... (but have music elsewhere in the game, like in Doom and Cybermorph) Perhaps Jerry was already being monopolized for other purposes in those cases. Given the inefficiency of Jerry's accessing it really seems like most tasks would be better loaded onto the GPU, even if it gets overworked and slows things down. (unless sacrificing 6x the main bus bandwidth and offloading that to the DSP is still preferable)

 

Was the relatively simplistic use of the sound system in most cases simply tied to limited software tools and limited budgets. (or time spent focusing on graphic/logic difficulties and not so much sound)

 

It would have been really neat to have some less common synthesis methods used, not just sample based synthesis (often MOD seems to have been used), or FM synthesis for that matter (which did have some support in the SDK as I recall), but something like what the MT-32 does would have been really amazing. (well technically that's partially sample-based, but also a lot more flexible using subtractive synthesis in particular)

Edited by kool kitty89
Link to comment
Share on other sites

In as far as last minute changes, would it have been more cost effective to switch to a 68EC020 (the lest expensive 16.7 MHz chip) or add 512 kB to the 2nd bank? (assuming interleaving works properly in that configuration)

 

Obviously investing more into software tools would have helped a lot, but management wasn't thinking like that and it's not foolproof either -not providing better support on the assembly side of things at least seems a bit odd, even if high level stuff was thought to be less important. (a hardware change like that should be relatively fool proof as long as it was made before the board design was finalized and especially before production started being set-up -production of the PCBs; stocking ICs wouldn't be so much an issue)

Edited by kool kitty89
Link to comment
Share on other sites

Obviously investing more into software tools would have helped a lot, but management wasn't thinking like that and it's not foolproof either -not providing better support on the assembly side of things at least seems a bit odd

Atari was a hardware company. They were never good at managing software development. Almost every major piece of Atari software was contracted out, from all their OSes -- 8-bit and ST and so on -- to all their tools -- 8-bit BASIC, ST linkers, assemblers, and compilers.

 

The Jaguar was no exception -- the tools were all contracted. The very best contractors in the world will take a perfect spec and deliver a working product. If your spec is flawed, the product will be flawed. That's the best you get -- most contractors do far worse. Contractors don't care about your success, they care about making money and the good ones care about meeting your spec.

 

Now look at the spec for the Jaguar RISC. It's very brief, just a list of instructions. The early specs don't mention any bugs. Atari wasn't even finding bugs. They were relying on 3rd parties to write software for the Jaguar, so the 3rd parties found the bugs. The 3rd parties didn't know what to make of them and misreported them or didn't report them.

 

The contractors who wrote the tools took the spec and created a working assembler. They got paid and went on their way.

 

In theory, this style of management seems like a great idea -- it saves lots of time and money. In practice, as soon as something goes wrong, your contractors and 3rd parties don't communicate well. Flaws are unresolved and fester. Projects slip and quality drops.

 

Good managers know this stuff. Average managers get to learn the hard way.

 

I don't mean to rag too much on Atari management. It's almost impossible for me to imagine a way they could have kept the company alive past 1996. It's amazing they lasted past 1993, it really is.

 

Hindsight makes all mistakes very clear: One mistake was not correctly guessing how gaming technology would COMPLETELY CHANGE from 1989-1995 -- and who could really? (Except for armchair managers in 2010, of course.)

 

But to many of us, it is harder to forgive the "more obvious" management mistakes, like contracting out things of core importance, such as game and tool development. Atari got away with it before but that's because their computers were plain-jane. The Jaguar was not and needed in-house software resources. The lack of in-house software talent shows in the tools, in the games, and even in the quality and bugs of the final chips.

 

Finally (to really show my ambivalent feelings about this), you could still excuse management for not knowing better: A big part of that "gaming technology revolution" of the early 90s was that tools and libraries went from "occasionally useful" to "completely essential". Software-focused companies like 3DO and Sony totally got it. Sega learned the hard way with the 32X and Saturn. And by the time of the N64 and Dreamcast everybody understood it crystal clear.

 

- KS

Edited by kskunk
  • Like 2
Link to comment
Share on other sites

Obviously investing more into software tools would have helped a lot, but management wasn't thinking like that and it's not foolproof either -not providing better support on the assembly side of things at least seems a bit odd

Atari was a hardware company. They were never good at managing software development. Almost every major piece of Atari software was contracted out, from all their OSes -- 8-bit and ST and so on -- to all their tools -- 8-bit BASIC, ST linkers, assemblers, and compilers.

Well, Atari Inc was a different case (at very least in terms of game development), but that's not really the point as it's a different company. (incidentally I found out my dad was involved in writing BASIC for the ST while at Metacomco -Amiga too iirc- he was actually in the background in a UK televised interview with Jack Tramiel -one of the engineers in front of computers :))

 

I don't mean to rag too much on Atari management. It's almost impossible for me to imagine a way they could have kept the company alive past 1996. It's amazing they lasted past 1993, it really is.

 

Hindsight makes all mistakes very clear: One mistake was not correctly guessing how gaming technology would COMPLETELY CHANGE from 1989-1995 -- and who could really? (Except for armchair managers in 2010, of course.)

 

But to many of us, it is harder to forgive the "more obvious" management mistakes, like contracting out things of core importance, such as game and tool development. Atari got away with it before but that's because their computers were plain-jane. The Jaguar was not and needed in-house software resources. The lack of in-house software talent shows in the tools, in the games, and even in the quality and bugs of the final chips.

 

One thing is though that some 3rd parties (or at least 1) took it upon themselves to develop better tools, namely id... I wonder (in hindsight at least) if contracting id to produce the tool set would have been a good idea. ;) (or buying/licensing the tools they did

produce -apparently they'd already developed a better working C compiler and were working on another for the planned Quake port)

 

 

Regardless of the tools themselves, one thing they could have done was offer the dev tools freely to 3rd party developers. (that's obviously been suggested before)

 

 

 

When it comes down to it though, you're right, it's pretty impressive that they made it is far as they did, could probably have been a bit better, but also far worse. (hardware improvements would come at other costs, a CD drive would attract "multimedia" titles and developers interested in such as well as the much lower risk of cheap CD media, but then you've got a $300-400 console; however, if nothing else I think pushing the test/pre-release to Europe -even if it meant the exclusion of the US initially, might have been a good move -given that they couldn't wait 'till spring of '94 -or weren't willing to dig into private funds) --Though, in theory the 68EC020 could have greatly facilitated game development. (given they weren't aware of all the bugs by that point, it's harder to say they'd even know how important it may have been)

 

To reference a previous quote:

And to really return to the thread topic, I think even radically improved technology wouldn't have saved the Jag I, but I've said as much in this thread already!

 

It pretty much comes down to the market at the time... if it wasn't for Sony's massive, aggressive, powerhouse of marketing, software, and hardware (already established in Japan, a region Atari never had a remote chance at), Atari might have had a chance... (that and Sega would have been much better off for sure -Sony forcing Sega's hand with their price dumping -exacerbating Sega's internal problems and mistakes) But that's another ball game all together. ;)

 

 

Actually, that mention of Sony's initial build-up in Japan before a western release reminds me again of Europe. Japanese video game companies have a tendency to release their products in Japan first, not simply because it's their native region, but because it make a much better test market than huge markets like North America. Europe (especially a few, select pro-Atari countries) could have served a similar role for Atari: with an initial build-up in interest, possibly attracting some prominent EU/UK developers in the process and making for a stronger, healthier US launch a few months later. (one thing Sauron mentioned very early in this thread that a later release would also mean Atari wouldn't have to share the spotlight with 3DO either -and the 32x didn't come until late fall of '94)

 

They could have focused marketing more on relatively small regions (both UK and Germany come to mind for popularity of Atari Corp products -especially UK in terms of games, not sure about France -and numerous other countries, of course). The viral marketing strategy would also be much more viable in that region. With smaller territories also comes less issues due to initially limited supplies of hardware. UK seems to have been the strongest in terms of Atari Corp and apparently one of the few regions where the Lynx had significant market penetration (ahead of the Game Gear apparently, but still far behind the Game Boy) I'd say if they had to limit things to a single country, it would probably be the UK.

 

Atari Corp did the exact opposite though: they'd initially announced London and Paris being included in the 1993 release, but later pulled out and didn't end up releasing the Jag until much later. (late 1994 it seems)

 

If possible it seems better to wait a few more weeks at least and have a more polished Cybermorph and actually complete (or reasonably complete) Crescent Galaxy.

 

 

In that respect though there was the CD32 to contend with, so something to consider at least. (the Jaguar was far cheaper and the EU/UK market was known for being rather cost sensitive as well -plus the CD32's library relied fairly heavily on Amiga ports -often 16-bit ones, not AGA specific -granted a fair about of Jag titles were 16-bit ports)

 

 

 

 

OK, one more hardware comment: in addition to the suggestion to buffer the blitter for texture mapping in phrase mode, it was suggested (by gorf) to add a blitter command cache. (or is that referring to the same thing?)

Edited by kool kitty89
Link to comment
Share on other sites

Its also worth mentioning that in the UK we still had a stream of Atari loyal dealers like Silica who had actually just got into Debenhams who are one of our largest department store chains which would have been a great place to sell to mum and dad.

 

I can remember being asked on almost an hourly basis when we were getting Jags in by expectant customers. British people also tend to be very brand loyal and near enough all the people I knew who owned a Lynx also owned a 2600, 7800 or an ST.

 

Funny thing about the CD32 is that it either sold to Amiga owners who didn't want to fork out for a A1200 or it sold to people who simply didn't want to wait for the other consoles or fork out for a more expensive machine. I can remember when we got the 3DO, in my whole time at Game I sold 1 console and thats it becuase it was just far too expensive. In fact the 1 unit I sold was too a footballer called Dean Austin (ex Tottenham Hotspur) who bought it on the strength of FIFA and certainly wasn't short of a few bob.

 

Getting the Jag versions of games like Sensible Soccer, Kick Off 3 (unreleased but finished & lost), Graham Gooch Cricket (unreleased), Jack Nicklaus (unreleased) and getting Championship Manager off Gremlin (who where already doing Zool 2) would have helped a great deal in selling the machine.

Link to comment
Share on other sites

They could have focused marketing more on relatively small regions (both UK and Germany come to mind for popularity of Atari Corp products -especially UK in terms of games, not sure about France -and numerous other countries, of course). The viral marketing strategy would also be much more viable in that region.

I wonder if they considered the idea of a Europe test launch. It really seems like a good idea, so I'm trying to figure out what made them "go big" in the US instead.

 

One possibility is that they were VERY focused on raising investor capital in 1993. They were just about to run out of money.

 

Maybe the financial situation forced their hand -- they had to look like they were going to make a billion to get investors excited. They did create that impression and they did get the money.

 

Could a slower/European "viral" launch still boost investor confidence enough? Their investors and creditors were mostly in the US, for better or worse.

 

OK, one more hardware comment: in addition to the suggestion to buffer the blitter for texture mapping in phrase mode, it was suggested (by gorf) to add a blitter command cache. (or is that referring to the same thing?)

They are different things. Gorf is right, command buffering helps too. They added that feature in the Jag II. (The Jag II also addressed texture mapping performance using several kilobytes of buffer RAM -- this provides more performance than a few 64-bit buffers, but is too expensive on 1993-era 0.5 micron chips.)

 

Command buffering is not related specifically to texture mapping, but to all polygons. Because the blitter can't draw a whole shaded/textured polygon, only a line at a time, you must start a new command for each line. Jag I has no command buffer, so there is a small setup delay at the start of each line.

 

With large polygons, the lines are also large, so the small setup delay doesn't add up to much proportionally. But with small polygons, the lines are short, so now a large percentage of your time is "wasted" in those small setup delays.

 

For the same reason, command buffering is less important if the blitter is upgraded to draw polygons instead of lines.

 

On the Jag I, you can eliminate the setup delay by letting other masters use the bus during setup. For example, it's a good time to let the object processor read the frame buffer.

 

- KS

Link to comment
Share on other sites

Funny thing about the CD32 is that it either sold to Amiga owners who didn't want to fork out for a A1200 or it sold to people who simply didn't want to wait for the other consoles or fork out for a more expensive machine. I can remember when we got the 3DO, in my whole time at Game I sold 1 console and thats it becuase it was just far too expensive. In fact the 1 unit I sold was too a footballer called Dean Austin (ex Tottenham Hotspur) who bought it on the strength of FIFA and certainly wasn't short of a few bob.

The 3DO also had its European (and Worldwide) release a fair bit later than the US too, though still before the Jaguar I believe, or near the same time.

 

The CD32 would have been out already in 1993, but given that Commodore went bankrupt the following year, that pretty much put a halt to it; the barring from the US market certainly didn't help either.

Link to comment
Share on other sites

I wonder if they considered the idea of a Europe test launch. It really seems like a good idea, so I'm trying to figure out what made them "go big" in the US instead.

Well, they had announced including London and Paris in the 1993 release previously:

http://www.atariage.com/forums/topic/132973-comprehensive-atari-jaguar-timeline-1991-2008/page__st__25

August, 1993 - Jaguar was unveiled to worldwide press. Atari announced that 50,000 units would be sold in New York, San Francisco, Paris, and London in October. With a worldwide release in 1994 & an MSRP of $200.

They ended up dropping the 2 latter locations. (and I don't think they had stocked anywhere near that many units by that point -and of course the price was $250, with Cybermorph pack-in)

 

One possibility is that they were VERY focused on raising investor capital in 1993. They were just about to run out of money.

Yes, I was thinking of that too, but weren't there any British or European investors they could have attracted? (and might not a positive UK/EU release also incite some possible US investors?)

 

Maybe the financial situation forced their hand -- they had to look like they were going to make a billion to get investors excited. They did create that impression and they did get the money.

Atarian63 made a point awhile back about roping in investors, and while the suggestion to launch a spin-off "Jag Corp" (public, taking advantage of the strong stock speculation at the time) may not have been entirely feasible, one other suggestion: promoting plans for internet connectivity could have had a dramatic effect. (another thing being highly speculated upon)

Or was Atari already promoting plans for a modem or such? (I know they had the networking, but I don't recall any plans -or announcements- for internet capabilities)

Again, it didn't matter so much if they actually went through with pushing such a feature as a key element of the system, but could have still enticed prospective investors. (if they did managed to actually pull it off -particularly with the buggy UART -unless they opted for a simple parallel connection via the cart slot like X-band- it could have been useful for some games, particularly Doom)

 

Could a slower/European "viral" launch still boost investor confidence enough? Their investors and creditors were mostly in the US, for better or worse.

Were there no possible EU/UK investors they could seek?

 

 

 

 

 

On hardware again: In terms of mass storage, CD drives were expensive (but CDs were cheap), HD floppies were too low capacity (and prone to piracy -if not proprietary), and ZIP hadn't yet come out either (Superdisl later still), but might the Floptical format have been a possibility? (using a proprietary form factor and/or file format)

I'm not entirely sure of the reliability issues, but if they persisted into the mid 90s and were of a write-related nature like ZIP, a read-only format may have been an option (perhaps read-only drives may have been cheaper too), any save data would need to be in onboard memory and/or using memory cards. (likely EEPROM)

 

Or would a floptical drive have still been too expensive? (games could have a much higher profit margin, pack-ins would be less costly, capacity far less limited -even room for a bit of streaming/compressed video, and distribution to reviewers would be facilitated -all arguments for CD as well, and other than holding tons of streaming video and red book audio, the CD medium was not being used anywhere near capacity by games of the time, so ~20 MB wouldn't be bad at all -and compression is still an option)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...