Jump to content
IGNORED

Its 1993, you're in charge of the Jag, what do you do?


A_Gorilla

Recommended Posts

I think I may have made it a super hot *2d* games machine, it may not make mass appeal. But like the neo geo it would have a cult following.

 

I think that was the initial plan - but we know what happens to the best laid plans ;)

Yes the mice eat them. :lol: jk

 

What i would do would be to firstly make a deal with Jaguar so everyone who purchases a Jag gets a Jag free. :lol:

 

-Darren-

Link to comment
Share on other sites

  • 2 weeks later...

A complete reorg wouldn't be a Jaguar, though :) , that's why I settled on the idea of just fixing the bugs - and letting the machine run more as a dual processor risc machine with a sleeping 68000 :) - or even no 68000 at all.

Yeah, not to mention they probably wouldn't have had the time/resources to a full reorganization once the market started showing different trends. Really though, it seems like Flare must have been forced into implementing the design in an impractical manner, the 68k doesn't mesh well with the chipset in a single bus configuration, so I can't imagine that they planned to use such a configuration originally. With a dual bus design it would make at least some sense though, but that wasn't the case. (besides, there's cost trade-offs with a dual bus layout vs just using a more suitable CPU)

Does the Jaguar even support bus interleaving with the 68k in a manner like the Amiga?

 

OK, I started thinking on this last point again. What I meant to ask referred to the Amiga's DMA set-up with its chipset able to use 50% of the bus time without the 68k taking a hit. (due to the way the 68k accesses memory)

Did that Jaguar have any such DMA control, even just for interleaving the 68k?

 

 

 

One other thing I was thinking about has to do with the Jaguar possibly being released EARLIER... The panther was obviously a no go, but might it have been possible to get something derived from flares hardware by 1992 maybe? For example, if the RISC chips had been avoided initially and work been primarily on completing the blitter and object processor and using an off-the-shelf sound chip (as the Panther seemed to have planned, or maybe carrying over the Falcon's DMA audio hardware), perhaps cutting come other features too.

Most 3D would be out (polygon based at least), though there's plenty of "3D" (pseudo 3D) stuff that might have worked well, like raycasting type rendering.

It would still have been kick-ass with 2D stuff, color depth, etc, and great for scaling type pseudo 3D games.

 

However, they'd be steeped in the heat of the 16-bit console war with the Genesis and SNES in the North American market... Granted, that was still fairly true for late 1993 too, and soon following was the next generation consoles too, there was little opening other than Sega's decline, but that was quickly filled by Sony anyway.

Once more, Europe (or at least certain countries) might have been more viable due to the marketing methods used and popularity of the Atari ST line giving a more modern representation/reputation than what the North American market did. (plus the SNES didn't launch in EU until 1992)

Link to comment
Share on other sites

What I meant to ask referred to the Amiga's DMA set-up with its chipset able to use 50% of the bus time without the 68k taking a hit. (due to the way the 68k accesses memory)

 

Did that Jaguar have any such DMA control, even just for interleaving the 68k?

The Jaguar's bus sharing is already more efficient than the Amiga's simple 50/50 split.

 

Interleaving is actually detrimental to the Jaguar's performance. This is unfortunate, because with 5 processors, naive programming can very easily cause tons of interleaving.

 

On the Amiga, a random 16-bit DRAM access takes 280ns. At most, the Amiga's 68K wants to do an access every 560ns. So, in between each 68K access, there's time for 1 more chip access. And so we have a 50/50 split.

 

On the Jaguar, a random 16 or 64-bit DRAM access takes 188ns. The Jaguar's 68K wants to do an access every 300ns. So already, you should see that the 68K leaves no time for ANY bus sharing without slowing down.

 

The Jaguar custom chips can also take advantage of "page mode". In page mode, the first 64-bit access takes 188ns, but all following accesses take only 75ns.

 

Page mode ONLY applies if there is no interruption between accesses! In other words, interleaving disables page mode!

 

Let's say you're running the blitter alone. In some modes, the blitter can do a 64-bit transfer almost every 75ns.

 

Now let's say the blitter and 68K are splitting the bus. Each time the blitter does one 64-bit transfer (188ns), the 68K is ready to interrupt it with a 16-bit transfer (188ns). That means the blitter can only do one 64-bit transfer per 376ns (188ns * 2) -- exactly 5 times slower than the 75ns example above!

 

What have you gained by cutting the blitter performance to 20%? Not much. The 68K is running at an effective 10MHz, which is about 5% of the performance of the RISCs found in the Saturn or PSX.

 

Of course, my example is a sort of best case for the 68K... in real Jaguar games there are far more demands on the system and 68K performance is much worse.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

Why does it take 188 ns for an access and not 150 ns? Colock for clock, that's greater latency than the Amiga, isn't it?

 

Amiga is at 7.16 MHz (NTSC), right? So that's ~140 ns accesses, and with the 68k taking 4 cycles, that matches the 560 ns figure (and 2 cycles is 280 ns), so for the 13.3 MHz Jaguar bus, that should be 75 ns and for the 68k's 4 cycles that's 300 ns, so 150 ns if set-up like the Amiga...

 

Does it have to do with page mode operation, or the way all 5 chips have to access the bus?

 

With the amiga, the chips can access the bus at greater than 280 ns, but that starts cutting in on the 68k's time, right? While the 68k is halted, the chipset can access at 140 ns, can't it?

Link to comment
Share on other sites

Why does it take 188 ns for an access and not 150 ns? Colock for clock, that's greater latency than the Amiga, isn't it?

Yeah, but DRAM is not logic. It has always been slow and still is. Doesn't matter how fast you clock the DRAM. (Due to the laws of physics and semiconductor processes.)

 

For example, here in 2010, we have GDDR5 DRAMs that move data at 450x the speed of the Jaguar's DRAM -- 0.16ns per transfer! And yet the random access (active to active) time is 45ns. In other words, access time barely changed in 17 years -- likewise, it had barely changed in the 8 years between the Amiga and Jaguar.

 

Obviously modern systems use all kinds of tricks to cope with that slowness, including caching, sequential transfers (like "page mode"), and another trick, bank interleaving. Today, it's normal to have 32 banks per chip.

 

In 1993, the Jaguar DID support 2-bank interleaving, but you needed twice the DRAM chips to use it (i.e., either 1MB or 4MB). With bank interleaving, you get to keep those 75ns accesses as long as each master accesses its own bank. In this mode, the 68K (and other processors) can share the bus more fairly as long as you allocate memory carefully.

 

A Jaguar in this configuration has much better texture mapping performance as well. The downside is that it's not cheap.

 

There's really no reason to do such expensive things: The cheapest solution is buffering and caching, and the Jaguar has quite a bit of that, just in the "wrong places" for certain types of games (namely the kind popular in 1995).

 

With the amiga, the chips can access the bus at greater than 280 ns, but that starts cutting in on the 68k's time, right? While the 68k is halted, the chipset can access at 140 ns, can't it?

Are you saying the classic Amigas can use page mode? That doesn't seem right to me, especially since video memory is non-sequential in the Amiga (it's broken into planes, so each subsequent video or blitter access MUST be a random access). Maybe an Amiga expert can pipe up.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

Congratulations to this thread for lasting longer than the commercial life of the Jaguar... :P

It's the ultimate armchair quarterback thread. :D

 

But the title may as well be, 'It's 1993, despite 5 strokes and 3 heart attacks, you had a pretty good 100th birthday! Even though you're confined to a wheel chair, you can still move your left hand slightly. So, how do you plan to win the Superbowl?'

 

In all seriousness, Atari deserves a ton of credit for making it all the way to the kickoff line before keeling over.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

Why does it take 188 ns for an access and not 150 ns? Colock for clock, that's greater latency than the Amiga, isn't it?

For example, here in 2010, we have GDDR5 DRAMs that move data at 450x the speed of the Jaguar's DRAM -- 0.16ns per transfer! And yet the random access (active to active) time is 45ns. In other words, access time barely changed in 17 years -- likewise, it had barely changed in the 8 years between the Amiga and Jaguar.

Oh, so it's just a bit of coincidence that the Amiga's DRAM access times mesh with the interleaving timing for the 68k? (280 ns)

 

In 1993, the Jaguar DID support 2-bank interleaving, but you needed twice the DRAM chips to use it (i.e., either 1MB or 4MB). With bank interleaving, you get to keep those 75ns accesses as long as each master accesses its own bank. In this mode, the 68K (and other processors) can share the bus more fairly as long as you allocate memory carefully.

That sounds like it might have been useful in the Jag II (on top of the additional caching). Hmm, did the CoJag use that feature with DRAM+VRAM, or was that set-up differently?

Does it require equal sized banks, or could you have one 512 kB bank (4x 128 kB 16-bit chips) plus the 2 MB? (obviously, interleaving would be limited to the 2x512 kB range)

 

The downside is that it's not cheap.

Unless you opted for only 1 MB of RAM, but then that adds all kinds of other compromises, especially for multiplatform releases; many PC games had to be whittled down and optimized to work witht he limited RAM on contemporary game consoles as it was, and even then there were cases of the game engine being completely re-done (like Quake on the Saturn)... though it's still a better case than the 32x. ;) (skimpy 256 kB, plus 2x128 kB for the frame buffers)

 

There's really no reason to do such expensive things: The cheapest solution is buffering and caching, and the Jaguar has quite a bit of that, just in the "wrong places" for certain types of games (namely the kind popular in 1995).

Would gouraud shading be one of those "wrong places"?

Again, it's completely understandable that Flare wouldn't have been able to predict the ideal optimizations int he timeframe the Jag was designed, especially considering its starting point. Honestly it could probably have been a lot worse in that respect, ie had it been less flexible and more optimized for those "wrong" areas. Even so, I think we've established in this discussion that some options were likely not possible given the timeframe and budget limitations, or at least Flare seemed to thin so. (like using one of the RISCs as the system host and implementing a cache for it)

Didn't the texture mapping feature of the blitter get added somewhat late in the design, or at least enhanced over what it originally was?

 

I am a bit curious why ARM wasn't one of the supported CPU architectures though, x86 and 68k make sense given the popularity (and use of both in past Flare projects), and MIPS makes sense too to include, but it really seems like ARM would have been an attractive choice to support given it's low-cost orientation. (and it'd been around for a while, albeit not quite as long as MIPS) Perhaps the architecture was more difficult to support along with the other choices, but if anything it would make more sense than MIPS in the context the Jag was being developed.

 

I can't seem to find the quote right now, but I know Carmack made some suggestions on the Jaguar's weak point, including one specific comment on buffering the blitter.

 

With the amiga, the chips can access the bus at greater than 280 ns, but that starts cutting in on the 68k's time, right? While the 68k is halted, the chipset can access at 140 ns, can't it?

Are you saying the classic Amigas can use page mode? That doesn't seem right to me, especially since video memory is non-sequential in the Amiga (it's broken into planes, so each subsequent video or blitter access MUST be a random access). Maybe an Amiga expert can pipe up.

Oh... at some point I'd gotten the impression that the Amiga (and ST) were using faster DRAM than that... but if it's just 280 ns (or 250 ns for the ST), I was mistaken.

 

 

 

 

Congratulations to this thread for lasting longer than the commercial life of the Jaguar... :P

It's the ultimate armchair quarterback thread. :D

 

But the title may as well be, 'It's 1993, despite 5 strokes and 3 heart attacks, you had a pretty good 100th birthday! Even though you're confined to a wheel chair, you can still move your left hand slightly. So, how do you plan to win the Superbowl?'

 

In all seriousness, Atari deserves a ton of credit for making it all the way to the kickoff line before keeling over.

 

:) Nice analogy there. Heh.

 

I know some comments have been made about management under Sam Tramiel in general though, not in the Jaguar specifically either, but going back to Jack's retirement and the decline of the ST thereafter... Obviously the North American market was going to be niche for any non-PC platform, and Commodore had their own management/marketing issues too, but it seems like Europe didn't really start pushing towards PC until after the ST and Amiga stated their decline.

 

But the only real bearing that had on the Jaguar was that Atari had no supporting product to maintain funding, which explains some of the desperation and questionable decisions made with he Jag. Though I still think it should have been apparent that pushing in Europe would be advantageous -reputation/brand recognition of the ST (and Lynx in some cases), better suited for viral marketing, etc.

 

The computers were dying off and Atari hadn't had big video game sales since the 7800 and VCS died off around 1990 (slightly later in EU), the Lynx was moderately successful in a few regions, but not nearly enough to be the main product. The Panther was obviously a no-go as well. Although an ST derived gaming platform might have been good in the late 80s (a 1990 release or later would probably have been too late, at least in North America), but any plans for that seem to have been abandoned, perhaps in part due to plans for the Panther.

 

 

 

However, there's still that one other think I mentioned a couple posts ago. I know we've already been through second-guessing Flare as they were, in most respects (other than pure hindsight), much more familiar with the engineering trade-offs and time constraints as well as practical options at the time, but I do wonder if something between the Panther and the Jaguar might have been possible.

The part of the Jaguar tracing to the Panther design is the object processor, but the Jaguar's object processor alone (which was much more practical due to the ability to work with DRAM) would probably have been a bit weak at the time, or a bit specialized at least.

 

The heart of the Jaguar design is the Blitter and Object processor, so again, the minimum usefulness of the system hinged on the completion of those chips. So IF, and that's a big IF, just those parts could have been completed alone significantly earlier than the full Jaguar (perhaps without all the features the blitter had in 1993), that might have been something -so just the object processor, blitter, and a host CPU (almost certainly a 68000). Some separate I/O hardware and sound hardware should have been fine. (for the latter, the Falcon's 8-channel DMA sound hardware would probably be nice, possibly some of the same I/O hardware used int he ST/Falcon too)

 

Again, 1992 would be pretty late into the 4th generation console market, less so in Europe, but the Jaguar still had pretty impressive hardware at the time, and it seems like a lot of the 2D and pseudo 3D games could have been done fairly well without the RISCs (obviously not the advanced Voxel/Raycasting stuff -or at least not nearly as fast or advanced). Perhaps some interesting vector-line (wireframe) based stuff could have been done -or mixed with 2D rendering too.

 

One important thing would have been Atari not being in quite as bad financial shape at the time... Then again, any other reasonably successful product would have helped similarly, especially a video game related one -to get the name back on the North American mainstream game market. (given the time, an ST based system was probably the only reasonable option -and probably a possibility once the BLiTTER was available -plenty of other options for sound if the STe sound hardware wasn't available)

Edited by kool kitty89
Link to comment
Share on other sites

Oh, so it's just a bit of coincidence that the Amiga's DRAM access times mesh with the interleaving timing for the 68k? (280 ns)

It was a design tradeoff. They knew contemporary DRAM could go that fast, and they knew that 280ns is a convenient time interval -- it is also known as a "color clock" -- part of the NTSC-centric timing in the Amiga. They could have bought faster or slower DRAM for a higher or lower price if needed.

 

That sounds like it might have been useful in the Jag II (on top of the additional caching). Hmm, did the CoJag use that feature with DRAM+VRAM, or was that set-up differently?

The feature still existed in the Jag II. CoJag uses both banks.

 

The bank sizes can be different. Theoretically it works to have one 16-bit bank (i.e., 5 DRAM chips, 2.5MB). That's an interesting idea that could help with texture mapping and relieve some of the 68K burden. Never thought of that one.

 

The reason you thought Amiga DRAM was faster is marketing: DRAM chips are marked by the duration of the first part of a cycle, not a full cycle. In fact, a so-called "150ns" DRAM really needs 280ns to complete a random access cycle.

 

Again, it's completely understandable that Flare wouldn't have been able to predict the ideal optimizations...

I am a bit curious why ARM wasn't one of the supported CPU architectures though...

I think you just answered your own question! In 2D games, CPU is just NOT that important. The 68K was a logical choice, and probably the cheapest that seemed to meet requirements.

 

In the type of game that dominated in 1996, CPU is important for a lot of reasons. The Flare team guessed some of those reasons (geometry transform, scan conversion) but not all of them (collisions, scene management, AI, etc). The GPU is pretty good at offloading the 68K in the cases the Flare guys knew about in 1990.

 

Didn't the texture mapping feature of the blitter get added somewhat late in the design, or at least enhanced over what it originally was?

It was there from the beginning, but it was intended for rotating sprites or drawing occasional textured billboards, NOT for drawing everything on the screen. The guys at Flare thought texture mapping was just a spice to be used sparingly, not the main course.

 

The last minute change added gouraud shaded textured polys, but that was even slower.

 

John Carmack's statement was that the blitter needed buffering. My interpretation was that a 64-bit destination buffer was needed. With such a buffer, texturing would be 2.5x faster. Not Playstation fast, but a lot more usable.

 

More buffering, such as a 64-bit source and 256-bit destination buffer, could achieve Playstation texturing performance. This would cost far less in transistors and complexity than, say, the z-buffering logic. But they instead spent all their time and energy on z-buffering, a feature the Playstation didn't even have, without considering the importance of texture mapping.

 

The hardware is very good, it's just good at different things!

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

Oh, so it's just a bit of coincidence that the Amiga's DRAM access times mesh with the interleaving timing for the 68k? (280 ns)

It was a design tradeoff. They knew contemporary DRAM could go that fast, and they knew that 280ns is a convenient time interval -- it is also known as a "color clock" -- part of the NTSC-centric timing in the Amiga. They could have bought faster or slower DRAM for a higher or lower price if needed.

Yeah, I know it's built around the colorburst signal, just as a lot of other platforms were, especially game consoles (even the original IBM PC had the CPu clocked at 4/3 the NTSC color clock), making systems built such that they mesh with both the PAL and NTSC color signals tends to be a bit trickier though, given the PAL Amiga's CPU speed, I'd guess that it uses a 35.47 MHz master clock (8x PAL colorburst) coresponding to 35.8 MHz (10x the NTSC colorburst signal). The Master System and Mega Drive used 57.3 MHz (15x NTSC) and 53.2 MHz (12x PAL), hence the MD's slightly faster 68k. (7.67/7.6 MHz)

I think Sega's Saturn may have been the last console to conform to this. (and the PAL Saturn got stuck with 26.6 MHz CPUs to NTSC's 28.64 MHz due to the set-up they used)

 

The bank sizes can be different. Theoretically it works to have one 16-bit bank (i.e., 5 DRAM chips, 2.5MB). That's an interesting idea that could help with texture mapping and relieve some of the 68K burden. Never thought of that one.

I thought about tha a while back, actually soon after you mentioned the Jag's 8 MB of RAM space being mapped as 2x 4 MB banks, but I didn't get to commenting on it. ;)

In that case, the added 16-bit bank would have been most useful for the DSP and 68k to work in, wouldn't it? (it wouldn't allow the proper interleaving that 2 64-bit banks would)

In that case, you could even go down to a 128 kB DRAM chip for the 16-bit 2nd bank.

 

The reason you thought Amiga DRAM was faster is marketing: DRAM chips are marked by the duration of the first part of a cycle, not a full cycle. In fact, a so-called "150ns" DRAM really needs 280ns to complete a random access cycle.

Ah, that makes sense. For SRAM, is it as fast "as labeled" so to speak?

 

I think you just answered your own question! In 2D games, CPU is just NOT that important. The 68K was a logical choice, and probably the cheapest that seemed to meet requirements.

Yes, but: they included the custom RISC GPU, which seems to point out that the chipset was planned to handle a fair amount of heavier, non 2D rendering, but on top of that, they supported MIPS, which I think even at the early period of the Jag's development, was more expensive than ARM. (though perhaps it was more readily available from 3rd parties as ARM was later to license) Or perhaps it was easier to include MIPS in addition to 68k and x86 than ARM would have been.

 

In the type of game that dominated in 1996, CPU is important for a lot of reasons. The Flare team guessed some of those reasons (geometry transform, scan conversion) but not all of them (collisions, scene management, AI, etc). The GPU is pretty good at offloading the 68K in the cases the Flare guys knew about in 1990.

1990 was the very beginning of the Jaguar design, wasn't it, or almost such (probably some crossover with the Panther)? In light of that comment tough, the inclusion of MIPS support seems the most contrary, though perhaps that may have been tied to alternate uses of the Jaguar that contributed to that decision. (perhaps plans for use in MIPS based workstations -of course it came in handy for the CoJag too)

 

It was there from the beginning, but it was intended for rotating sprites or drawing occasional textured billboards, NOT for drawing everything on the screen. The guys at Flare thought texture mapping was just a spice to be used sparingly, not the main course.
THat was certainly the case for a lot of polygon based games up until '94/95 (in some case, no textures at all, just decals -in the case of X-WIng and Tie Fighter -the latter a 1994 release). Though for non-polygon based "3D" games (namely raycasting), texture mapping was a prominent feature, Catacomb 3D would probably be the earliest example of that (with wolf 3D being the first big success), Doom took that to another level though. Still, for those purposes, the Jag's texture mapping seemed apt as well.

 

Interesting that what you just described matches what the ASIC in the Sega CD was intend for almost exactly. ;) (scaling, rotation and warping -the latter most commonly used for a textured layer similar to that of SNES's Mode 7) In that case the ASIC was hindered by the Genesis's VDP though. (DMA and VRAM limitations, plus the color depth/palette limits) THat, of course had no shading support, though I think it could draw vector lines too. (I think Starblade used that)

 

The last minute change added gouraud shaded textured polys, but that was even slower.

Oh, so texture mapping in CRY mode with simultaneous shading support? (otherwise the mixed RGB/CRY mode would have at least allowed 15-bit RGB textures and gouraud shaded CRY surfaces)

 

More buffering, such as a 64-bit source and 256-bit destination buffer, could achieve Playstation texturing performance. This would cost far less in transistors and complexity than, say, the z-buffering logic. But they instead spent all their time and energy on z-buffering, a feature the Playstation didn't even have, without considering the importance of texture mapping.

Did any Jag games even take advantgae of Z-buffering? I know atari owl commented that using z-buffering puts a damper on a lot of overall performance, especially if you want any decent amount of textures. I think the N64 used a fair amount of Z-buffering though. (some games seem to also used the Z-buffer representations -profiles- for low detail objects in the distance -at the edge of the fog)

 

It seems like Z-buffering might have been one of the more unnecessary features in the Jaguar.

 

Did the playstation end up handling object sorting in software or did the GPU handle it in a different way?

 

 

I wonder how much consideration Flare gave 16/15-bit RGB gouraud shading over their custom CRY format (indexed/interpreted 24-bit RGB), though I suppose doing that would have reduced the need for 24-bit RGB support at all, well perhaps not for 2D stuff. (though that takes up a ton of ROM space due to lack of idex support for 24-bit RGB textures -not sure if it's any different for the object processor's sprites)

 

Hey, I don't remember if I already got the answer to this, but can the blitter gouraud shade in 24-bit mode? (I know it cant for 15-bit RGB, and i know texture mapping isn't practical for 24-bit RGB.

Edited by kool kitty89
Link to comment
Share on other sites

In that case, the added 16-bit bank would have been most useful for the DSP and 68k to work in, wouldn't it?

It's great for texture mapping since the blitter only does 8 or 16-bit accesses in that mode.

 

Yes, but: they included the custom RISC GPU

Like I say, the GPU is very useful for the stuff they thought about -- transform, lighting, and scan conversion.

 

For the rest of the game, they probably assumed it would be like 2D games -- just "reading joysticks" and updating positions, trivial stuff. The 68K is fine for the sort of 3D game they envisioned -- not the sort that actually became popular.

 

on top of that, they supported MIPS

I don't think this was on purpose. They supported both common CPU bus standards (Motorola and Intel) or a combination. Lots of other processors use either one or the other or some combination, including MIPS.

 

1990 was the very beginning of the Jaguar design, wasn't it

The architecture must have been finalized by mid-1990, because all of the core logic files, including the GPU and blitter, are dated November 1990. The architecture in Nov 1990 is identical to the one on store shelves (shelf?) in Dec 1993. By May 1991, every major component of Tom, as we know it today, was fully implemented in logic.

 

After May 1991, the logic design slows to a crawl, with only a few bug fixes here and there. I assume they started simulation and physical layout, and probably began work on Jerry. From July-October 1992, they make a few more bug fixes -- these are the fixes for the second revision of Tom (also known as the final production chip).

 

Going by those dates, the absolute last chance to make any big design changes in Tom must have been mid-1990 or so. Minor changes like new buffers might have fit in mid-1991. After that you only get bug fixes, and only if they're trivial.

 

Did any Jag games even take advantgae of Z-buffering?

Did the playstation end up handling object sorting in software or did the GPU handle it in a different way?

I'm pretty sure Cybermorph does. And the Playstation's Z-sorting is software-only. Turns out that's fine for most games.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

For the rest of the game, they probably assumed it would be like 2D games -- just "reading joysticks" and updating positions, trivial stuff. The 68K is fine for the sort of 3D game they envisioned -- not the sort that actually became popular.

Wouldn't flight sims have been some of the games predicted types of games? (though I suppose games of that genere in 1990 conformed to the limitations of contemporary systems too and/or ran very slowly)

 

I don't think this was on purpose. They supported both common CPU bus standards (Motorola and Intel) or a combination. Lots of other processors use either one or the other or some combination, including MIPS.

Does that mean some other architectures might have worked as well? (is ARM close enough to have possibly worked? -I'd imagine PPC would have worked -though certainly not cheap)

 

The architecture must have been finalized by mid-1990, because all of the core logic files, including the GPU and blitter, are dated November 1990. The architecture in Nov 1990 is identical to the one on store shelves (shelf?) in Dec 1993. By May 1991, every major component of Tom, as we know it today, was fully implemented in logic.

 

After May 1991, the logic design slows to a crawl, with only a few bug fixes here and there. I assume they started simulation and physical layout, and probably began work on Jerry. From July-October 1992, they make a few more bug fixes -- these are the fixes for the second revision of Tom (also known as the final production chip).

Wow, OK, so there was a lot of time where minimal work was done... That kind of stuff would be nice to include in the timeline thread from a while ago: http://www.atariage.com/forums/topic/132973-comprehensive-atari-jaguar-timeline-1991-2008/page__st__25

 

I wonder if more fixes (or better software tools) were simply limited by available funding, and/or if the Jag had been planned to be released earlier. I hadn't realized the designed really dated back that fare in its entirety, I'd though a lot of work was still being done though 1992 at least, not just Jerry; I wonder if Jerry was actually planned from the beginning. (and not planning to used separate sound and I/O hardware as on the Panther)

 

Do you know when the first silicon was produced?

 

Going by those dates, the absolute last chance to make any big design changes in Tom must have been mid-1990 or so. Minor changes like new buffers might have fit in mid-1991. After that you only get bug fixes, and only if they're trivial.

Would the MMU bugs be one of those non-trivial cases?

 

 

In light of this, it kind of adds into my rambling about a simpler Jaguar being released earlier... except if the GPU was already implemented in 1990 (and in logic by '91). It seems like thy might have managed a timely 1992 release if they dropped Jerry entirely and used other I/O and sound hardware. (especially if that used int he STE/Falcon would have been compatible, I'd immagien the I/O hardware used on the STe/Falcon was compatible at least as the same logic was used for the enhanced joystick ports as the Panther/Jaguar)

Though if they'd aimed at a 92 release, some other compromises would likely have had to be made, like fewer bug fixes (which have various implications), not quite as high a clock speed, or omitting certain portions of the design in the actual silicon (for cost reasons too as they wouldn't likely be using .5 um chips). If they had implemented the complete TOM, it might have been more expensive due to the added silicon required. (though thay'd also be missing JERRY entirely)

RAM costs shouldn't have been much different.

Link to comment
Share on other sites

Wouldn't flight sims have been some of the games predicted types of games?

I'm not sure how to sum this up without starting a new thread. In 1989, 3D games were simple and static, with few moving objects and very simple, wide-open, scenes. Think of games like STUN Runner or Steel Talons (a flight sim), state of the art when the Jaguar project started. There is never more than a handful of moving objects on screen and they interact in very simple ways. Both games use a 68000 at 7MHz to run all game logic, and that's STILL more CPU than required!

 

Games like Crash Bandicoot, Metal Gear Solid, NiGHTs -- they don't fit this model very well. The game logic is sprawling and complex, because of the number of interactive objects and complexity of interaction -- the AI. Scene management was not well understood yet -- big open environments are easy to render, but you need TONS of CPU to organize dense environments with tons of overlapping layers behind every corner.

 

Wow, OK, so there was a lot of time where minimal work was done...

That's the wrong way to look at it. Layout and verification IS the lion's share of the work. It's a bit like saying, 'Wow, the architect was done 10 years before the skyscraper was finished... I guess minimal work was done for ten years!'

 

Like a skyscraper, you don't decide to change the shape of the building when you're 2 years into construction. You are stuck with a lot of decisions you make in the beginning, or you have to knock it over and start again.

 

Also like a skyscraper, if you have armies of people and billions of dollars you can build the chip faster, or even afford to knock over part of what you've got and rebuild from the tenth floor up. But you will run WAY over budget.

 

Do you know when the first silicon was produced?

Would the MMU bugs be one of those non-trivial cases?

Not sure, but I'm certain that the "blueprints" were finished around May 1991. Given normal development time frames I'd guess first silicon was spring or summer 1992? That is consistent with the flurry of bug fixes that started that July...

 

If you mean the RISC memory controller prefetch bug, they tried to fix that for Rev 2 and the fix had its own bug. They could have taken another six month hit or just shipped what they had at that point. As engineers say, "Stuff Happens".

 

It seems like thy might have managed a timely 1992 release if they dropped Jerry entirely and used other I/O and sound hardware.

You have to look at the whole context of the Jaguar project. Would any software be ready? As it is, there wasn't much on launch day, and at that point developers had chips for about 18 months. How could games be ready with only 6 months of dev time?

 

They would absolutely need to target an older process, which means a major performance cut -- typically half the clock speed and half the transistors. The GPU would be a hard fit, since it is the single largest component in Tom. But without a GPU, 3D isn't even possible in Tom's architecture. Cutting scratchpad RAM does no good -- RAM is much, much, smaller than logic.

 

There's just no way to fit anything like the Jaguar design in less than 0.5 microns.

 

Luckily, the guys at Flare were naive and didn't realize how much they were biting off! Even INTEL released the PENTIUM (flagship project, army of engineers, etc) at 0.8 microns in 1993.

 

I know Flare optimistically predicted an earlier release... I also think it was just impossible. I even doubt they would have undertaken such an ambitious project so early if they had any idea how crazy they were being. ;)

 

- KS

Edited by kskunk
  • Like 2
Link to comment
Share on other sites

I'm not sure how to sum this up without starting a new thread. In 1989, 3D games were simple and static, with few moving objects and very simple, wide-open, scenes. Think of games like STUN Runner or Steel Talons (a flight sim), state of the art when the Jaguar project started. There is never more than a handful of moving objects on screen and they interact in very simple ways. Both games use a 68000 at 7MHz to run all game logic, and that's STILL more CPU than required!

Yeah, I know about those (STUN Runner and Talons both used the Hard Drivin' boards) and Talons didn't come out until '91 (based on System-16.com's page -also listing 68010 not 68000), and contemporary 3D games on home platforms, I guess I was thinking a bit more in terms of some later games (92-94). By '90 some of the most prominent polygon based games were those like A-10 Tank Killer, F-15 Strike Eagle II, LHX, MechWarrior, Stellar 7, Starglider 2, etc. (plus earlier ones like Elite and Starglider wireframes with hidden line removal -with the former having filled polygons on the Amiga/Archimedes)

So, yeah, mostly open areas with a few objects and enemies to interact with. I suppose some later games of similar genres were still somewhat like that, though games like X-Wing had missions with very tightly populated areas, that and a lot of such were enemies other craft which required AI, especially the combat craft. (and a 33 MHz 386DX minimum for acceptable playability in 1993)

The Jaguar did ineed handel an excellent game in that category too, though late and very optimized. (that's not to say a simpler game of similar genre couldn't have been managed much earlier than BattleSphere)

 

Though there were other "3D" games at the time which didn't use polygons, like some Lucas Film WWII combat flight sims (Battlehawks, Their Finest Hour, Secret Weapons of the Lufwaffe) along with the famous Wing Commander. Those types of games would probably be ideal for the Jaguar though, scaled/rotated bitmap objects.

 

Games like Crash Bandicoot, Metal Gear Solid, NiGHTs -- they don't fit this model very well. The game logic is sprawling and complex, because of the number of interactive objects and complexity of interaction -- the AI. Scene management was not well understood yet -- big open environments are easy to render, but you need TONS of CPU to organize dense environments with tons of overlapping layers behind every corner.

Actually, I wasn't really thinking of those kinds of games, they were just starting to get popular in '95/96 (System Shock, Tomb Raider, and Fade to Black would be some early examples).

Games with open space environments (like X-Wing) or restrictions in general, like railshooters would have been good examples, though both of those can also be done well with non-polygon based graphics. (a matter of taste, style, and certain game mechanics) I've mentioned it before, but a railshooter in the vein of starfox would probably have been a better idea overall than a more complex (and problematic) game like cybermorph. (in addition to cybermorph itsself being a bit unpolished, it's the kind of game with steeper learning curve and less broad appeal than a simpler game like Star fox -the latter would also have meant better framerate and draw distance)

 

That's the wrong way to look at it. Layout and verification IS the lion's share of the work. It's a bit like saying, 'Wow, the architect was done 10 years before the skyscraper was finished... I guess minimal work was done for ten years!'

Ah, OK. :D

 

You have to look at the whole context of the Jaguar project. Would any software be ready? As it is, there wasn't much on launch day, and at that point developers had chips for about 18 months. How could games be ready with only 6 months of dev time?

Well, from that respect, it could also have meant a full release in '93 with more launch titles and such, rather than the desperate test release the Jaguar got (and lacking even the London/Paris releases promised). Possibly getting some better development tools out with that time too.

In that case, the .5 micron thing wouldn't be an issue either as it's still pretty much the same release date as the original. (possibly starting production slightly earlier to allow for a decent number stocked for launch)

 

 

How much might it have helped things if they'd never bothered with Jerry? The DSP is a lot slower at doing anything that the GPU due to the bus access problems and 16-bit bus and the Falcon's 8x 16-bit DMA sound channels seem like that they'd match what the Jag did in many cases, if not often exceed it. Either some more fine tuning to TOM (maybe some additional buffering or bug fixes) or really good development tools seem like the could have been better trade-offs than building Jerry. Hell, if using the Falcon/STe's hardware ended up saving some costs it might even facilitate use of a 68EC020 rather than the 68k. (or use of an additional 16-bit DRAM on the second bank)

Edited by kool kitty89
Link to comment
Share on other sites

How much might it have helped things if they'd never bothered with Jerry? The DSP is a lot slower at doing anything that the GPU due to the bus access problems and 16-bit bus and the Falcon's 8x 16-bit DMA sound channels seem like that they'd match what the Jag did in many cases, if not often exceed it. Either some more fine tuning to TOM (maybe some additional buffering or bug fixes) or really good development tools seem like the could have been better trade-offs than building Jerry. Hell, if using the Falcon/STe's hardware ended up saving some costs it might even facilitate use of a 68EC020 rather than the 68k. (or use of an additional 16-bit DRAM on the second bank)

I wonder about the justification for Jerry. On the one hand it is very powerful on paper. On the other hand it's so hamstrung by performance bugs, it's almost impossible to tap that power -- except when synthesizing sound.

 

How many games use Jerry for non-sound, non-CD, related tasks? I thought somebody said Jerry was used in Cinepak to improve framerates, but I don't know this first hand.

 

Maybe the bugs were a surprise -- in theory Jerry can handle nearly 6x the bandwidth it actually does, but all of the required features to do this are broken or unimplemented in the Jaguar.

 

Or maybe they just believed that sound synthesis was a very important feature on its own. After all, the SNES had quite good sound synthesis and that requires DSP-like power to do. Plain old DMA audio mixing, ala Amiga/Falcon, is really limited in comparison. You can forget 3D processing and Q-Sound and Dolby and the other kind of stuff that was pushed at the time. Jerry was capable of all of that.

 

DMA audio mixing is also a pretty big waste of cartridge space. No biggie for CD games but they may have wanted something "smarter" (like the SNES) to save cart space.

 

Pretty much all the discussion of the last days seems to sum up with, "Well, they thought XXX (z-buffer, audio, etc) was important and didn't realize YYY would make a bigger difference." Alas, that's what engineering a cutting edge product is like -- it's really hard to see technology trends 4 years in the future.

 

In any case, you can't just take the chip designers working on Jerry and put them to work designing an SDK. They're totally different skills. We know the chip designers were really good, but they had plenty of free time to work on Jerry while the software effort was being utterly, hopelessly, mismanaged.

 

I think the software development problem had more to do with clueless management at Atari (or "incorrect assumptions about game development trends", to be polite), not so much a lack of resources.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

I wonder about the justification for Jerry. On the one hand it is very powerful on paper. On the other hand it's so hamstrung by performance bugs, it's almost impossible to tap that power -- except when synthesizing sound.

 

How many games use Jerry for non-sound, non-CD, related tasks? I thought somebody said Jerry was used in Cinepak to improve framerates, but I don't know this first hand.

 

Maybe the bugs were a surprise -- in theory Jerry can handle nearly 6x the bandwidth it actually does, but all of the required features to do this are broken or unimplemented in the Jaguar.

 

Or maybe they just believed that sound synthesis was a very important feature on its own. After all, the SNES had quite good sound synthesis and that requires DSP-like power to do. Plain old DMA audio mixing, ala Amiga/Falcon, is really limited in comparison. You can forget 3D processing and Q-Sound and Dolby and the other kind of stuff that was pushed at the time. Jerry was capable of all of that.

 

DMA audio mixing is also a pretty big waste of cartridge space. No biggie for CD games but they may have wanted something "smarter" (like the SNES) to save cart space.

- KS

 

Not to go off topic here, but I have to say SPC format for the Super Nintendo really was a great sound format that yeild a lot great music and sound effects. I know it was really task specific for SNES sound chip, however the concept behind SPC is solid, kind of similar to how the "Casio Tone Bank keyboards" works where a small chunk of a looping sound wave is stretched and modified in realtime to make long musical key notes. If only Jerry had more internal RAM to hold more data. I would like to here more then 8 channels of sound where only 4 is used for music and the other 4 is used for sound effects. I always felt that the Jag could do more then 8 channels of sound considering that Jerry is a little flexible with more instructions then the SNES sound chip. Didn't mean to go off topic here, I just had to throw that out there. :lust:

Link to comment
Share on other sites

I wonder about the justification for Jerry. On the one hand it is very powerful on paper. On the other hand it's so hamstrung by performance bugs, it's almost impossible to tap that power -- except when synthesizing sound.

And a lot of the time it's just messing with samples (playback at varying volumes and frequencies, sometimes with additional effects), though I think there might be one or 2 cases where software FM sysnthesis was used. Though realtime decompression of samples and such would be useful, especially if going for more advanced compression than ADPCM (though the SNES and Playstation used that -in fact, I believe the PSX's sound system is a derivative of the SNES's). I believe some games used Amiga mod format, which should hardly be taxing.

 

 

How many games use Jerry for non-sound, non-CD, related tasks? I thought somebody said Jerry was used in Cinepak to improve framerates, but I don't know this first hand.

I'm not sure how much an indication it is, but for a fair number of games, they'll crash with the DSP disabled in emulation. (Project Tempest)

 

I believe Doom used the DSP for something other than sound, hence the lack of music in-game. (in spite of most/all of the music being included and played on the score screens) I think Cybermorph may use it as well. I'd imagine it would be most useful relative to the GPU for any tasks with minimal main bus accessing. (things that could fit in the scratchpad)

Otherwise, the hits to the bus are so severe that loading everything to the GPU would seem more efficient. (even in the case of polygons, if seems like it could be faster to have the GPU rasterize the polygons and halt the blitter while point plotting)

 

Maybe the bugs were a surprise -- in theory Jerry can handle nearly 6x the bandwidth it actually does, but all of the required features to do this are broken or unimplemented in the Jaguar.

And they wouldn't have realized that until after it was in logic, or silicon?

 

Or maybe they just believed that sound synthesis was a very important feature on its own. After all, the SNES had quite good sound synthesis and that requires DSP-like power to do. Plain old DMA audio mixing, ala Amiga/Falcon, is really limited in comparison. You can forget 3D processing and Q-Sound and Dolby and the other kind of stuff that was pushed at the time. Jerry was capable of all of that.

The SNES's sound also had the severe limitation of having everything in its 64 kB block of RAM, no access to ROM, plus the forced interpolation and filtering (not variable or even full on/off optional). I've heard a lot of complaints over that for both aspects from some programmers/sound artists and critics.

Plus, there's additional effects possible with such DMA sound systems with a bit of software work. Average people still seem to prefer Amiga stuff to contemporary FM synthesis stuff (there are exceptions, myself included, but not for the masses). On top of that, several games used Amiga MOD format on the Jaguar, or similar. (if I'm not mistaken)

With the Falcon's sound, you've got 8 channels to the Amiga's 4, and up to 16-bit resolution and 50 kHz sample rate. (not sure about stereo capabilities, I'd think they'd offer hard panning and full pan with pairing rather than the hard wired channels of the Amiga)

 

In one such case, a MD sound artist (and programmer) complained that even with compression and splitting up banks of instruments to optimize per track, the SNES's 64 kB for audio was still too small for the sample rates they wanted to use, though also commented that the forced interpolation ruins a lot of higher sample rate stuff (like 22 kHz or greater), the former issue was mainly criticized due to lack of the SPC's ability to access ROM directly, for his MD stuff, he had 512 kB of space dedicated to uncompressed samples, and almost exclusively percussion (other instruments being done in FM Synthesis).

In the context of the early/mid 90s, it probably would have been quite acceptable. They ended up facing CD audio anyway, and even the N64's sound system couldn't compete with that. (albeit the N64's was similar to the Jaguar in some respects, being CPU controlled at least)

 

DMA audio mixing is also a pretty big waste of cartridge space. No biggie for CD games but they may have wanted something "smarter" (like the SNES) to save cart space.

You mean with the compressed samples? (ADPCM cutting the size to 1/3) Samples could be decompressed and loaded into RAM (so as to avoid perforamce hits with on-the-fly decompression). Otherwise you're stuck with lower quality samples to compensate for lack of compression. (low sample rate/resolution)

 

Pretty much all the discussion of the last days seems to sum up with, "Well, they thought XXX (z-buffer, audio, etc) was important and didn't realize YYY would make a bigger difference." Alas, that's what engineering a cutting edge product is like -- it's really hard to see technology trends 4 years in the future.

I certainly see how the former issue of compromises made that seem unfortunate in hindsight, but in specific context of not having foresight for future trends:

The jaguar didn't have a huge emphasis on 3D (especially polygonal 3D), but that makes the addition of the Z-buffer a little odd; I suppose they might not have considered software based Z-sorting, but that would seem a bit odd given that's pretty much how all 3D games had to be done in the early 90s anyway.

And again, for the audio, it seems like they made some odd compromises, or were attempting to be more bleeding edge than necessary. Trends with sample based and FM synthesis at the time (late 80s), with the exception of unique cases like the MT-32. Consumer response seems to have heavily favored sample based sound (simple DMA based using MOD like the amiga, software MOD, or full wavetable synthesizers, like the emerging General Midi and similar sample based sound cards -like Ultrasound).

 

I know a lot of people who though the Amiga sounded "just as good" or better than the SNES.

 

At one point they seemed interested in Ensoniq (OTIS for the Panther), but I'd imagine that would have been a more expensive option as well, plus, going back to Flare 1, they'd opted for a custom "RISC DSP" to drive audio. Perhaps they also werent aware of what other departments at Atari Corp were developing either (the Falcon being released in '92 and the limited DMA sound of the STe really would be too weak -unless they did an array of something like 4 of what the STe used). Something they might not have discovered unless actively seeking alternate sound hardware.

 

In any case, you can't just take the chip designers working on Jerry and put them to work designing an SDK. They're totally different skills. We know the chip designers were really good, but they had plenty of free time to work on Jerry while the software effort was being utterly, hopelessly, mismanaged.

 

I think the software development problem had more to do with clueless management at Atari (or "incorrect assumptions about game development trends", to be polite), not so much a lack of resources.

Yeah, I suppose, but what was the point of including such an inefficient compiler on top of that? (was that a last minute kind of thing)

You mentioned most games being programmed in assembly through the previous generation of consoles (and home computers), but in those cases they were almost always dealing with a common architecture (Z80, 650x, 68k, etc), or at least something fairly similar. The Flare's RISC architecture was completely foreign at the time by comparison, though the GPU it was intended to be specifically used for rendering purposes, not running game logic and such. (otherwise, a less specialized course could have been implemented, perhaps a licensed 3rd party one available in 1990 -perhaps something like the ARM 3, though I'm not sure ARM was being licensed yet and MIPS may not have been either)

 

I wonder if the RISC design in the jaguar has any relation to the custom RISC "DSP" mentioned withthe Flare 1/Konix Multisystem.

Link to comment
Share on other sites

I wonder if the RISC design in the jaguar has any relation to the custom RISC "DSP" mentioned withthe Flare 1/Konix Multisystem.

The instruction sets are totally different. The "DSP" in the Konix is much more microcoded/DSP-like, and much more primitive and limited. It's like programming the Reality Signal Processor in the N64. It's not for the faint of heart.

 

The JagRISCs are very general purpose, like contemporary RISCs. If it weren't for the bugs, I think the JagRISCs would be a cakewalk for any assembly programmer.

 

I think one reason people get so pissed at the JagRISCs is that they have the appearance of very stable, well-thought-out, simple and easy to use processors. It's a shock when you find that your perfectly reasonable-looking code just doesn't work, due to a dozen different timing and pipeline bugs. With microcode, nobody has those expectations. ;)

 

- KS

Edited by kskunk
Link to comment
Share on other sites

I think one reason people get so pissed at the JagRISCs is that they have the appearance of very stable, well-thought-out, simple and easy to use processors. It's a shock when you find that your perfectly reasonable-looking code just doesn't work, due to a dozen different timing and pipeline bugs. With microcode, nobody has those expectations. ;)

 

I think this is a point that people tend to miss.

 

When people say the Jag is difficult to code - they don't mean that its hard to learn the instructions - indeed anybody should be able to learn these quickly - its that once you start to push the Jaguar it begins to behave oddly - indeed there is perfectly acceptable code which just doesn't work under some circumstances for which sometimes there seems to be no reasonable explanation. It just means that often things take a lot longer than they should and that bug finding can be torturous.

 

Of course with the 68k this is not a problem as it was a well thought-out chip.

  • Like 1
Link to comment
Share on other sites

Huh, wow, I was reading around about the Konix Multisystem and Flare and found something really interesting: Ben Cheese of the original Flare 1 team ended up leaving Flare after the multisystem and designed the RISC Core for Argonaut Software which became the Super FX GSU. In the mean time, John Mathieson and Martin Brennan went on to form Flare 2 designing the Panther and Jaguar chipsets. Heh, small world, but that kind of thing seems to happen a lot in the industry, or at least in the 80s and early 90s.

 

I was again wondering if the Core used for super FX was at all related to the Flare 1 DSP, but given that it's often described as a more general purpose RISC microprocessor and, indeed, was used to run a lot of the game logic in a couple of games (supposedly), it seems rather unlikely as well. (Sega's SVP chip otoh was a licensed Samsung DSP)

That CORE ended up spawning a spin-off company from Argonaut software selling embedded RISC microprocessors (ARC International), apparently starting with direct derivatives of the core used by Nintendo.

 

 

Was the Jaguar Chipset designed (engineered) by just by Brennan and Mathieson, or were there other members of the Flare II team? (for the original architectural chip design)

 

 

 

In learing a bit more about Flare 1, the Panther seems a rather odd route to take following that. The Flare 1 seems much more akin to the Jaguar than the Panther (albeit the Jaguar includes some architectural similarity to the Panther insofar as the object Processor).

Flare 1 was designed around a blitter working in a framebuffer with an indexed palette of 256 or 16 colors from 12-bit RGB (apparently the blitter was most efficient in 8-bit pixel mode), and a RISC DSP coprocessor intended to be used for sound and fast math calculations (as in 3D), using a decent chunk of DRAM (it seems 128 kB for the framebuffer and 128 kB of main memory), plus a common CISC CPU to facilitate programming the game logic. (originally a Z80, switched to a 8086/8088)

That sounds much more in line with contemporary designs (in terms of ease of porting and cross-platform development) than the Panther, and also lacks the fatal flaw of SRAM dependence.

It seems like much lot more realistic hardware to bring to market than the Panther, probably Could have been a lot more useful for Atari than the Panther Design was, probably more useful to Atari than it was for Konix too. :P

 

In light of that comparison, the Panther really does seem a bit odd, going from a blitter based system rendering to a 16 or 256 color framebuffer using cheap DRAM and with provisions for some 3D capabilities to the panther: based around an object processor aimed at 2D heavily Sprite oriented graphics and relying on limited amount of fast and expensive SRAM, limiting colors and providing far too little RAM for a framebuffer, also using an architecture which didn't match with contemporary consoles, home computer, or popular arcade systems.

 

It almost seems like they decided to take things in a drastically different direction, but then went back on track with the Jaguar... (the Jaguar differed in having a General purpose RISC rather than the DSP, including the object procssor, and using a single bus with lots of buffering on the custom chips -opposed to the separate video bus on the multisystem)

 

The only things completely tying all 3 designs together is the emphasis on low cost and the fact that all 3 designs had heavy limitations due to contention in shared memory.

Link to comment
Share on other sites

In learing a bit more about Flare 1, the Panther seems a rather odd route to take following that.

The Panther was designed at Atari before Flare was even involved. Flare was only brought in to finish implementation after the design was complete. Here's a quote from an interview with Martin Brennan about Panther:

 

The design and specification (of Panther) had already been started, and they said "somebody's left - here's the concept" and it was only the video part of the chip - there was no sound.

 

I wasn't keen on it, but I finished it and the chip was built.

 

But while I was over in California in '89, I actually convinced the bosses at Atari that 3D was the way to go, with the experience we'd gained on Flare one - if you didn't just do flat rendering, but shaded rendering you got a 3D appearance.

As you noted, the Jaguar is similar to an upgraded Flare 1 joined with an upgraded Panther-style Object Processor. This probably pleased Atari since it seemed to have the strengths of both console designs.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

Oh, wow, that makes a whole lot more sense. ;)

There sure seems to be a lot of mixed information (or misinformation) regarding the panther; a lot of articles claim that Flare "designed" or "developed" the system. (the latter being closer to the truth)

Including this detailed article here on the multisystem: http://www.konixmultisystem.co.uk/index.php?id=simons_analysis

Interesting that another article on that site more accurately lists the progreaaion of Flare 1 to Konix Multisystem to Jaguar: http://www.konixmultisystem.co.uk/index.php?id=quickfacts

 

 

BTW where did you find that quote, is it from a book or magazine article; is it available online?

 

 

It's a bit tough to get a clear time line for what was going on, but it sounds like Flare had stopped working with Konix by '89 (or prior), formed Flare II (with Ben Cheese leaving) and partnered with Atari. (Konix had purchased Flare's hardware outright I believe, so by '89 the hardware must have been completed, or being fine tuned by other engineers)

 

 

A previous post by Marty seems to imply that the Panther came from a 3rd party rather than in-house development, so (unless I'm misinterpreting him) I wonder who that was.

There were working on one back then actually. That morphed in to the Panther which morphed in to the Jaguar.

 

That would mean plans for an ST/STE derived game system were abandoned in favor of the Panther, then the Jaguar, right? (with the panther having nothing to do with the ST)

 

Concept wise of a more advanced 68000 based game system it did, hardware wise no, they had nothing to do with each other. Curt has the memo's, they were working on a new ST based "Super XE" game system, I.E. a 68000 based game system. The idea of using the ST was dropped when the Panther design came aboard, which of course was then dropped in favor of the Jaguar design.

 

 

 

In spite of its age the ST seems like a far better candidate for the basis of a Game system, even in 1990, than the Panther, obviously an STe derivative by that point, stripped down, consolidated board, no OS, etc. (perhaps some improvement on sound) it would obviously have been at a disadvantage color wise (16 indexed colors, though the STe did have a broader master palette than the Genesis). Still, it would have at least been a reasonable architecture to work with and have a decent amount of RAM unlike the Panther. Again, it might have had a better chance in Europe too.

 

-An ST derived game system would probably have been best for '88 or '89 though, before competition became established and when the hardware was more competitive. The BLiTTER was available by '88 and the DMA sound system of the ca 1989 STe wasn't absolutely necessary. (and off the shelf Yamaha FM synthesis chip would probably have been apt)

Edited by kool kitty89
Link to comment
Share on other sites

An ST derived game system would probably have been best for '88 or '89 though
I'm not sure about that. Computer-derived consoles weren't successful before the first Xbox -- remember the fate of the Amiga CD32, Amiga CDTV, Apple/Bandai Pippin and Amstrad GX4000 (OK, the last one was already quite obsolete when it was introduced).
Link to comment
Share on other sites

BTW where did you find that quote, is it from a book or magazine article; is it available online?

Same site you linked, coincidentally!

http://www.konixmultisystem.co.uk/index.php?id=interviews&content=martin

 

An ST derived game system would probably have been best for '88 or '89 though, before competition became established and when the hardware was more competitive. The BLiTTER was available by '88 and the DMA sound system of the ca 1989 STe wasn't absolutely necessary. (and off the shelf Yamaha FM synthesis chip would probably have been apt)

Such a system would need to come out in 1988 to have a chance. By 1989, it would be compared directly to the Genesis which is launching world-wide. Compared to the Genesis, the ST blitter isn't fast enough to do overlapping backgrounds at 60 FPS, nor can it trivially support 64-183 colors on screen -- these are all standard features in the Genesis, available without fancy programming.

 

Price wars were a big deal too. The Genesis was very cheap to make, in part because of its tiny memory and very specialized architecture. The ST's architecture is more general and (typically) uses quite a bit of memory. For example, typical games would want two frame buffers -- the Genesis does away with such costly overhead.

 

Don't forget that the 7800 came out in 1986 and the Lynx came out in 1989. That's a lot of consoles to be juggling at one time.

 

If Atari had a SNES beater in 1990 or something, maybe it's got a shot... But neither the Panther nor the ST were SNES beaters. The Jaguar was, but... this thread is getting pretty circular. ;)

 

- KS

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...