Jump to content
Sign in to follow this  
shadow460

How many "bits" do you consider the Jag and N64 to be?

Recommended Posts

The Xbox is a 32 bit system. Hell, the Xbox 360 is a 32 bit system.

Actually, the 360 is 64-bit, it uses Power970s (AKA G5s) which are 64-bit native processors.

Share this post


Link to post
Share on other sites
In reality, the Jaguar was a 64 bit system. It doesn't matter what "the public" thinks. If they think that means it is way more powerful, they need to realize...

 

The Xbox is a 32 bit system. Hell, the Xbox 360 is a 32 bit system.

Sorry, the accepted definition among computer engineers is very precise, and we refer to the general-purpose CPU (in this case the 68000), not the graphics chips or specialty processors.

 

So either the Jag fanboys are living in an alternate reality or they've changed the definition.

Share this post


Link to post
Share on other sites
Are we all on the same page about what "bits" means for a system, as in, the width of a processor word?

*is not insulting anyone, just making sure*

Most definitely so...

 

Before ca. 1993, this was the definition and there was no confusion. But then Atari came out with a 16-bit machine (under the accepted definition) but imprinted "64-bit" on it anyway because of a graphics chip. So instead of questioning Atari, Jaguar fanboys redefined "bits" to make Atari seem correct.

I see 2 problems with your version of history.

 

 

1. Hudson and SNK screwed with it before Atari did, with the TurboGrafX16 and NeoGeo.

Note that the TurboGrafX predates even the Sega Genesis, which is where bits are commonly recognized as becoming a major marketing point for the post-crash market, AND refers to the graphics chip. Your "accepted definition" was being ignored before it was even accepted.

 

 

 

2. The Jaguar actually doesn't HAVE a central processor, or a dedicated graphics chip. It's all processor, and all 5 processors are equals.

 

There IS a 68000 in there, which is the commonly identified CPU by people itching to define it as a 16-bit system. Officially, it was intended to boot the system and handle mundane tasks like controller polling.

 

There's also 2 32-bit processors. One of which was the oficially designated general processor, the other of which was officially the sound hardware.

 

And 2 64-bit ones, the "object processor" and the "blitter." Both were intended as graphics chips, but are not limited to or even required for graphics work.

 

All 5 chips, despite having official uses, are fully programmable, have no pre-defined tasks in the silicon, and can do anything.

You can literally run an entire game on the blitter, if you so desire. Or on the sound chip.

If you're particularly mad, you can even run your graphics from the 68k, your sound from the blitter, your main program from the sound chip, and do the mundane stuff with the object processor, while the "main processor" sits idle.

 

Given the system HAS NO CPU, it's rather difficult to define bittage by the CPU. Hence the Jaguar argument typically falls to system bus, as it's the one constant across the system.

This rule also gets applied to conventional hardware designs sometimes. The original IBM PC is often cited as an 8-bit machine. While the processor was 16-bit, the rest of the system was very much 8-bit.

 

If you choose to (arbitrarily) designate a Jaguar processor as the CPU and measure that, it should be the Atari-sanctioned main processor, which is a 32-bit core.

Share this post


Link to post
Share on other sites
Are we all on the same page about what "bits" means for a system, as in, the width of a processor word?

*is not insulting anyone, just making sure*

Most definitely so...

 

Before ca. 1993, this was the definition and there was no confusion. But then Atari came out with a 16-bit machine (under the accepted definition) but imprinted "64-bit" on it anyway because of a graphics chip. So instead of questioning Atari, Jaguar fanboys redefined "bits" to make Atari seem correct.

I see 2 problems with your version of history.

 

 

1. Hudson and SNK screwed with it before Atari did, with the TurboGrafX16 and NeoGeo.

Note that the TurboGrafX predates even the Sega Genesis, which is where bits are commonly recognized as becoming a major marketing point for the post-crash market, AND refers to the graphics chip. Your "accepted definition" was being ignored before it was even accepted.

OK, maybe I don't know the full history of console wars, but the definition was accepted way before this. Probably even before you were born, if you were born after 1971.

 

If you choose to (arbitrarily) designate a Jaguar processor as the CPU and measure that, it should be the Atari-sanctioned main processor, which is a 32-bit core.

I'll give you 32-bit, since one is a general-purpose chip. But 64 bit is still bogus. Most modern processors, whether they be GPU's, FPU's or even DSP's can do general computation, as it really doesn't take much to fit the definition. You can do any computation with just 3 operations: var=var+1, var=var-1, and a conditional branch if var=0.

 

For example you could, in theory, do matrix multiplication on Intel's network processor, or you could make a working video game system out of the 2600's TIA, the RIOT and an Intel 80487 FPU. But you wouldn't want to. For that matter, Intel's FPU's are 80-bit, so by "Jaguar definitions" we shouldn't be calling our P4's 32-bit anymore.

Share this post


Link to post
Share on other sites
Given the system HAS NO CPU, it's rather difficult to define bittage by the CPU. Hence the Jaguar argument typically falls to system bus, as it's the one constant across the system.

What is the bit-width of the Jag system bus?

Share this post


Link to post
Share on other sites
Given the system HAS NO CPU, it's rather difficult to define bittage by the CPU. Hence the Jaguar argument typically falls to system bus, as it's the one constant across the system.

What is the bit-width of the Jag system bus?

 

It's a 64-bit* non-argument. The bus width only matters for transmitting data. Most PCs today have at least 128 bits of bus width. That doesn't make their 32 bit processors 128 bit. Neither does the 128 bit math and 256 bit bus of the GPU. What *does* make a processor 64 bit is the size of its registers and the size of its ALU. A machine as a whole isn't really 32 bit or 64 bit, but rather refers to the Central Processing Unit of the machine.

 

The central processor of the Jaguar is a 32 bit Motorola 68000 processor which is complemented by a DSP and several graphical processing units. Ergo, the Jaguar is a 32 bit machine. But for marketing purposes, Atari chose to play up the 64-bit-ness of the system. Which (while rather dubious) was somewhat acceptable for its time. There really weren't any major competitiors with the same sort of technology. Then again, I remember some gaming system from the time (it was either the Jaguar or the TGFX-16) advertising "4-D Gaming". So Atari was not the only one selling a bag of balony.

 

In modern machines the "bit-ness" of a CPU is becoming less and less clear as time goes one. With machines moving to SIMD architectures with 128-bit registers for DSP-like streaming of computations, the number of bits has been redefined to refer to the memory addressable by the processor. Thus a 32 bit processor can only address 4GBs of memory, while a 64Bit processor can address 2^64 bytes (several exabytes) of RAM. The Jaguar REALLY fails under that definition.

 

Honestly, I don't see any reason why people feel a need to call the Jaguar the "first 64 bit processor". It was what it was. Atari advertised it as 64-bit, but the reality was more complex. I don't see why that's hard to accept.

 

* The Jaguar's bus varied from component to component. As a result, it was as little as 16 bits in some places, and as large as 64 bits in others.

Share this post


Link to post
Share on other sites
In reality, the Jaguar was a 64 bit system. It doesn't matter what "the public" thinks. If they think that means it is way more powerful, they need to realize...

 

The Xbox is a 32 bit system. Hell, the Xbox 360 is a 32 bit system.

Sorry, the accepted definition among computer engineers is very precise, and we refer to the general-purpose CPU (in this case the 68000), not the graphics chips or specialty processors.

 

So either the Jag fanboys are living in an alternate reality or they've changed the definition.

 

Well considering experienced Jag programmers use the 68k on startup only to get the risc's going and then have the 68k shut down, that doesn't make much sense.

 

Also can you post where these definitions have been put up by the official Engineering association? Or whatever it is?

Edited by JagChris

Share this post


Link to post
Share on other sites
Are we all on the same page about what "bits" means for a system, as in, the width of a processor word?

*is not insulting anyone, just making sure*

Most definitely so...

 

Before ca. 1993, this was the definition and there was no confusion. But then Atari came out with a 16-bit machine (under the accepted definition) but imprinted "64-bit" on it anyway because of a graphics chip. So instead of questioning Atari, Jaguar fanboys redefined "bits" to make Atari seem correct.

I see 2 problems with your version of history.

 

 

1. Hudson and SNK screwed with it before Atari did, with the TurboGrafX16 and NeoGeo.

Note that the TurboGrafX predates even the Sega Genesis, which is where bits are commonly recognized as becoming a major marketing point for the post-crash market, AND refers to the graphics chip. Your "accepted definition" was being ignored before it was even accepted.

OK, maybe I don't know the full history of console wars, but the definition was accepted way before this. Probably even before you were born, if you were born after 1971.

The game industry never accepted that definition, which is what I thought we were talking about.

 

If you choose to (arbitrarily) designate a Jaguar processor as the CPU and measure that, it should be the Atari-sanctioned main processor, which is a 32-bit core.

I'll give you 32-bit, since one is a general-purpose chip.

Actually, both are. They're identical cores.

 

But 64 bit is still bogus. Most modern processors, whether they be GPU's, FPU's or even DSP's can do general computation, as it really doesn't take much to fit the definition. You can do any computation with just 3 operations: var=var+1, var=var-1, and a conditional branch if var=0.
But the Jaguar's "AV hardware" didn't have ANY preprogrammed functions, as I understand things.

Most modern AV hardware, while capable of general computation, ALSO has hard-coded audio or video functions.

 

And no, I don't really consider this a good thing. As I understand the issue, it made the Jaguar something of a pain to work with until developers got a good set of libraries going so they didn't have to recode their basic IO handling for every game.

 

For example you could, in theory, do matrix multiplication on Intel's network processor, or you could make a working video game system out of the 2600's TIA, the RIOT and an Intel 80487 FPU. But you wouldn't want to.

True. They're much less flexible than the Jaguar parts are, though.

 

For that matter, Intel's FPU's are 80-bit, so by "Jaguar definitions" we shouldn't be calling our P4's 32-bit anymore.

Except the P4's floating-point unit can ONLY do floating-point math, and IS subservient to the rest of the P4. If it was actually a FULL processor on equal footing, it'd be a valid comparison.

 

And by SNK definitions, a dual-core P4 is actually a 64-bit part, because 32+32=64. Unless it's a 64-bit P4. Then it's 128-bit.

 

By NEC's definitions, it's dependant on the video chipset. I believe most of those today are 256-bit?

 

The central processor of the Jaguar is a 32 bit Motorola 68000 processor which is complemented by a DSP and several graphical processing units.

Again, the 68k wasn't the intended main processor.

There was a RISC core inside the "graphics chip" that was intended as the main processor. The DSP was actually a clone of this core, but the sound chip could only access half of the system bus at a time, making it a less desirable "CPU."

 

Ergo, the Jaguar is a 32 bit machine. But for marketing purposes, Atari chose to play up the 64-bit-ness of the system. Which (while rather dubious) was somewhat acceptable for its time. There really weren't any major competitiors with the same sort of technology. Then again, I remember some gaming system from the time (it was either the Jaguar or the TGFX-16) advertising "4-D Gaming". So Atari was not the only one selling a bag of balony.

NEC's TurboGrafX16 set the "video chip bittage" standard.

 

Atari was probably pushing 4D gaming, if I had to guess. 3D at the time implied scaling and rotation of bitmapped objects, so being able to do polygon graphics would logically be the next step up, or 4D.

I heard Sony was claiming the PS3 was 4D at this year's E3. So that particular pile of bull feces hasn't quite died yet.

 

The TG16 wasn't even capable of 3D(in either that period's sense or the modern sense), so it's highly unlikely they would claim 4D.

 

In modern machines the "bit-ness" of a CPU is becoming less and less clear as time goes one. With machines moving to SIMD architectures with 128-bit registers for DSP-like streaming of computations, the number of bits has been redefined to refer to the memory addressable by the processor. Thus a 32 bit processor can only address 4GBs of memory, while a 64Bit processor can address 2^64 bytes (several exabytes) of RAM. The Jaguar REALLY fails under that definition.
If I recall, the designated main processor CAN address all 64 bits of the main bus, as can the other 2 processors on that chip.

 

Honestly, I don't see any reason why people feel a need to call the Jaguar the "first 64 bit processor". It was what it was. Atari advertised it as 64-bit, but the reality was more complex. I don't see why that's hard to accept.

Because people like simple answers. And the Jaguar is a nightmarish mess that defies logic in several respects.

Share this post


Link to post
Share on other sites

And the Atari 8-bit line wasn't truely 8-bit either. Did any of those games for the A400 or A800 look as good as Super Mario 3...NOOOOOOOOO So lets argue over that.

 

What you cannot argue is that the Jaguars games sucked, were not 64-bit, could not EVER compete with 32-bit systems like the Saturn and PS1, hell lets throw the 3do in there too.

 

I don't care if it had a 1,000,000 bit chip in its crap field shell, there's not one game on the Jag that I've seen, that could just as easily been a SNES game, which was 16-bit. Atari played on the minds of people thinking that because they slapped 64 bit on the front, people would think the system was the shiznit, but when people saw the actual graphic capabilities and the games, they resisted, and look where it got Atari, a whole lot of bankrupt.

Edited by JagFan422

Share this post


Link to post
Share on other sites
So either the Jag fanboys are living in an alternate reality or they've changed the definition.

I think we can safely guess which one of those is the case. ^_^

 

The Jag wasn't that good, guys. Yes, it had potential, and yes, it wasn't used, but the bitwidth of the ALU and system registers cannot make the games (and thus the system) not suck. With the exception of Tempest 2k.

Share this post


Link to post
Share on other sites
The game industry never accepted that definition, which is what I thought we were talking about.

 

The game industry doesn't get a say in these things. Hardware is hardware is hardware. I could start calling all consoles cheese muffins because "the professional computing industry never accepted the definition of consoles", but that doesn't mean that anyone will take me seriously.

 

(MMmmmmm.... cheese muffin.....)

 

Except the P4's floating-point unit can ONLY do floating-point math, and IS subservient to the rest of the P4. If it was actually a FULL processor on equal footing, it'd be a valid comparison.

Actually, this gets back to what I was saying about modern processors. Things have gotten pretty darn fuzzy thanks to the march of modern technology. As a result, 64-bit is used to refer to memory addressing more than math or register sizes.

 

And by SNK definitions, a dual-core P4 is actually a 64-bit part, because 32+32=64. Unless it's a 64-bit P4. Then it's 128-bit.

I thought that was Atari's line?

 

32 bit + 32 bit = 64 bit "Do the Math"

 

By NEC's definitions, it's dependant on the video chipset. I believe most of those today are 256-bit?

The memory bus is usually between 256 and 512 bits for the high-end cards, but the processors themselves (AFAIK) don't go beyond 128-bit floating point precision.

 

Again, the 68k wasn't the intended main processor.

There was a RISC core inside the "graphics chip" that was intended as the main processor. The DSP was actually a clone of this core, but the sound chip could only access half of the system bus at a time, making it a less desirable "CPU."

Alright. If we accept that the embedded processor was a clone of the DSP, then you've still got a 32-bit processor. ("Jerry" was a 32-bit RISC.)

 

If I recall, the designated main processor CAN address all 64 bits of the main bus, as can the other 2 processors on that chip.

(raises eyebrow) Can you find a reference for that? I very much doubt that Atari would have wasted silicon on 64 bit memory addressing. Even the largest ROM wouldn't have needed more than 24-bit addressing, much less 32-bit addressing. 64-bit addressing would have been like lighting off a 300 megaton nuclear hydrogen bomb just to kill that fly you couldn't catch with your chopsticks.

 

Because people like simple answers.

Simple answers like that are for advertising. They have very little place in any technical argument. For those purposes, the "was it 64-bit?" question is neither here nor there. It was what it was, and its games did what they did.

 

And the Jaguar is a nightmarish mess that defies logic in several respects.

Mmm... well we agree on one point. :)

 

So basically, the Jaguar was the Intelevision of its day?

 

Not really. The Intellivision was unconventional, to be sure, but I don't see anything odd about its design. The Jaguar, on the other hand, seems like an exercise in "how many microprocessors can we get away with putting in a single system?" I imagine that Flare thought that programmers would appreciate all the general purpose processing power they were given, but they seem to have failed to realize how incredible of a burden they placed on those same developers. You'll notice that when 3D co-processors became popular, they all came with a standardized API for programming them instead of allowing the programmer to go willy-nilly with the GPUs functions. This actually improves performance on modern 3D hardware since the driver is responsible for turning graphics commands into highly optimized GPU code.

 

Interestingly, pixel shaders may have brought us into a full circle. Whereas GPUs have traditionally been glorified Digital Signal Processors, they're now becoming more complex processors that allow arbitrary code to run onboard. It's difficult to say if this is a good thing. The complexity of the modern GPU has resulted in exponential cost increases in creating a new video game. Each game can easily require a budget the size of a blockbuster movie. And the situation is only getting worse, not better.

 

IMHO, we're just about ready for a technology change-over. Using Ray Tracing technology would greatly simplify rendering engines, and allow graphics designers to create future scenes much more naturally than most game engines today. The types of polygon limitations we see today wouldn't exist in a raytracing system (at least not in the same way), and the Lightwave 3D/3D MAX/Bryce/Poser models could go directly into the game rather than using something more complex like a Milkshape model and custom level editor.

Share this post


Link to post
Share on other sites
In reality, the Jaguar was a 64 bit system. It doesn't matter what "the public" thinks. If they think that means it is way more powerful, they need to realize...

 

The Xbox is a 32 bit system. Hell, the Xbox 360 is a 32 bit system.

Sorry, the accepted definition among computer engineers is very precise, and we refer to the general-purpose CPU (in this case the 68000), not the graphics chips or specialty processors.

 

So either the Jag fanboys are living in an alternate reality or they've changed the definition.

 

Well considering experienced Jag programmers use the 68k on startup only to get the risc's going and then have the 68k shut down, that doesn't make much sense.

 

Also can you post where these definitions have been put up by the official Engineering association? Or whatever it is?

According to Hennesy and Patterson, who literally wrote the book on computer architecture, to be 64-bit, a system needs a designated general-purpose CPU with 64 bit registers, data pathways, ALU and instructions that work with data 64 bits at a time.

 

I wonder why consoles don't specify bitness anymore? Is it because most people learned that their "definitions" were bogus?

There's not one game on the Jag that I've seen, that could just as easily been a SNES game, which was 16-bit. Atari played on the minds of people thinking that because they slapped 64 bit on the front, people would think the system was the shiznit, but when people saw the actual graphic capabilities and the games, they resisted, and look where it got Atari, a whole lot of bankrupt.
True, but IIRC, games like Doom on the SNES required the Super FX chip on the cart but the Jag could handle Doom without additional hardware.

Share this post


Link to post
Share on other sites
The Intellivision was unconventional, to be sure, but I don't see anything odd about its design.

Then you haven't read enough about it.

 

FUN FACT: The least frequently asked question of the Blue Sky Rangers was from the guy who opened up his Intellivision and noticed that there is no data bus connection between the CPU and the GROM chip. How then, he asked, does the processor execute the EXEC instructions that are located in the GROM chip? The answer is that those instructions, when needed, are loaded from GROM into the BACKTAB locations of System RAM. The processor then executes them from RAM. Since BACKTAB normally defines what's on screen, the STIC is set to black out the video display while the instructions execute. These instructions are used to load picture definitions from the game cartridge into GRAM, which normally occurs only at RESET. This is why the screen goes black for a second when you press RESET, and why you occasionally see a flash of strange characters.

http://www.intellivisionlives.com/bluesky/...telli_tech.html

 

Yikes.

 

 

I don't care if it had a 1,000,000 bit chip in its crap field shell, there's not one game on the Jag that I've seen, that could just as easily been a SNES game, which was 16-bit.

Remind us again why you called yourself "JagFan"? :ponder:

Edited by ZylonBane

Share this post


Link to post
Share on other sites
FUN FACT: The least frequently asked question of the Blue Sky Rangers was from the guy who opened up his Intellivision and noticed that there is no data bus connection between the CPU and the GROM chip. How then, he asked, does the processor execute the EXEC instructions that are located in the GROM chip? The answer is that those instructions, when needed, are loaded from GROM into the BACKTAB locations of System RAM. The processor then executes them from RAM. Since BACKTAB normally defines what's on screen, the STIC is set to black out the video display while the instructions execute. These instructions are used to load picture definitions from the game cartridge into GRAM, which normally occurs only at RESET. This is why the screen goes black for a second when you press RESET, and why you occasionally see a flash of strange characters.

You call that odd? I dunno, it made a lot of sense to me. Memory was extremely limited in those old consoles, and the Intellivision just happened to be an incredibly expensive one. Reusing busses and memory was a very useful idea.

 

A similar concept was used in the IBM PCs to get around the 640K limit. If a programmer needed some extra memory, he could easily overwrite the video locations in the HMA. Of course, this was made easier by the fact that the IBM PC had two video memory locations rather than one. (A000 was graphics and B000 was text.)

Share this post


Link to post
Share on other sites
I don't care if it had a 1,000,000 bit chip in its crap field shell, there's not one game on the Jag that I've seen, that could just as easily been a SNES game, which was 16-bit.

 

When you happen to write Rayman, Zero 5, Iron Soldier, IronSoldier 2, BattleSphere, Hover Strike, Hover Strike CD,AvP, Black Ice/White Noise, World Tour Racing, CyberMorph, BattleMorph, I-War, Native or even Checkered Flag on the SNES and make them look the same or better with higher frame rates using the same number of polygons, shading effects and colors, and you can even use the SuperFX chip,then I'll believe you know what you're talking about. Of course, your bias towards the Jag is obvious so I don't see why anyone should listen to anything you have to say on the subject.

 

As for it being a 64-bit system - the two processors that are 64-bit deal with drawing the screen, the RAM is 64-bit and is has a couple of 64-bit buses on the board. They are 64-bit so I don't see what the problem is.

Share this post


Link to post
Share on other sites
the RAM is 64-bit and is has a couple of 64-bit buses on the board. They are 64-bit so I don't see what the problem is.

* jbanes buries his face in his hands and sighs

 

If you'll excuse me, I need to go get another cup of coffee. This is going to be a loooonnng thread.

Share this post


Link to post
Share on other sites
the RAM is 64-bit and is has a couple of 64-bit buses on the board. They are 64-bit so I don't see what the problem is.

* jbanes buries his face in his hands and sighs

 

If you'll excuse me, I need to go get another cup of coffee. This is going to be a loooonnng thread.

 

Well really, you guys are getting too uptight about this whole thing. There is no definition of what a 64-bit game looks like. That really isn't important. Why can't we just accept that certain parts of the Jaguar system are 64-bit and just move on? It's all supposed to be about the games in the end anyways.

Share this post


Link to post
Share on other sites
the RAM is 64-bit and is has a couple of 64-bit buses on the board. They are 64-bit so I don't see what the problem is.

* jbanes buries his face in his hands and sighs

 

If you'll excuse me, I need to go get another cup of coffee. This is going to be a loooonnng thread.

 

LOL! It's funny you didn't respond this way to this brilliant remark.

 

And the Atari 8-bit line wasn't truely 8-bit either. Did any of those games for the A400 or A800 look as good as Super Mario 3...NOOOOOOOOO So lets argue over that.

Share this post


Link to post
Share on other sites
And the Atari 8-bit line wasn't truely 8-bit either. Did any of those games for the A400 or A800 look as good as Super Mario 3...NOOOOOOOOO So lets argue over that.

 

What you cannot argue is that the Jaguars games sucked, were not 64-bit, could not EVER compete with 32-bit systems like the Saturn and PS1, hell lets throw the 3do in there too.

"Bittage" has nothing whatsoever to do with quality.

And the Jag actually has some decent games. It's just a rather diminutive library, so one bad game is a lot more signifigant.

 

The Jag ahd some bad games, and some good ones.

So did the PS1 and Saturn. Same quite likely holds true for the 3D0, I'm just not familiar at all with it's library.

 

 

I don't care if it had a 1,000,000 bit chip in its crap field shell, there's not one game on the Jag that I've seen, that could just as easily been a SNES game,
*laughs*

Most of the games I've seen, rancid or not, COULDN'T be done on the SNES or Genesis.

Like... anything with shaded or texture-mapped polygons. The SNES couldn't even do FLAT polygons at a playable frame rate without assistance.

 

Atari played on the minds of people thinking that because they slapped 64 bit on the front, people would think the system was the shiznit, but when people saw the actual graphic capabilities and the games, they resisted, and look where it got Atari, a whole lot of bankrupt.
Actually, people WERE impressed by the graphics. This was an era where StarFox was the greatest thing since sliced bread. The Jag's PACK-IN blew StarFox away.

 

The gameplay was what drove people from the Jag. There just weren't enough compelling reasons to own one, especially not when the # of good games on the SNES or Genesis was larger than the Jaguar's entire library, and both systems were getting more new games than the Jag..

 

 

The game industry doesn't get a say in these things. Hardware is hardware is hardware. I could start calling all consoles cheese muffins because "the professional computing industry never accepted the definition of consoles", but that doesn't mean that anyone will take me seriously.

But people DID take the game industry seriously when they started bragging about their "bittage." Up through the Dreamcast, I believe.

 

Actually, this gets back to what I was saying about modern processors. Things have gotten pretty darn fuzzy thanks to the march of modern technology. As a result, 64-bit is used to refer to memory addressing more than math or register sizes.
Ah.

 

I thought that was Atari's line?

32 bit + 32 bit = 64 bit "Do the Math"

I didn't recall Atari arguing they were a dual-proc 32-bit system.

The "do the math" ads to me seemed to be claiming that because it was "64-bit", the Jaguar was twice as good as the 32-bit PlayStation and Saturn.

 

Atari picked the widest part to measure. SNK added the "bittage" of the individual processors(1 68k CPU + 1 z80 sound processor).

If Atari had gone SNK's route, the Jag would be 32+32+64+64+16=208-bit system. Or if we discount the object processor and blitter, 32+32+16=80-bit.

 

By NEC's definitions, it's dependant on the video chipset. I believe most of those today are 256-bit?

The memory bus is usually between 256 and 512 bits for the high-end cards, but the processors themselves (AFAIK) don't go beyond 128-bit floating point precision.

M'kay. I haven't paid very close attention to what they're doing lately.

Last I remember bit-wise was the original GeForce bragging about being 256-bit something-or-other.

 

If I recall, the designated main processor CAN address all 64 bits of the main bus, as can the other 2 processors on that chip.

(raises eyebrow) Can you find a reference for that? I very much doubt that Atari would have wasted silicon on 64 bit memory addressing. Even the largest ROM wouldn't have needed more than 24-bit addressing, much less 32-bit addressing. 64-bit addressing would have been like lighting off a 300 megaton nuclear hydrogen bomb just to kill that fly you couldn't catch with your chopsticks.

Let me see...

http://db.gamefaqs.com/console/jaguar/file/atari_jaguar.txt

Copy/paste, just like they did from usenet way back when.

 

Everything in Tom has a 64-bit-wide bus.

But it uses 24-bit memory addressing. My memory was a bit fuzzy on that point, so sorry if I misrepresented it.

 

According to Jaguar designer John Mathieson, "Jaguar has a 64-bit memory

interface to get a high bandwidth out of cheap DRAM. ...

So it IS a 64-bit bus, but that was so they could use cheap RAM chips.

 

Because people like simple answers.

Simple answers like that are for advertising. They have very little place in any technical argument. For those purposes, the "was it 64-bit?" question is neither here nor there. It was what it was, and its games did what they did.

I agree. But the technical arguments crop up because people want to know.

They've seen it has a 68000, like the Genesis. They know the 68000 is considered a 16-bit chip. So they either want to know if they missed something, or they argue that the Jaguar is actually a 16-bit machine because their mis-identified CPU is 16-bit.

 

 

Personally, I'd argue for a 32-bit design goal, and a functional 16-bit if the programmer used the 68k as the main processor(which many did because they already knew it well, and probably also because it made ports from the Genesis easy).

 

But I can see the reasons they argued 64-bit. NEC had made it an acceptable pardigm within the game industry, and it let them smack the 3D0 around(which never actually became relevant, but...).

 

The Jaguar, on the other hand, seems like an exercise in "how many microprocessors can we get away with putting in a single system?" I imagine that Flare thought that programmers would appreciate all the general purpose processing power they were given, but they seem to have failed to realize how incredible of a burden they placed on those same developers. You'll notice that when 3D co-processors became popular, they all came with a standardized API for programming them instead of allowing the programmer to go willy-nilly with the GPUs functions. This actually improves performance on modern 3D hardware since the driver is responsible for turning graphics commands into highly optimized GPU code.
I think it was intended as "future-proofing."

The Jaguar was thrown out during a lot of paradigm transitions.

They'd just seen the SNES and Genesis become dated. Sprite scaling and rotation had become a major feature, and neither system could do it in hardware. The SNES could do bitmap scaling and rotation, but only to one background layer and at the expense of all other background layers.

At the design time, bitmap scaling and rotationwas fading for polygon rendering.

 

Who knew what was coming next?

So they could design a graphics chip with hard-coded features, and risk it becoming obsolete within a year or 2 of hitting the market, or use a pair of "high-performance" processors and let the programmer put the functions he needed in the hardware.

 

Of course, things stabilized after polygons hit. We went from flat polys to shaded polys to textured polys and sat for a while.

The Jag used shaded polys at launch, and texture-mapping could probably be seen as a relevant feature. That would've put it on a more or less even footing with the PS1 and Saturn, feature-wise.

 

 

Even then, they could've hard-coded SOME functions, and left hooks to "add" functions in RAM later. That way it wouldn't have been near as big of a pain to work with(and there'd be clearly-defined uses in the hardware for all the processors) AND they'd have future-proofing.

 

Interestingly, pixel shaders may have brought us into a full circle. Whereas GPUs have traditionally been glorified Digital Signal Processors, they're now becoming more complex processors that allow arbitrary code to run onboard. It's difficult to say if this is a good thing. The complexity of the modern GPU has resulted in exponential cost increases in creating a new video game. Each game can easily require a budget the size of a blockbuster movie. And the situation is only getting worse, not better.

Yeah. I think when they did the Radeon demo where someone ran Frogger entirely in pixel shaders was where things got visibly out of hand.

 

 

It IS kind of funny that the modern PC hardware is emulating the Jaguar, though.

Aside from the video cards exploding with the introduction of pixel shaders, Creative's got the X-Fi, which makes the sound card into a seperate computer; and Ageia just rolled out a physics processor board, which adds ANOTHER subcomputer.

 

Of course, the PC architecture makes them all distinctly subservient, though video cards get special treatment(at least, they did under AGP, I'm not sure about PCIe graphics cards).

Share this post


Link to post
Share on other sites
The Jaguar, on the other hand, seems like an exercise in "how many microprocessors can we get away with putting in a single system?"

 

Oppressor said that Battlesphere may have looked prettier on the PlayStation but the gameplay would of suffered. The multiple processors on the Jaguar allowed him to do stuff with physics and AI that weren't possible on the original Playstation. He says he didn't feel utterly outgunned until Halo came out.

 

And the Jag's data bus is 64bits.

 

Edit: sorry I didnt notice my typos until recently. whoops.

Edited by JagChris

Share this post


Link to post
Share on other sites

Good grief. This argument has been done to death many times over.

 

Whether the Jag is "a 64-bit console" (whatever that means) or not is neither here nor there. It is a matter of definition - semantics, nothing more. It's like arguing about whether the 7800 is really a "ProSystem", or the 5200 is a "SuperSystem".

Share this post


Link to post
Share on other sites
There IS a 68000 in there, which is the commonly identified CPU by people itching to define it as a 16-bit system. Officially, it was intended to boot the system and handle mundane tasks like controller polling.

 

Actually, it's quite often used as the main logic CPU. Dedicated routines in the GPU and DSP are then triggered when needed. You see better performance the less you use the 68000, but you increase the development time with custom code modules that must be swapped.

 

Anyway, here is the definitive answer: The Jaguar is a hybrid system that uses a central 64-bit data bus. It is not a pure 64-bit system, but is rather a good illustration of how meaningless these marketing designations are.

 

-Bry

Edited by Bryan

Share this post


Link to post
Share on other sites

Which bit generation would I clump the jag in? It goes into a generation that wasn't defined by bits as much as it was by failure, the tweener gen. So assuming I actually still had the system when classifying it to a friend I would just forego the bits and tell them it was a tweener.

Edited by sega saturn x

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...