Jump to content
IGNORED

Elegant Architecture or Speed?


rpiguy9907

Recommended Posts

I used to be an adamant defender of the 68000 architecture and as a kid could rattle off all the ways in which it was superior, but now that I am much older I see that it lacked something potentially more important - speed.

 

The 68000 had just terrible IPC with long instructions that took several cycles to load over the 16-bit bus and most instructions taking 8 cycles or more to complete. It did not have to be this way, Motorola could have figured out a more efficient way to encode instructions.

 

Yes it was much faster than the 8088, but on 16-bit code it was barely the same speed as an 8086, and consistently fell behind the 286. Yes the 286 was five years newer and had twice the transistors, but the 68000 was just picking up steam when the 286 came out and they competed directly.

 

It was even worse for the 68000 if you needed floating point, as there was no official FPU until the 68881.

 

You would think the 68000 would always win on 32-bit integer math, but the 286 had a hardware multiplier and divider that negated that advantage often by a lot.

 

I get it, the 68000 was elegant to the programmer, but the end-user didn't get all that much from it unless they were running a narrow set of applications that favor the 68000.

 

I guess that was a lot of words to say I bought into a lot of hype and marketing material when I was a kid and it used to puzzle me how anyone could prefer "dogs" like the 8086 and 286, never knowing that they were as fast or faster than my super elegant 16/32-bit 68000.

 

 

  • Like 1
Link to comment
Share on other sites

The x86 was anything but simple and elegant, the programming mode with segments and paragraphs and all those fucked up memory models (tiny, small, compact, medium, large, huge) were terrible, simply terrible.

The memory map was a minefield, to these days modern kernels have to put a blackhole in the first mega or so to avoid tripping in legacy crap.

 

We are now seeing the result of decades old battles for computing supremacy, and we are all looking at it with survivor bias, the x86 won for pretty much the same reason the VHS won, it was NOT superior by any mean, it was just ubiquitous and somewhat good enough for the 80s (many people wanted to be able to work at home, that was the way to do it, in the office it won against a fragmented and pretty pathetic horde of incompatible CP/M machines unable to evolve).

 

Wrt the current most successful computing platform well with 2 billions Android devices and 1 billion iOS devices ... it's ARM that's leading the pack. I am not ready to give up my i7 yet but it's gonna happen, give it time. In my household alone there's 3 Intel machines (1PC, 1 Mac, 1 XB-ONE) and like 8+ ARM based machines or more (3 phones, 1 roku, 1 googlecast, 3 google wifi, 1 amazon echo, 1 Rpi3+, 1 RPi3, 1 Super Retrocade, 1 Nintendo 3DS, 1 Nintendo DS, 1 Nintendo GameBoy Advance ...) so yeah ....

  • Like 1
Link to comment
Share on other sites

When I was shopping for my first computer in the late 1980s, my primary concern was which would give me more hardware choices and better access to software. It wasn't the platform that had the better architecture. Twenty years later the same thing happened when I was shopping for a smart phone.

Edited by mr_me
  • Like 1
Link to comment
Share on other sites

Speed and simplicity in an architecture is elegance itself. And a proven winner time and time again.

 

This not only shows itself in chip design, but overall system design too.

 

As I get older I definitely agree with this. I did not appreciate the Apple // or the Atari ST when I was younger, but I really admire their simple, elegant architectures. The Apple // in particular is stunning with what it achieved using off the shelf components.

  • Like 1
Link to comment
Share on other sites

We are now seeing the result of decades old battles for computing supremacy, and we are all looking at it with survivor bias, the x86 won for pretty much the same reason the VHS won, it was NOT superior by any mean

 

It was faster and cheaper, which I was arguing was more important than elegance. This mirrors VHS as well, where cheap won the day.

Link to comment
Share on other sites

BITD there needed to be some metric(s) on how to judge which computer was better. Speed in MHz became the universal benchmark, and shortly thereafter all these synthetic marks - as companies tried to monetize and capitalize on the MHz race.

 

Speed was easily understood by everyone. Bus/memory width or number of gates in the Accumulator were too esoteric and required a background in computer science to understand.

Link to comment
Share on other sites

intel hired away several of Motorola's chip designers back during the 68K vs x86 wars.
Motorola sued, but I never saw the outcome.
I'm sure that's part of why Motorola fell behind.
Then they threw in with the PowerPC group and ditched the 68K path.
The Vampire accellerators show what really could have been done with the 68K line.

Link to comment
Share on other sites

 

It was faster and cheaper, which I was arguing was more important than elegance. This mirrors VHS as well, where cheap won the day.

Cheaper ... lol

https://en.wikipedia.org/wiki/IBM_Personal_Computer/AT

 

It is only cheaper once the "clones" stepped in and then again it took a while to get to affordable prices.

 

The first PC/AT compatible (286 based) in 1985 was at 4500$ (wihtout an RGB monitor)

https://www.atarimagazines.com/creative/v11n7/25_Kaypro_286i_the_first_PC.php

 

In 1985 an Amiga 1000 + monitor was < 2000$

 

So please lets not rewrite history just because we need to rationalize what happened.

I repeat it had nothing to do with speed or cost way until 386/486 with SVGA and cheaper IDE HDDs.

Before that it was just the way to work from home, it had the business aspect (like a terminal VT220 or alike), it was built like a tank, business could expense it/amortize it and it had many business apps to foster the mighty circle.

Link to comment
Share on other sites

I repeat it had nothing to do with speed or cost way until 386/486 with SVGA and cheaper IDE HDDs.

Before that it was just the way to work from home, it had the business aspect (like a terminal VT220 or alike), it was built like a tank, business could expense it/amortize it and it had many business apps to foster the mighty circle.

 

Naturally. That's when the x86 architecture started increasing in speed more than other micros. 16MHz, 25MHz, 33MHz, 50MHz, 66MHz and faster. 68000 architecture stagnated.

Link to comment
Share on other sites

 

Naturally. That's when the x86 architecture started increasing in speed more than other micros. 16MHz, 25MHz, 33MHz, 50MHz, 66MHz and faster. 68000 architecture stagnated.

It did not, the 68030 reached 50Mhz as well

https://en.wikipedia.org/wiki/Motorola_68030#Technical_data

The 486 broke to 50Mhz with the DX2 25/50 and 33/66 in 1993

It competed with a 68040 that had overheating issues but did reach 40Mhz and clock per clock was faster.

 

By 1993/94 though the prices were down considerably, the army of the cheap clones got very very interesting, Win3.11 was with warts but acceptable and Win95 was just behind the corner etc...etc... aside the Mac the 030/040 were afterthoughts on the "other platforms" (excluding Sun with the Sun-3x, not sure about other Unix vendors) and I mean Atari ST and Commodore Amiga.

The ATI mach 64 released in 94 as well and it was a hell of gfx card:

https://en.wikipedia.org/wiki/ATI_Mach#Mach_64

 

None of this was known at the time of 8086/286 so attributing anything at all to that period is really nonsense. They made the system out of off-the-shelf parts boring as hell because business didn't care about the pizzaz of fancy custom chips they had no use for (outside of DTP which was dominated by Mac, music which was for long an Atari ST domain thanks to integrated Midi and software to boot [CuBase anyone], AV titling with the Amiga GenLock)

 

The P4 almost ruined Intel, they had to be saved by their Isreal team with the design of the P3 mobile (Pentium M) that formed the base for the new "Core" line.

Again stop re-interpreting the history to suit a narrative.

The reasons why the wintel duo end up dominating the market are complex and not always related to any particular winning metric, in specific none of the competitors could evolve their platform for whatever reason. Btw many of the early PC games are unplayable on later machines for the reason of being in open loop and without way to synchronize with anything known (no VSync, no advertised clock speed) but because business apps kept working faster and faster it was all fine and dandy for the business side.

Link to comment
Share on other sites

The P4 almost ruined Intel, they had to be saved by their Isreal team with the design of the P3 mobile (Pentium M) that formed the base for the new "Core" line.

 

That it did. The P4 sucked. Heat. Long pipeline. 2x clocked integer-ALU running 6.8GHz in the Extreme Edition. And 7.6GHz in the later Prescott chips. They had to something to get the piss-poor architecture up to speed. No pun intended. And to make matters worse they made laptops with these monstrosities.

 

OTH, the Pentium-M was a gem of a processor. At 2.1GHz it maintained pace with or outdid the P4-EE. And at low power. I still run a couple of these in an ultra-low-voltage configuration. Something like .5 or .6v.

 

 

 

The reasons why the wintel duo end up dominating the market are complex and not always related to any particular winning metric, in specific none of the competitor could evolve their platform for whatever reason. Btw many of the early PC games are unplayable on later machine for the reason of being in open loop and without way to synchronize with anything known (no VSync, no advertised clock speed) but because business apps kept working faster and faster it was all fine and dandy for the business side.

 

Agreed there was a lot going on in market dynamics at the time. I've mentioned it before, I think the custom chips played a bigger role in holding back platforms than most realize. Sure they're a 1-trick solution to a problem. But to expand you have to evolve them along with the CPU. And that's more cost and complexity.

 

Whereas with wintel you could swap CPUs 2 or 3 times over the life of the machine. I did exactly that with my BX based rig, 266MHz, 350, 450, 850, 1.4GHz.

 

Nowadays we've drifted away from doing that. But, also, we're very close to having the industry accept having the Southbridge (PCH) integrated into the processor package.

 

Anyways, PC architecture could be upgraded piecemeal. My BX rig has played host to several graphics architectures - each being quite different from one another. Voodoo2, Riva-128, TNT2 Ultra, Rendition Verite, Cirrus Logic, and some GeForce parts. Same for HDD, memory, sound, and other bits and pieces.

 

In fact there was a time when I temporarily cannibalized my 486 to get my P2-266 up and running. Till I could afford the proper parts. The PC architecture was generic enough to all me to make a bridge with the stuff I had laying around.

 

Had I had to purchase a new machine to experience all that new tech I would have long ago given up. So I see subsystem modularity as a huge advantage. Now.. Now I'm at the point where having all the hardware in a tiny NUC box is satisfying enough. And so software becomes my area of interest. The box is cheap and disposable and I purchase them with the understanding I may replace them in 2 years time.

Link to comment
Share on other sites

So please lets not rewrite history just because we need to rationalize what happened.

I repeat it had nothing to do with speed or cost way until 386/486 with SVGA and cheaper IDE HDDs.

Before that it was just the way to work from home, it had the business aspect (like a terminal VT220 or alike), it was built like a tank, business could expense it/amortize it and it had many business apps to foster the mighty circle.

 

 

The price of x86 chips were much lower, particularly around the time that IBM was shopping for a processor for their PC. I was pointing out that x86 has higher IPC than 68K and outperforms in most cases that do not involve 32-bit math.

 

You don't have to look at an AT to get comparable performance to a 68000, which ran even with a 8086. The 68K was also far slower than than 8086/8087 on floating point. Since the 68K had no available FPU the x86 ran the killer app of the 80s (the spreadsheet) much faster.

 

For all its elegant architecture, the 68K did not offer much to the end-user. It was much more attractive to the programer of course, but the person cranking away on a spreadsheet doesn't care about a flat memory space if the spreadsheets runs slower because you don't have an FPU.

Edited by rpiguy9907
  • Like 1
Link to comment
Share on other sites

My understanding is that the 68000 cpu would have been the natural choice for ibm's new pc but it just wasn't ready; specifically the support chips weren't ready. The only other options were the 8086 and a 16-bit processor from texas instruments. Ibm had no choice but to go with intel's processor. I've also read that they didn't want to make the ibm pc too powerfull, fearing that other ibm departments might have the project shut down.

 

...

The reasons why the wintel duo end up dominating the market are complex and not always related to any particular winning metric, in specific none of the competitors could evolve their platform for whatever reason. Btw many of the early PC games are unplayable on later machines for the reason of being in open loop and without way to synchronize with anything known (no VSync, no advertised clock speed) but because business apps kept working faster and faster it was all fine and dandy for the business side.

There were ways for programmers to keep game timing correct. When I had my 386 it ran most of the old games perfectly fine. There was only one game I remember where I had to use the turbo button. Going forward there was one game I remember that was fine on the 386 but ran too faster on quicker computers. Most were programmed to keep time. There was definitely a timing issue with the gameport. It's why every game had a controller calibration routine. Edited by mr_me
Link to comment
Share on other sites

 

 

..Since the 68K had no available FPU the x86 ran the killer app of the 80s (the spreadsheet) much faster....

The first spreadsheet to have FPU support for SuperCalc 3 ~'85 (the PC-XT shipped in '81) and Lotus 1-2-3 had to catch up on that one.

It was up to 5x faster than without any FPU at all (best scenario mind you).

 

And again killer app ... for business that didn't mind spend the money on overpriced hardware to get their job done, none of the Atari ST/Amiga of the time catered to that kind of business at all.

Making the leap the the FPU itself was the differentiating factor I think is wrong, no one that I know at the time (business included) could invest in an FPU equipped PC until much much later .... that is when Intel made the very smart move to incorporate it into the 486, now that made a hell of a difference.

 

The 68K was in the workstations for Unix domain which were not geared to small business (the PC-XT was) and commanded high prices (the OS itself was darn expensive but multi-tasking) ...

 

NOTE: I really don't care about the 68K or the x86 or ARM or Alpha or MIPS or Sparc .... what I care is that we stop attempting to explain backwardly how the x86 platform got to be the behemoth that it is now (even just with respect to the 68K) based only on one or two "simple" factors ... the only sensible answer is exactly what an overweight person would tell you if you ask him how he got overweight .... "day by day", he didn't eat a hippo the day before, he didn't plan for it, things aligned in certain ways, some he controlled and he can take credit/blame, some he didn't control but played towards making him that.

 

EDIT:

Sure Intel cared so much about the original 8086 as to put a full on virtual86 mode on the 386 and above after the fiasco of the 286 in that regard, Motorola did not .... maybe that was not a wise choice but again one of many factors. Just keep in mind that the Compaq386 (the first 386 on the market) was a 7K$ beast ... not cheap by any stretch of imagination.

Link to comment
Share on other sites

NOTE: I really don't care about the 68K or the x86 or ARM or Alpha or MIPS or Sparc .... what I care is that we stop attempting to explain backwardly how the x86 platform got to be the behemoth that it is now (even just with respect to the 68K) based only on one or two "simple" factors ... the only sensible answer is exactly what an overweight person would tell you if you ask him how he got overweight .... "day by day", he didn't eat a hippo the day before, he didn't plan for it, things aligned in certain ways, some he controlled and he can take credit/blame, some he didn't control but played towards making him that.

 

EDIT:

Sure Intel cared so much about the original 8086 as to put a full on virtual86 mode on the 386 and above after the fiasco of the 286 in that regard, Motorola did not .... maybe that was not a wise choice but again one of many factors. Just keep in mind that the Compaq386 (the first 386 on the market) was a 7K$ beast ... not cheap by any stretch of imagination.

But I wasnt saying that at all. I wasnt saying x86 won because of speed, only that the architectural advantages of 68K did not equate into speed. There were indeed many factors.

Link to comment
Share on other sites

But I wasnt saying that at all. I wasnt saying x86 won because of speed, only that the architectural advantages of 68K did not equate into speed. There were indeed many factors.

The architectural advantages of the 68K didn't materialize fully until the 68020 (the internal 68K ALU is 16bit), in any case:

https://tech-insider.org/unix/research/1986/0219.html

 

check the Dhrystone 1.0 benchmarks.... 68K and 8088/8086 are all over the place.

Link to comment
Share on other sites

The architectural advantages of the 68K didn't materialize fully until the 68020 (the internal 68K ALU is 16bit), in any case:

https://tech-insider.org/unix/research/1986/0219.html

 

check the Dhrystone 1.0 benchmarks.... 68K and 8088/8086 are all over the place.

Very True. However the 68020 held almost no advantage over the 80386 speedwise. Byte magazine and others were shocked when the two faced off in what was essentially a dead heat in terms of speed. The major advantage of the 68020 was that existing apps could run and take advantage of the speed and memory, whereas legacy x86 apps forced the 386 into real mode, or had to be virtualized. Edited by rpiguy9907
Link to comment
Share on other sites

Very True. However the 68020 held almost no advantage over the 80386 speedwise. Byte magazine and others were shocked when the two faced off in what was essentially a dead heat in terms of speed. The major advantage of the 68020 was that existing apps could run and take advantage of the speed and memory, whereas legacy x86 apps forced the 386 into real mode, or had to be virtualized.

 

Real world users didn't care much about real or virtual mode or memory models. MHz and marketing were the huge factors.

Link to comment
Share on other sites

My understanding is that the 68000 cpu would have been the natural choice for ibm's new pc but it just wasn't ready; specifically the support chips weren't ready. The only other options were the 8086 and a 16-bit processor from texas instruments. Ibm had no choice but to go with intel's processor. I've also read that they didn't want to make the ibm pc too powerfull, fearing that other ibm departments might have the project shut down.

 

..and Atari was to design the IBM PC. Considering the shithole both companies became it's a good thing Atari got laughed out of the conference and that Motorola wasn't ready with the 68000.

Link to comment
Share on other sites

 

..and Atari was to design the IBM PC. Considering the shithole both companies became it's a good thing Atari got laughed out of the conference and that Motorola wasn't ready with the 68000.

This is a biased analysis, because what you know now may not have happened had IBM chosen to go with Motorola.

Possibly instead of the IBM PC we would be all using a totally different system, I don't know. But hindsight is of no use here.

 

If it wasn't for a gross oversight of IBM, clones would not have existed at all. They thought so little of their PC that they only copyrighted the portion of BIOS that interfaced with tapes (as in cassettes) and have no patents on their IBM PC design at all, even more allowed Microsoft to ship MS-DOS outside of the IBM made machines (under which it was called PC-DOS) as a way to foster the counter CP/M ecosystem .... once the cassette didn't pick up at all that portion of the ROM was useless, GWBasic didn't need it (developed for the Compaq) and alternative BIOSes just ditched it etc... etc... and also rumors has it IBM wanted to follow the Apple ][ philosophy of being open and expandable [but in control] given the base was really lacking on sound, gfx and a horde of other features (it did have a DMA controller which many home computers lacked though). Even the ISA bus was a dog once higher Gfx card started to appear.

 

It's hard to tell what could have been, for sure Intel defended tooth and nail their advantage but if history is any indications .....

 

Intel (small CPUs) dethroned the Mainframe (big CPUs) and grew big.

ARM (small CPUs) is dethroning Intel (big CPU mostly) at least in all phones, tablets, SBC, hackerboards etc.... and they are growing on more powerful chips.

 

I don't have a crystal ball but "little things grow and once they are ubiquitous become de-facto std", we are waiting for DataCenter level ARM chips, if those get competitive it's anyone guess how long Intel can really hold the fort.

 

News of today about 50% of Azure VMs are running Linux .... so one part of the Wintel duo is already leaving the DataCenter scene (Windows Server) on its own home turf .... Windows itself has been ported to run on ARM (Intel was not happy that's for sure) .... Linux can support all manner of ARMs already ... give it 5 years.

Link to comment
Share on other sites

Most users wouldn't know the difference if the ibm pc had a motorola cpu inside. Would it have been slow? Things sure could have been different for these companies. Would microsoft have survived without ibm? An ibm pc powered by a 32-bit 68k cpu running digital research operating systems sounds okay.

 

The ibm pc used open standard architecture to keep costs down and get to market quickly. Otherwise it wouldn't compete with apple/commodore/tandy/atari. The ibm pc bios was protected by copyright. It had to be reverse engineered and rewritten for clones to happen. Original ibm clones were not 100% software compatible without that bios. Arm computers running windows are still not compatible with windows software. Interesting about azure; they do have to provide the software their customers are asking for afterall.

Edited by mr_me
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...