Jump to content
IGNORED

Would the Acorn Archimedes family have done well in the US by competing with the Amiga as a media and gaming computer?


Recommended Posts

Acorn had a limited (and unofficial?) Release in the US mostly due to concerns about gaining a foothold. But Amiga had a decent sized foothold in US being a professional media and games machine, and the Acorn Arc computers were arguably better media and game machines, more so for the latter.

 

While the 1200 Amiga or the later CD32 struggled with complex 3D or post SNES style 2D games, the Acorn unit launched near the same time could run games like Star Fighter. Something iirc even the strongest expensive Amiga station couldn't do. Or at least not anywhere near as well.

 

Acorn even started modern "complex" 3D games on home computers. Starting with the 1987 Zarch which was ported as Virus to ST, Amiga, and others in 1988. Replacing the more basic methods from before.

 

It had good sound capabilities, good media software (though not as polished as Amiga) and ARM was a major advantage over the Amigas custom chip set for performance and efficiency, especially post 1989.

 

I think if they launched the Arc in the US as a media (for professionals) and gaming computer it may have done well, or at least passable, selling 1-3 million.

 

ARM also may have even been used across electronics more too, instead of its widespread use only really accelerating after post 2000s mobile electronics.

Edited by Leeroy ST
  • Like 1
Link to comment
Share on other sites

No CD-Rom until the Risc-PC around 1994. You could get one as an external device but the Risc-PC was the first one to come with a drive bay in the case.

 

I actually sold my Amiga 500 to get an Acorn A5000 and it was quite an upgrade.

 

But the true answer? It depends what you want out of a computer. Amiga and video production was unmatched for the price but everything else would depend on the software. I think RiscOS was far better than Amiga OS. The video output on the archie was easier to use with VGA monitors and the raw power of the Archie was better. I know I never missed my Amiga except the larger gaming software base. The productivity software was so much better as well on my A5000. For me at least it was a better computer in all aspects. RiscOS felt like such a great leap forward that when I had to move over to PC Windows 3.1 for a job it felt like a horrible step back.

 

Glad its still around and I have a modern RiscOS machine I like to use from time to time. But I still miss my old A5000.

Edited by Arnuphis
  • Like 3
Link to comment
Share on other sites

3 hours ago, Arnuphis said:

But the true answer? It depends what you want out of a computer. Amiga and video production was unmatched for the price but everything else would depend on the software. I think RiscOS was far better than Amiga OS. The video output on the archie was easier to use with VGA monitors and the raw power of the Archie was better. I know I never missed my Amiga except the larger gaming software base. The productivity software was so much better as well on my A5000. For me at least it was a better computer in all aspects. RiscOS felt like such a great leap forward that when I had to move over to PC Windows 3.1 for a job it felt like a horrible step back.

 

Glad its still around and I have a modern RiscOS machine I like to use from time to time. But I still miss my old A5000.

Slightly off from the question but I agree on productivity software. It was very ahead of its time, and most people I know who used Arcs used them for business purposes initially.

 

I also sympathize with you on using Windows 3.1, it was an unmitigated disaster. 

 

I did not know however RiscOS was still around. I assume like with AmigaOS it's being handled by individual contributors?

  • Like 1
Link to comment
Share on other sites

On 8/28/2021 at 4:47 PM, Leeroy ST said:

Slightly off from the question but I agree on productivity software. It was very ahead of its time, and most people I know who used Arcs used them for business purposes initially.

 

I also sympathize with you on using Windows 3.1, it was an unmitigated disaster. 

 

I did not know however RiscOS was still around. I assume like with AmigaOS it's being handled by individual contributors?

Oh Yes. RiscOS is very much alive and if you have a Raspberry Pi you can download it and have fun.

 

https://www.riscosopen.org/content/

 

 

Link to comment
Share on other sites

On 8/28/2021 at 5:08 PM, Leeroy ST said:

[...] and the Acorn Arc computers were arguably better media and game machines, more so for the latter.

 

While the 1200 Amiga or the later CD32 struggled with complex 3D or post SNES style 2D games, the Acorn unit launched near the same time could run games like Star Fighter. Something iirc even the strongest expensive Amiga station couldn't do. Or at least not anywhere near as well.

Have to disagree here. The Amiga was a 2D powerhouse. It could do things with little effort that other systems could only do with 10-times the CPU power. And the latter is the case with the Archimedes. It was able to do better 3D stuff because of the powerful CPU. But it lacked 2D features, any of them. It had a frame buffer, and that was pretty much it, no sprites either. So for games, it was as worse as the Atari ST. The CPU had to draw every pixel you wanted on screen to the frame buffer. And sound mixing had to be done in software as well - no four hardware sound channels as on the Amiga.

 

Star Fighter doesn't count. Yes, this game later showed what the Archimedes can do. But it's a late 1994 game - the Amiga was pretty much done at that point. And let's be fair: unless you had a StrongARM-equipped Archimedes, the frame rate of Star Fighter was nothing to call home about. The Amiga got some stunning 3D games in the mid-90s as well, which all required a faster CPU. The reason why 3D on the Amiga mostly sucked was because while the Amiga still had a large-enough market, all games were made to run on a stock machine. We all know that the Amiga was able to run games like Wing Commander with no issues, as the CD32 version later proved, but still they f*cked it up trying to make it run on an A500.

 

btw, I own many Amigas as well as an Acorn A3020.

Edited by derSammler
  • Like 1
Link to comment
Share on other sites

4 hours ago, derSammler said:

. The reason why 3D on the Amiga mostly sucked was because while the Amiga still had a large-enough market, all games were made to run on a stock machine. We all know that the Amiga was able to run games like Wing Commander with no issues, as the CD32 version later proved, but still they f*cked it up trying to make it run on an A500.

While this is partially true and I agree, even in early years the Arc could run 3D better than the Amiga, in fact Amiga did poorer than nearly every other relevant computer platform in 3D.

 

But I do agree with you on 2D, though acorn has some impressive titles, yes the Amiga handled that very well.

 

The problem is in the US in the 90's FMV and high res adventure games, and polygonal 3D were seen as progressions of power and were major reasons for computer adoption among game players  So in the US where this was a bigger deal, I'd argue Acorn would have been seen as the better option for games.

 

In the US people likely already had a SNES or MD for 2D.

  • Like 1
Link to comment
Share on other sites

Sure, but I don't see a fair comparison when it comes to 3D. When 3D was slowly becoming a thing in the 90s, the Amiga was struggling in the market already, because most users still had an A500 and Commodore kept releasing outdated hardware like the A600 in 1991 not allowing more complex 3D games. Most companies moved to the PC and left the Amiga behind.

 

Yes, the 68000@7MHz-based Amigas sucked in 3D - no question, but that was technology from 1985. The first Archimedes hit the market in mid-1987, hence it should be compared to an A2620-equipped A2000. The CD32 (also 020@14MHz like the A2620) could do a Starfox-like game at a better frame rate than the SNES (that game is "Guardian", in case someone is not into the CD32). So it wasn't that bad. My A600 with an 68030@40MHz can even run DOOM at a frame rate similar to a fast 386. Problem is, too few people had accelerator cards back then and games were tailored to run on a stock A500 to sell more copies, which in turn held back people from spending money on accelerator cards. Catch-22.

 

Quote

in fact Amiga did poorer than nearly every other relevant computer platform in 3D.

Yes, that is true as well. It's because of its hardware design completely centered around fast 2D and low memory usage. The Amiga uses bitplanes instead of a frame buffer with packed pixels. This makes 3D very slow, as you can not just write a 3D-rendered scene pixel-by-pixel to the frame buffer.

Edited by derSammler
  • Like 2
Link to comment
Share on other sites

7 hours ago, derSammler said:

 The CD32 (also 020@14MHz like the A2620) could do a Starfox-like game at a better frame rate than the SNES (that game is "Guardian", in case someone is not into the CD32). So it wasn't that bad. 

Oh, I'm not saying Amigas 3D was bad, but it was clearly behind, and per thread title the American population was since 1990 onward having accelerated interest in 3D polygon and FMV.

 

Also for some reason that alludes me, still high-res graphical adventure titles like Myst, though that was just a year before Amigas death iirc.

 

The perception in US was the Amiga was outdated and games like Alien Breed 3D or Guardian later didn't really do anything to remedy that.

 

Snes Star Fox and others got a pass because the standards for consoles were different. Star Fox likely wouldn't be half as successful on any computer platform compared to the SNES. 

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...
1 hour ago, zzip said:

Probably not.   The market space Amiga and ST were in was ever-shrinking as people moved to the PC, and Acorn was not an established brand, so it would have been an uphill battle.

Possibility it could have took Macs spot as it was slowly dying off for years. It had enough 3D power to be a gaming niche device.

 

Later MAC never really got anywhere, by the time of the Pippin OS era it was behind and on its last legs.

Link to comment
Share on other sites

In the US, there was a different "culture" in regard to games.  Consumers were more interested in cartridge based, dedicated game consoles, like the Atari VCS, or the NES. Games came on cartridges, were durable goods, and did not have a lot of muss or fuss with setup and use. (compared to a computer.)

 

Computers on the other hand, where for "adults" for "adulting". (you know, visicalc and pals.)  Computers that did not do "adulting" were not well regarded in the market, no matter what they were actually capable of.  This aura of "adultness" and "For Business!!" dominated, which is one of the reasons the IBM (despite being inferior in oh so many ways at inception into the market) really dominated in the US.

 

This is vastly different from Europe, in which the primary game experience was on home micros, like the C64, the Sinclair, ZX Spectrum, MSX and pals.

 

Kids in europe were used to sharing cassettes with each other (as floppy disk drives were expensive, and these were inexpensive home micros), and many got so involved that they started making their own games (with the built-in BASIC implementations), and sharing them on cassettes in like fashion.  This was a vastly different cultural dynamic to the US, where kids were mostly passive consumers. (which parents seemingly encouraged, as they did not want to field questions about technology that they did not know, and thus favored the solid state game experiences for their children instead.)

 

That is not to say that there was not overlap-- there most certainly was, and in both directions-- this is just the observation of the prevailing trends.

 

 

This is why Commodore USA marketed the Amiga as a graphics workstation for ADULT THINGS, LIKE CGI FOR MOVIES-- (and not as an epic graphics and sound platform for consumer games, like in Europe), and subsequently why the market was much more barren for it over here.

 

The same would have happened for the Acorn Archemedes and its descendants.

 

 

Edited by wierd_w
  • Like 1
Link to comment
Share on other sites

6 hours ago, wierd_w said:

Computers on the other hand, where for "adults" for "adulting". (you know, visicalc and pals.)  Computers that did not do "adulting" were not well regarded in the market, no matter what they were actually capable of.  This aura of "adultness" and "For Business!!" dominated, which is one of the reasons the IBM (despite being inferior in oh so many ways at inception into the market) really dominated in the US.

That was true for the high-end of the market,  the cheap computers may have been sneered at by those people, but they still sold widely and were adopted by the "kids".   I wonder how many of todays tech professions got their start because they learned to code on a C64, Atari or Ti?

 

6 hours ago, wierd_w said:

Kids in europe were used to sharing cassettes with each other (as floppy disk drives were expensive, and these were inexpensive home micros), and many got so involved that they started making their own games (with the built-in BASIC implementations), and sharing them on cassettes in like fashion.  This was a vastly different cultural dynamic to the US, where kids were mostly passive consumers. (which parents seemingly encouraged, as they did not want to field questions about technology that they did not know, and thus favored the solid state game experiences for their children instead.)

The same culture existed in the US, except with disks,  I was part of it.    But yeah it does look like it was more wide-spread in Europe.   I'm curious about how that came about.

 

My observation in the US was the rise of computers coincided with the crash.   Some people made the jump from video games to computers, but a lot of people dropped out of the videogame scene altogether and didn't return until after the NES was established.   Since the crash didn't happen in Europe, perhaps that's how computers ended up becoming more widespread?

Link to comment
Share on other sites

7 hours ago, zzip said:

That was true for the high-end of the market,  the cheap computers may have been sneered at by those people, but they still sold widely and were adopted by the "kids".   I wonder how many of todays tech professions got their start because they learned to code on a C64, Atari or Ti?

 

The same culture existed in the US, except with disks,  I was part of it.    But yeah it does look like it was more wide-spread in Europe.   I'm curious about how that came about.

 

My observation in the US was the rise of computers coincided with the crash.   Some people made the jump from video games to computers, but a lot of people dropped out of the videogame scene altogether and didn't return until after the NES was established.   Since the crash didn't happen in Europe, perhaps that's how computers ended up becoming more widespread?

The crash had nothing to do with the rise of computers in the US. Nor was there anything other than marginal disinterest in video games after it. There goes that NES myth again.

 

That aside, the price wars were what drove Micro adoption in the US, temporarily, which pissed off retailers, and mostly just the C64, however by the time the Amiga and ST came out Micros were falling out of favor. ST did ok for awhile but faltered, and Amiga was driven into a niche. PC and clones, which almost never participated in the price wars, were selling at higher rates at higher prices. That is where the real rise in computer adoption took place.

 

PC and clones were growing at 10x the rate the C64 ever could. One could argue Micros in general where a fad and place holder in the US market.

 

In Europe top manufacturers early on (1980 starting) where able to produce incredibly affordable consumer and professional Micros. In the same time frame, there were no comparable machines in the US. So in Europe those spawned companies pushing Micros at all levels of the market. While in the US there really wasn't a storm of affordable machines targeting multiple demographics like in Europe, the cheap Micros only occurred in the US starting in early 83 when the snowball effect began after TI started the race to the bottom. But even that barely lasted two years.

 

The most successful Micro in the US was the C64 followed by Tandy stuff. Atari never did that well comparatively and also never made any money. 

 

It also helps that cheaper formats for storage were more prominent in Europe too. Really early on.

 

 

13 hours ago, wierd_w said:

 

The same would have happened for the Acorn Archemedes and its descendants.

 

Amiga would have made money as a game and media machine, but bad decisions overrode what was made in profits. It was also becoming more and more outdated post 1990.

 

At least an Acorn machine would be competitive until about 1994 in those areas. It may have been a profitable niche. Just look at Star Fighter as an example.

 

Of course it wouldn't win, but imo I think it would have done better than Commodore Amiga did.

 

 

 

 

  • Haha 1
Link to comment
Share on other sites

  • 2 weeks later...
On 8/28/2021 at 11:08 AM, Leeroy ST said:

Acorn had a limited (and unofficial?) Release in the US mostly due to concerns about gaining a foothold. But Amiga had a decent sized foothold in US being a professional media and games machine, and the Acorn Arc computers were arguably better media and game machines, more so for the latter.

 

While the 1200 Amiga or the later CD32 struggled with complex 3D or post SNES style 2D games, the Acorn unit launched near the same time could run games like Star Fighter. Something iirc even the strongest expensive Amiga station couldn't do. Or at least not anywhere near as well.

 

Acorn even started modern "complex" 3D games on home computers. Starting with the 1987 Zarch which was ported as Virus to ST, Amiga, and others in 1988. Replacing the more basic methods from before.

 

It had good sound capabilities, good media software (though not as polished as Amiga) and ARM was a major advantage over the Amigas custom chip set for performance and efficiency, especially post 1989.

 

I think if they launched the Arc in the US as a media (for professionals) and gaming computer it may have done well, or at least passable, selling 1-3 million.

 

ARM also may have even been used across electronics more too, instead of its widespread use only really accelerating after post 2000s mobile electronics.

No because the Amiga had the Video Toaster and Penn and Teller. 

  • Like 1
Link to comment
Share on other sites

  • 1 year later...
On 8/30/2021 at 10:55 AM, derSammler said:

Have to disagree here. The Amiga was a 2D powerhouse. It could do things with little effort that other systems could only do with 10-times the CPU power. And the latter is the case with the Archimedes. It was able to do better 3D stuff because of the powerful CPU. But it lacked 2D features, any of them. It had a frame buffer, and that was pretty much it, no sprites either. So for games, it was as worse as the Atari ST. The CPU had to draw every pixel you wanted on screen to the frame buffer. And sound mixing had to be done in software as well - no four hardware sound channels as on the Amiga.

 

Star Fighter doesn't count. Yes, this game later showed what the Archimedes can do. But it's a late 1994 game - the Amiga was pretty much done at that point. And let's be fair: unless you had a StrongARM-equipped Archimedes, the frame rate of Star Fighter was nothing to call home about. The Amiga got some stunning 3D games in the mid-90s as well, which all required a faster CPU. The reason why 3D on the Amiga mostly sucked was because while the Amiga still had a large-enough market, all games were made to run on a stock machine. We all know that the Amiga was able to run games like Wing Commander with no issues, as the CD32 version later proved, but still they f*cked it up trying to make it run on an A500.

 

btw, I own many Amigas as well as an Acorn A3020.

Well, I am going to politely disagree here.

The Acorn Archimedes does have a hardware sprite.

It can be up to 32 pixels wide, and whole height of your chosen screen display.

It's only 3 colours ( from 4096 ), but now thanks to RasterMan you can change them every line, and position the display at any x position, per line.

It's not been much used in games, and that's a pity.

 

For the 3D games, there is no need to use SF3000.

Games like Aldebaran ( much more impressive than Zarch ) or Chocks Away demonstrate very well what an Archie can offer.

You also have Black Angels, Guile, and don't forget Air Supremacy, Conqueror, or Saloon Cars, these games appeared early in the life of the machine.

 

Sound mixing is done in software, yes, so what ? That wil take max 12.5% of CPU cycles available per VBL, to play a 4 channel mod at 20.833 kHz, that leaves quite a lot of power for everything else.

A base Archie at 8 Mhz can output the same mod up to 62 kHz if you wish, there's no need of an accelerator for this.

With a tracker like the Matrix Tracker an Archie at 8 Mhz can play S3M files up to 16 channels with more than decent quality.

Of course more powerful Archies can do even better.

 

What is a pity is that most games exclusive to the platforms were coded by bedroom coders.

The question is what would it have been if there had been real teams trying to get the most out of the machines ?

It's also interesting to note that some 'features' of the ARM chip ( with its memory controler ) have only recently been discovered, like faster memory transfers, and saving one cycle if you can

use and place in your sequence of code one instruction at a specific place ( and that's possible every 4 instructions ).

 

 

 

  • Like 1
Link to comment
Share on other sites

I forgot to add that ok SF3000 isn't very fast on an ARM2 Archie at 8 Mhz from 1987 ( launch year ), but its framerate is excellent if the machine is equipped with an ARM3 ( available as an upgrade in 1989 ) or an ARM250.

There's no need at all of a RISC PC with a Strong ARM to have a great framerate.

You can check the videos available on YT if you don't believe me.

  • Like 1
Link to comment
Share on other sites

Here in the mid-west it seems the Amiga was indeed maketed as a video effects machine, a CGI thing for Hollywood effects and all that. To the average consumer no less. Whatta non-sequitur state of affairs. To do any of that, you needed accelerators and genlocks and toasters. All of it firmly priced outside the average consumer's budget. The Amiga didn't even output color composite to begin with..

 

I would (dubiously) argue that that crash thing happened because the market got flooded with garbage at about the same time people were experiencing gaming fatigue. Computers, doing both games and productivity, were able to continue on after gaming was sidelined. In my part of town, it wasn't a crash, but more of a turn in the road of evolution.

  • Like 1
Link to comment
Share on other sites

  • 1 year later...
On 10/24/2022 at 2:04 PM, Xavier Louis Tardy said:

I forgot to add that ok SF3000 isn't very fast on an ARM2 Archie at 8 Mhz from 1987 ( launch year ), but its framerate is excellent if the machine is equipped with an ARM3 ( available as an upgrade in 1989 ) or an ARM250.

There's no need at all of a RISC PC with a Strong ARM to have a great framerate.

You can check the videos available on YT if you don't believe me.

Still trough, once o heard that the acorn archimedes was released in 1987 just recently,i was mind blown away since it contains an ARM 32BIT cpu,i tout that the 32bit ARM cpu was yet still in prototype stage at the time (heck i am already amszed that the ARM cpu chip was already prototyped at the time,let alone being used yet) yes the 385 cpu was already developed in 1985 and i suppose it was used in a pc from 1989,but still,

this is simply amezing,we are talking about the skip from 8bit to 32bit machines in the 80’s(i tout that a 16bit computer such as the amiga 500 or atari ST was the most advanced pc you could,ve asked for in the late 80’s and nothing else),just incredible,heck even nintendo was still stuck with their 8bit gameboy from the late 80’s till the early 2000’s,so it’s really just incredible how technology moved fast forward back in the 80’s because for a long time i tout that back in the 80’s technology still moved sluggisch forward,but i was clearly wrong about that,it’s amezing that the ARM cpu being developed in the 80’s are used (albeit upgraded and sped up) in todays mobilephones,tablets and what not etc,,,

Link to comment
Share on other sites

  • 3 weeks later...
On 12/17/2023 at 7:21 AM, johannesmutlu said:

Still trough, once o heard that the acorn archimedes was released in 1987 just recently,i was mind blown away since it contains an ARM 32BIT cpu,i tout that the 32bit ARM cpu was yet still in prototype stage at the time (heck i am already amszed that the ARM cpu chip was already prototyped at the time,let alone being used yet) yes the 385 cpu was already developed in 1985 and i suppose it was used in a pc from 1989,but still,

this is simply amezing,we are talking about the skip from 8bit to 32bit machines in the 80’s(i tout that a 16bit computer such as the amiga 500 or atari ST was the most advanced pc you could,ve asked for in the late 80’s and nothing else),just incredible,heck even nintendo was still stuck with their 8bit gameboy from the late 80’s till the early 2000’s,so it’s really just incredible how technology moved fast forward back in the 80’s because for a long time i tout that back in the 80’s technology still moved sluggisch forward,but i was clearly wrong about that,it’s amezing that the ARM cpu being developed in the 80’s are used (albeit upgraded and sped up) in todays mobilephones,tablets and what not etc,,,

To be fair, the 68000 was a 16/32 bit processor, which is why the ST was named the ST (Sixteen / Thirtytwo).  Then again, the 68000 was also released in 1979.  Jay Miner wanted the successor to the Atari 8bit line to use it way back then, but Atari fumbled about so he took off with some others and made the Amiga... If that hadn't happened, it's possible the Amiga would have launched a year or two earlier, been an Atari computer and... well it probably still would have died, since Commodore and Atari kept driving each other's prices down so far they couldn't make proper profit margins. 

If Acorn had jumped into the fray, maybe they would have been able to compete more with Apple and take their spot?  Atari and Commodore were already on some weird quest to kill each other off, especially after Jack was kicked from Commodore and bought up the bleeding Goliath from Warner...

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...