Jump to content
IGNORED

Amiga 1200 vs Atari ST.


Recommended Posts

All use of these systems at this point are for nostalgic reasons. A clearly faster computer came out from Compaq in '86 with a 80386 processor, a better sound machine was introduced in '86 with the Apple ][GS, and better graphics were available in '84 with EGA having 640x350 with 16 colours.

 

Like which ever one you want the most, but any superiority is only true inside one's head.

I have to disagree about EGA being better. It had a smaller color palette, not every EGA card supported 640x350 @ 16. Some only supported 640x350 @ 2 colors. Also being ISA-based it was very slow.

 

The first graphics card that was clearly better was MCGA in 87 (and the better known VGA from the same year). But the speed problem wasn't resolved until later bus architectures with higher bandwidth, like Vesa Local bus, PCI

  • Like 2
Link to comment
Share on other sites

I see your point. As you said, I think the "clearly better" is what VGA represents.

 

How about "the least worse of what was available for a high res colour display" is what EGA would represent in '85, being compared to Mac, ST, and Amiga.

EGA was definitely not a superset of the capabilities of the others.

 

I think I am pushing it a little, but I am trying to imagine '85 again (talk about nostalgia... heh heh heh).

Link to comment
Share on other sites

VGA is where graphic on PC started to be really good and competent. But in early years, it was very slow in full color mode. I had 1MB Trident ISA VGA card about 1992. In true color mode it took some 5 secs to just fill entire screen. Not some large res (don't remember what was it exactly). That card could be faster with better RAM. But then would cost much more. As said, it's all mostly about RAM prices and speed.

Modern cards have more RAM than full truck with Amigas and Ataris.

It's easy now to say how PC is superior. But truth is that all they learned each from other. Crucial point was when manufacturers (mostly in Taiwan) went on making custom chip sets for PC motherboards. That was exactly in years when Atari and Commodore went out of business. People wanted more standard machines, peripherals, compatible SW, and speed. To add that Motorola started to lose against Intel in those years too. I say it, who never liked Intel CPU coding.

Edited by ParanoidLittleMan
  • Like 1
Link to comment
Share on other sites

No one developer ever ever liked intel programming, especially stack and addressing/memory map related items. And. Yet. The ENTIRE INDUSTRY embraced the model and supported it at every turn. The 286 was loathed and everyone wanted 6809 and 68000 or PowerPC. And..yet.. x86 rules the roost. So someone is bullshitting someone and hating just to hate, because bandwagon.

Link to comment
Share on other sites

No one developer ever ever liked intel programming, especially stack and addressing/memory map related items. And. Yet. The ENTIRE INDUSTRY embraced the model and supported it at every turn. The 286 was loathed and everyone wanted 6809 and 68000 or PowerPC. And..yet.. x86 rules the roost. So someone is bullshitting someone and hating just to hate, because bandwagon.

Simple. Developers don't code for what they like, they code for where the sales are.

 

Anyway I think relatively few developers had to touch x86 assembly, as PC has always had a broad selection of high-level languages.

 

And 286 did suck for assembly language (this is what I learned on). 386 improved things somewhat.

  • Like 2
Link to comment
Share on other sites

I suppose so. I remember in x86 class back in the day the most distracting things were the groans and frequent F*** YOU. Most of it surrounding memory and segments.

 

But Someone had to like the platform and recognize something good in it for it to rise above the other 16-bit rigs. What was the force that overcame the crap memory model? I always thought it was "standards" and consistency, stuff partly enforced by the BIOS. Like for example, all videocards had certain interrupt vectors and that didn't change from Compaq to Packard Bell to Gateway.. All PC-XT/AT standards.

 

Perhaps it was the highly professional marketing and no references to kiddie games?

Link to comment
Share on other sites

I suppose so. I remember in x86 class back in the day the most distracting things were the groans and frequent F*** YOU. Most of it surrounding memory and segments.

Yup, I never touched x86 assembly after my assembly class

 

 

But Someone had to like the platform and recognize something good in it for it to rise above the other 16-bit rigs. What was the force that overcame the crap memory model? I always thought it was "standards" and consistency, stuff partly enforced by the BIOS. Like for example, all videocards had certain interrupt vectors and that didn't change from Compaq to Packard Bell to Gateway.. All PC-XT/AT standards.

 

Perhaps it was the highly professional marketing and no references to kiddie games?

Well consider the audience for the Amiga/ST/Mac:

 

* multimedia enthusiasts (Amiga)

* Musicians (ST)

* Designers and desktop publishers (Mac)

* Gamers who weren't smitten by the NES

 

contrast that to the PC audience:

 

* business apps

 

When was the last time your boss sent you a spreadsheet or word doc? When was the last time he/she sent you a MIDI file or CAD drawing? The former was and still is a much more common use of computers. People wanted to bring their work home and they wanted 100% compatibility with their PC at work. If they couldn't afford $4000 for a genuine IBM, maybe they could swing $1000 for a clone.

 

What people don't seem to realize is this type of buyer wasn't wowed by the slick multimedia demos that the Amiga/ST would produce because it they were not computer enthusiasts. They simply saw the computer as a means to an end. The number of sound channels/colors/MHz/ram weren't as important to them as whether it could run Lotus 1-2-3.

 

So I think it was a combination of the PC having all the serious apps and being an open architecture that lead to its domination. The open architecture lead to competition and innovation. Atari truly offered "Power without the Price" in 1985, but by the early 90s, that had flipped and PC hardware was cheaper.

 

Also the horrible memory model affected asm programmers, but if you were developing in C/C++, The PC had better compilers/development tools for that. And after the 386, you could code in C and not have to worry about memory segmentation in your code. After Windows 95, the TSRs were gone, and the weird memory model didn't affect end-users anymore either.

  • Like 4
Link to comment
Share on other sites

No one developer ever ever liked intel programming, especially stack and addressing/memory map related items. And. Yet. The ENTIRE INDUSTRY embraced the model and supported it at every turn. The 286 was loathed and everyone wanted 6809 and 68000 or PowerPC. And..yet.. x86 rules the roost. So someone is bullshitting someone and hating just to hate, because bandwagon.

 

Segmented memory was the devil. 6 different memory models in Borland C.......

Link to comment
Share on other sites

This is oddly relevant as I did this.

 

Quick personal history: I had an ST back in the day, I had it for college work, BBS's and of course gaming. I modded the shit out of it. It was the first machine I ever went to town on. It was in a tower case, had a internal SCSI HDD, 16mhz processor and was pretty silly. My brother later got an Amiga 500, so we had one of those too. Despite personally having way more history with the ST, even at the time, the superiority of the Amiga was pretty obvious.

 

Fast forward to last year and I'm on my nostalgia trip. I land myself an A1200 and am just amazed at how nice it still all works even today. It's handling of HDD and software is still slick and impressive and the system is a joy to use. I also picked up an ST (well 3...) and proceeded to try and recreate something along the lines of my original beastie and after the Amiga build I just found it painfully clunky. In the end the ST I kept was a simple 1040STF with a Gotek. It plays the few early games that were better than the versions ported to the Amiga, and that's about all I use it for.

 

The A1200 however sits on my desk next to my current main computer. It's just bloody lovely. The later games are way better than the ST equivalents. As for compatibility, the A1200 ain't no STE. Just about everything works, there's also a lot more enhanced games for the AGA machines. Running from internal CF is a doddle.

 

It's not all perfect, there are some quirks. If you want to run WHDLoad software you're gonna need a 4MB expansion for the trapdoor and they're quite expensive. You could put an 8MB in there, but that'll stop the PCMCIA port working (which you can use for ethernet or a CF card reader to transfer stuff to it). They also are prone to leaky caps (as is the A600, A4000 and CD32) so a cap job is becoming pretty much mandatory. Finally they're getting damned expensive!

 

But otherwise I think the performance of the machine is pretty much nailed on perfect for most games. It's speed bump over an A500 is really nice for some games that benefit from it, but it's not so much that it makes them unplayable. Adding a CF HDD to it costs buttons also, and it makes it a joy to use when you're done.

 

Basically it's my favourite computer of all time and I wish I could have afforded an A1200 back in the day. The fact it's still so much fun now is telling of just how awesome it is. The ST lives next to it mostly for nostalgia reasons, Super Sprint and Oids. That's pretty much it.

I am the opposite, I most game on my ste and the a500 mostly gathers dust except for a few games now and then.

  • Like 1
Link to comment
Share on other sites

When I said that did not like x86 ASM programming, I did not mean because segmented memory. I never did something larger. It is just that 68000 registers, memory addressing modes are better structured. 68000 is true 16-bit, while x86 has some 8-bit characteristics. Big part of possible 16 bit combinations is valid opcode. And that may be the reason why 68000 was not well expandable later. There was simply no enough space to add new registers, many new operations. Intel managed to keep basic register model and to expand it. Motorola was forced to go Risc way (Power PC). But today technology allows Risc speed with Cisc CPUs.

 

I don't agree that target users for PCs were only business people. Surely not after 1990. It became good for gaming when VGA, sound cards and faster CPUs arrived. Prices of monitors went down too.

Real multimedia came little later - with CD drives, faster CPUs made possible quality video playback ... I remember talk with some Amiga fan, about 2000, who bragged how his accelerated Amiga 2000 (or something like that) can play well video clips. I asked what resolution - and he said something about 160x100 . I said 'sure, that was worth of giving over thousand DM for accelerator' In that time we could play DivX/DVD on PC well .

Link to comment
Share on other sites

I'm fairly confident CISC achieved RISC speeds many years ago. And continues to the be architecture style of choice. It has to be. Since speed and density are hitting walls today, the way forward is more capable instruction sets and more parallelism.

 

It should also be noted that a modern intel microprocessor is really a mix of risc and cisc if you adhere to the traditional (and outdated) definitions. There's a microprocessor within a microprocessor executing uops. The definitions of RISC/CISC have become bloated in attempt to cover the rich mix of features and styles in today's microprocessors. So I don't even think it is an applicable way of describing the architectures. It's become like discussing a 3-bladed propeller vs 4-bladed propeller. In the Jet age.

 

The 1990's was the decade of "Multi-Media"! All those sound card and CD-ROM bundle kits. There was a HUGE push into the consumer space. Fer'chrissakes, even my local drugstore had them for sale!

 

I, too, remember that postage stamp video stuff. While the PC had it, the PC showed signs of evolving and increasing its capabilities. And did. Whereas the Amiga had stagnated. Yes. 1000DM accelerator for video, bwahahahaha!!

Edited by Keatah
Link to comment
Share on other sites

I don't agree that target users for PCs were only business people. Surely not after 1990. It became good for gaming when VGA, sound cards and faster CPUs arrived.

After 1990, yes.. but the PC would never have gotten to that point if it hadn't been for all the people who bought them in the 80s mostly to run business apps. If you look at most PC games from the 80s, they were inferior to other ports. It certainly wasn't gamers buying them in the 80s

  • Like 2
Link to comment
Share on other sites

I meant term Risc regarding execution times, better said cycle counts for instructions. Latest CPUs are actually faster that classic Risc CPUs - can execute even multiple instructions in 1 clock cycle. That needs of course lot of logic, but there are millions and millions of gates for it.

 

Surely, PC was not targeting gamers in beginning. And IBM had not some big ambitions with whole project. But it went that direction. I guess market was what made it - gamers are good customers :)

Link to comment
Share on other sites

Surely, PC was not targeting gamers in beginning. And IBM had not some big ambitions with whole project. But it went that direction. I guess market was what made it - gamers are good customers :)

It went that way because IBM left the architecture open. That meant graphics card companies could compete with each other for better tech, same with sound. That's not something that can happen easily with closed systems. This lead to tech innovating faster in the PC space, and the economies of scale drove prices down.

 

By the 90s, PCs had better and cheaper game tech. Closed computer systems makers could not compete (except Apple, barely)

  • Like 1
Link to comment
Share on other sites

 

Then I am confused then why people answered the thread at all. Only to ignore the OP and answer questions that weren't asked at all ...for 4 pages?

 

I answered his question because I happened to have done the same thing recently (being more invested in the games than the platforms running them). I didn't think it was stupid or provocative.

 

This is nothing. Do a search for the infamous A8 vs C64 thread in the A8 forum hahah

 

That goes on for dozens of pages.

 

Quite amusing when you realize people are arguing angrily for pages about two machines that have been dead for 30 or so years

  • Like 1
Link to comment
Share on other sites

 

This is nothing. Do a search for the infamous A8 vs C64 thread in the A8 forum hahah

 

That goes on for dozens of pages.

 

Quite amusing when you realize people are arguing angrily for pages about two machines that have been dead for 30 or so years

 

And does this death make arguing any less fun?

Edited by Christos
Link to comment
Share on other sites

This is nothing. Do a search for the infamous A8 vs C64 thread in the A8 forum hahah

 

That goes on for dozens of pages.

 

Quite amusing when you realize people are arguing angrily for pages about two machines that have been dead for 30 or so years

Well after 30 years, you'd think some people would finally admit defeat and realize the other computer RULEZ ALWAYS AND FOREVER!!

 

but noooo..... :P

Link to comment
Share on other sites

 

That would be part of the speed difference.

 

My understanding is that even in 16-colour 320x200, the video processor halts the 68000 of the Amiga to get the screen drawn.

(and this increases to a worst case with the dual scrolling mode)

 

This doesn't make the Amiga bad, but a powered up Amiga 68000 is more than 8/7.16 slower than an ST 68000.

 

 

To say the Amiga does better than the ST for games is mostly due to the eventual popularity of horizontally scrolling games of the later 80's (strongly due to arcade games and NES games).

 

If polygon games had become the "in thing" instead, the ST would have been considered the better games player now.

 

 

Both are fun though!

 

The GPU hogging the bus in the Amiga is an option. You can switch it to an interrupt mode instead and let the CPU take bus cycles and still run software. It's up to the programmer. That and the Blitter and 68000 are cycle-interleaved.

https://archive.org/stream/byte-magazine-1985-12_201502/BYTE-1985-11_Vol%2010-12_Graphics_Hardware#page/n189/mode/2up

Link to comment
Share on other sites

It went that way because IBM left the architecture open. That meant graphics card companies could compete with each other for better tech, same with sound. That's not something that can happen easily with closed systems. This lead to tech innovating faster in the PC space, and the economies of scale drove prices down.

 

By the 90s, PCs had better and cheaper game tech. Closed computer systems makers could not compete (except Apple, barely)

 

I would argue about "companies could compete with each other for better tech".

 

It can be said in another way:

 

--== it took 5 YEARS for _open_ PC to catch up with _closed_ Atari, Amiga, Mac... ==--

 

Superiority of non-PC computers can be seen in software made for them: majority of today high end programs and software packages start their life on non-PC computers (http://www.atari-forum.com/viewtopic.php?t=22856)

 

And one more note:

even if you are prone to capitalists view of world, you can not celebrate PC as something "great" since PC KILL all alternatives (it was slow process but eventually it happened).

Link to comment
Share on other sites

The successful programs of today had to get away from the platforms they started on. Atari and Amiga and other systems of the time were stagnating and stillborn with minimal upgrade paths, if any. Held back by custom chips.

 

 

Ofcourse but why these programs was not developed on PC in first place?

Because PC sux and it took at least five years for PC to catchup with Atari, Amiga, Mac...

Link to comment
Share on other sites

Things can change to opposite in few years. Problem of PC was that it was just too expensive concept - large motherboard with slots. That costs money. But it is what is really opened architecture.

And that was partially done with Mega ST, Mega STE, TT . Even Falcon has some expansion slot, but that is limited because lack of space in keyboard computer.

What was disadvantage about 1983-9 became advantage in later years. People wanted to expand their machines, and that was just problematic, expensive with those without expansion slots. Sinclair Spectrum made 'wise' choice by making expansion without connector - why to spend money on slot, when it can be on expansion self ? It matters not that it must be on every one ... For them. Not for customer.

And who had number of expansion slots in their computers ? 2 survivors: PC and Apple.

I mentioned that 68000 CPU had very well designed instruction set, register architecture. And that was probably the reason why it discontinued - it was not well expandable. In fact, same as Atari ST - very optimized design, powerful, not expensive, but expansions were always the problem and needed extra raiser boards, extra power supplies and like.

Unfortunately, later models like Mega ST and Mega STE went not really in way of easy expansion - used different expansion slots for instance. VME was just too expensive. I must say here that Amiga did better with it's Zorro (or what was the name) universal expansion slot.

Link to comment
Share on other sites

 

 

Ofcourse but why these programs was not developed on PC in first place?

Because PC sux and it took at least five years for PC to catchup with Atari, Amiga, Mac...

You may repeat that PC sux zillion times, and other things, but that only indicates that you are biased. It's all about what user can pull out from his computer. Not liking Windows - use other OS.

 

It's not true that PC killed alternatives. It is just that manufacturing large numbers makes prices lower. Atari killed also many smaller manufacturers in that time, which launched maybe even better designed 16-bit machine, but were not competent. Remember, Atari is not from communistic East Germany, but USA + also not communistic Taiwan.

Competition is good thing. Not ideal, must be controlled, but still much better than what we had here in years until 1991 - and that was practically nothing in computer development.

 

IN beginning there was simply not real competition between Atari, Amiga and PC. Different target customers, different target SW. MAC was closest to PC, with mostly serious SW oriented first MACs.

Atari wanted to attract people to buy it instead PC and MAC from 1985, but it failed mostly right in USA. Just because misjudging market. Because slow launching of new models. Then came some panic decisions - worst was probably launching Atari PC - like admitting that we lost war ...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...