Jump to content
IGNORED

Atari and Microsoft and the ST


Recommended Posts

From what I know, PC software started catering to non-standard (or non-uniform, rather) hardware configurations pretty early on to address the growing clone market

You're going to get pounced on by us old-timers (am I really so old?) that worked on DOS software.

 

A few manufacturers had non-clones, and there are a few sad tales of "90% clones" being marketed as clones, but that all cleared up by 1985.

 

By 1985, everything calling itself a PC clone was 100% compatible with the IBM PC at a memory map and register level. Sure, some hardware had extra features, and some software could use them, but the core was exactly the same.

 

The PC clones had to be perfect, because every last PC register was directly accessed by at least some popular software. Early on, a few programs tried to 'cater' to non-clones by using DOS/BIOS for everything, which produced sluggish, horrible, experiences. Once those guys were slaughtered by hardware-banging competitors like 1-2-3 and WordPerfect, the lesson was learned. Direct hardware access became the industry standard.

 

Even as late as 1995, you could buy Pentiums with 'turbo' buttons on the front panel, intended to help users with software that still had hardcoded loops. And don't get me started on the A20 gate, still present in the newest Sandy Bridge!

 

I'm trying to think of a PC register obscure enough not to be accessed directly by a lot of popular software, and I'm coming up blank.

 

Plenty of major titles directly accessed the floppy disk controller (and thus DMA system) to implement copy protection routines. The BIOS could only play one note, so any other sound/music required register access to the timer/counter. BIOS serial and parallel routines were too slow and infexible, so everyone directly programmed those, and of course the associated interrupt controller. Games tended to directly access the keyboard registers and even use their own mouse drivers. And Doom famously used undocumented (probably unintended) register combinations in the VGA controller, and this worked on everybody's PCs in the 90s.

 

That's how slavishly compatible PCs had to be.

 

Even today, if you go buy a brand new $3,000 Mac, those VGA registers are there. They're being emulated in firmware (system management mode), but the emulation is 100% accurate, especially the combinations that make Doom work. ;)

 

- KS

Edited by kskunk
Link to comment
Share on other sites

I missed this before:

 

But to write a firmware/hardware based PC emulator running on that x86 is simple enough

It's only simple to emulate PC hardware in software if you have hardware support for PC register emulation. The 80386 was the first x86 chip to include this (Virtual 86 mode).

 

Some people added their own "virtual 86" hardware to 8088 and 286 based machines. I had an old 8086-based Wang PC with a hardware "emulator card" that worked this way. When IBM PC registers were addressed, it would detect the access and trip an NMI that allowed software emulation of the missing hardware. (The Gravis Ultrasound also emulated Sound Blaster that way.)

 

That emulator card also hardware-remapped memory into PC-compatible locations. PC software hardcodes access to video memory somewhere between 640KB and 786KB. If it's not at the right byte in the right format, you get no screen.

 

Before the 80386, you had to design your hardware with PC compatibility in mind from the start. This is one of the many reasons people call PC/DOS programming a big 'kludge'. :)

 

Compare how easy it is to run Mac software on any 68K machine...

OK, this explains why I got the wrong impression . . . I hadn't realized the 386 (onward) had that sort of hardware support built-in. (I wonder how much silicon went into that)

 

Also, it's interesting to note that while some programs have allowed virtualization for emulation of older PC hardware/OSs with relatively limited CPU overhead (VDMSound comes to mind), DOSBox definitely does not, it's total software emulation of the entire system (ie it has no advantages from running on x86 compared to a Power PC based machine), and the programmer specifically mentioned that there's no plans to ever support virtualization to improve performance. (hence why you can barely get fast 486 or slow pentium level performance on a modern 2 GHz system -not sure if there's decent multi-core support yet either)

 

a number of programs that bypassed DOS and used the hardware directly . . . some bypassing the BIOS even, but that eventually dissipated.

It only dissipated because all hardware became 100% PC compatible. Professional DOS programmers bypassed the BIOS as a rule, at least for text, graphics, serial, and printing. The BIOS was just too crappy and slow. Sound never got BIOS support, so you just banged the registers until it worked.

 

In the late 90s, I helped a company move their program from DOS to Windows. Their DOS code was a nightmare. It never touched the BIOS. It had special 'quick scroll' code that abused the CGA video registers, and even read the keyboard using register access to the underlying scan codes. It didn't allocate memory -- it just 'knew' what areas would be safe to overwrite. Hopefully your emulator (or TSR for that matter) wasn't in one of those areas.

 

Windows emulated all of that perfectly... which makes me think most DOS programmers never learned to stop kludging!

OK, so prior to the 386 (or special hardware register emulation), you either had 100% memory map/register level compatibility, or you'd lose a lot of compatibility with DOS software in general?

 

And the PCJr and Tandy 1000 are register-level compatible with the 5150/XT/AT/etc? I think this is part of what threw me off on the topic: http://www.classiccmp.org/dunfield/pc/index.htm "The Tandy 1000 series was a very early line of IBM compatibles which didn't try "too hard" to be fully IBM compatible."

 

 

Hmm, the 386 would also have been pretty convenient for bridging the differences between NEC's PC9801 and IBM compatibles (aim at a PC9801 compatible derivative using the 386 to greatly simplify making the machine DOS/PC compatible) . . . and it was the early 90s that had DOS compatibles displace NEC's monopoly in Japan. (the main reason cited for that is Japanese text getting support in DOSV, but I wonder if the 386's gaining popularity for lower-end machines around that time was also a factor)

 

In that case, I was still generally correct in my argument against oky's comment about a DOS compatible machine, but not quite for the right reasons . . . or maybe for the right reasons, but I was thinking of "100% PC compatible" in the wrong sense. (if you need register level compatibility of video hardware -which was what I was assuming- it wouldn't make sense not to assume register level compatibility of I/O, interrupt controller logic, timers, RAM, etc, etc -except I was also thinking of certain expansion devices -like sound cards- that wouldn't be mapped to a fixed/specific I/O or DMA address/channel but would need to have any program using it to be installed/set-up with the corresponding information for those areas -thinking more in the context of what you have to deal with when installing DOS software, especially games)

 

That would also totally explain why the 80186's embedded logic is useless for a PC compatible, but still wouldn't explain why a 186 still wouldn't make for decent middle ground between 8086 and 80286s for price/performance. (or why Intel didn't release a version of the 186 that removed the unnecessary logic or at least the external pins/traces for those features for an 8086 compatible pinout/package -or, if Intel didn't want to compromise 286 sales by doing that, why didn't 3rd party 186 manufacturers introduce a lower cost version -albeit, the V20/30 may be just that, but I'm not sure if that's related to the 186 or an independent enhancement of the 8086)

 

 

So in that sense, if you WERE to built a "special" PC clone/compatible, you'd have to start with the PC compatible memory map and lowest common denominator standards for display/audio/peripheral interface logic. (which would also mean my musing on going a step further with Tandy-like extended CGA would probably be the most cost effective option . . . if you directly built on CGA, you'd still probably have to deal with the odd interleaved screen addressing, but at least you'd have more CPU-blit-friendly 4-bit packed pixels and not planar graphics -granted, you'd still have to move graphics in a byte addressed manner to get any decent speed -no bit manipulation stuff, but working with pairs of pixels on a virtual 160x200 grid is a lot more flexible than working with 8 pixels at 40x200; not to mention the benefit of hardware scrolling on top of that)

 

 

 

 

On the other hand, that would also mean that, for the case of the "simple" x86 emulator board, you'd either need some embedded hardware register emulation logic (assuming this ie before you could get cheap 386SXs), non-embedded PC hardware on that emulator board, or an extremely tight software emulation solution on the ST/68k end handling register emulation. (then again, it should still be better than trying to do a full PC emulator in software on the 68000 -and they did have that for the ST in the late 80s iirc, but was extremely slow -like less than 1/4 the speed of a PCXT)

Or perhaps they could rig up a hardware emulator board that only used the most problematic/necessary components, and did the rest in software on the 68k end. (and you'd need some level of software bridging emulation to make use of the ST's peripheral interfacing anyway)

 

Though, again, the most productive option for PC "compatibility" on the ST, probably would have been an effort by Atari to get as much cross-platform work file/formatting level compatibility as possible between ST and PC applications. (of course, Atari themselves could only do so much to promote that . . . but as it was, they didn't push that angle at all -and of the 68k platforms of the time, the ST definitely would be the most straightforward for PC file format compatibilitiy: the disk formatting was already basically compatible on the hardware end, though early on you'd have a high percentage of 5.25" specific PC support and Atari's single sided 3.5" discs, but by the end of the 80s, both platforms were commonly supporting double sided 3.5" discs)

 

The ST wouldn't even need to get actual multiplatform versions of all the major PC applications, just applications that could read/write files in compatible standard/de facto standard formats to what was being used on the PC.

 

 

 

A few manufacturers had non-clones, and there are a few sad tales of "90% clones" being marketed as clones, but that all cleared up by 1985.

OK, as I said I missed your previous post (and I think that pretty much cleared up my misconceptions)

 

The Tandy 2000 would be among those unfortunate "90% clones", right?

 

By 1985, everything calling itself a PC clone was 100% compatible with the IBM PC at a memory map and register level. Sure, some hardware had extra features, and some software could use them, but the core was exactly the same.

OK, so the only exception might be a few very odd cases where software was critically timing sensitive to the 4.77 MHz CPU speed (albeit a lot of early 8088/8086 based clones had support for a 4.77 MHz mode for further compatibility -iirc both Tandy and Atari's 7/8 MHz 808x machines had support for that), and maybe a few other really odd cases, but that's separate from the context of general register level compatibility with the PC.

 

The PC clones had to be perfect, because every last PC register was directly accessed by at least some popular software. Early on, a few programs tried to 'cater' to non-clones by using DOS/BIOS for everything, which produced sluggish, horrible, experiences. Once those guys were slaughtered by hardware-banging competitors like 1-2-3 and WordPerfect, the lesson was learned. Direct hardware access became the industry standard.

There were pretty few OSs at that time that were remotely efficient enough to really make high-level programming attractive. (wasn't the BBC Micro among those exceptions -I seem to recall reading that much of its software was programmed to Acorn MOS thanks to its extremely efficient and feature rich API -for the time- the fact it had a 2 MHz 6502 with no wait states also made it pretty powerful for the time)

 

Even as late as 1995, you could buy Pentiums with 'turbo' buttons on the front panel, intended to help users with software that still had hardcoded loops. And don't get me started on the A20 gate, still present in the newest Sandy Bridge!

Those turbo buttons would do what exactly? (I'd think hardcoded loops would be most problematically manifest as highly timing sensitive programs -ie run way too fast on CPUs with fast clock speeds and/or cycle times- I know that was a persisting problem on some DOS games into the early 90s . . . and even a number of windows games -though I'd think some of those are for different reasons, especially odd cases like Wipeout XL, where the 256 color software renderer plays fine on a 2-3 GHz machine, but the software highcolor renderer and hardware renderer run way too fast even on 800 MHz system)

 

Plenty of major titles directly accessed the floppy disk controller (and thus DMA system) to implement copy protection routines. The BIOS could only play one note, so any other sound/music required register access to the timer/counter. BIOS serial and parallel routines were too slow and infexible, so everyone directly programmed those, and of course the associated interrupt controller. Games tended to directly access the keyboard registers and even use their own mouse drivers. And Doom famously used undocumented (probably unintended) register combinations in the VGA controller, and this worked on everybody's PCs in the 90s.

 

That's how slavishly compatible PCs had to be.

Heh, so aggressive cost reduction would all have to be done via licensed/reverse engineered consolidation of those discrete components of early PCs (be it embedded by the motherboard manufacturer, graphics card maker, etc), and then you'd have to manage volumes high enough to make that newly integrated hardware competitive on the mass market . . . especially as the PC clone market really exploded. (again, Atari supposedly gave up trying to push custom, consolidated, single-board PCs in favor of off the shelf ones because of that cost issue -granted, they started with the PC-1 in 1987, so already a huge advantage for that with existing manufacturers -Tandy had started back in 1984 and did continue to push a number of custom board/computer designs at least through the end of the 80s -they were also possibly the only PC vendor in the US to release console form factor machines with the EX and HX, albeit not nearly as sleek or compact as Amstrad's PC-200 series -albeit Amstrad's systems were basic CGA machines which lacked any space for internal expansion and required you to let a naked ISA card stick out of the top for expansion . . . perhaps if Tandy had licensed their format to Amstrad -and Amstrad hadn't pulle out of the low-end PC market during the DRAM crisis- Europe might have actually had a more significant home PC presence in the late 80s ;) -granted, the Tandy 1000 was still no Atari ST, but it was the only PC compatible format with decent gaming support -or any sort of sound chip- until 1988 -when EGA+Adlib started getting support)

 

Hmm, actually, for Europe (especially for Amstrad), it might have made more sense to introduce a low-end PC compatible using a NEC V20 (native Z80 compatible) and including the Spectrum ULA and support for spectrum compatibility modes. (you could do that for the CPC too, and that had a significantly better graphics architecture to build on -especially since CGA color artifacting won't work in PAL, but the Speccy was so much more popular and the speccy ULA would have been cheaper)

That, or they could do closer to what OKY was suggesting, but with full PC compatibility (on a basic Sinclair 200 level) with the Flare 1 chips tacked on for support with software specific to that platform. (after all, Amstrad had been involved with the Flare prople briefly after Sinclair's computer business was bought out, and they must have been aware of their ongoing development efforts, and the timing was right for the PC-20/200's 1988 launch -actually the first version of the Slipstream ASIC would have been nearing production at that point)

It probably wouldn't have been price competitive with the ST or Amiga in '88, but might have still been attractive. (probably better off than trying to market the vanilla PC clones that Amstrad was pushing at the time)

 

 

Even today, if you go buy a brand new $3,000 Mac, those VGA registers are there. They're being emulated in firmware (system management mode), but the emulation is 100% accurate, especially the combinations that make Doom work. ;)

Hmm, would that mean you could still successfully install DOS (and run DOS applications) on a modern x86 machine? (obviously, modern OSs have largely dropped DOS support -I think 64-bit 7/Vista dropped everything, but 32-bit at least have DOS shell support for some DOS executables -but not full-screen mode support . . . and I'm thankful for that since I don't have to use QBASIC through DOSbox on my laptop ;))

 

 

 

 

 

 

 

 

 

 

 

 

None of this discussion has anything to do with Microsoft anymore. ;) . . . well except my last comment about OS compatibility . . . and the earlier implication of how DOS was too inefficient to be worthwhile for any significant percentage of software to be purely built around OS routines. (albeit, there wasn't much software that did that on the ST or Amiga either -I think the Archimedes may have been mainly high-level based, but that seems to be among the rare exceptions as with Acorn's previous machine -if that's even true about the Beeb)

Edited by kool kitty89
Link to comment
Share on other sites

But to write a firmware/hardware based PC emulator running on that x86 is simple enough

It's only simple to emulate PC hardware in software if you have hardware support for PC register emulation. The 80386 was the first x86 chip to include this (Virtual 86 mode).

 

Some people added their own "virtual 86" hardware to 8088 and 286 based machines. I had an old 8086-based Wang PC with a hardware "emulator card" that worked this way. When IBM PC registers were addressed, it would detect the access and trip an NMI that allowed software emulation of the missing hardware. (The Gravis Ultrasound also emulated Sound Blaster that way.)

 

That emulator card also hardware-remapped memory into PC-compatible locations. PC software hardcodes access to video memory somewhere between 640KB and 786KB. If it's not at the right byte in the right format, you get no screen.

 

Before the 80386, you had to design your hardware with PC compatibility in mind from the start. This is one of the many reasons people call PC/DOS programming a big 'kludge'. :)

 

Compare how easy it is to run Mac software on any 68K machine...

 

a number of programs that bypassed DOS and used the hardware directly . . . some bypassing the BIOS even, but that eventually dissipated.

It only dissipated because all hardware became 100% PC compatible. Professional DOS programmers bypassed the BIOS as a rule, at least for text, graphics, serial, and printing. The BIOS was just too crappy and slow. Sound never got BIOS support, so you just banged the registers until it worked.

 

In the late 90s, I helped a company move their program from DOS to Windows. Their DOS code was a nightmare. It never touched the BIOS. It had special 'quick scroll' code that abused the CGA video registers, and even read the keyboard using register access to the underlying scan codes. It didn't allocate memory -- it just 'knew' what areas would be safe to overwrite. Hopefully your emulator (or TSR for that matter) wasn't in one of those areas.

 

Windows emulated all of that perfectly... which makes me think most DOS programmers never learned to stop kludging!

 

- KS

 

The ST had an 8086 and 286 hardware + software combo that plugged a daughterboard into the CPU socket so it was simple enough to do that it would be even simpler to get DOS running on the 8086 based Atari machine a la Konix console :)

Link to comment
Share on other sites

Still, certain areas seemed to be "no nos" on the hardware end, like using an 80186 (and making use of the onboard peripheral logic rather than conforming to IBM's standard configuration of timers/interrupt control/DMA logic). Though it would seem like using a 186 would still have been worthwhile with those features unused. (faster per clock performance than the 8086 with some added instructions, not nearly as good as the 286 but much cheaper)

I actually had a 6 MHz 80186 XT accelerator card :D

 

PC software went right to the hardware well into the Windows era, long after the ST was introduced.

Yes, but with installation to account for varying memory maps, hardware changes, etc, etc. (ie not assuming the exact configuration of the original 5150 or PC-XT ;))

 

Actually, the opposite... you had to be register (including support chips), address and memory *exact* or stuff would break. Some software would explicitly check for extra hardware (a la Tandy 1000), but the core stuff had to be dead on.

 

That's more of a cost vs price issue. A lot of PC hardware didn't COST that much to manufacture, but it was priced for a high-end market with huge margins. (that and a lack of emphasis on tight, low-cost engineered designs) That's the same thing as the Mac, more or less. (the Mac128k in 1985 had a nominal price point roughly 4x that of the 520ST, but was almost certainly significantly cheaper to manufacture)

 

No, and sorry, no. In fact it DID cost a lot to manufacture PC hardware. The economies of scale that you see today simply did not exist back then. No (big)GALs, no all in one VLSI motherboard chipsets. Mad crazy expensive discrete memory chips and TTL everywhere. All this stuff cost bug bucks.. to produce and to purchase.

 

As for the Mac... I loved my ST, but the Mac had way more engineering than the ST ever saw, a built in display (which was NOT cheap in 1984), 'proper' support chips (serial, SCSI, etc)... that all costs money....

  • Like 1
Link to comment
Share on other sites

I hadn't realized the 386 (onward) had that sort of hardware support built-in.

Yeah, virtualization was complex, but demanded by Intel's customers. Because the only reliable way to multitask and/or isolate DOS programs was to emulate 100% of the underlying hardware.

 

You mentioned VDMSound as an example. Wikipedia defines VDM: "Virtual DOS machines rely on the Virtual 8086 mode of the Intel 80386 processor..." It's a handy feature!

 

a hardware emulator board that only used the most problematic/necessary components, and did the rest in software on the 68k

This is what PC/AT-Speed was. It looks like it freezes the x86 on any hardware access. While the x86 is frozen, the 68K swoops in to emulate the missing hardware. When the emulation is complete, the x86 is unfrozen. The x86 doesn't even know that time has passed.

 

This is a more elegant approach than even Virtual 86 hardware, but is more expensive since you need two CPUs.

 

The ST had an 8086 and 286 hardware + software combo that plugged a daughterboard into the CPU socket so it was simple enough to do that

Do you mean the PC-Speed? It wasn't that simple! It wasn't widely available until 1990, and it took a number of updates to improve the emulation quality to an acceptable level. Even then, it didn't do EGA or VGA, which were very popular by 1990!

 

it would be even simpler to get DOS running on the 8086 based Atari machine a la Konix console :)

Well, sure, everything is simple with 20/20 hindsight! :)

 

If Jack Tramiel wanted to build a PC compatible, his team would have to start with that intention from day one. It's expensive to add-on after a machine is already designed. Jack was cheap. (So were the PC clones.)

 

The first PC chipset didn't come out until after the ST was already shipping. (Anyone remember CHIPS?) Before PC chipsets, you had three choices for compatibility: massive engineering in custom ASICs (Tandy), or paying tons per unit for off-the-shelf chips (IBM), or relying on complex/buggy emulation to get 90% there (Wang).

 

Jack didn't have money for engineering, and wanted ST to be cheap. The timing just wasn't right for PC compatibility.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

I hadn't realized the 386 (onward) had that sort of hardware support built-in.

Yeah, virtualization was complex, but demanded by Intel's customers. Because the only reliable way to multitask and/or isolate DOS programs was to emulate 100% of the underlying hardware.

 

You mentioned VDMSound as an example. Wikipedia defines VDM: "Virtual DOS machines rely on the Virtual 8086 mode of the Intel 80386 processor..." It's a handy feature!

There's several other virtual machines that use that too iirc (albeit I hadn't realized that included PC register emulation, I'd assumed that "virtualization" mentioned was jut in terms of native x86 CPU compatibility rather than full software emulation of the entire system -as DOSBox and many other console/computer emulators use). I think Microsoft's own (now free) virtual PC also supports that.

 

a hardware emulator board that only used the most problematic/necessary components, and did the rest in software on the 68k

This is what PC/AT-Speed was. It looks like it freezes the x86 on any hardware access. While the x86 is frozen, the 68K swoops in to emulate the missing hardware. When the emulation is complete, the x86 is unfrozen. The x86 doesn't even know that time has passed.

 

This is a more elegant approach than even Virtual 86 hardware, but is more expensive since you need two CPUs.

In the ST's (or Amiga's) case, you've already got that added CPU . . . and you're adding the x86 CPU.

 

Didn't it take a long tome for an actual x86 CPU board to be released for the ST or Amiga. (I remember seeing a clip from a 1989 episode of Computer Chronicals that made note of a full software based DOS/PC emulator for the ST and an actual x86 CPU board that would come later on -or maybe that was just in the context of a specific emulator company and not in general -the same clip had Spectre being demonstrated with the plug-in Mac OS ROM cart for the ST, and they made note of how it ran Mac software about 20% software than an actual Mac/Mac Plus ;))

 

The ST had an 8086 and 286 hardware + software combo that plugged a daughterboard into the CPU socket so it was simple enough to do that

Do you mean the PC-Speed? It wasn't that simple! It wasn't widely available until 1990, and it took a number of updates to improve the emulation quality to an acceptable level. Even then, it didn't do EGA or VGA, which were very popular by 1990!

Ok, so it WAS that late . . . though they'd been talking about x86 emulator boards on the ST and Amiga since '85/86.

 

Being that late pretty much mitigates oky's premise of using PC compatibility to boost initial software support. ;) (the ST didn't seem to have that much trouble getting early software support anyway, even in the US . . . actually it seems like a lot of developers too the "safe" route to develop for the ST first and port to the Amiga -doing the opposite would be much more difficult since the Amiga could do most/all things the ST could -some moderately slower- but the reverse was not the case -and then there's the rest of the software market in general)

 

Anyway, it would just be PC cross compatibility that an emulator would really have been useful for in '89/90, and again, it probably would have been far more significant and productive for Atari (or CBM -though the Amiga's design made it more difficult in some areas than the ST) to heavily promote/encourage PC/ST cross-platform file format compatibility (and so some work themselves to that extent).

So, you'd still need to get that software support in general (be it from the same companies as on PC, or 3rd parties who were able to make compatible software), and you'd still need the consumers to buy the specific software for the ST, but you'd be able to have people take home project work files from PCs and use them with their STs and then bring that back to work and use it on a PC again. (or various other work/school/professional/educational/casual uses of that)

 

Of course, to really have anything like that in the mid 80s, you'd need to cater to 360k 40 track DSDD 5 1/4" floppy drives as well (which 3rd parties already offered for the ST, but Atari would have needed to market themselves if they'd pushed the PC compatibility angle).

 

it would be even simpler to get DOS running on the 8086 based Atari machine a la Konix console :)

Well, sure, everything is simple with 20/20 hindsight! :)

 

If Jack Tramiel wanted to build a PC compatible, his team would have to start with that intention from day one. It's expensive to add-on after a machine is already designed. Jack was cheap. (So were the PC clones.)

 

The first PC chipset didn't come out until after the ST was already shipping. (Anyone remember CHIPS?) Before PC chipsets, you had three choices for compatibility: massive engineering in custom ASICs (Tandy), or paying tons per unit for off-the-shelf chips (IBM), or relying on complex/buggy emulation to get 90% there (Wang).

Yes, and Atari might have been more successful doing that too (albeit Commodore had some of the greatest potential with their in-house engineering staff and manufacturing capabilities), but the ST obviously wouldn't have been the same otherwise . . . it would have lost some of the things that made it so significant at the time and so popular in Europe. (even in the best case of a tight, integrated, enhanced PC compatible, you couldn't match the cost to performance ratio the ST offered at the time . . . granted, in the US at least, the added value of PC compatibility would have been extremely significant . . . and the huge value of an open expansion architecture -the latter is something the ST really should have had in any case, even if just a spectrum/commodore/tandy/PBI style expansion port for the console models -obviously desktop models should have afforded proper PC/Apple style expansion slots out of the box)

 

Jack didn't have money for engineering, and wanted ST to be cheap. The timing just wasn't right for PC compatibility.

Yep, and hell, even in Tandy's case they were basically cloning (and correcting) the PCjr design. If they HAD gone the PC route, they probably would have had to focus on just consolidating cloned versions of existing PC hardware with modest enhancement at best, and Tandy had already done that. (albeit they never brought it to Europe and seemed to continue with their Radio-Shack specific distribution line without expanding to various other computer dealers/outlets -which some claim may have crippled them, especially in terms of average Radio Shack staff being generally less than adequate to sell computers to prospective buyers)

 

Commodore was in a far better position to start pushing such at that time, especially since their in-house resources could potentially allow a cost-effective embedded PC chipset well before 3rd parties really started pushing that (and without overhead of buying from 3rd party vendors). And, of course, Atari Inc had been planning the 1600XL DOS/PC compatible of sorts back in '83. (it was supposedly to use a 186, so it's unclear what degree of PC compatibility it actually would have offered)

 

It's a bit ironic that Atari Corp finally did get into the PC business, mass market high-volume motherboards with embedded chipsets were beginning to really compete on the market (in 1987) and Atari's own custom single-board embedded hardware (including a custom ASIC with CGA+MDA+Hercules+EGA support with 256k of dedicated video memory) in the PC-1 was produced in volumes too low to be cost effective (compared to Atari just buying off the shelf motherboards and parts, in spite of the overhead of 3rd party vendors). It's also a bit ironic that Atari Corp's custom and off the shelf PC ranges were both broader and of generally better quality than what Commodore started offering in the late 80s. (and that Atari was offering a variety of 8 MHz 8088, 16 MHz 286, 16 MHz 386SX and 20 MHz 386DX machines when they were still only offering 8 MHz STs, no faster CPU models and no 32-bit models -not to mention all those off the shelf PC-3/4/5/ABC machines also offered standard flexible expandability -the PC-1 offered a single side mounted ISA slot somewhat closer to the MEGA ST expansion slot)

 

 

I wonder what the pricing was for Atari's line of PCs at the time, especially compared to their STs. (Curt's old pages on Atari PCs mention them being exceptional values for the time, but nothing specific)

Link to comment
Share on other sites

That's more of a cost vs price issue. A lot of PC hardware didn't COST that much to manufacture, but it was priced for a high-end market with huge margins. (that and a lack of emphasis on tight, low-cost engineered designs) That's the same thing as the Mac, more or less. (the Mac128k in 1985 had a nominal price point roughly 4x that of the 520ST, but was almost certainly significantly cheaper to manufacture)

 

No, and sorry, no. In fact it DID cost a lot to manufacture PC hardware. The economies of scale that you see today simply did not exist back then. No (big)GALs, no all in one VLSI motherboard chipsets. Mad crazy expensive discrete memory chips and TTL everywhere. All this stuff cost bug bucks.. to produce and to purchase.

What time period are you talking about? Are you talking around the ST's launch (where there were only a handful of embedded chipsets, and most of those were limited to in-house systems like Tandy's 1000 -and most of those chipsets were only partially consolidated/embedded with a mix of off the shelf components including TTL parts mixed in), but that changed rapidly in the late 80s with an increasingly competitive high-volume PC market.

 

A lot of it was up to margins though . . . from the pricing on the base chips (especially Intel's overpriced CPUs -without strong competition early on) to the PC/clone manufacturers themselves pushing high margins. (not at all unlike what Apple had been doing for most/all products -for better or worse) It took a while for the low-end/budget guys to come in and really push competition with tighter margins (and often tighter hardware -in Tandy's case, they had more embedded/consolidated hardware than most clones at the time), and then you saw a general expansion of the market to form broad ranges of machines (in price, capability, and overall value) that hadn't really been seen before with any single compatible standard systems.

 

As for the Mac... I loved my ST, but the Mac had way more engineering than the ST ever saw, a built in display (which was NOT cheap in 1984), 'proper' support chips (serial, SCSI, etc)... that all costs money....

Hmm, I've seen the opposite perspective pushed more often. The Mac initially lacked any proper external DMA support for a hard drive (you could suffer with teribly slow serial drives) or such while the ST had a proper range of support chips including ACSI out of the box from day 1. (plus it had a full floppy controller and signalling set-up while the Mac used a rather primitive -though cheap- low-level interface that sacrificed its 2nd/stereo DMA sound channel to drive the square wave for the drive)

The ST had a full range of off the shelf and custom support chips from the DMA chips to the parallel and serial peripheral interface logic (3 serial interfaces, a couple dozen parallel I/O ports), floppy controller, programmable interval timers, embedded MCU driving the keyboard (and also the sole purpose of one of the serial I/O ports, obvious forward thinking for use of an external keyboard . . . a bit ironic it took so long to have desktop models like that).

 

I fail to see the significance of that (tiny) built-in display as well. Several early (popular) 8-bit computers had built-in monitors from the PET to the TRS-80 model 1 to the Model II, III, various portables, etc, etc.

The one significant thing about the Mac display was that it wasn't using the standard 15.7 kHz TV scan rate, but 22 kHz for higher resolution. (albeit still weaker than what MDA had already been pushing on PCs -or Hercules for proper graphics, let alone EGA . . . or the ST's 30 kHz 70 Hz 640x400 display)

Link to comment
Share on other sites

Thinking more on that Mac comparison . . . it wasn't until 1986 (with the Mac Plus) that Apple had SCSI, so some 8 months after the Atari ST launched with ACSI out of the box. The ST had a standard RS-232 port, centronics/PC/Amiga compatible 8-bit parallel port, 8-bit ACSI DMA port, floppy drive prallel/DMA port, 2 joystick/mouse ports, and MIDI interface.

 

The Mac 128 had the external floppy port, 2 non-standard RS-422 serial ports (for "printer" and "modem"), and the mouse port.

 

The ST achieved those interfaces with a mix of off the shelf and custom chips with relatively little use of discrete logic (mostly off the shelf peripheral interface LSI chips and custom ASICs -presumably using cheap/commodity ULAs and gate arrays), plus off the shelf RAM and CPU/MCU chips, of course.

 

The mac also used a mix of off the shelf parts and custom ASICs . . . though they also seem to have rebranded some off the shelf components as custom apple parts. (they even did that for some of the later CPUs in Macs oddly enough -and unlike some other manufacturers who did that to avoid cloning -like various arcade boards, sound cards, video cards, some PC motherboard chipsets, etc- it was pretty obvious what most of those rebranded chips were . . . which makes it a bit odd)

Looking at the original Mac motherboard, there also seems to be a fair chunk of discrete logic in there. There actually seems to be fewer custom chips (or custom logic in general) used in the initial Macs than the ST used. (granted, the hardware is considerably more bare bones than the ST)

 

That discrete logic in place of heavier use of consolidated logic would have meant higher costs to an extent (and lower R&D/engineering costs), but it's still a pretty cheap/basic system. (if Apple produced in low volumes with crappy/unfavorable business contracts for their chip vendors and manufacturing, then yes, it could have been more expensive . . . but that's just saying Apple was bad at business management :P)

Volumes and limited (and delayed) consolidation obviously had an impact on the Apple II line as well (albeit the IIgs was pretty tight for its time), but the main thing was just very high margins on hardware compared to contemporaries. (albeit the Apple II's initial high price was partially due to wanting to fund strong advertising without need of investment capital and deficit spending -something that has some merits, especially when you don't have especially good credit or resources to sustain deficit spending, but arguably made less sense after Apple started to get really big . . . then they eventually settled into marketing products at extremely high margins and making good profits relative the niche market they catered to -granted, they totally missed out on the persisting mass market standard that Woz's original Apple II design had the potential to be)

 

Everything I've read/heard about the Mac is the same thing, cheap/basic hardware at high margins, just like most Apple products since. (albeit in the Mac's case -like a few others- they ended up struggling for quite a while and using the -otherwise neglected- Apple II market as a crutch for the weak support of the Mac early on and kept pouring in more funding until "it worked" -I can't help but think what sort of company Apple might have become if they'd actually focused resources continually on supporting and building onto the Apple II line . . . of course that also implies that Apple had smarter marketing/management personnel who actually respected the prospect of a unified market standard machine that emphasized open-ended expandability and compatibility)

Of course, now we're back to this discussion:

http://www.atariage.com/forums/topic/179246-apple-ii-in-low-end-market/page__st__25 ;)

 

 

 

 

 

As far as the ST end of things, the Mac was more impressive in the same areas that most successful platforms are, software support (nothing to do with quality of the design or engineering skill). And that was only impressive from the US perspective; it was the opposite in Europe.

Of course, the PC was the most impressive by far in that respect. ;)

 

 

That, and more intensive engineering tends to make a product MORE cost effective, not less. A product using integrated custom chips (produced by a competitively priced chip vendor at high volumes) would be significantly cheaper than even the best cases of "shopping around" for the best deals on off the shelf parts. (that hardly seems to be the case for the Mac, so it wasn't THAT cheap to make, but the sheer simplicity of the design can't be ignored) Tighter, more custom optimized designs are also riskier from the requisite R&D time/investment (including tooling costs for manufacturing the custom chips -more so for more advanced techniques: ULAs/gate arrays would be relatively low risk, but really tight full custom chips are another story -the most heavily integrated/optimized option, but requiring high volumes to really be worthwhile, albeit full-custom was the only option before masked ULA/gate array chips became available until the late 70s -not exactly sure when ULAs became common, but definitely by the beginning of the 1980s)

 

If you can't manage to produce your custom chips at competitively high volumes, you're often better at going off the shelf (for any components widely available on the open market), and the ST's design shows a good amount of compromise for using high-volume competitively priced off the shelf components for most areas where those were available, and custom chips where off the shelf parts were not suitable (be it not available at all in LSI form, or not competitively priced).

 

Atari's initial PC venture would be a good example of these general trade-offs as well: a significant amount of investment in a consolidated custom motherboard with a mix of custom and off the shelf chips (rather like the ST), but not able to be produced in volumes to be competitive with generic off the shelf motherboards and support hardware of 1987 -in spite of those off the shelf parts being generally less cost efficient. (had they pushed such a design even a year earlier, it might have been soon enough to actually have a cost advantage and have volumes up to competitive levels by the time cheap generic motherboards became widely available . . . 1985 much more so -and in 1984, it was obviously a big advantage for Tandy when most competition -IBM and clones- hadn't bothered investing in lower cost designs as they were managing fine margins at premium prices anyway -sort of like how it took so long for Apple to bother to consolidate to the IIe level, though that was also partially due to official lack of interest in the Apple II line on the management end)

 

That is, unless Curt's info on Atari's decision to drop the PC-1 in favor of off the shelf PCs is incorrect. (there's some outdated stuff on his site, but in that case, it seems pretty logical: Atari Corp was constantly negotiating for the best deals on various components -off the shelf and otherwise- so it wouldn't have made sense for them to switch to generic motherboards and cases if that wasn't advantageous on the cost end)

Albeit, Curt's page also claims that Atari's PC-1 was "too well made" and thus cost too much compared to the competition, but that seems to be a rather vague oversimplification of the real issue. (volume production regardless of "quality" standards -though from the accounts I've seen, Atari's PC-1 was pretty well made, similar to the MEGA ST if not better in build quality in some areas -as well as having a great feature set)

 

Or maybe it was actually cost competitive for a short while after the release in '87, and the highly competitive mass-produced motherboards/hardware arrived slightly later. (and Atari still hadn't managed volumes high enough to make their design worthwhile) Albeit, the general amount of consolidated chipsets available by early '87 and kskunk's comments about the first PC chipsets arriving after the ST started shipping (doesn't mention exactly when, but the comments seems to imply that it wasn't too long after summer of 1985) seems to imply that even 1986 might not have been especially favorable for Atari's PC-1 compared to using mass market options. (then again, maybe Atari's PC-1 DOES largely use off the shelf LSI support chips from the time, but on a custom, consolidated motherboard -so it would just have been the introduction of high volume off the shelf motherboards that killed the PC-1)

 

It would seem like re-using at least some of the custom hardware would have made sense though (if it was actually custom), like using that custom graphics ASIC on an ISA card to plug into the generic motherboards they were getting. (AFIK they didn't do that and used generic 3rd party EGA+CGA cards for the bottom-end PC-3 instead, though some apparently had VGA -the PC4/5/ABC had VGA or SVGA) Of course, that's assuming Atari actually invested in designing (or licensing) that ASIC and wasn't buying it from a 3rd party vendor. (ATi had a custom CGA+Hercules+EGA compatible chip for their EGA Wonder card released in Spring of 1987, so the timing was right) If it was off the shelf, it would make sense to just buy ATi's ISA cards in bulk after the switch to generic PC hardware. (or just shopping around for whatever was the best deal -though ATi seems to have been a pretty cost-competitive graphics card vendor for the time)

Edited by kool kitty89
Link to comment
Share on other sites

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

Edited by kskunk
Link to comment
Share on other sites

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

Are you sure about? Never had or saw any Tandy 1000's or much tandy stuff in general throughout the 80's. Our retail store sold most brands and took trades on most anything in computers, if they were that prevalent we would certainly have seen many in trade. pc's really didnt get going for the average consumer until late 87/early 88, then ramped up with the introduction of vga. The 1st ones we sold were only 8 bit isa.

Mid 80's it was ST and Amiga, A8 and C64. ( yes apple...)

Link to comment
Share on other sites

Sinclair_PC200_System_s1.jpg

 

Proof that PC in the 80s couldn't compete for home sales.

That's not really proof of anything. :lol: Aside from the fact that even a stylish case couldn't save a poorly timed and poorly priced low-end PC compatible in Europe. (which PC wasn't already the dominating de-facto standard)

 

Also, that the DRAM shortage made that particularly problematic and led to Amstrad pulling the line from the market.

 

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

Yeah, maybe if the PC-200 was on the level of the Tandy 1000 EX (or perhaps HX), being priced similarly aggressively, and released a good bit earlier it would have had some chance in Europe. (or for that matter, perhaps if Amstrad had been pushing a PC compatible line more like the Tandy 1000 in general -if not some actual licensed agreement from Tandy to distribute in Europe, Amstrad might have had more of a chance getting PCs established in the European market)

 

The Tandy-1000 wasn't great compared to the ST or Amiga, but it was a great value for a PC in the early/mid 80s with a feature set well suited to a general purpose home computer with moderate business/work capabilities along with casual/educational/entertainment stuff. (with a low cost/price emphasis as well; especially well suited for the European market given those qualities)

 

However, it was still not a good value compared to the ST or Amiga outside of the PC compatibility and expandability. (and once you got strong 3rd party support on the ST, the PC compatibility wasn't a big attraction in that region -ie where PCs weren't dominant, so the advantages would be more modest)

Still, it would at least have had a better chance than what Amstrad was offering.

 

 

The case definitely looks good though. (it would have been neat if Atari had released a black version of the ST)

 

 

 

 

 

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

Are you sure about? Never had or saw any Tandy 1000's or much tandy stuff in general throughout the 80's. Our retail store sold most brands and took trades on most anything in computers, if they were that prevalent we would certainly have seen many in trade. pc's really didnt get going for the average consumer until late 87/early 88, then ramped up with the introduction of vga. The 1st ones we sold were only 8 bit isa.

Mid 80's it was ST and Amiga, A8 and C64. ( yes apple...)

It probably varied a lot by region too; the US is a huge country with a massive range of regional interest (you might as well treat it as several separate markets divided by region) . . . that's also part of why some people experienced with video game crash far more severely than others. (or some didn't even realize it ever happened -I've gotten the impression that a large portion of California experienced the "crash" rather mildly, but that's largely anecdotal -not based on actual sales figures or activity)

 

Also, wasn't the Apple II still fairly big in the mid 80s?

 

 

It would be nice to have some at least moderately accurate sales figures for various computers in the 80s. This site: http://jeremyreimer.com/postman/node/329 would be ideal if it weren't for the grossly erroneous figures. (and total lack of international sales information . . . as well as lacking in even listing some rather significant platforms)

 

 

 

On another note, Atarian, did you ever distribute Atari's line of PC compatibles? (and do you have any idea of the pricing of those machines?)

Link to comment
Share on other sites

Sinclair_PC200_System_s1.jpg

 

Proof that PC in the 80s couldn't compete for home sales.

That's not really proof of anything. :lol: Aside from the fact that even a stylish case couldn't save a poorly timed and poorly priced low-end PC compatible in Europe. (which PC wasn't already the dominating de-facto standard)

 

Also, that the DRAM shortage made that particularly problematic and led to Amstrad pulling the line from the market.

 

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

Yeah, maybe if the PC-200 was on the level of the Tandy 1000 EX (or perhaps HX), being priced similarly aggressively, and released a good bit earlier it would have had some chance in Europe. (or for that matter, perhaps if Amstrad had been pushing a PC compatible line more like the Tandy 1000 in general -if not some actual licensed agreement from Tandy to distribute in Europe, Amstrad might have had more of a chance getting PCs established in the European market)

 

The Tandy-1000 wasn't great compared to the ST or Amiga, but it was a great value for a PC in the early/mid 80s with a feature set well suited to a general purpose home computer with moderate business/work capabilities along with casual/educational/entertainment stuff. (with a low cost/price emphasis as well; especially well suited for the European market given those qualities)

 

However, it was still not a good value compared to the ST or Amiga outside of the PC compatibility and expandability. (and once you got strong 3rd party support on the ST, the PC compatibility wasn't a big attraction in that region -ie where PCs weren't dominant, so the advantages would be more modest)

Still, it would at least have had a better chance than what Amstrad was offering.

 

 

The case definitely looks good though. (it would have been neat if Atari had released a black version of the ST)

 

 

 

 

 

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

Are you sure about? Never had or saw any Tandy 1000's or much tandy stuff in general throughout the 80's. Our retail store sold most brands and took trades on most anything in computers, if they were that prevalent we would certainly have seen many in trade. pc's really didnt get going for the average consumer until late 87/early 88, then ramped up with the introduction of vga. The 1st ones we sold were only 8 bit isa.

Mid 80's it was ST and Amiga, A8 and C64. ( yes apple...)

It probably varied a lot by region too; the US is a huge country with a massive range of regional interest (you might as well treat it as several separate markets divided by region) . . . that's also part of why some people experienced with video game crash far more severely than others. (or some didn't even realize it ever happened -I've gotten the impression that a large portion of California experienced the "crash" rather mildly, but that's largely anecdotal -not based on actual sales figures or activity)

 

Also, wasn't the Apple II still fairly big in the mid 80s?

 

 

It would be nice to have some at least moderately accurate sales figures for various computers in the 80s. This site: http://jeremyreimer.com/postman/node/329 would be ideal if it weren't for the grossly erroneous figures. (and total lack of international sales information . . . as well as lacking in even listing some rather significant platforms)

 

 

 

On another note, Atarian, did you ever distribute Atari's line of PC compatibles? (and do you have any idea of the pricing of those machines?)

No we didn't though we would had we had them available to us. Never saw it in Atari's pricing or through distribution. We probably could have sold many to folks Atari/ Amiga folks making the switch to PC's or just as a second PC along with their existing amiga/st.

Link to comment
Share on other sites

Are you sure about? Never had or saw any Tandy 1000's or much tandy stuff in general throughout the 80's.

I'm not sure, it's mostly anecdotal. I based that comment off Tandy's advertising, which said it was the best-selling PC in America from 1985 to 1989 -- note that their use of "PC", which was probably to exclude the C64 but could exclude the ST too.

 

One comparison I found is 1985, where Tandy sold 450,000 computers in America while Atari sold 100,000 worldwide. Another source I found was an Ars Technica article that shows Atari ST peaking at 300,000 computers worldwide in 1987, and yet another source states that they were making 75% of their sales overseas at that time.

 

This matches stories about Atari's US distribution problems: Radio Shack was everywhere and very organized, while Atari's dealer network was small and patchy. So if you were in an area where your Atari dealer had things together, your town might be an ST town. My dealer was more Doc Brown than Gordon Gekko, sadly.

 

Sorry I couldn't find better data! But from the data I could find, Tandy seemed to be pretty dominant here in America.

 

It does make some sense: By the mid-80s, Brand Name Software was going mainstream in the US. People were starting to buy the software first, making the computer just an accessory. (You want a faster WordPerfect? More columns in 1-2-3? Try this computer!) It wasn't about searching for a word processor with the right features for you, it was about buying WordPerfect.

 

I was just looking at an old InfoWorld from 1985 (they're free on Google Books, check 'em out), and even peripheral manufacturers were in on this. No-name printers would say, 'Full support for WordPerfect and 1-2-3!' in bigger type than their technical specifications.

 

Just a few years before, people in the US were buying computers first, and looking for software later. This helped the C64 dominate -- features and price and availability mattered most, and then you'd go read Byte to learn the names of some word processors for your new computer.

 

One sad footnote is that a few people saw the shift coming. I read about one Commodore market research project in 1984 which concluded that PC-compatibility was all customers cared about. I can just imagine the engineers bristling at the thought of building another crap clone instead of something leaps ahead...

 

- KS

Edited by kskunk
Link to comment
Share on other sites

No we didn't though we would had we had them available to us. Never saw it in Atari's pricing or through distribution. We probably could have sold many to folks Atari/ Amiga folks making the switch to PC's or just as a second PC along with their existing amiga/st.

Hmm, Atari didn't sell their PCs in the US, or just didn't distribute to your region? it would seem rather odd if Atari specifically aimed its PC clones to Europe. (where the ST was really big and PC was relatively weak)

 

 

 

 

 

 

I'm not sure, it's mostly anecdotal. I based that comment off Tandy's advertising, which said it was the best-selling PC in America from 1985 to 1989 -- note that their use of "PC", which was probably to exclude the C64 but could exclude the ST too.

They could have been using "PC" in terms of IBM compatible and being totally exclusive. ;) Even so, the best selling of any IBM compatible would still be a pretty significant number for that period.

 

One comparison I found is 1985, where Tandy sold 450,000 computers in America while Atari sold 100,000 worldwide. Another source I found was an Ars Technica article that shows Atari ST peaking at 300,000 computers worldwide in 1987, and yet another source states that they were making 75% of their sales overseas at that time.

1985 isn't really a good year to compare for Atari (even if those figures are accurate), the A8 had weakened significantly and the ST was just getting started (I don't think it made a wide release in Europe until '86). Plus, Tandy had their 1000 out since '84, and 3 other computer lines to work with as well. (CoCo, Model 1 compatible TRS-80s, Model 2 compatible TRS-80s)

 

IIRC, Atari Corp was also establishing manufacturing in '85 and had some volume problems with some products. (including the 2600, resulting in shortages with the strong surge in video game sales in summer/fall of that year)

 

This matches stories about Atari's US distribution problems: Radio Shack was everywhere and very organized, while Atari's dealer network was small and patchy. So if you were in an area where your Atari dealer had things together, your town might be an ST town. My dealer was more Doc Brown than Gordon Gekko, sadly.

That could be a double edged sword though, couldn't it? (limiting distribution to Radio Shack specifically)

Radio Shack offered a decent compromise from the mass merchants Commodore tended towards and dedicated computer dealers (at a time when Fry's was still just a grocery store, and places like Best Buy, or Comp USA didn't exist -and Circuit City hadn't expanded to computers, let alone the nationwide superstores they established later on)

 

Having their Radio Shack outlets were certainly an asset, but I don't see why they wouldn't have wanted the best of both worlds with "serious" computer dealers and distributors as well. (or even introducing the CoCo to discount/department stores in line with the C64 -and technically the A8 had sported that from day 1 thanks to Atari's relationship with Sears, albeit a more prestigious outlet than Kmart ;))

 

Sorry I couldn't find better data! But from the data I could find, Tandy seemed to be pretty dominant here in America.

I hadn't realized Tandy had actually been the dominant PC clone manufacturer in the mid/late 80s (I'd assumed they were more of a long-running niche player in the lower-end small business and home market), but it really would make sense given that they seemed to be the only low-priced PC option until the late 80s. (and build quality seemed pretty decent as well . . . let alone the proprietary features for graphics and sound beyond other PCs until 1988 -when EGA and Adlib support started to emerge -and EGA games were really no better than 320x200 Tandy games, if not worse due to the use of planar graphics vs Tandy's packed pixels)

 

That makes me wonder even more why there weren't low-cost 3rd party upgrade cards oriented around Tandy video/sound. (if not directly compatible, at least functionally compatible to allow easy ports of Tandy specific software) There actually was the Plantronics colorplus on the market (basically PCJr/Tandy extended CGA graphics), but AFIK it didn't get much specific support (in spite of ATi also releasing a line of cross-compatible cards in their wonder series).

And on the sound end, there was absolutely nothing (professional or consumer level) for PCs until Adlib in 1987 (except Covox type parallel port DACs), no basic SN76489 board, no AY8910 board (in that case, a nice option for including built-in Atari style digital joyports as well -especially at a time when IBM analog ports were poorly supported, unreliable, and excessively resource intensive).

But imagine that, just after Adlib launched, you saw the release of the MT-32, CMS/Game Blaster, an obscure SID based card, and Covox's Sound master and Creative's Sound Blaster released in '89. (the sound master including atari type digital joy ports and using an AY8930 . . . but creative's SB standard obviously won favor with its Adlib compatibility and competitive pricing -had Covox come out with something back in '86, it might have been another story ;))

 

Just a few years before, people in the US were buying computers first, and looking for software later. This helped the C64 dominate -- features and price and availability mattered most, and then you'd go read Byte to learn the names of some word processors for your new computer.

If you bought for (business/educational) software in the early, you probably went Tandy or Apple (PET had a niche for a short time, but CBM failed to maintain it . . . Atari had a lot of potential, but generally squandered it on the management/marketing end -awesome games, of course).

 

For features and price alone, Atari had a lot of advantages to contemporaries in '82-84, a competitive (but not the lowest) price, good performance and flexbility, easy to install SIO peripherals, etc.

One huge thing on the marketing end could have been comparing Atari's floppy drives to the slugs Commodore were using with the VIC and C64 (some of those recent cell phone/network speed commercials come to mind ;)). They my not have been quite as fast as Apple II drives (not sure on the exact comparison there), but they were light years faster than Commodore drives. (tapes were also 2x as fast as CBM tapes using normal loading speeds, though about 1/2 that of the Apple II and poorer still compared to CoCo or spectrum)

 

One sad footnote is that a few people saw the shift coming. I read about one Commodore market research project in 1984 which concluded that PC-compatibility was all customers cared about. I can just imagine the engineers bristling at the thought of building another crap clone instead of something leaps ahead...

Yes, but, then again, you COULD do both "PC compatibility" and a clean all-new hardware design . . . but limit that compatibility to the software/file formatting end as I already mused on above. (granted, success of that would greatly depend on software support facilitating real cross-platform support)

 

Heh, that and it might not have even been "PC compatibility" that people wanted in the mid 80s if things had gone differently at Apple. (the Apple II had real potential as a PC-like standard -from the open expansion architecture to the software support to easy to clone hardware, it was very much like the PC, but Apple didn't push it like IBM did the PC -granted, what IBM did wasn't ideal either, but it did expand upon the ideas of flexible compatibility and expandability . . . with exceptions to that with the PCJr and PS/2)

Actually I wonder if Wozniak actually had a vision of a long-running intercompatible/evolutionary design with the Apple II, or if that prospect was more coincidental. (I haven't seen any comments from him on that topic, though he definitely seemed to be pushing that direction -and continued to push hard in spite of opposition at apple, to the point that they actually managed to get the IIgs out on the market, albeit with some crippling concessions)

It's certainly interesting to think what the Apple II might have evolved into if it had gone the way of the PC. (and what the 650x architecture would have become with a PC-like market driving its evolution)

 

Of course, part of that would also include Apple II compatibles appearing at competitive prices. ;)

Edited by kool kitty89
Link to comment
Share on other sites

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

 

Difference between USA and EU, only the odd ex Acorn BBC Micro user bought a PC for home use....AKA idiots with money to burn. Even in 1990 EGA was de-facto for home PCs.

 

The PC200 was a terrible design but it could have ISA cards inserted...they just stuck out the top. Biggest problem was 8086 CPU at the time.

 

End of the day pre 1991 every game released the best version in the EU was the Amiga for 2D gaming. 3D gaming however on a proper 386 was faster than ST/Amiga. But then they did cost over £1000 so.....unless you liked rubbish American games like Sierra adventures etc or other boring crap most games were EGA only

Link to comment
Share on other sites

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

 

Difference between USA and EU, only the odd ex Acorn BBC Micro user bought a PC for home use....AKA idiots with money to burn. Even in 1990 EGA was de-facto for home PCs.

The thing is, they weren't just idiots . . . there's a REASON the Apple II and PC got a lot of attention and respect from the mass market: 1. flexible open architecture expandability (important for hobbyists and general consumers due to forwards compatibility and flexible expandability), and 2. they damn things just got good software support. In IBM's case, they also had a MASSIVE brand name to gain instant respect from the market. (the high build quality was also a major selling point -the high quality keyboard was a very significant factor for business use) For a time, the TRS-80 line also shared some of that, but that ended up falling apart later on. (as did the Apple II, though it survived surprisingly long in spite of Apple virtually trying to kill it off from 1980 onward -Commodore had a place in the business/professional market early on 2, but their's faded fastest by FAR with the PET languishing rapidly in the early 80s -they also, unlike Apple and Tandy -or IBM- intentionally limited expandability and aimed at an "appliance computer" format more like the Macintosh and Atari 8-bit -the ST was rather like that too)

Gaining respect from the market ensured good software support as well as respected use from certain high-end/professional niches.

 

The cost was high relative to raw performance, but there was a TON of value beyond raw hardware capabilities. (that value includes tangible things like software and peripheral availability or expandability, to things like brand prestige)

 

The BBC Micro had much of that too, except it was also an exceptionally capable computer for its time (2 MHz 6502 without wait states, OK graphics -plenty useful for business purposes and OK for games- and pretty good sound for the time, plus one of the most efficient Operating systems on the market -one of the few functional enough to get a high degree of high-level software support when most other systems totally bypassed the OS and went for hardware alone).

Add flexible expansion slots, and you'd have an Apple II killer if marketed right in the US. ;)

 

Price point was the only thing that really limited all of those machines in the mass market, in part due to manufacturers not aiming at that market or screwing up when they did attempt to go lower-end. (few to none seemed to understand the potential of supporting a broad range of intercompatible machines spanning the low-end consumer to mid-range/small business/education, to professional business/science, to workstation class machines) You saw some cases with potential, like the PCJr and Acorn Electron, but most/all of those were botched . . . albeit Tandy managed to do the PCjr right with a much more mainstream line of machines (and if Kskunk's recent comments are accurate, Tandy actually brought the PC to the lower-end/mid-range mass market, and played a huge part in market saturation of PC clones). It would have been interesting to see how the Acorn Electron would have turned out had it not been so gimped. (ie if it had been a REAL BBC Micro, but just at lower cost from the new LSI chips -not quite as cheap, but a much better value)

 

The value of those "serious" machines obviously allowed them to be sold at very high margins and still be successful, but without lower priced models, they really couldn't penetrate the mass market. (something that was obviously a much bigger problem for the extremely price sensitive Euro market . . . not to mention the greater gaming demands of that market)

 

Games drove all computer markets to some degree, but the US (and Japan) were never like Europe (especially their peak in the 80s). Both the US and Japan were limited to consoles first and a few niche gaming computers along with the dominant more "serious" computers (tandy, apple II, PC, PC8800 and 9800, etc). The C64 (and to lesser extent VIC and A8) boom in the early/mid 80s was short lived and even at its height, it never came close to the market saturation of computer gaming in Europe. (consoles stayed very significant on the whole, and it was only really 1984 and 1985 that saw a real shift toward computer support -and some market regions were already shifting back towards consoles by late 1985)

From what I understand, the late 80s saw the shift to a new niche PC (and Amiga and ST) gaming market with PC becoming the dominant computer gaming platform by the late 80s. (and by 1990 -with a significant number of games supporting VGA and Adlib with full Soundblaster support soon following, and the PC not only getting the dominant amount of support on the software end, but also becoming a seriously capable gaming machine -a fast 286 with 512-640k, 16-bit VGA card, and 8-bit sound blaster made a pretty nice game machine for the time -VGA really helped on the graphics end, well beyond just the added color you had hardware scrolling, some primitive raster op accleration, and 8-bit packed pixels -fast/easy to manipulate for software blitting, so a huge step over dealing with EGA's bitplanes -though at least the Tandy had 4-bit packed pixels, so you coudl easily do software blits on byte boundars without being to choppy -working with byte boundaries with 1bpp graphics or bitplanes means 8 pixel wide -a big part of why you see a lot of games moving objects/BGs at choppy 8-pixel wide intervals on the ST and Spectrum)

 

Of course, by that point, the "cheap" end of the PC market had expanded well beyond Tandy and had almost reached the point where users could easily build their own machines from off the shelf parts. (I think that wasn't quite there in 1990, but definitely became a reality in the early 90s)

 

 

 

I'm getting off topic here though.

In context of bother catering to "serious" business (and science) and the consumer/casual level, there were numerous machines that COULD have done that prior to the PC going mainstream, but none had been managed or marketed that way. (Apple II had huge potential for a broad range of evolving high to low end machines, Tandy had a lot of potential there too, and, of course, Atari had some real potential with the A8 -in fact, engineers had originally wanted to include Apple II like expansion . . . had they done that AND offered lower-end/home oriented models -and promoted a wide range of software from games to serious business and academic applications- they might have ended up with a winning mass market machine -of course, additional tailoring would have been necessary to cater to Europe for both business/professional and home use)

One could argue the C64 had potential for that even, but it came a bit too late to really push that side of things, and the way it managed its market penetration didn't really cater to pushing into the high-end market after the fact. (perhaps if they'd designed a high-end/serious incarnation of the C64 in parallel with the breadbox model, it could have worked . . . CBM definitely screwed over their business and education market in the early 80s, it's not like they stopped trying to push there too, but they didn't address some fatal the shortcomings of the original PET line and tended to make generally less attractive successors -incompatible, no more flexible for expansion, less attractive price point, etc -the B128 would be among those failures as well . . . OTOH it also may have made sense for CBM to make a hybrid C64+PC compatible machine -probably would have been better than the C128, then again they should have just been pushing the Amiga and C64 at that point anyway -if they didn't have the Amiga, going PC could have made the most sense, but otherwise they could have molded the Amiga into a wide range of machines catering to mid-range consumer to small business to serious business, etc, etc)

 

The PC200 was a terrible design but it could have ISA cards inserted...they just stuck out the top. Biggest problem was 8086 CPU at the time.

An 8086 as apposed to what? 286s wouldn't have really worked with a cheap/low-end design, though perhaps an NEC V30 would have worked. Maybe they could have gotten a good deal on AM286 (or perhaps a 186), and either made a high-volume order for an embedded PC chipset (or invested in making their own ASICs).

But really, in 1988 it was too late to push that, going off the shelf was cheaper by that point as Atari learned.

 

Now, maybe if Amstrad had been really aggressive in pushing PCs and got a license to distribute/localize the Tandy 1000 (or a more embeded derivative thereof) for the European back in the mid 80s, they might have had something. (Tandy had an embedded PC chipset with decent graphics/sound capabilities on top of that)

That would have meant getting into the market before the ST was established and possibly making the PC mainstream in Europe a decade before it was historically. (or it might have still fallen behind the ST and ended up as more niche competition)

 

Pushing a higher-end Amiga (or Flare) like chipset would have compromised the cost effectiveness of such a design, so a Tandy-like machine (especially fully Tandy compatible) would probably have been the best option. (maybe, just maybe, they could have collaborated with Tandy to further expand their standard to have more comprehensive features -Tandy didn't end up doing that though, AFIK they also didn't end up investing in custom EGA/VGA compatible embedded ASICs either, just off the shelf video cards -a best case would be trying to comply with mass market standards and minimizing cost . . . and rather than continuously investing in in-house R&D, it might be preferable to outsource or license -ATi was pushing a lot of consolidated graphics solutions, so perhaps they could have partnered with ATI to get an embedded VGA compatible chipset that also complied with Tandy video -ATi was already offering cards with Plantronics support, and that was nearly identical to TGA- plus ATi cloned the IBM 8514 blitter/accelerator and released it in the Mach 8 in 1991, so that could have possibly been rolled into the standard as well -piggybacking on that standard would make a lot more sense than going with a new custom blitter, especially if they could get it at a reasonable price -havign such as a standard feature could mean a lot more support too -vs all games using just plain VGA acceleration at the time, aside for a few not even supporting that)

 

End of the day pre 1991 every game released the best version in the EU was the Amiga for 2D gaming. 3D gaming however on a proper 386 was faster than ST/Amiga. But then they did cost over £1000 so.....unless you liked rubbish American games like Sierra adventures etc or other boring crap most games were EGA only

WTF? Those awesome classic graphics adventures RUBBISH??? RUBBISH??? (not to mention Lucas Art's stuff, or Origin, or EA)

 

 

 

 

 

 

 

Anyway, much of this topic is down to business and non-technical. Powerful features and good cost effectiveness can play a significant role, but are only a small part of what makes a successful mass market product (especially in the US). Less powerful features with greater support and compatibility can be much more valuable (to developers/publishers and consumers) than sheer capabilities/power without strong support.

And then the many non-technical issues, like sheer breadth software support (which can be gained in a number of ways -ease of programming helps, but having a strong brand name -especially a reputation as a stable and successful company- can be huge, and -in the long run- market share is obviously a factor -actually GETTING market share is the real issue, it's the result, not the cause ;)), consumer confidence of the product, marketing, price point, perceived value, sheer monetary funding and credit, and good overall business management are all critical.

 

A (technically) weaker product with good funding and management is often much better off than an amazing product with crap management and support (or just crap funding . . . or all of the above -good/efficient management can do more with less funding, and massive funds are hardly foolproof -as with NEC with the TG-16).

In the video Game industry, Sony's introduction of the PSX was more or less a perfect storm (pretty clean engineering, massive advantages in vertical integration, good feature set, excellent development tools -for the time, tons of cache, a good reputation -albeit no real history in the game market and NEC showed how that was hardly foolproof, massive spending -showing how serious they were, proportionally conservative investment was one of NEC's failings too, and good management on top of all that -not to mention competition making serious mistakes)

 

The PS2 is more of a good example on how the technical end isn't the most important . . . the PS2 was a relatively efficient design, but a tough architecture to work with (compared to market standards) with rather weak SDKs from Sony (so not like the Saturn in terms of poor cost effectiveness, but perhaps closer to the Jaguar in that sense). They had some many other advantages that developers flocked to the platform anyway and eventually built some pretty good in-house SDKs. (I think Carmak may have been the only one to really come close to that for the Jaguar -rather surprising given just how small the Jag's market share was)

 

The Dreamcast is another example of how technical brilliance can't win alone . . . the hardware feature set was great, the cost effectiveness was great, the SDKs were the best on the market at the time (possibly the best of any system at launch ever), and even the marketing was pretty good (in the US), but Sega's overall cash flow and management problems and Sony's massive competition hurt it (along with poor management/marketing in Europe). SoJ didn't feel the machine could be viable in the US market alone (which was the only region where it was really selling at mainstream volumes), so they cut it prematurely. (it's arguable whether they'd have been better or worse off had they kept pushing the DC -certainly not as clear cut as Atari pulling out in '96 where that was pretty much the smartest move possible)

 

 

 

 

 

Marketing of the ST and Amiga may have been one of the biggest weak points of those machines in the US as well. Advertizing alone was part of that, but other things like form factor (especially the ST's lack of a desktop format until '87) and some other marketing areas weakened the machines. (including general market positioning . . . CBM made much bigger mistakes given they had much more funding and market share to build on; Atari did pretty damn well under their severe limitations -things consistently got better and better for Atari Corp up to 1989, when the consistent decline began -they'd gone from deep in debt to a fortune 500 company within 4 years, but then it began falling apart)

Lack of expandability could also be considered a marketing failure (especially if a closed box form factor was chosen for business and not technical reasons -ie to force users to buy a new system rather than upgrading internally; a terrible idea by that time).

 

Pushing for PC cross compatibility on the software/file format end would also be a marketing dependent area.

 

 

 

Hell, many of the technical mistakes with the A8 were more business/management related. (like the "appliance computer" concept, pushing that further with the 1200 XL, not catering to the needs of the European market -and not catering to many important areas in the US market either)

Same thing for CBM deciding to push out the C128, Plus/4, C-16, etc and crowd the market with unnecessary hardware. (the C128 was OK, but focusing purely on the C64 and Amiga -and compatible derivatives thereof- should have filled all major market sectors with far less confusion -lack of compatibility from PET to VIC to C64 to Amiga was be bad enough, but throwing the rest in made a real mess -they could have finally done it right with the Amiga and established a base standard with a wide range of machines tied together by an open-ended highly expandable architecture -of course, Atari Corp should have done that with the ST too . . . actually a lower-end ST derivative probably would have made more business sense than the 130XE)

Edited by kool kitty89
Link to comment
Share on other sites

End of the day pre 1991 every game released the best version in the EU was the Amiga for 2D gaming. 3D gaming however on a proper 386 was faster than ST/Amiga. But then they did cost over £1000 so.....unless you liked rubbish American games like Sierra adventures etc or other boring crap most games were EGA only

I always wondered what I missed out on by being in the US.

 

By 1987 or so, my friends were all done with home computer gaming. The C64s were in the closet and everybody was working their way through Zelda. (Sorry Kool Kitty, but King's Quest was not exactly considered cool next to Zelda...)

 

Shortly after, it was all about the Genesis, and Gameboys. I don't remember computer gaming being anything popular over here until Wolfenstein 3D.

 

I know this is all anecdotal, but I felt like the US (or maybe just my part) missed out on the 16-bit era of computer gaming. I was in love with Dungeon Master and Populous, but my friends thought computer games were for nerds, and all the cool stuff was on consoles.

 

- KS

Edited by kskunk
Link to comment
Share on other sites

APOGEE put the PC on the map for gaming. That's when most of my peers began reconsidering the PC for gaming.

 

Most were using consoles, or a "home computer" for that purpose. C64 / Atari machines were a great deal against the NES for a while, because one could get a ton of free games! Lots of people did that, with huge copy parties going on all over the place.

 

Apples were here and there, usually made available due to a PC replacement. That's how I got my first very well equipped ][+

Link to comment
Share on other sites

End of the day pre 1991 every game released the best version in the EU was the Amiga for 2D gaming. 3D gaming however on a proper 386 was faster than ST/Amiga. But then they did cost over £1000 so.....unless you liked rubbish American games like Sierra adventures etc or other boring crap most games were EGA only

I always wondered what I missed out on by being in the US.

 

By 1987 or so, my friends were all done with home computer gaming. The C64s were in the closet and everybody was working their way through Zelda. (Sorry Kool Kitty, but King's Quest was not exactly considered cool next to Zelda...)

I know that, PC games were special and niche . . . they had different genres, often ones that appealed to a more narrow crowd. I wasn't around for that time, obviously, but I was there for the early 90s with PC games alongside console stuff.

 

Actually, it was pretty much my dad who was the gamer at home. He was the one who got the NES (either as a gift or at friends' suggestion), and before that it was mostly computer games. (a few games on his TRS-80 model 3, but mainly stuff on the ST or Amiga at work -he worked at metacomco and did a fair amount of work on both platforms in the late 80s -iirc they did a fair amount of work on the ST's OS and BASIC for both systems -though Atari didn't end up using their BASIC iirc, need to ask him again; and I think he may have done a little bit of PC stuff before he got an NES in 1990, but the real PC stuff started in the early 90s -especially after he built his own PC -a lot of flight sims and graphic adventure games in particular, and then he built out mishmash odd hybrid multimedia capable -but with a grayscale monitor until 1995- shared/faimly PC where I started getting into PC games, and then he started getting into Quake 2 and Unreal and then there was grimm fandango, some later graphic adventures, etc, etc ;) -in between that late 90s period and the NES, there was the SNES used to a good extent too -but in the mid 90s, X-Wing and Return to Zork really hang in my mind -X-Wing I played a lot, Return to Zork was more of a back seat gaming thing like a lot of games I watched my dad play through)

He spent a lot of time on the NES, Xexyz, Air Fortress, Zelda, Rolling Thunder treasure island dizzy, super robbin hood, good times. ;) Then in the mid 90s (christmas of '96 I believe) we got our SNES, and he spent a lot of time on that too (I especially remember him playing Link to the Past and Yoshi's Island). And then we got the N64 at the end of 1999. (he spent most time in the 2 Zelda games iirc, but a good chunk in Mario 64 and Shadows of the Empire too -I hadn't actually realized the latter had been on PC at the time, we tended to get the PC versions of such games as they were often cheaper and looked/played better -and we had a good gaming PC years before an N64 -though I also got my own PC just after the N64, before that it was all the shared family PC -initially rebuilt from the existing family PC with the Rage Pro and a 500 MHz K6-2 iirc with some soundlbaster 16 compatible -might have been better than that but we almost always used the FM synth MIDI options for some reason, all put together into an old PC-AT era case with a big DIN keyboard connector ;) -still have that case . . . actually I think that same old motherboard and processor ended up coming back together in that case as -what was supposed to be- a diagnostic computer in our garage -though I don't think it's ever been used for that purpose, or used much at all since it went in there ;))

 

Shortly after, it was all about the Genesis, and Gameboys. I don't remember computer gaming being anything popular over here until Wolfenstein 3D.

Oh yes, it took quite a bit to get PC gaming really mainstream, it was really still niche (though growing) until windows 95 really brought things together on the mass market and PC gaming got huge in the late 90s (rivaling the console market in the US) before settling down in the early 2000s to around where it is today. (not niche, but not nearly as big, more of a secondary market filled with cross-platform games and a few exclusives)

 

Actually, for us, it was still all about the NES in the early 90s, we started getting into PC gaming well before anything else. Actually, it's a bit odd, but most friends my age that I knew were also mainly (or at least heavily) playing the NES in the mid 90s, and it wasn't until the N64 that I really saw a shift (the N64 seemed significantly more popular in my area than the PSX or Saturn -I saw more Saturns than PSXs early on, though a ton of PSXs by the early 2000s- though maybe it was also the age group -kids my age were the prime market for the N64). Anyone I knew who had an SNES or Genesis was also still playing (and trading/loaning) NES games heavily (one friend had an NES2 as well as a genesis as SNES at home, so he obviously got an NES very late in anyway).

 

My family ended up a bit odd with consoles and computers . . . we've almost always gotten consoles late gen (usually used, sometimes New -Wii and NES are the only ones we got new iirc -maybe GC since it was on sale with a special bundle deal in 2004 iirc) and we almost always bought used/on sale games (ended up with some rather obscure titles too, with a lack of some common ones like Mario 2, 3, all the mega man games, contra, castlevania, etc -but had things like Xexyz, Air Fortress, Rolling Thunder, Quattro adventure, etc), and then our first family PC was that odd mix of rather powerful based hardware (CD-ROM fast 486 class CPU -probably AMD- CD-ROM drive, SB-16 compatible sound card, etc) but still a used (and slightly beaten up) grayscale monitor and all in an old PC-AT case. (he shopped around for the best deals on parts too, be it used or new on sale -he frequented a few local used computer warehouses doing that)

But by the mid/late 90s, we had a pretty sweet gaming/multimedia PC set-up (and he had an even better one in his office), and that pretty much stayed the case through the late 90s and into the early 2000s -from software renderers to the PCI Rage Pro -which he hacked to play DVDs with some bata drives in spite of the lack of official firmware for the PCI version -very few DVDs actually hit bandwidth high enough to cause trouble too- and he had a voodoo III in his office workstation at one point iirc, then a Radeon 9500 pro with his dual athlon MP worksation and the family PC went up to a Pentium III in the late 90s and then a 1.3 GHz celleron, then we went athlon XP for my comptuer, my brother's, and the faimily one, and a mix of Radeon and Geeforce cards, but thin things stopped -we've still got an Athlon XP PC for our shared computer and I'm using my newer -but gaming poor- laptop -still using that Atlon XP machine -the slowest one in the house with an old 180 nm 1600+ until I switched to my laptop in late 2009)

 

Again, my dad didn't really get into PC gaming until around 1992/93 (when he was still very into the NES too), and by the time I actually started playing stuff (initially mostly edutainment, but then stuff like X-Wing by '94/95), I'd missed out on a lot of the classics of the day . . . actually I continued to miss out on some awesome games that appeared in the mid/late 90s (the entire Wing Commander franchise as well as freespace -I'm a huge space sim fan, and I've gotten all the WC games finally but haven't played through any yet -I really want to build a good DOS/9x gaming PC to play them right, at least III and IV -and I could go stright to V, but I kind of wanted to play them in order).

Then there's some games I knew about because of my elementary and middle school computer labs (Jazz Jackrabbit 1 and 2, Descent, and some others -lots of fun with LAN play in Jazz 2 and later on then Halflife in our middle school "computer club").

 

Oh, and of course, at home in the mid/late 90s we had quite a few Sega PC games too. ;)

 

 

 

 

Anyway, yes, I totally understand the computer market was niche, that was a point I mentioned earlier. It was the same in Japan: even the MSX was a small/niche machine and the PC8801/9801 (which much more heavily supported -like PCs became to be) were also a niche game market very much like PC games in the US. The X68000 and FM Towns were somewhat like the ST and Amiga or the Japanese market. (totally niche too, but the best of the computer gaming end of things)

I've actually seen some who claim the Amiga to have been the definitive game computer in the US in the late 80s up into the early 90s, not just in terms of quality and people "in the know", but in general quantity as well. (ie there were more actual gamers playing games consistently on the Amiga than the PC -though that could also be a regional thing, and if that was ever true in the US, it was probably between 1987 and 1991)

 

I know this is all anecdotal, but I felt like the US (or maybe just my part) missed out on the 16-bit era of computer gaming. I was in love with Dungeon Master and Populous, but my friends thought computer games were for nerds, and all the cool stuff was on consoles.

That's pretty true though, most of the neat PC (or computer) games of the time WERE for the nerds so to speak. ;) (some exceptions, but many of the niche genres were best suited to those)

Even more so for some early 80s computer games . . . especially text adventures. ;) (I know my dad was into several of those, including the original Zork series -I myself had a bit of fun with those, though I never actually completed them ;))

 

Anyone who really liked gaming in the US at that point, but like what was on PC too, almost certainly would have had both an NES and a PC (or possibly another computer).

 

Albeit, anyone REALLY into video games beyond a casual basis was probably a nerd too. (cool nerd or not) My dad definitely fell into that category. (probably the cool nerd case knowing my dad) :P

 

 

I'm also really glad some of those genres are making a comeback, like what Telltale Games has been coming out with recently.

Edited by kool kitty89
Link to comment
Share on other sites

If you check on sites like Home of the Underdog you will see upto around 1990/91 VGA versions of most PC games didn't exist, and early VGA games used Amiga 32 colour graphics anyway, therefore the best version of a game was either on Amiga or Sega Genesis/Megadrive. I really don't understand why yanks were buying PCs for home use. If you want to play games from 1985-91 in their best incarnation you will have to source the Amiga versions for use on an emulator. Like I said in the EU the home PC was nowhere until Commodore went bust really.

 

The less said about the pathetic addiction to that pathetic NES the better. Again in the EU the NES was nowhere, and in the UK a complete and utter flop because we weren't stupid enough to buy such overpriced low tech hardware and games.

Link to comment
Share on other sites

Proof that PC in the 80s couldn't compete for home sales.

Anecdotally, almost everybody I knew in the 80s used a PC for a home computer. I was sort of a freak for having a ST. Of course, I'm speaking from the US. I know it was different in other countries.

 

Tandy 1000s outsold Atari STs in the US, and those were not business-oriented machines.

 

The PC200 (which you linked to) suffered from being expensive and wimpy, competing against an ocean of cheap, powerful, clones. Although I really love the ST-ripoff design, who wanted CGA graphics for Christmas 1988? And no chance of fitting standard PC cards, no hope of adding a hard disk?

 

-KS

 

Difference between USA and EU, only the odd ex Acorn BBC Micro user bought a PC for home use....AKA idiots with money to burn. Even in 1990 EGA was de-facto for home PCs.

The thing is, they weren't just idiots . . . there's a REASON the Apple II and PC got a lot of attention and respect from the mass market: 1. flexible open architecture expandability (important for hobbyists and general consumers due to forwards compatibility and flexible expandability), and 2. they damn things just got good software support. In IBM's case, they also had a MASSIVE brand name to gain instant respect from the market. (the high build quality was also a major selling point -the high quality keyboard was a very significant factor for business use) For a time, the TRS-80 line also shared some of that, but that ended up falling apart later on. (as did the Apple II, though it survived surprisingly long in spite of Apple virtually trying to kill it off from 1980 onward -Commodore had a place in the business/professional market early on 2, but their's faded fastest by FAR with the PET languishing rapidly in the early 80s -they also, unlike Apple and Tandy -or IBM- intentionally limited expandability and aimed at an "appliance computer" format more like the Macintosh and Atari 8-bit -the ST was rather like that too)

Gaining respect from the market ensured good software support as well as respected use from certain high-end/professional niches.

 

The cost was high relative to raw performance, but there was a TON of value beyond raw hardware capabilities. (that value includes tangible things like software and peripheral availability or expandability, to things like brand prestige)

 

The BBC Micro had much of that too, except it was also an exceptionally capable computer for its time (2 MHz 6502 without wait states, OK graphics -plenty useful for business purposes and OK for games- and pretty good sound for the time, plus one of the most efficient Operating systems on the market -one of the few functional enough to get a high degree of high-level software support when most other systems totally bypassed the OS and went for hardware alone).

Add flexible expansion slots, and you'd have an Apple II killer if marketed right in the US. ;)

 

Price point was the only thing that really limited all of those machines in the mass market, in part due to manufacturers not aiming at that market or screwing up when they did attempt to go lower-end. (few to none seemed to understand the potential of supporting a broad range of intercompatible machines spanning the low-end consumer to mid-range/small business/education, to professional business/science, to workstation class machines) You saw some cases with potential, like the PCJr and Acorn Electron, but most/all of those were botched . . . albeit Tandy managed to do the PCjr right with a much more mainstream line of machines (and if Kskunk's recent comments are accurate, Tandy actually brought the PC to the lower-end/mid-range mass market, and played a huge part in market saturation of PC clones). It would have been interesting to see how the Acorn Electron would have turned out had it not been so gimped. (ie if it had been a REAL BBC Micro, but just at lower cost from the new LSI chips -not quite as cheap, but a much better value)

 

The value of those "serious" machines obviously allowed them to be sold at very high margins and still be successful, but without lower priced models, they really couldn't penetrate the mass market. (something that was obviously a much bigger problem for the extremely price sensitive Euro market . . . not to mention the greater gaming demands of that market)

 

Games drove all computer markets to some degree, but the US (and Japan) were never like Europe (especially their peak in the 80s). Both the US and Japan were limited to consoles first and a few niche gaming computers along with the dominant more "serious" computers (tandy, apple II, PC, PC8800 and 9800, etc). The C64 (and to lesser extent VIC and A8) boom in the early/mid 80s was short lived and even at its height, it never came close to the market saturation of computer gaming in Europe. (consoles stayed very significant on the whole, and it was only really 1984 and 1985 that saw a real shift toward computer support -and some market regions were already shifting back towards consoles by late 1985)

From what I understand, the late 80s saw the shift to a new niche PC (and Amiga and ST) gaming market with PC becoming the dominant computer gaming platform by the late 80s. (and by 1990 -with a significant number of games supporting VGA and Adlib with full Soundblaster support soon following, and the PC not only getting the dominant amount of support on the software end, but also becoming a seriously capable gaming machine -a fast 286 with 512-640k, 16-bit VGA card, and 8-bit sound blaster made a pretty nice game machine for the time -VGA really helped on the graphics end, well beyond just the added color you had hardware scrolling, some primitive raster op accleration, and 8-bit packed pixels -fast/easy to manipulate for software blitting, so a huge step over dealing with EGA's bitplanes -though at least the Tandy had 4-bit packed pixels, so you coudl easily do software blits on byte boundars without being to choppy -working with byte boundaries with 1bpp graphics or bitplanes means 8 pixel wide -a big part of why you see a lot of games moving objects/BGs at choppy 8-pixel wide intervals on the ST and Spectrum)

 

Of course, by that point, the "cheap" end of the PC market had expanded well beyond Tandy and had almost reached the point where users could easily build their own machines from off the shelf parts. (I think that wasn't quite there in 1990, but definitely became a reality in the early 90s)

 

 

 

I'm getting off topic here though.

In context of bother catering to "serious" business (and science) and the consumer/casual level, there were numerous machines that COULD have done that prior to the PC going mainstream, but none had been managed or marketed that way. (Apple II had huge potential for a broad range of evolving high to low end machines, Tandy had a lot of potential there too, and, of course, Atari had some real potential with the A8 -in fact, engineers had originally wanted to include Apple II like expansion . . . had they done that AND offered lower-end/home oriented models -and promoted a wide range of software from games to serious business and academic applications- they might have ended up with a winning mass market machine -of course, additional tailoring would have been necessary to cater to Europe for both business/professional and home use)

One could argue the C64 had potential for that even, but it came a bit too late to really push that side of things, and the way it managed its market penetration didn't really cater to pushing into the high-end market after the fact. (perhaps if they'd designed a high-end/serious incarnation of the C64 in parallel with the breadbox model, it could have worked . . . CBM definitely screwed over their business and education market in the early 80s, it's not like they stopped trying to push there too, but they didn't address some fatal the shortcomings of the original PET line and tended to make generally less attractive successors -incompatible, no more flexible for expansion, less attractive price point, etc -the B128 would be among those failures as well . . . OTOH it also may have made sense for CBM to make a hybrid C64+PC compatible machine -probably would have been better than the C128, then again they should have just been pushing the Amiga and C64 at that point anyway -if they didn't have the Amiga, going PC could have made the most sense, but otherwise they could have molded the Amiga into a wide range of machines catering to mid-range consumer to small business to serious business, etc, etc)

 

The PC200 was a terrible design but it could have ISA cards inserted...they just stuck out the top. Biggest problem was 8086 CPU at the time.

An 8086 as apposed to what? 286s wouldn't have really worked with a cheap/low-end design, though perhaps an NEC V30 would have worked. Maybe they could have gotten a good deal on AM286 (or perhaps a 186), and either made a high-volume order for an embedded PC chipset (or invested in making their own ASICs).

But really, in 1988 it was too late to push that, going off the shelf was cheaper by that point as Atari learned.

 

Now, maybe if Amstrad had been really aggressive in pushing PCs and got a license to distribute/localize the Tandy 1000 (or a more embeded derivative thereof) for the European back in the mid 80s, they might have had something. (Tandy had an embedded PC chipset with decent graphics/sound capabilities on top of that)

That would have meant getting into the market before the ST was established and possibly making the PC mainstream in Europe a decade before it was historically. (or it might have still fallen behind the ST and ended up as more niche competition)

 

Pushing a higher-end Amiga (or Flare) like chipset would have compromised the cost effectiveness of such a design, so a Tandy-like machine (especially fully Tandy compatible) would probably have been the best option. (maybe, just maybe, they could have collaborated with Tandy to further expand their standard to have more comprehensive features -Tandy didn't end up doing that though, AFIK they also didn't end up investing in custom EGA/VGA compatible embedded ASICs either, just off the shelf video cards -a best case would be trying to comply with mass market standards and minimizing cost . . . and rather than continuously investing in in-house R&D, it might be preferable to outsource or license -ATi was pushing a lot of consolidated graphics solutions, so perhaps they could have partnered with ATI to get an embedded VGA compatible chipset that also complied with Tandy video -ATi was already offering cards with Plantronics support, and that was nearly identical to TGA- plus ATi cloned the IBM 8514 blitter/accelerator and released it in the Mach 8 in 1991, so that could have possibly been rolled into the standard as well -piggybacking on that standard would make a lot more sense than going with a new custom blitter, especially if they could get it at a reasonable price -havign such as a standard feature could mean a lot more support too -vs all games using just plain VGA acceleration at the time, aside for a few not even supporting that)

 

End of the day pre 1991 every game released the best version in the EU was the Amiga for 2D gaming. 3D gaming however on a proper 386 was faster than ST/Amiga. But then they did cost over £1000 so.....unless you liked rubbish American games like Sierra adventures etc or other boring crap most games were EGA only

WTF? Those awesome classic graphics adventures RUBBISH??? RUBBISH??? (not to mention Lucas Art's stuff, or Origin, or EA)

 

 

 

 

 

 

 

Anyway, much of this topic is down to business and non-technical. Powerful features and good cost effectiveness can play a significant role, but are only a small part of what makes a successful mass market product (especially in the US). Less powerful features with greater support and compatibility can be much more valuable (to developers/publishers and consumers) than sheer capabilities/power without strong support.

And then the many non-technical issues, like sheer breadth software support (which can be gained in a number of ways -ease of programming helps, but having a strong brand name -especially a reputation as a stable and successful company- can be huge, and -in the long run- market share is obviously a factor -actually GETTING market share is the real issue, it's the result, not the cause ;)), consumer confidence of the product, marketing, price point, perceived value, sheer monetary funding and credit, and good overall business management are all critical.

 

A (technically) weaker product with good funding and management is often much better off than an amazing product with crap management and support (or just crap funding . . . or all of the above -good/efficient management can do more with less funding, and massive funds are hardly foolproof -as with NEC with the TG-16).

In the video Game industry, Sony's introduction of the PSX was more or less a perfect storm (pretty clean engineering, massive advantages in vertical integration, good feature set, excellent development tools -for the time, tons of cache, a good reputation -albeit no real history in the game market and NEC showed how that was hardly foolproof, massive spending -showing how serious they were, proportionally conservative investment was one of NEC's failings too, and good management on top of all that -not to mention competition making serious mistakes)

 

The PS2 is more of a good example on how the technical end isn't the most important . . . the PS2 was a relatively efficient design, but a tough architecture to work with (compared to market standards) with rather weak SDKs from Sony (so not like the Saturn in terms of poor cost effectiveness, but perhaps closer to the Jaguar in that sense). They had some many other advantages that developers flocked to the platform anyway and eventually built some pretty good in-house SDKs. (I think Carmak may have been the only one to really come close to that for the Jaguar -rather surprising given just how small the Jag's market share was)

 

The Dreamcast is another example of how technical brilliance can't win alone . . . the hardware feature set was great, the cost effectiveness was great, the SDKs were the best on the market at the time (possibly the best of any system at launch ever), and even the marketing was pretty good (in the US), but Sega's overall cash flow and management problems and Sony's massive competition hurt it (along with poor management/marketing in Europe). SoJ didn't feel the machine could be viable in the US market alone (which was the only region where it was really selling at mainstream volumes), so they cut it prematurely. (it's arguable whether they'd have been better or worse off had they kept pushing the DC -certainly not as clear cut as Atari pulling out in '96 where that was pretty much the smartest move possible)

 

 

 

 

 

Marketing of the ST and Amiga may have been one of the biggest weak points of those machines in the US as well. Advertizing alone was part of that, but other things like form factor (especially the ST's lack of a desktop format until '87) and some other marketing areas weakened the machines. (including general market positioning . . . CBM made much bigger mistakes given they had much more funding and market share to build on; Atari did pretty damn well under their severe limitations -things consistently got better and better for Atari Corp up to 1989, when the consistent decline began -they'd gone from deep in debt to a fortune 500 company within 4 years, but then it began falling apart)

Lack of expandability could also be considered a marketing failure (especially if a closed box form factor was chosen for business and not technical reasons -ie to force users to buy a new system rather than upgrading internally; a terrible idea by that time).

 

Pushing for PC cross compatibility on the software/file format end would also be a marketing dependent area.

 

 

 

Hell, many of the technical mistakes with the A8 were more business/management related. (like the "appliance computer" concept, pushing that further with the 1200 XL, not catering to the needs of the European market -and not catering to many important areas in the US market either)

Same thing for CBM deciding to push out the C128, Plus/4, C-16, etc and crowd the market with unnecessary hardware. (the C128 was OK, but focusing purely on the C64 and Amiga -and compatible derivatives thereof- should have filled all major market sectors with far less confusion -lack of compatibility from PET to VIC to C64 to Amiga was be bad enough, but throwing the rest in made a real mess -they could have finally done it right with the Amiga and established a base standard with a wide range of machines tied together by an open-ended highly expandable architecture -of course, Atari Corp should have done that with the ST too . . . actually a lower-end ST derivative probably would have made more business sense than the 130XE)

even with 8bit vga and a basic adlib card or early sound blaster, a PC could not even play a basic shoot'em up arcade for many years. People were excited by the picture perfect photos and basic games and it sold alot of pc's, others like myself were just "meh" about it. No arcade games, no sale.(not to mention having to get used to an analog jytstick).

later on with the arrival of title like 7TH Guest, these really made pc's mainstream for gaming and cut into those who still had an ST or Amiga.

Also around the same time frame a pc could not compete at all with some of the most simple desktop publishes on the ST(i.e publishing partner) or Amiga. Work processing though windowed still had many DOS functions and seems like you were using some old POS for a word processor. I am sure old lady secretaries of the day would disagree,

Edited by atarian63
Link to comment
Share on other sites

If you check on sites like Home of the Underdog you will see upto around 1990/91 VGA versions of most PC games didn't exist, and early VGA games used Amiga 32 colour graphics anyway, therefore the best version of a game was either on Amiga or Sega Genesis/Megadrive.

Actually, they weren't. Starting around 1989, you started getting some high-end games with VGA support, many which would never be ported to non PCs, or done so years later and often with poor quality.

 

Wing Commander is one of the definitive examples, it (like many Origin games) was pushing the limits of the hardware at the time and required VGA to run . . . it had EGA support, but required a special installation with software conversion of all the animation from VGA to EGA format.

Wing Commander on the Amiga was crap (unless you had a fast 68030 -which it was at least decent on a 12-16 MHz 286 PC, pretty damn good on a 20 MHz 286 or 386SX from what I understand, and at its limit with a 33 MHz CPU -any faster and the game would play too fast -one of the few later gen timing sensitive PC games . . . albeit some windows games continued to suffer from that).

Monkey Island is also another game fully benefitting from VGA. (it supported EGA and CGA too, but didn't look nearly as good -the ST and Amiga versions had optimized graphics too, the ST definitely had optimized graphics converted from VGA and not just the crappy EGA colors ported over)

 

I really don't understand why yanks were buying PCs for home use. If you want to play games from 1985-91 in their best incarnation you will have to source the Amiga versions for use on an emulator.

They were buying PCs because that's what you needed if you wanted any decent compatibility with mass market software. Availability of good Commodore or Atari dealers was a bit spotty and hit or miss, but PC vendors were widespread and Radio Shack in particular was across the country (not the best salesmen on average, but still better than having good sales reps in some areas, and a total absense in others -which seems to have been the case from anecdotes I've heard)

 

Like I said in the EU the home PC was nowhere until Commodore went bust really.

Yes, it failed to dig-in early on as there wasn't the massive infrustructure demanding high-end "serious" business computers, let alone the likes of Tandy and such pulling that tech to the range of the average consumer (and making it more attractive in general). (or the prestige of IBM, sensationalist marketing, etc, etc -you also have a dense population and stronger magazien culture to keep BS marketing in check with realistic reviews and criticism -well, better than in the US at least . . . at least back then ;))

 

The less said about the pathetic addiction to that pathetic NES the better. Again in the EU the NES was nowhere, and in the UK a complete and utter flop because we weren't stupid enough to buy such overpriced low tech hardware and games.

It's all about marketing and appeal for the mass market, there's a ton of stuff in Europe (and Japan) that would have (or did) flop in the US market due to clashing market demands, or (more often) sheer weak advertising/marketing and/or funding. :P (with the master System, it was pure bad marketing it seems given Sega's highly competitive marketing budget -yet they got outsold by the underfunded Atari with much weaker software- -of course, Nintendo's Japanese and subsequent US software exclusivity hurt, but Sega's in-house software was so strong that it should have at least managed a decent 2nd place)

Home computer gaming died, though the C64 hung on for a little while against consoles (Atari 8-bit to lesser extent), that quickly fell apart towards the end of the 80s. (1987 was when the NES really took off big time, 1988 is when they totally solidified a monopoly with market share averaging between the high 70s and low 80% range, 1989 saw it peak at 90% -over 93% for quarterly peak- and nearly that high for 1990 as well, 1991 it had declined a fair bit, but was still ahead of the Genesis and SNES in sheer volume market share -1992 it dropped far behind in the figures I've seen, but was still very notable up through 1994, enough to keep Nintendo dominant on the market if you compare composite sales -they were losing the 16-bit market for much of the generation, but persisting NES sales pushed that back up considerably)

 

I'm not going to start another NES defense argument (especially since I'll end up playing devil's advocate with myself), so I'll leave it at that. ;) (I already started going into the business end of things . . . I'm not going to start on the actual technical or "quality" or "good games" side of things again -I had enough of that with Underball :P )

 

Hell, you complain about the US, but what about Japan??? (FAR, FAR more absolute NES dominance, a small niche MSX/PC8801 game market, pitiful market for Sega's consoles prior to the MD, and no real competition to Nintendo until the PC Engine in 1987, hell, in Japan managed to secure exclusive publishing contracts without even establishing any sort of lockout -it it was as open to unlicensed development as the VCS/Intellivision/CV/etc, but Nintendo somehow managed to enforce some of the most restricting and anticompetitive licensing contracts in the history of the video game or computer industry)

 

 

 

 

 

 

1993: Star Wars X-Wing comes out.

 

Moved from ST to PC.

Yep, that's one of those milestone events. (though for me, we ended up getting X-Wing in '94 or '95 with the updated CD-ROM version . . . and a PC fast enough to play it at full speed at max detail ;))

 

Though as I said, that didn't stop us (my dad and I) from playing on the NES still, and later with the SNES from the end of 1996 on. (he'd played ST and Amiga at work prior to getting the NES in 1990 -as a gift from some friends)

 

For some people it was Wing Commander, others X-Wing or Tie Fighter, or Wolf 3D, Doom, Duke 3D, Quake, etc, etc, but eventually you had a huge shift towards PC gaming into the mainstream by the mid 1990s.

Edited by kool kitty89
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...