Jump to content
IGNORED

Atari and Microsoft and the ST


Recommended Posts

Allegedly while the ST was still in development stage, tramiel was looking at possibly getting something like MS windows on the ST as the default O/S

 

The story goes that microsoft couldn't get a proper working version of windows running on the 68k platform within Atari's/tramiels projected timescale

 

Why couldn't tramiel have just loaned MS some of it's own crack coders/programmers to have worked with MS and ported MS windows over to the 68k platform, after all didn't atari's crack team of coders/programmers do something very similar to DRI's gem o/s...namely porting the x86 based o/s to the 68k platform

 

 

There again, since the MS based o/s was increasing dominant in the upscale computer market, would the US and EU authorities have allowed atari and ms to engage in this sort of tie up, since it would have given MS an unfair advantage and position within the upscale computer market

 

And given the development on windows subsequent to legacy windows (probably 3x) what would the possiblilies of an atari version of say windows 9x, ME, 2000/NT or XP look like....especially on higher spec'd st platforms like the TT/falcon etc

 

The other thing being, would tramiel/atari have been forced into updating the ST spec (and similarly the TT/Falcon spec) to taking into consideration the future development of the atari windows platform (by way of constantly upgrading the hardware, i.e better gfx/sound chips etc)

Edited by carmel_andrews
Link to comment
Share on other sites

Remember that when the ST was in development, windows 1 was not ready for release on any platform, and when it was released was a rather poor dos shell which could only tole it's windows!

 

Gem was a far better alternative right up to windows 3.x where there was a technical similarity and functionality, and was the point when windows was started to be taken seriously by the world.

 

If you have ever had the misfortune to use windows 1, you would wonder how Microsoft ever managed to become dominant in anything!

 

 

Sent from my iPhone using Tapatalk

Edited by AtariMusicNet
  • Like 4
Link to comment
Share on other sites

Oh, boy! I had Microsoft Write on the ST and it has been the worst program ever made. Ever!

It was the *only* program that wrote loads of temporary files on the floppy, without cleaning up when closing. It was a joke of software, and a testament to how good coders at microsoft are.

  • Like 1
Link to comment
Share on other sites

Allegedly while the ST was still in development stage, tramiel was looking at possibly getting something like MS windows on the ST as the default O/S

 

The story goes that microsoft couldn't get a proper working version of windows running on the 68k platform within Atari's/tramiels projected timescale

 

Why couldn't tramiel have just loaned MS some of it's own crack coders/programmers to have worked with MS and ported MS windows over to the 68k platform, after all didn't atari's crack team of coders/programmers do something very similar to DRI's gem o/s...namely porting the x86 based o/s to the 68k platform

 

 

There again, since the MS based o/s was increasing dominant in the upscale computer market, would the US and EU authorities have allowed atari and ms to engage in this sort of tie up, since it would have given MS an unfair advantage and position within the upscale computer market

 

And given the development on windows subsequent to legacy windows (probably 3x) what would the possiblilies of an atari version of say windows 9x, ME, 2000/NT or XP look like....especially on higher spec'd st platforms like the TT/falcon etc

 

The other thing being, would tramiel/atari have been forced into updating the ST spec (and similarly the TT/Falcon spec) to taking into consideration the future development of the atari windows platform (by way of constantly upgrading the hardware, i.e better gfx/sound chips etc)

 

As far as I can remember, Atari was 100% committed to TOS/GEM in the belief that it was better than anything comming from Microsoft (at the time).

Link to comment
Share on other sites

Madness.

Windows was a complete pile of puke until 3.0.

 

Actually, it wasn't much good either.

3.1 was OK.

 

Timeframe, Windows 3.0 released May 1990, 3.1 in March 1992.

 

By 1992 the ST was in it's death throws.

 

I used GEM on the PC back in the day - it was far superior to the equivalent early Win OSes.

  • Like 2
Link to comment
Share on other sites

Madness.

Windows was a complete pile of puke until 3.0.

 

Actually, it wasn't much good either.

3.1 was OK.

 

Timeframe, Windows 3.0 released May 1990, 3.1 in March 1992.

 

By 1992 the ST was in it's death throws.

 

I used GEM on the PC back in the day - it was far superior to the equivalent early Win OSes.

You had OS/2 in the late 80s as well. ;)

Link to comment
Share on other sites

Oh, boy! I had Microsoft Write on the ST and it has been the worst program ever made. Ever!

 

Used every program ever made, have you? :P However did you find the time? :D

 

Jokes aside, was Write ever on anything else? Was it some weird sister to WORD or an early version of it. Never understood that product and not much about it out there information wise.

Edited by DracIsBack
Link to comment
Share on other sites

Oh, boy! I had Microsoft Write on the ST and it has been the worst program ever made. Ever!

 

Used every program ever made, have you? :P However did you find the time? :D

 

Jokes aside, was Write ever on anything else? Was it some weird sister to WORD or an early version of it. Never understood that product and not much about it out there information wise.

 

Hehe, I used a pretty lot of software (on Atari, on Linux, on Mac and on Windoze), and even a lot of editing software (1st word plus, redacteur, papyrus, msword from the DOS version onwards, easywrite, easy, wordstar, neo/open/star/libreOffice, and someone else. But NO atari program *ever* clogged my floppy or hard disk with temporary files. Just MSWrite.

 

Also, Microsoft Write is on every Windows version up to now. I assume it started being a minor sister to Word for Windows, and later being an enhanced 'notepad'.

Link to comment
Share on other sites

Also, Microsoft Write is on every Windows version up to now. I assume it started being a minor sister to Word for Windows, and later being an enhanced 'notepad'.

Write got replaced with WordPad for win9x onward. (basic ritch text editor, no real word processing features beyond that -not even paragraph formatting or double spacing)

Looking at the wiki page, Write actually had some features (or at least 1) that WordPad lacks. (namely automatic pagination vs wordpad just displaying a continuous column of text until you go to print it -youd need to use print preview to see how it would look before printing)

Link to comment
Share on other sites

I thought CP/M was Z80 not 8086 code. Anyway prior to Win95 Windows was a load of cock used by clueless twats. Win95 was the first proper GUI without all that Program and File manager bollox.

 

Had the ST had a simple DOS emulator built in just as Amiga OS4 integrates legacy Amiga software emulation and Apple's 68k emulation was on PowerPC Macs then this would have helped. The 68000 goes up to 20mhz models I think for some arcade motherboards.

 

Actually if Jack wanted MS rubbish, which I doubt he did, why not just use an 8086 + custom chips....Amiga style performance boost and native DOS/WIN/GEM compatibility. The Konix console was superior to Amiga using just that kind of hardware after all ;)

  • Like 1
Link to comment
Share on other sites

Sorry, I keep calling Worpad as Write...

However, both are just good as Notepad as far as for writing text (now that Notepad has overridden that 32k filesize limit).

Being an half-*ssed wordprocessor doesn't make it any better than a texteditor.

(and, by the way, even so they have a long way to go to reach features like auto-indent, sintax highlight and other features that Atari texteditors have had since -well- forever) :-)

Link to comment
Share on other sites

I thought CP/M was Z80 not 8086 code. Anyway prior to Win95 Windows was a load of cock used by clueless twats. Win95 was the first proper GUI without all that Program and File manager bollox.

It was initially 8080/Z80, but was ported to x86 soon after the introduction of that architecture. (8088/86 code was source compatible with the 8080/8085/Z80, so porting the code over would have been rather straightforward)

Of course, a simple source port wouldn't have been optimized as a true rewrite of CP/M for x86.

 

There was a much longer delay for CP/M68k, and that was rather problematic when it was first released. (plus, but the time the ST was coming to market, CP/M itself was aging somewhat and really needed updating to be useful going forward -it was still better at MS/PC DOS at that point, I believe -which, of course was originaly just a licensed copy of QDOS -which MS later bought exclusive rights to- which in turn was inspired by/derived from CP/M)

Heh, if IBM had done their business research options better, they could have bought the rights to QDOS directly and not dealt with IBM (plus they could have had exclusive rights to the OS rather than sharing with MS and eventually falling behing ;)). Of course, if DRI hadn't been reluctant (or their legal consultation specifically), they'd have had CP/M on PCs and would have had the weight of IBM to throw at Apple over GEM. (the same weight that made it moot to attempt suing over DOS being a CP/M ripoff)

 

Had the ST had a simple DOS emulator built in just as Amiga OS4 integrates legacy Amiga software emulation and Apple's 68k emulation was on PowerPC Macs then this would have helped. The 68000 goes up to 20mhz models I think for some arcade motherboards.

The ST did get a slow DOS emulator in the late 80s, but I'm not sure if it ever got a hardware emulator. (the MAC Emulator was the real interest given its ability to outperform contemporary 68k Macs by 20-30% -I assume due to a lack of CPU wait states and slightly higher clock speed)

 

As to the 68k's speed, the fastest models I know of were used in computer accelerator boards. 20 MHz was/is the fastest "official" rating manufactured, but a number of 3rd parties regraded chips into the 25-30 MHz range (not sure if any pushed beyond that).

I'm rather surprised that various 68k manufacturers didn't push it to considerably faster speeds, especially after Motorola tightened up licensing and fixed their prices (in collaboration with Toshiba) with the 68020 and later chips. (many other 2nd sources of 68ks attempted legal action over that issue, but why not just compete with faster 68ks, let alone take the initiative to engineer their own extensions to the architecture like NEC, Cyric, AMD, etc did with x86 ;)) Hell, x86 had a lot more competitive licensing/development of newer designs, yet you had x86, 286, 386, and 486 compatible derivatives from 3rd parties that pushed well beyodn the clock speeds offered by Intel. (16 MHz 808x chips, 25 MHz 8018x, 25 MHz 286s, 40 MHz 386s -plus Cyrix's 486SLC/DLC range, 120 MHz 486s, etc; OTOH Motorola themselves offered 20 MHz 68000s iirc)

 

Actually, I've been thinking about starting a discussion on this very topic (68k development), but I wasn't sure if it should go in the ST or Classic Computer General forums. (probably the latter, but maybe with a linked tag thread in the ST forum to get more people interested . . . unless the MODs don't want me cluttering the boards like that . . . they don't seem to mind with other people though ;))

 

Actually if Jack wanted MS rubbish, which I doubt he did, why not just use an 8086 + custom chips....Amiga style performance boost and native DOS/WIN/GEM compatibility. The Konix console was superior to Amiga using just that kind of hardware after all ;)

IBM sort of did that with the PCJr and Tandy did it right with the Tandy 1000, though without nearly as advanced custom hardware as the Amiga and even weaker than the ST (but better than any other mid-range/low-end PC clone of the time). The problem was a lack of standards, the PCJr/Tandy video never became an open standard PC upgrade (never rolled into EGA and VGA either), though that probably wouldn't have been the case if IBM had done the PCJr like Tandy did. ;) Tandy's Deskmate was no GEM either, though it seems to have been a decent rudimentary DOS shell program with some specific application support from Tandy.

 

Going x86 with a "better than, but compatible with PC" sort of system would have been interesting to see. It would have undoubtedly been more popular in the US, but it's hard to say if it would have lasted. (Europe would be a bit of a craps shoot, though it it did catch on, it would have meant much weaker Amiga support early on -the ST's architecture really helped the Amiga get support)

 

Going x86 wouldn't mean going with that MS rubbish (as you so deftly put it) either, it would have meant having hardware supporting IBM/MS OS and hardware standards, but more than that too. (they could have gone with GEM for PC well before windows was any good, or before OS/2 . . . but PC GEM ended up getting most of the legal wrath of Apple while TOS somehow remained unscathed -it wasn't totally crippling, but it was a road block for GEM on the PC for sure)

 

Also, if they wanted to compete with PC clones using custom, embedded hardware, they'd have to be really aggressive with high volume production of custom chips at razor edge pricing on those components. (Atari started off with custom, embedded PC hardware with their PC-1 in 1987, but that ended up being too costly against generic gray-market PC clones)

  • Like 1
Link to comment
Share on other sites

oky2000,

 

It's also a bit ironic that Atari started selling 16 MHz 286, 386SX, and 20 386DX PC-4/5/ABC in the late 80s (along with the bottom-end PC-3 with an 8 MHz 8088) when they weren't offering anything but 8 MHz 68k STs even with the MEGA line. ;) (no 10, 12/12.5, or 16/16.67 MHz models aside from 3rd party accelerators, and no '020 or '030 models at all, low end or high-end . . . until the TT which didn't cater to Atari's general market range at all due to the high price, and was neither here nor there as a workstation -too expensive to be a mid-range machine, but not capable enough to be a really competitive high-end workstation, albeit it had advantages over the Amiga 3000 on the hardware end, but I think the pricing was worse and the software support definitely was)

 

It's also a bit funny that Atari's switch from their sleek 8 MHz 8088 PC-1 with custom motherboard and embedded CGA+EGA+Hercules compatible graphics ASIC ended up being generally less capable/flexible than the following off the shelf desktop box that was the PC-3. (the PC-1 was more visually attractive, but the PC-3 was a proper big box PC with internal ISA slots rather than the single expansion slot on the PC-1 requiring an additional expansion module to go beyond that)

So that's one more strike against the ST line too; the MEGA line just barely added PC-1 like expansion (maybe slightly better . . . but obviously less supported than ISA) while Atari's cheaper off the shelf successor to the PC-1 with the PC-3 (let alone the PC-4 or 5 -or later ABC) had all the flexibility for expansion and market standard support that came with other PCs while the ST never did get any models with that sort of flexible inbuilt expansion. (the Amiga got a couple high-end models close to that with the 2000 and such)

 

Atari also never extended that expansion slot architecture to the lower-end console form factor STs. (ie a simple edge expansion port for plug-in modules or a full expansion module with additional slots -like Atari Inc was doing with PBI and Tandy did with the CoCo's cart slot; hell, even the spectrum, VIC 20, and C64 had rather flexible expansion via the cart slot -or expansion port on the spectrum -and the Amiga 500 had the rather flexible trap door expansion port too -in addition to CPU socket piggyback add-ons)

That's something they should have done from day 1 on the ST, but definitely should have corrected after the fact. (had they done it from day 1, keeping flexible forward/backwards compatibility would have been much more straightforward as many upgrades could use the expansion port -blitter, sound expansion, perhaps even a CPU upgrade)

 

 

 

 

Sorry, I keep calling Worpad as Write...

However, both are just good as Notepad as far as for writing text (now that Notepad has overridden that 32k filesize limit).

Being an half-*ssed wordprocessor doesn't make it any better than a texteditor.

(and, by the way, even so they have a long way to go to reach features like auto-indent, sintax highlight and other features that Atari texteditors have had since -well- forever) :-)

The main advantage of Wordpad over Notepad is it being an actual rich text editing/viewing program with proper support for various fonts (and font sizes), some limited page formatting (or at least a reasonable guide to manually set margins, spacing, paragraph formatting, etc).

I got by writing a few papers in middle school with just wordpad . . . manually double spacing got annoying though. Open office all the way now though. ;) (aside from school computers with word instead, but I save as doc files anyway, so it's cross compatible)

 

Actually, it's probably more useful as a rich text viewing program than an editor. (it can read text files that were formatted by proper word processors so long as they were saved as rich text -you can lose some stuff from doc files or such)

  • Like 1
Link to comment
Share on other sites

oky2000,

 

It's also a bit ironic that Atari started selling 16 MHz 286, 386SX, and 20 386DX PC-4/5/ABC in the late 80s (along with the bottom-end PC-3 with an 8 MHz 8088) when they weren't offering anything but 8 MHz 68k STs even with the MEGA line. ;) (no 10, 12/12.5, or 16/16.67 MHz models aside from 3rd party accelerators, and no '020 or '030 models at all, low end or high-end . . . until the TT which didn't cater to Atari's general market range at all due to the high price, and was neither here nor there as a workstation -too expensive to be a mid-range machine, but not capable enough to be a really competitive high-end workstation, albeit it had advantages over the Amiga 3000 on the hardware end, but I think the pricing was worse and the software support definitely was)

 

It's also a bit funny that Atari's switch from their sleek 8 MHz 8088 PC-1 with custom motherboard and embedded CGA+EGA+Hercules compatible graphics ASIC ended up being generally less capable/flexible than the following off the shelf desktop box that was the PC-3. (the PC-1 was more visually attractive, but the PC-3 was a proper big box PC with internal ISA slots rather than the single expansion slot on the PC-1 requiring an additional expansion module to go beyond that)

So that's one more strike against the ST line too; the MEGA line just barely added PC-1 like expansion (maybe slightly better . . . but obviously less supported than ISA) while Atari's cheaper off the shelf successor to the PC-1 with the PC-3 (let alone the PC-4 or 5 -or later ABC) had all the flexibility for expansion and market standard support that came with other PCs while the ST never did get any models with that sort of flexible inbuilt expansion. (the Amiga got a couple high-end models close to that with the 2000 and such)

 

Atari also never extended that expansion slot architecture to the lower-end console form factor STs. (ie a simple edge expansion port for plug-in modules or a full expansion module with additional slots -like Atari Inc was doing with PBI and Tandy did with the CoCo's cart slot; hell, even the spectrum, VIC 20, and C64 had rather flexible expansion via the cart slot -or expansion port on the spectrum -and the Amiga 500 had the rather flexible trap door expansion port too -in addition to CPU socket piggyback add-ons)

That's something they should have done from day 1 on the ST, but definitely should have corrected after the fact. (had they done it from day 1, keeping flexible forward/backwards compatibility would have been much more straightforward as many upgrades could use the expansion port -blitter, sound expansion, perhaps even a CPU upgrade)

 

 

Commodore also made PCs too. PCs are a reference design with off the shelf parts mostly. The Atari PC1 was attractive though and quit a bit smaller than the tank like 8086/286 machines of the time. I'd love a PC1 but never seen one for sale. No cross marketing is possible.

 

A500 and A1000 had an edge connector too. It is possible to have 286 PC bridgeboard+68030+16mb+HD+CD at once on it so pretty powerful.

Link to comment
Share on other sites

I thought CP/M was Z80 not 8086 code. Anyway prior to Win95 Windows was a load of cock used by clueless twats. Win95 was the first proper GUI without all that Program and File manager bollox.

 

It was initially 8080/Z80, but was ported to x86 soon after the introduction of that architecture. (8088/86 code was source compatible with the 8080/8085/Z80, so porting the code over would have been rather straightforward)

Of course, a simple source port wouldn't have been optimized as a true rewrite of CP/M for x86.

 

There was a much longer delay for CP/M68k, and that was rather problematic when it was first released. (plus, but the time the ST was coming to market, CP/M itself was aging somewhat and really needed updating to be useful going forward -it was still better at MS/PC DOS at that point, I believe -which, of course was originaly just a licensed copy of QDOS -which MS later bought exclusive rights to- which in turn was inspired by/derived from CP/M)

Heh, if IBM had done their business research options better, they could have bought the rights to QDOS directly and not dealt with IBM (plus they could have had exclusive rights to the OS rather than sharing with MS and eventually falling behing ;)). Of course, if DRI hadn't been reluctant (or their legal consultation specifically), they'd have had CP/M on PCs and would have had the weight of IBM to throw at Apple over GEM. (the same weight that made it moot to attempt suing over DOS being a CP/M ripoff)

 

I heard that the Atari engineers did most of the conversion for DR from their less than perfect source files, maybe the source was 8088 and not Z80. It was on that Dadhacker blog thing.

 

GEM was one of three windowed OS solutions, Windows was the other and IBM had one of sorts too well before OS/2, it's in either the Amiga 1000 review or Atari ST/C= 128 preview issue of PCW Magazine.

 

Jack Tramiel was the only person in history to tell Gates to shove his royalties and keep rights to what he sold typical deals, he got royally screwed and lost millions on VIC20/PET/C64 sales.

 

Had the ST had a simple DOS emulator built in just as Amiga OS4 integrates legacy Amiga software emulation and Apple's 68k emulation was on PowerPC Macs then this would have helped. The 68000 goes up to 20mhz models I think for some arcade motherboards.

The ST did get a slow DOS emulator in the late 80s, but I'm not sure if it ever got a hardware emulator. (the MAC Emulator was the real interest given its ability to outperform contemporary 68k Macs by 20-30% -I assume due to a lack of CPU wait states and slightly higher clock speed)

 

As to the 68k's speed, the fastest models I know of were used in computer accelerator boards. 20 MHz was/is the fastest "official" rating manufactured, but a number of 3rd parties regraded chips into the 25-30 MHz range (not sure if any pushed beyond that).

I'm rather surprised that various 68k manufacturers didn't push it to considerably faster speeds, especially after Motorola tightened up licensing and fixed their prices (in collaboration with Toshiba) with the 68020 and later chips. (many other 2nd sources of 68ks attempted legal action over that issue, but why not just compete with faster 68ks, let alone take the initiative to engineer their own extensions to the architecture like NEC, Cyric, AMD, etc did with x86 ;)) Hell, x86 had a lot more competitive licensing/development of newer designs, yet you had x86, 286, 386, and 486 compatible derivatives from 3rd parties that pushed well beyodn the clock speeds offered by Intel. (16 MHz 808x chips, 25 MHz 8018x, 25 MHz 286s, 40 MHz 386s -plus Cyrix's 486SLC/DLC range, 120 MHz 486s, etc; OTOH Motorola themselves offered 20 MHz 68000s iirc)

 

Yes the MC68000P20 was an official product and if it had ever been used would probably have had to be stretched to 24mhz to keep the system timing on ST simple. In 1985 though even a slow DOS + CGA emulator would have done them a lot of good, a lot more good than ST running Windows V1.0 that's for sure. The kings at the time were Wordperfect Corporation and Lotus (1-2-3 and Symphony) and Dbase. Beyond DOS MS didn't count much.

 

Actually if Jack wanted MS rubbish, which I doubt he did, why not just use an 8086 + custom chips....Amiga style performance boost and native DOS/WIN/GEM compatibility. The Konix console was superior to Amiga using just that kind of hardware after all ;)

IBM sort of did that with the PCJr and Tandy did it right with the Tandy 1000, though without nearly as advanced custom hardware as the Amiga and even weaker than the ST (but better than any other mid-range/low-end PC clone of the time). The problem was a lack of standards, the PCJr/Tandy video never became an open standard PC upgrade (never rolled into EGA and VGA either), though that probably wouldn't have been the case if IBM had done the PCJr like Tandy did. ;) Tandy's Deskmate was no GEM either, though it seems to have been a decent rudimentary DOS shell program with some specific application support from Tandy.

 

Going x86 with a "better than, but compatible with PC" sort of system would have been interesting to see. It would have undoubtedly been more popular in the US, but it's hard to say if it would have lasted. (Europe would be a bit of a craps shoot, though it it did catch on, it would have meant much weaker Amiga support early on -the ST's architecture really helped the Amiga get support)

 

Going x86 wouldn't mean going with that MS rubbish (as you so deftly put it) either, it would have meant having hardware supporting IBM/MS OS and hardware standards, but more than that too. (they could have gone with GEM for PC well before windows was any good, or before OS/2 . . . but PC GEM ended up getting most of the legal wrath of Apple while TOS somehow remained unscathed -it wasn't totally crippling, but it was a road block for GEM on the PC for sure)

 

Also, if they wanted to compete with PC clones using custom, embedded hardware, they'd have to be really aggressive with high volume production of custom chips at razor edge pricing on those components. (Atari started off with custom, embedded PC hardware with their PC-1 in 1987, but that ended up being too costly against generic gray-market PC clones)

 

Nope what I meant was if the ST had used an 8086 (or NEC V30 preferably as it is 20% fast mhz/mhz to 8086) and custom chips like A8/C64 then it would be OK as a propriety games system still as the 8086 is just a controller for the DMA on the custom chips BUT having an x86 CPU Atari could have run DOS natively and written custom VESA style bios for EGA/CGA to run PC stuff full speed.

 

The Konix Console had 8086+custom chipset (Flair1 chipset I think...Flair2=Jaguar Tom and Jerry) and was better than the Amiga in quite a few ways really so it's not a restriction. Hell the NEC Turbografx had a 6502+custom chips and still had better games than the ST or Amiga in many cases (Outrun, Powerdrift, SF2, etc).

 

Any machine that allowed you to run that PC rubbish from the office when doing work at home in the evening AND allowed you to have a technically superior machine for creative/leisure software would have worked really well. Commodore missed this trick too, the OS was quite capable of even multitasking a software DOS emulator so would have been most convenient. Amiga OS 2.0 integrates reading the PC formatted disks as a module in the OS for duality....both should have been in the Amiga 1000 and 520ST from day one IMO.

Link to comment
Share on other sites

I heard that the Atari engineers did most of the conversion for DR from their less than perfect source files, maybe the source was 8088 and not Z80. It was on that Dadhacker blog thing.

They started with CP/M 68k, but then abandoned it in favor of DRDOS+GEM and worked on porting that to 68k. (not sure who ended up doing the programming work for Atari for that conversion -I've seen some mention of it being in-house, but I think it may have been largely outsourced; it may have been MetaComCo . . . I know they did a lot of software work for both the ST and Amiga -BASIC, various work on the OSs, etc- but I'm not sure they did the initial conversion work for TOS+GEM)

 

GEM was one of three windowed OS solutions, Windows was the other and IBM had one of sorts too well before OS/2, it's in either the Amiga 1000 review or Atari ST/C= 128 preview issue of PCW Magazine.

I think there was more than that, but some others never made it to market. (and there were some more primitive window-like DOS shells appearing in the mid 80s -again, Tandy had their primitive deskmate back in '84)

 

Jack Tramiel was the only person in history to tell Gates to shove his royalties and keep rights to what he sold typical deals, he got royally screwed and lost millions on VIC20/PET/C64 sales.

You mean for not using MS BASIC? That ended up happening on a lot of platforms . . . Atari actually wanted to use MS BASIC, but ended up having to do their own in-house to fit in 8k. (apparently they didn't want to go to a 16k cart) Tandy initially used a different source for BASIC, but went to MS for their level II BASIC on the TRS-80. (CoCo used MS BASIC too, apparently they'd managed to get it down to 8k by 1980)

 

Yes the MC68000P20 was an official product and if it had ever been used would probably have had to be stretched to 24mhz to keep the system timing on ST simple. In 1985 though even a slow DOS + CGA emulator would have done them a lot of good, a lot more good than ST running Windows V1.0 that's for sure. The kings at the time were Wordperfect Corporation and Lotus (1-2-3 and Symphony) and Dbase. Beyond DOS MS didn't count much.

Keeping timing simple wouldn't be that important . . . the main issue would just be the cost of clock generation circuitry (oscillators and divide/multiplication circuits for given frequencies -preferably, you'd just want 1 shared oscillator that had all the clocks derived from that . . . but in the ST's case, you already need at least 2 if you're going to have composite/RF output).

Early model STs would have depended on the interleaved memory configuration (like the Amiga), and retaining an 8 MHz mode for full backwards compatibility could have been wise, but otherwise, you'd throw all that out the window for faster speeds and use serial bus sharing with wait states and fast page mode support -the amiga already allowed the custom chips to steal the bus for higher bandwidth but without page mode support -so just limited to 3.58 MHz accesses or 7.16 MB/s peak bandwidth. (that, or you could keep the slow/inefficient interleaved random access memory configuration and add wait states on top of that, but you still wouldn't need it to be a multiple of 8 MHz -specific speeds would minimize the number of actual wait states, namely anything that's a multiple of 2 MHz, but it wouldn't have to be a multiple of 8 . . . but really, they should have shifted towards fast page mode support with efficient serial bus sharing, especially to allow higher video bandwidth for scanning the framebuffer among other things . . . a fastRAM bus would also have been another option with different trade-offs -you'd still want to stat adding fast page support on both buses though . . . the Amiga also missed out on that -a shame Miner and the Amiga team didn't put an emphasis on efficient FPM DRAM support -perhaps with wider buses- rather than that impractical use of VRAM with the Ranger chipset, the AGA chipset was rather weak in that sense too, but it was at least a lot more practical than the Ranger set had been -or still would have been in '92 -sort of ironic that the Lynx chipset, largely designed by Amiga engineers, was heavily optimized for efficient fast page DRAM accesses)

 

One area where careful timing could be really important is if you wanted to actually speed up the interleaved accesses. That's still going to be pretty limited as random access speeds (full CAS+RAS times) don't improve dramatically with faster DRAM grades (CAS time speeds up much faster than RAS, hence why page mode accesses became so important), but it probably would have been practical to bump it to work with a 10 MHz 68k with that basic 50/50 split (or perhaps 8.95 MHz in the Amiga's case, keeping with NTSC+PAL compatible clock rates -like using a 35.8/35.47 MHz master clock generator). Depending on just how fast the RAM needed to be (not sure how fast 120 ns CAS DRAM random accesses are, so it might have needed 100 ns chips, and those weren't really in the lower-end/cheap range in the late 80s iirc -dropped to that at the beginning of the 90s, I think)

It really may have made more sense for overall cost/complexity/compatibility to never do something like that and just use faster CPUs with wait states. (be it sticking with the 50/50 split, or disabling interleaving and adding a more complex wait state system -or doing both, starting with the simple option an then going beyond interleaving -with interleaved modes for compatibility; with interleaving forced, it would greatly simplify wait state generation for the CPU rather than giving it free reign on the bus and forcing waits when the SHIFTER/FDD/etc needed to grab the bus, sort of like the Amiga does when the chips dig into 68k bandwidth -not sure if that uses waits or just halts the CPU -the latter is simple but can waste a lot more CPU resource -and then there's more options if a fastRAM bus was added)

 

Nope what I meant was if the ST had used an 8086 (or NEC V30 preferably as it is 20% fast mhz/mhz to 8086) and custom chips like A8/C64 then it would be OK as a propriety games system still as the 8086 is just a controller for the DMA on the custom chips BUT having an x86 CPU Atari could have run DOS natively and written custom VESA style bios for EGA/CGA to run PC stuff full speed.

That's pretty much what I said (well, I addressed a number of scenarios from a more ST-like option to the general notion of using more powerful custom chips . . . or just using consolidated custom chips to include off the shelf features -like the custom ASIC with hurcules+CGA+MDA+EGA support that Atair had in the PC-1). There would be major trade-offs in price competitiveness with custom hardware. (albeit if they pushed hard in 1985, that could have been really significant . . . actually having EGA compatibility with extended features -like full use of the palette in lower res modes, hardware scrolling or maybe a proper blitter, not to mention sound hardware- could have been really interesting, the clone market was still pretty young in '85, so there was a lot more to dig into than when Atari finally jumped in in '87 -albeit Atari Inc had plans for a PC compatible years earlier -I think that was among the projects to get shelved during Morgan's reform of the company in '83/84)

 

Also, if you weren't going for true native low-level PC compatibility (ie not direct register/memory map/BIOS level compatibility with the PC/PCXT), an 80186/88 could have been a very attractive option due to the embedded peripheral support and the fact that it also has a similar performance boost to the V20/30. (not sure if the V20/30 actually cloned the 186's enhancements or did different ones -the 186 may actually have had better overall performance, the V20/30 was a good option in place of a normal 808x -and was highly price competitive iirc) The advantages compared to a V20/30 would depend on the cost of the alternative peripheral logic used and the actual pricing of the 8018x vs V20/30.

Another advantage of using the V20/30 or 186 over plain 8086/88s would be the max clock speeds offered. (NEC's chips went up to 16 MHz and the 186 eventually had versions up to 25 MHz -not sure if any 3rd party 8086 offerings went beyond 10 MHz)

The V20/30 also have native Z80/8080 compatibility, but that's not really of use here. (it does make it an interesting option for an upgrade to other Z80 based systems -especially given how obscure/unpopular the Z800/280 was and how common/standardized x86 became)

 

The Konix Console had 8086+custom chipset (Flair1 chipset I think...Flair2=Jaguar Tom and Jerry) and was better than the Amiga in quite a few ways really so it's not a restriction.

Actually, the Flare 1 started with a 6 MHz Z80 (a natural option given it grew out of the Loki Spectrum project), and the use of the Z80 also made the 8088/86 a natural extension of that (similar bus architecture and such). It was actually Konix who pushed to go beyond a Z80 iirc, and the prototype demonstration and development systems used 8088s with the production models planned to use 8086s iirc.

 

And yes, the CPU is a generally separate issue from overall system performance, but it really depends on the task and overall context. (on a game console, you've got a massive gap from the VCS to the NES, but the CPU itself is basically the same but only 50% faster)

Granted, with good use the Flare 1 chipset could be used to accelerate a number of "serious" computer applications too, and great for graphics/multimedia stuff.

However, it also would have been no good as the basis for a DOS/PC compatible. (lack of compatible graphics modes, Amiga-style floppy format, etc)

 

Any DOS/PC compatible system would need to include the baseline standard graphics/IO/expansion features, and as such, it would also be most efficient to build any special features around those compatible areas. (like extended EGA support I mentioned above, added hardware acceleration, sound etc . . . actually if you just took the Tandy route of extending CGA with the full 16 color modes, but added hardware scrolling as well, that would have been pretty significant -it's 4-bit packed pixels too, so software blits would be more flexible/practical than 1-bit planar stuff) To actually have something better than the ST, you'd probably need to go beyond that though . . . or maybe take that extended CGA support and add 9-bit RGB indexing modes. (so Atari ST quality color at 160x200 and 320x200, but packed pixels, so much easier to software render with . . . much more so with added hardware graphics acceleration -from simple raster op support to a proper blitter- but even just having scrolling on top of packed pixels would be a big boost over the ST let alone PCs of the time -unless I'm mistaken and EGA actually included hardware scrolling)

 

Hell the NEC Turbografx had a 6502+custom chips and still had better games than the ST or Amiga in many cases (Outrun, Powerdrift, SF2, etc).

Umm, that's a really terrible example. The PCE's CPU was one of its strongest points, a very powerful, fast, and efficient enhanced R65C02 derivative with zero wait state ROM/RAM (both at 140 ns). It was that CPU that gave it a huge advantage over the SNES in any CPU intensive games, and many cases where it was better than the MD as well. (a lot of cases where the 68ks powerful, but slow instructions are at a disadvantage, especially in any case where things can be done efficiently in mostly 8-bit operations . . . plus the PCE's CPU has almost double the memory bandwidth as the 68k in the MD)

Then there's the extremely fast/efficient interrupts of the 650x that allow things like interrupt driven sound. (you could do a 7 kHz interrupt driven PCM playback system with just 5% of CPU resource, no way you'd manage that on the MD -you've got the Z80 on its own bus to help there though, except it must use software timed code loops as it has no timed interrupt source other than 50/60 Hz vblank -cycle timed code is more efficient, but far less foolproof for sloppy programmers, and a huge chunk of MD games show this with distorted PCM playback -a good coder could take advantage of that to push things well beyond what the PCE could practically do without eating up a lot of CPU time, but few commercial games came close to doing that -be it due to incompetence or just lack of emphasis on PCM sound quality . . . some engines made life easier by totally dedicating the Z80 to sample playback and using the 68k to manage the sound engine -some of the best sounding early engines work that way, like Vapor Trial and Atomic Runner . . . but Capcom also used a sound engine with that set-up and still had crap PCM -they also chose to decrease sound quality further with Super SFII both on the MD and SNES -both used even lower sample rate sound, super muffled and that's compared to the 4 kHz PCM with distorted playback in SCE)

 

 

Any machine that allowed you to run that PC rubbish from the office when doing work at home in the evening AND allowed you to have a technically superior machine for creative/leisure software would have worked really well.

No, it wouldn't if it didn't get the specific software support needed and wasn't cost effective. Building a PC compatible machine alone wasn't cheap, let alone adding custom chips on top of that. (again, you couldn't just use an x86 based machine with a bunch of custom/optimized video/coprocessor hardware, you needed to cater to existing PC standards and either build on those or tack-on added hardware -both would be expensive, but building on would be the better option)

That, or you could just include the minimum of compatibility (like CGA video) and build beyond that. (like the PCJr/Tandy machines, but built up a bit more)

 

It was the ST's relative capabilities and cost that made it attractive in Europe . . . it would have been tough to make a PC compatible machine that was still that attractive overall. (perhaps just extended CGA with hardware scrolling and a larger palette plus an off the shelf sound chip or basic DMA sound)

 

 

Commodore missed this trick too, the OS was quite capable of even multitasking a software DOS emulator so would have been most convenient. Amiga OS 2.0 integrates reading the PC formatted disks as a module in the OS for duality....both should have been in the Amiga 1000 and 520ST from day one IMO.

You know, the best possible idea for the time might have been to push for an all-new (clean/unencumbered) system (like the ST or Amiga) that also aimed at promoting cross compatible software/file formatting. That would require sufficient cross-platform software support, but the ST was already somewhat close to that with PC compatible floppy formatting, so cross-compatible PC/DOS file reading/writing and formatting could have gone a long way even without any actual DOS emulation.

All you need is a native version of the same (or a compatible formatting) application on the ST and PC to allow straightforward cross-compatibility. (that would heavily depend on 3rd party support, of course)

 

Part of that would be promoting 5.25" floppy drives more. (not as the standard, but as an accessory for business-oriented systems) Double sided 3.5" drives would also have been important for full cross compatibility with PCs.

 

Macs eventually got that functionality (and advertised it heavily), but can you imagine the impact for the ST (or Amiga) if such a feature had been promoted/emphasized from day 1? (especially in the US market)

 

One more thing Atari could have done that shouldn't have even required any real change in release date or R&D investment (on the hardware end at least), along with offering professional quality desktop form factor machines (having only the console models for the first 3 years was crippling from a marketing standpoint, especially for the US business sector), and then the lack of expandability until the MEGA. (and even then it wasn't the full internal array of internal expansion slots like ISA offered -or as the Apple II brought back in '77 -the low-end console models should have had a basic/low-cost expansion slot/connector with desktop models including full internal expansion -and the console models offering similar with the purchase of an expansion bay)

It's somewhat ironic that Atari Inc finally got the message to really push expandability and brought out PBI and the planned 1090XL, but then Atari Corp fell right back down to a closed box design. (except even more closed box than the 400/800, VIC, or C64 -actually the VIC and C64 offered generally more flexible expansion through the cart ports, as did the CoCo -which even had an expansion module somewhat akin to the 1090XL)

Hardware scroll registers would have been the simplest important hardware addition. (probably worth even a moderate delay in release or slight cost increase for the time -granted, a flexible expansion port would also have made that less necessary as a blitter upgrade could be easily added by the user with such a port -something that could have dramatically boosted the install base and such software support for the blitter for games, art, and business applications)

Link to comment
Share on other sites

I think you misunderstand me. The x86 CPU alone on the motherboard is the only part related to PC. No ISA bus or anything else PC XT. But to write a firmware/hardware based PC emulator running on that x86 is simple enough and allows PC software compatibility which is all people needed to test the water on a new machine.

Link to comment
Share on other sites

I think you misunderstand me. The x86 CPU alone on the motherboard is the only part related to PC. No ISA bus or anything else PC XT. But to write a firmware/hardware based PC emulator running on that x86 is simple enough and allows PC software compatibility which is all people needed to test the water on a new machine.

There's not much PC/DOS software that would be usable without MDA/CGA compatibility, and a good high-level DOS emulator with virtualization would be a long way away. (maybe you could have minimalistic CGA/MDA emulation . . . or rather 80x25 text mode emulation -as many programs could use MDA or CGA text either way -but with low res text in CGA obviously- along with software emulation of CGA graphics and color text; but even that could have taken a significant R&D effort to achieve -plus performance hits from overhead of the emulation -again, without hardware emulation support of CGA or MDA, let alone EGA)

 

Hell, early PC compatibles struggled if they weren't totally low-level IBM compatibles (a number of programs that bypassed DOS and used the hardware directly . . . some bypassing the BIOS even), but that eventually dissipated. (that issue was virtually gone by the time the ST was launched)

 

You're also assuming PC standards wouldn't continue to take over and make proprietary features obsolete . . . unless those features gained enough support to become true defacto mass-market standards. (and that generally requires an open market with upgrade paths to directly compete with PCs)

 

And here's the thing: in the US, any DOS compatibility in 1985 with zero actual PC hardware compatibility would have been pretty weak (so much so that it probably wouldn't be worth the loss of the 68k) and it would have done almost nothing for the European market given the relatively weak PC market there. (it may have done more harm than good in that sense . . . unless it could still remain extremely cost effective and flexible for business/graphics/games for the time -in which case it may have destroyed the Amiga's foundation of cross platform ST software and actually been stronger in general)

 

 

There really were much better alternatives for the time. The best long-term competitive solution would be to aim at software cross-compatibility on the file/formatting end (mainly dependent on getting the necessary software application support for cross compatible formats -either actual versions of programs on the PC, or programs with compatible file format standards -like rich text, microsoft word documents, lotus or excel spreadsheets, etc). Atari didn't even try to do that, so it's totally up to speculation how well that would have worked out. (again, supporting 5.25" DSDD drives at affordable prices would by part of that, as would be DS 3.5" drives)

 

In the short run, a simple x86 hardware emulator board could have been used for direct DOS compatibility (with the ST/68k end handling additional emulation of text/graphics and such). That would definitely be something facilitated by a flexible/comprehensive expansion interface in place of the cart slot. (and if all went well on gaining cross-platform file support, the need for such emulation would largely disappear -as would the need for 5.25" accessory drives once both STs and PCs commonly used double sided 3.5" drives)

 

 

Hell, with that sort of support, we might still have derivatives of the ST as a minor (or perhaps no so minor) alternative platform to PC to this day. ;)

The ST's relatively simple hardware makes it relatively attractive to clone . . . so if it had gotten really popular in a sustained manner -especially in the US- it may have ended up spreading out as a defacto 3rd party standard even without Atari opening up licensing -the smart thing for them to do would be to open up licensing though, since then there would be less incentive for cloning and more profits for Atari from those licenses. (plus promoting a general standard to increase competition)

Not pushing competitive/open licensing may also have been a major reason the 68k architecture declined compared to x86. (though it's also somewhat up to luck, as Intel also attempted to clamp down on 3rd party competition, but ended up doing so late enough that the market had already become established beyond licensed manufacturing or even clones -ie actual independent extensions to the x86 architecture rather than using Intel chips directly . . . albeit there actually WAS that potential for 68k too, even with Motorola cutting things off at the '020, it just turned out that competing 3rd parties didn't push for their own 68k developments to compete more in the '020/030/040/etc range -not even high speed 68000s, oddly enough, no contemporaries to what AMD, Cyrix, NEC, etc were doing with x86 -albeit NEC abandoned x86 befoe going 32-bit while Cyrix didn't actually start until 32-bit -with their 486SLC/DLC design)

 

 

Regardless of plans to push for cross compatibility, the ST lacking any sort of flexible expansion support or professional quality/looking desktop box models were far bigger issues than DOS compatibility overall. (with both of those 2 things, it could have been far more competitive in the US market . . . the Atari brand name still wasn't especially strong in the mainstream business/computing sector -mainly due to the . . . less than ideal management of the A8 by Atari Inc- but having the right look/form factor and apparent quality standards -that would include using high quality keyboards- would have gone a long way in the US computer market, let alone the value of expandability and the flexibility that would give for forward compatibility of hardware upgrades later -within limits, just like PC expandability, eventually you'll need to get a new machine to meet current standards -though even then, some old upgrades might be re-usable in the new system)

Having a hardware DOS emulator board on the market and an emphasis on cross-compatible PC format support on top of those things would have been all the better. (and again, expandability would facilitate an x86 emulation board)

 

 

 

The US market is also pretty heavily advertising driven, and that's one area Atari was fairly stuck with early on. (not enough funds to push massive ad campaigns -a problem for all of their products games or computers . . . Europe was FAR better in every respect: cheaper to run effective ad campaigns in general due to the high population densities, that density along with strong magazine culture making viral marketing extremely effective on top of that -that word of moth advertising and magazine support also acted as a powerful counter-hype mechanism for partially mitigating false/sensationalistic advertisements -the market was also more price sensitive than the US by far at the time)

 

 

 

 

 

On another note, one other thing that might have helped the ST early on (probably more so in Europe) would have been an even lower end model than the base 520 (or 260). I'm thinking something along the lines of the Amiga 600 compared to the 500 (except more cut down in some areas, and released early on, when it was actually important). Something like a 520 with the keypad removes (but overall keyboard functionality otherwise intact), many of the less necessart peripheral ports removed (MIDI, ASCI, maybe even RS232 -probably retain the parallel port at very least, and obviously the floppy drive port) and also remove much of the associated peripheral interface logic needed for those ports. (given the 68901 actually handled RS232 directly, it might make more sense to leave the serial port too . . . unless they swapped duties with one of the ACIAs and allowed the keyboard to directly interface with the 68901 and have RS232 and MIDI handled by ACIAs) Any of that missing functionality could be added via an expansion module (assuming Atari opted to include a flexible expansion interface), making such a low-end model more attractive in general.

Such a low-end system could have allowed the ST to push into the lower-end home/game market significantly sooner than it did with the 520STM/STFM (having RF and composite video from day 1 would also be important), and having such a model on the market during the '88 DRAM crisis might have been critical. (keeping a cheaper model than the Amiga . . . albeit there's a lot of other things that could have made the ST more competitive through that period -the expansion port alone would have been a huge factor -allowing existing ST users to easily add more RAM, blitter, sound, etc; while also making such upgrades better supported in general and making the system a much better overall vaule)

 

 

And now we're back to expansion. ;) The value added by flexible expandability really can't be overestimated, but was certainly (and continuously) underestimated by companies in the 70s through the 80s. (Apple management was totally oblivious of the value and potential of what the Apple II offered as a long-term platform . . . let alone the idea of using that expansion to make backwards compatible successors even more attractive; IBM took Wozniak's innovation and ran with it, clones followed and that's the PC today ;) -in a grossly oversimplified summary)

Atari engineers saw the merit in the Apple II's expansion and wanted to add such features to the 8-bit computers from the start, but upper management (namely Kassar from what I understand) wanted to push the machine as a closed-box appliance computer with smart peripherals that would be easy and virtually idiot proof to install (and plans for color coordinated machines . . . and those things sound a lot like what happened with the iMac in the late 90s -of course, a niche product and nearly 2 decades later).

The initial A8 had included some moderate expansion support (aside from SIO), but mainly just for RAM (pretty much anything else was a hack, including some memory expansions). The 1200 XL removed even that with the built-in 64k eliminating any apparent need for further expansion . . . though interestingly, the canceled 600 of 1982 had featured the PBI in the prototypes. (presumably that was just for RAM, but obviously would have come in handy for general purpose expansion as well)

Atari finally got it right with the 800XL and 1090XL expansion system. (albeit that ended up falling apart too -delayed from reorganization in late '83 and then canceled with liquidation in mid '84)

 

The ST didn't even support RAM expansion . . . at least without hacking and voiding the warranty. (ie piggyback RAM modification, not offered by Atari service centers AFIK)

Edited by kool kitty89
Link to comment
Share on other sites

But to write a firmware/hardware based PC emulator running on that x86 is simple enough

It's only simple to emulate PC hardware in software if you have hardware support for PC register emulation. The 80386 was the first x86 chip to include this (Virtual 86 mode).

 

Some people added their own "virtual 86" hardware to 8088 and 286 based machines. I had an old 8086-based Wang PC with a hardware "emulator card" that worked this way. When IBM PC registers were addressed, it would detect the access and trip an NMI that allowed software emulation of the missing hardware. (The Gravis Ultrasound also emulated Sound Blaster that way.)

 

That emulator card also hardware-remapped memory into PC-compatible locations. PC software hardcodes access to video memory somewhere between 640KB and 786KB. If it's not at the right byte in the right format, you get no screen.

 

Before the 80386, you had to design your hardware with PC compatibility in mind from the start. This is one of the many reasons people call PC/DOS programming a big 'kludge'. :)

 

Compare how easy it is to run Mac software on any 68K machine...

 

a number of programs that bypassed DOS and used the hardware directly . . . some bypassing the BIOS even, but that eventually dissipated.

It only dissipated because all hardware became 100% PC compatible. Professional DOS programmers bypassed the BIOS as a rule, at least for text, graphics, serial, and printing. The BIOS was just too crappy and slow. Sound never got BIOS support, so you just banged the registers until it worked.

 

In the late 90s, I helped a company move their program from DOS to Windows. Their DOS code was a nightmare. It never touched the BIOS. It had special 'quick scroll' code that abused the CGA video registers, and even read the keyboard using register access to the underlying scan codes. It didn't allocate memory -- it just 'knew' what areas would be safe to overwrite. Hopefully your emulator (or TSR for that matter) wasn't in one of those areas.

 

Windows emulated all of that perfectly... which makes me think most DOS programmers never learned to stop kludging!

 

- KS

Edited by kskunk
Link to comment
Share on other sites

Hell, early PC compatibles struggled if they weren't totally low-level IBM compatibles (a number of programs that bypassed DOS and used the hardware directly . . . some bypassing the BIOS even), but that eventually dissipated. (that issue was virtually gone by the time the ST was launched)

 

You're also assuming PC standards wouldn't continue to take over and make proprietary features obsolete . . . unless those features gained enough support to become true defacto mass-market standards. (and that generally requires an open market with upgrade paths to directly compete with PCs)

 

And here's the thing: in the US, any DOS compatibility in 1985 with zero actual PC hardware compatibility would have been pretty weak (so much so that it probably wouldn't be worth the loss of the 68k) a

 

Not sure if you were around "back in the day", but, as someone who was, I can assure you that there were *many* "almost" compatibles in the late 80s and early 90s, the Zenith Z100 and its descendants and the DEC Rainbow line being two notable examples.

 

PC software went right to the hardware well into the Windows era, long after the ST was introduced.

 

A problem with this whole thread is that PC hardware was crazy expensive late 80s... You could easily drop $3K on mid range system. You can't easily bolt that kind of hardware onto a $500 computer and expect it to fly.

Link to comment
Share on other sites

Not sure if you were around "back in the day", but, as someone who was, I can assure you that there were *many* "almost" compatibles in the late 80s and early 90s, the Zenith Z100 and its descendants and the DEC Rainbow line being two notable examples.

No I wasn't . . . but it was my impression that the biggest problem over full PC (opposed to "just" DOS) compatible systems was from '82-85. And a lot of middle ground with PC compatibles that were close enough on the hardware level to be acceptable, and others that weren't. (especially if the incompatibility was limited only to areas that could relatively easily be addressed through the installation/configuration process for a new program -so obviously not any "PC booter" type games/programs assuming a fixed/standard configuration, but those were only common in the early 80s)

 

From what I know, PC software started catering to non-standard (or non-uniform, rather) hardware configurations pretty early on to address the growing clone market, and a big part of that was requiring installation of software rather than it being "plug and play" . . . and having similar issues with hardware expansions requiring additional manual configuration and tweaks. (albeit that was only partially due to being non-uniform)

 

Still, certain areas seemed to be "no nos" on the hardware end, like using an 80186 (and making use of the onboard peripheral logic rather than conforming to IBM's standard configuration of timers/interrupt control/DMA logic). Though it would seem like using a 186 would still have been worthwhile with those features unused. (faster per clock performance than the 8086 with some added instructions, not nearly as good as the 286 but much cheaper)

 

PC software went right to the hardware well into the Windows era, long after the ST was introduced.

Yes, but with installation to account for varying memory maps, hardware changes, etc, etc. (ie not assuming the exact configuration of the original 5150 or PC-XT ;))

 

A problem with this whole thread is that PC hardware was crazy expensive late 80s... You could easily drop $3K on mid range system. You can't easily bolt that kind of hardware onto a $500 computer and expect it to fly.

That's more of a cost vs price issue. A lot of PC hardware didn't COST that much to manufacture, but it was priced for a high-end market with huge margins. (that and a lack of emphasis on tight, low-cost engineered designs) That's the same thing as the Mac, more or less. (the Mac128k in 1985 had a nominal price point roughly 4x that of the 520ST, but was almost certainly significantly cheaper to manufacture)

 

However, that had already changed by the mid 80s with some rather low-cost, high-performance (relative to price) systems on the market. Tandy had the Tandy 1000 out in 1984, a machine better than a baseline PC-XT and a corrected version of IBM's own poorly executed PCJr. (iirc, at launch in late 1984, the baseline Tandy 1000 was in the $1200 range with a 7.16 MHz 8088, 128k RAM, 1 DSDD 40 track 5.25" drive, and I believe a parallel port -plus built-in 8-bit ISA slots, unlike the PCJr- I think RS232 may have required an expansion card, but I'm not positive) As time went on, Tandy increased the baseline to 256k and added low-end specific models. (including the console form factor HX and EX lines -the EX was priced at $800 with color monitor included by 1987)

 

That's just looking at one specific line too, not comparing the range of various clone machines available. (from Compaq to HP to Packard Bell to Atari etc -and Amstrad in Europe)

It would be interesting to see a timeline of pricing on Atari's own PCs from the late 80s. (especially comparing their single board PC-1 to the supposedly cheaper off the shelf PC-3 -while such a design would be nominally more expensive compared to the single board PC-1 with single expansion slot, the sheer volumes of competing off the shelf motherboards and components trumped any such advantage by the late 80s)

 

 

 

 

However, this is really going beside the point: the context was a minimalistic 8088/86 hardware emulator board (either with added RAM or an interface to share ST memory) along with some software emulation assistance on the ST's end for proper graphics/text/peripheral interfacing.

Such an emulation board should have been very cheap to manufacture in the late 80s (especially if you avoided much onboard RAM), but there would obviously be the issue of R&D investment to actually get it working adequately well. (plus you'd need a hard drive for the many applications requiring installation . . . and software emulation routines on the ST's end to allow that to function properly)

 

But again, even the utility of that would be limited overall . . . in the long run it would have made far more sense to focus on promoting cross compatible work file formats. (so you could move a project between PC and ST seamlessly -like you eventually could with Mac and PC, but the ST started off a lot closer to practically realizing that than Mac in '85)

 

 

Well, there's that context, and OKY2000's point of suggesting a DOS/PC compatible with C64/Amiga-like custom chips at a competitive price, and that's sort of an oxymoron. You could certainly have a cost-competitive x86 based design with embedded custom chips, and you could facilitate basic DOS compatibility . . . but including proper low-level PC hardware compatibility (even on the level of the tandy 1000) would be much more limiting and not really better than what would be possible through a simple x86 add-on emulator board as above -if not worse than that if the ST emulator could get some really good simulated compatibility on the 68k/ST end to couple with that x86 board. (if you tried to include all the compatibility of the common graphics standards at the time, you'd use a lot of silicon just to handle hercules/MDA+CGA+EGA . . . though, again, a best case might be just targeting CGA compatibility and then aiming the enhanced custom chips to build on the CGA logic -like doing something like TGA but with 9-bit RGB and hardware scrolling -and even then you'd have to hope software developers would target the custom standard rather than EGA, or resort to using CGA graphics/text modes)

Edited by kool kitty89
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...