Jump to content
IGNORED

"Who Killed the Atari ST?" 1989 article on Atari computers


pacman000

Recommended Posts

Jack was in such a hurry to get a machine out he did not do enough homework to see what was out there, what what might be happening the near future. Sure he (Atari) was cash strapped, but doing things in a hurry usully never ends well. He could have done a couple quick focus group studies through any of a number of third party firms that do that. He would have seen that the name Atari means games to the general public. He was a smart guy, he probably would have spun off a subsidiary company to market the ST under. (TMB Systems - Trimial Business Machines??) Next he should have used his C= experaince and put real documantion in every box, and give free deveolpment suff to anyone who asks. Next, use his 'Personality' to get the machines into computer stores. The 8-bits could stay in K-mart and Toy-R-Us. Then write a shedule for incrimental machine upgrade releases, like intel's tick-tock. Possibly make changes to that schedules from focus group and inhouse EE input.

His revenge rush to beat C= to market with a 68000 machine is was would eventually kill it.

IMO

 

In the early-to-mid 80s, the thinking was that videogames were just a stepping stone to computers. The popular idea that "if you are a videogame company, you can't make serious computers" didn't emerge until the late 80s.

 

The 8-bits were cheap, priced similarly to a game console, so it was easy to sell those. But the 16-bits cost significantly more... and that's where this mindset emerged. "If I'm going to drop serious money on a computer, it's going to be from a name I trust"

 

Using a different name might sound like a good idea, except that name starts off with zero name recognition.

 

computer shopper: "TBM? Who are they, some IBM knock off?"

dealer: "actually they are a sister company to Atari"

shopper: "Atari, the game company?"

dealer: "correct"

shopper: "No thanks, I want a real computer not a toy!"

 

So I don't think using a different brand name would have helped much. Atari had much more name recognition. But they needed to overcome the "game company" reputation. To do that, I think they needed to show the important business apps earlier, built a better dealer/service, have higher professional build quality models earlier (The Mega ST line)

 

Microsoft entering the videogame console market showed that you could play in both worlds.

  • Like 1
Link to comment
Share on other sites

Maybe people calling PC "WIndows PC" is responsible why M$ took over. What about using PC without Windows, MS DOS ? It was possible then, is possible today. Today you can boot it straight into Atari TOS environment thanx to great work of some people. - look for Beekey .

 

Very interesting point... PC's running MS operating systems was all about confomity because they were targeted towards businesses rather than computer hobbyists. They later entered the mainstream when they had "multimedia" abilities along with getting on the Internet. The Atari ST was one of many platforms that appealed to non-conformists but ultimately was no longer supported by the parent company. Apple is the only excpetion still around of course but they never marketed towards PC users anyway.

 

I've read an article years ago that stated although PC hardware is standardized, it can still run alternative operating systems like Linux and even ST sorftware through emulation. It helped me transition over the the PC platform without being 'completely' stuck with Windows (which I used for games only at the time).

Link to comment
Share on other sites

And what form of communication would you rather have? Message board forums work.

 

 

Hm. Can you imagine better solutions for communication than this (or any other: vBulletin, phpBB...) forum?

 

I am prone to mark people who are: limited, feared and shallow as main reason for PC success (and fails of all others).

Most of people are: limited, feared and shallow (unfortunate). They would rather buy IBM then potentially non-IBM, but better computer.

Edited by calimero
Link to comment
Share on other sites

How about a series of 2D flow charts? That way, when conversations get off topic, the new topic & the original topic can continue as separate, organized, on topic threads? (Not entirely serious.)

It is an interesting idea, but it could be added as a feature to forum software. Mods could designate something as a "spin-off topic" and the thread splits.

  • Like 1
Link to comment
Share on other sites

Awwww... What's the fun in that? Let's use statistics to examine post content! Then use some sort of clustering algorithm to put similar posts together! We can market it as the next big step in AI; "This isn't the same tech Excite used in the 90's; it's not how your email's spam filter works; it's new, new, new!" ;)

Link to comment
Share on other sites

^

Yes... there is no AI, just big data.

 

It is interesting how people try to use "AI" term to stick it to anything.

Just look at 2002. article by Uta Priss: Alternatives to the “Semantic Web": multi-strategy knowledge representation. This is one of my favorite article that got my curiosity to research and understand how today computer world is shaped (and what went terribly wrong: how we got this "fake news" phase...).

Article also expose me to names like Ted Nelson, Douglas Engelbart... (I am too young so I miss all these great names of 20 century) and later to Jaron Lanier or Paul Otlet...

 

---

I will answer to Keatah in next post regarding communication.

Link to comment
Share on other sites

This is like in politics (and here we are in election campaign at moment) - opposition talks great things, great promises. Those on power (current regiment - who I don't like really) usually makes not big promises, not even 10 days before elections. Why ? Because they will then get in hard position when it should be realized. Opposition can promise whatever want - they will not need to explain why it is not possible, because will be not on power (most of it surely never be) . They just want some votes and attention. And it is same with different Ted Nelsons and others. They talk irreal things. A.I. is still at very low level, and will be it for sure next 50 years. Look just relative simple task: translation. Mega companies like Google can not provide decent translator.

If someone would come with something really good, efficient, new way, that would succeed. Market is best judge. M$ had his failures too.

 

My conclusion of this topic would be: Atari had his 5 minutes of fame. Even IBM left PC manufacturing completely - first desktop, then notebook too. Actually, Apple is only manufacturer which is still here, but they made really serious changes. We can say that today MACs are much more similar to PCs than to former MACs. The real money is not in HW but in SW.

Link to comment
Share on other sites

Again I ask, which form of communication would you rather have?

 

Usenet blows away every web forum out there. Unfortunately, there isn't an equivalent html5 client of tin.

 

I hate visiting web forums so much that I try to avoid them at all costs. AA is the only forum I visit, and even it is a pain. For example, I haven't found a way to navigate threads through the keyboard.

 

Soup and Yarn under OS/2 were amazing.

Edited by gozar
  • Like 1
Link to comment
Share on other sites

  • 4 weeks later...

Atari would have had to get Adobe, Claris, and Microsoft on their side. Without those, it's tough to compete with the Mac.

 

And actually the Mac was a little backwards. :-D

 

There was a version of Microsoft Write for the ST but it was based on an old version for the Mac and required GDOS for the fonts. And we all know how well Atari Corp. sells things...

 

Thankfully there was Mac emulaton which was how most ST's got sold in the late 80's.

Link to comment
Share on other sites

That is one of the biggest issues with the mac lineup though the years

 

got thousands of dollars worth of 68k software, tough shit we are now on power pc, oops that ended up being a dead end now your 1 year old G5 is in la la land, and while that is extreme, and they tried with "fat" binaries before the machine was getting slow it was a new software door stop

 

meanwhile if I use an IDE drive or a sata drive in legacy mode I can boot MS-DOS5 on an i7, and right now in the garage I have a 2003 era pentium M (which is a pentium 3 with dynamic clock scaling) almost done installing ubuntu 16.04 LTS dual booting windows 98se (hell it runs windows 7 but there isnt drivers for the ATI 7050M for windows 7) ...

 

cause the PC was so bland and not reliant on custom chips it can be anything you want it to be, monster gaming rig, basic office worker, cad workstation or a server, and it could change at a whim (ie my old gaming rig with a card swap from graphics performance to a pcie raid controller, is my home server)

 

I'm typing this on my main machine, a 2008 Mac Pro. It's 10 years old and still running all up to date software quite happily. Yet I've also got Snow Leopard installed on here, which allows me to run quite a large amount of PPC software via Rosetta. I also have Windows 7 and Windows XP installed on it. All are fully supported by the hardware it's running. Now while this is not a cutting edge machine by any means, it's still more than quick enough to run some of the power hungry apps I use.

 

The PC is a slave to it's legacy and variance. Which is a blessing and a curse in terms of, yes in theory you can run everything, but in practice a tonne of old stuff simply will not run on newer kit. It's the same with everything. There's no such thing as future proof, and there's no such thing as 100% backwards compatible when the platforms have changed so much over the years.

Link to comment
Share on other sites

Yes, there is no 100% backward compatibility, but there are solutions which ensured good level of it over years, almost decade(s) . And in last years, there are trends, more than earlier, to make it intentionally incompatible. You can not use little older graphic card just because they make Windows to not accept older drivers, and of course, no new drivers for manufacturer.

PC was for sure architecture what was most future proof. Partially because slots, easy expansions, whole concept, and because CPU, what maintained to not remove some old instructions over 3 decades.

Motorola was not that good in this, we must admit. Stack frame change was one of bad things, what made problems with some SW, and for OS coders. Later they even removed some instructions, like movep - what was actually used pretty much in SW. Most problems with 8086 based CPUs came with bigger speed, what made some SW not working well.

 

Considering speed of some 10 years old computer and modern SW - hmm. truth is that it is not only speed. I tried Mozilla Firefox on some 10 years old PC, of course some older version, and there was lot of problems with most of WEBsites. Speed was not so bad, but it was not possible to install newer version because no SSE in older CPU. Better example would be multimedia. You need some quad core CPU at min 3 GHz for watching movies at high res, with efficient codec. Or for instance to watch some sport on TV at 50/60 fps, full HD. Not to mention encoding.

People wants speed, SW wants speed. That will be always one of most important things. And there will be always SW what will benefit from it.

Link to comment
Share on other sites

Motorola was pretty dumb, dare I say idiotic, with how they evolved the 68K architecture.

 

 

The level of forward and backward compatibility in PC's used to be quite good. Now that the platform has won the race to bottom, because monetization and increased focus on premature obsolescence, not so good anymore.

 

A real life example is the Windows 10 Wi-Fi card issue. With one small stroke of the driver brush, they cause an uptick in PC sales. It was a test to see how consumers would react. Most said fuck it. Didn't troubleshoot the issue. And threw away perfectly good machines that were 5 years old.

 

In the old days people threw away machines because of naturally evolving speed. One year we had 33MHz 386s. The next 66MHz 486s. And the next, Pentium 60. That was good for sales and low hanging fruit for those marketing drones.

 

Once we hit 2010 - 2013 timeframe. People stopped upgrading because of speed. Entry level processors were becoming fast enough to avoid churn. Companies felt the downturn in sales. In order to try and maintain sales they started breaking compatibility. They use "security" as a reason to upgrade, and once that started dying down they started intentionally breaking drivers. This shows up in "gamer" hardware first like graphics cards. They also axed the BIOS and moved you to a subscription/service OS.

 

So yes, even legendary compatibility standards are no match against the need to monetize everything.

  • Like 3
Link to comment
Share on other sites

Once we hit 2010 - 2013 timeframe. People stopped upgrading because of speed. Entry level processors were becoming fast enough to avoid churn.

That was around the time started buying flashy new smartphones and tablets and realized they didn't need their PCs as much, so they postponed upgrading it

  • Like 1
Link to comment
Share on other sites

I wonder if a start-up could make a living at making drivers for popular PC add-ons (in cases in which the original equipment manufacturer refuses to).

 

Back in the day, some people would make their own device drivers on an as-need basis...

 

It would be a cool way to escape the manipulation of the market through driver availability (or lack thereof).

  • Like 2
Link to comment
Share on other sites

Would there be enough information available to make it happen in a timely manner? And PC parts can be rather complex. And Manufacturers won't be making important details available either.

 

I would tend to think that modifying or interfacing existing drivers could be possible.

  • Like 2
Link to comment
Share on other sites

Based on what I've seen people do in the past, I think it's possible too.

 

I can see video cards being difficult.

 

However, things like audio interfaces, input devices, printers, scanners, etc... should be doable.

 

I still recall a printer driver I needed when the color Deskjet printers first arrived on the scene. Some dude in Germany wrote one and posted it on an FTP server. It worked flawlessly.

Link to comment
Share on other sites

It is not only that there is no new driver version for little older card. For instance Win 10 will not allow installing of older driver, for some silly reason. That was 2 years ago with my Radeon card, Win 10 installed driver self, from his installation package, and it worked well - except that there was no possible to set some things, like brightness and contrast. It said that set it on monitor. That was pretty much stupid and annoying. So, I played little with files from some Win 7-8 Radeon driver for that card. It allowed not to run main installer program, but I found that can run separated parts. So was able to install CCC - Catalyst Control Center, and problem was solved. Well - not problem called Win 10 and his constant updates, what resulted in abandoning it after some 8 months.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...