Jump to content
Keatah

Computers and the videogame crash of the 80's.

Recommended Posts

5 hours ago, elmer said:

Anyway ... I believe that the answer has really already been given, and its not about some technological superiority of computers-with-slots (which doesn't seem to have stopped the iPhone/iPad), it is about corporate fear and comsumer herd-mentality.

 

Commodore and Atari tried to make something different, and gave it their best try, but everyone else jumped on to the IBM clone bandwagon, and the sheer number of the alternatives made their success seem both desirable and innevitable to both consumers and businesses ... which then became a self-fulfilling prophecy.

 

As was said earlier, in business "Nobody ever got fired for buying IBM", and by the late 1980s most buyers didn't want to make a "mistake" and so buying an IBM clone was the "safe" bet.

The idea of computers not being controlled by one brand and easily expendable using cheap components is not something anybody could defeat. That's why PCs have dominated the market and everyone esle faded away. It has nothing to do with herd mentality, and if it's a bandwagon then it's one I was very happy to jump on. Apple has only survived by because of the specific niche it occupies and its "boutique" airs.

Share this post


Link to post
Share on other sites
6 hours ago, elmer said:

Anyway ... I believe that the answer has really already been given, and its not about some technological superiority of computers-with-slots (which doesn't seem to have stopped the iPhone/iPad), it is about corporate fear and comsumer herd-mentality.

Having slots doesn't make a computer technologically superior. Slots allow for great creativity and versatility. And whole businesses (even industries) were built around around products that plugged into those ISA slots.

 

The PC was never even considered technologically superior to other contemporary systems. Especially in sound, graphics, and gaming as anyone will tell you. It was only in the later 90's when SoundBlaster and 3D graphics chips got going did its gaming prowess become legendary.

 

IPhone is a self-contained communications device. Its versatility and purpose and practical usage is made manifest via connectivity and software and sensors. There's no need for modularity or slots.

 

Herd mentality is a good thing. It promotes standards and devices that work rather trouble-free. Everyone knows how to use them. They can go about their business without babysitting the technology. I've saved tons of money once I started going with established industry standards as opposed to some whiz-bang one-off product. It's like those lightfield cameras and how they were supposed to revolutionize photography. Fast forward 10 years later and we're seeing that uber-high-tech stuff show up in devices that babies dribble drool on.

 

6 hours ago, elmer said:

Commodore and Atari tried to make something different, and gave it their best try, but everyone else jumped on to the IBM clone bandwagon, and the sheer number of the alternatives made their success seem both desirable and innevitable to both consumers and businesses ... which then became a self-fulfilling prophecy.

That's ok. Nothing wrong there. If it wasn't PC that became prevalent it would have been something else. And we'd be talking about that something else in the same style and manner.

 

6 hours ago, elmer said:

As was said earlier, in business "Nobody ever got fired for buying IBM", and by the late 1980s most buyers didn't want to make a "mistake" and so buying an IBM clone was the "safe" bet.

I learned that lesson after dumping over $1,500 into an Amiga that couldn't be expanded beyond a certain point. Wish I would have put that into a PC earlier on or saved it. But it all worked out in the end.

 

I simply got tired of sitting on the sidelines. It was harder to share my work with anyone else. And it was 2x harder to see changes and upgrades and compatibility and not get to benefit from it.

Share this post


Link to post
Share on other sites

Any computer you bought in the later 1980s would have been obsolete within a few years.  Software was getting held back by the hardware so changes were happening quickly.    If you bought a  no-name IBM compatible the only thing you might have saved was the case.  Name brand PCs often had proprietary designs making board swaps not practical.  An affordable IBM PC compatible in the mid-1980s would have meant monochrome graphics and awful sound.  I don't know how many people had cga hooked up to their TV.  In hindsight, a Tandy 1000 would gave been the way to go but Radio Shack didn't have a reputation for quality.  The IBM PC and compatibles was dominating market share but that was mostly in the office.  It didn't become the dominant home computer until the internet in the mid 1990s. 

 

I was looking for a computer in the mid 1980s and Amiga was at the top of the list.  I considered an IBM PC compatible as well as older technology in the c128, but in the end didn't feel right buying anything.  I waited until 1989, when I bought an IBM PC compatible.  Paid more than double over an Amiga and then constantly upgraded components for the next twenty years.  I don't think I saved much, if anything, in hardware over buying an amiga a couple of years earlier.  The question with amiga would have been where to go to pirate software.

Share this post


Link to post
Share on other sites
13 hours ago, Bill Loguidice said:

Well again, I don't know how much "middle ground" there could be. The 520ST was around $800 with a monochrome monitor and with a color monitor it was around $1000. That was in 1985 (I do believe the price actually increased a bit not too long after launch, but let's just go with those numbers for now)

I don't think it increased because a few months later they announced the 1040STf mono system for $1000

 

13 hours ago, Bill Loguidice said:

Now, the Commodore 128 was around $300 in 1985. Add in a disk drive and monitor and you're looking at around $600 or so. So, really, that was your mid-range, like with the CoCo 3 (which came out a bit later). The "mid-range" ended up being really close to the pricing of the entry-level range of the Atari ST and Amiga 500

The 128 could use your old monitors and keep your old peripherals.   A lot of the people back then were looking for an upgrade path, and didn't want to start fresh the way the Amiga and ST forced you to.   The window for mid-range systems would have been 85-87.    87 is when the Amiga 500 came out, and when the STfm line came out with RF modulator, and you could buy it without monitor for about $400.  Two years might not seem like a long time, but considering how fast these companies introduced new models back then...   between 1982 and 1985, Atari's flagship model went from 800 to 1200XL to 800XL to 130XE.  It really wasn't.

 

Another factor that I don't think these companies fully appreciated back then was the 8-bit computers were the game consoles of the mid 80s, especially the C64.  When the 16-bits came, they were too expensive for many of these people, which gave an opportunity for the dedicated console like NES to return.   Mid-range models may have helped keep those gamers around.

  • Like 2

Share this post


Link to post
Share on other sites
4 hours ago, youxia said:

The idea of computers not being controlled by one brand and easily expendable using cheap components is not something anybody could defeat. That's why PCs have dominated the market and everyone esle faded away. It has nothing to do with herd mentality, and if it's a bandwagon then it's one I was very happy to jump on. Apple has only survived by because of the specific niche it occupies and its "boutique" airs

Being open and not owned by a single company is not enough,  there have been many attempts at similar things that failed to take off.     The fact that IBM controlled it in the beginning is what brought people in.  The fact that you could clone it at a fraction of the price is what kept it going.

 

3 hours ago, Keatah said:

The PC was never even considered technologically superior to other contemporary systems. Especially in sound, graphics, and gaming as anyone will tell you.

And speed!   Try using a 4.77mhz 8086 PC.  It feels much slower than the 6502-based systems!

Share this post


Link to post
Share on other sites
5 minutes ago, zzip said:

The 128 could use your old monitors and keep your old peripherals.   A lot of the people back then were looking for an upgrade path, and didn't want to start fresh the way the Amiga and ST forced you to.  

Yes, the same with the CoCo 3, but the point was to make them "mid-range" upgrades over the C-64 or CoCo 1/2, respectively, you needed to add a few things, like a feature-specific monitor. 

Share this post


Link to post
Share on other sites
10 hours ago, elmer said:

 

Yes, that was rather my point ... there was a rapid pace of change, and regular price drops made *all* machines more affordable as time went by, both at the top and the bottom of the ranges, and at any theoretical "middle" range.

 

The mid-range option that I'm missing in the US market is precisely what I just pointed out ... some theoretical US machine that could compete with the Amstrad CPC range, because that was what was *affordable* to build in 1984/1985. A cheap computer with an 80-column text mode, and 16 colors in chunky-pixel 160x224, or 4 colors at 320x224.

 

To put it in terms that might make sense to an American ... lets call it the computer equivalent of what Nintendo was designing in 1982 ... the hardware technology that knocked American companies out of the ring as home-video-game manufacturers.

 

Unless I'm missing something, American computer companies didn't release anything significantly *new* (I do NOT include cost-reduced versions of existing machines) for the home market in 1983/1984/1985.

 

The Coleco Adam comes the closest ... but since it was just a retread of the 1982's ColecoVision, with its 1979 graphics chip, and based on the same Texas Instruments reference design that gave us the Spectravideo, the Tatung Einstein, the Memotech MTX, and the original MSX1 ... it hardly counts as either uniquely interesting, or as a technical upgrade.

 

The Commodore 128 in 1985 offered far too few improvements to be worth discussing, and while the CoCo3 was a lovely upgrade to the Coco2 in 1986, it was still too-little, too-late.

 

Anyway ... I believe that the answer has really already been given, and its not about some technological superiority of computers-with-slots (which doesn't seem to have stopped the iPhone/iPad), it is about corporate fear and comsumer herd-mentality.

 

Commodore and Atari tried to make something different, and gave it their best try, but everyone else jumped on to the IBM clone bandwagon, and the sheer number of the alternatives made their success seem both desirable and innevitable to both consumers and businesses ... which then became a self-fulfilling prophecy.

 

As was said earlier, in business "Nobody ever got fired for buying IBM", and by the late 1980s most buyers didn't want to make a "mistake" and so buying an IBM clone was the "safe" bet.

As a US person who has owned various UK Amstrad CPC-based systems and even the stillborn console, there is no way I consider it the hypothetical "mid-range" option. It was slightly better than a C-64 in some ways and worse in others in my opinion. 

 

We *did* have new low end computers and even pseudo mid-range options here during the time period you stated, 1983 - 1985 (not counting Atari ST and Commodore Amiga at the tail end). There was the Mattel Aquarius, Timex Sinclair 2068, Commodore 128, Spectravideo 318/328, Sinclair QL, Coleco Adam, Yamaha (MSX 1), etc., but none save for the Coleco Adam (at least for a short time) and Commodore 128 sold worth a damn because there were close to half a dozen already entrenched solutions like Apple II, Atari 8-bit, CoCo, C-64, IBM PC, TRS-80, etc., and not enough of an appreciable difference or compelling reason to choose one of the new platforms over what was already out. Naturally, with the Atari ST and Amiga (and Macintosh to a lesser degree), you had real reasons to want to upgrade. The technical gulf was easily noticeable versus more of the same (or less, or compromised) with the other stuff.

 

In terms of IBM PCs and Compatibles, I tend to give them more credit these days than I've given them in the past. They were expensive and clunky early on, and not appreciably better than some of the other computer systems out there for the relative extra cost (and yes, in some ways, much worse), but there were relatively easy upgrades and definitely a slow, but steady evolution. It's the story of the tortoise and the hare, really, and it eventually came to down to what it usually comes down to, software. The platform stuck around long enough to get the majority of the best and most compelling software (and being business-centric first, didn't have to worry as much about the whims of the home market), so it eventually became a matter of what the other platforms, even the Atari ST and Amiga, were missing. The Amiga especially was a great game machine, but was not a great pure productivity machine when it came to robust word processing or spreadsheets. Once the technical gulf was bridged, especially in terms of audio-visuals on the PC side, it was naturally game over for anything else. 

 

The bottom line is is that markets tend to stabilize and standardize. In many ways, having more choices is a negative. We really didn't need a dozen (and prior to that, several dozen) different and incompatible computer platforms. It's fun for us, but for the average person who just wants something that works, something that runs what they want, and something that has great support thanks to its ubiquity, more options are really not a good thing. You can see the same homogenization that happened with computers happen with consoles, with smartphones and tablets, and many other things. It's an inevitability and again, something I'd argue is not necessarily a bad thing in terms of accessibility.

Share this post


Link to post
Share on other sites
10 hours ago, elmer said:

 

Yes, that was rather my point ... there was a rapid pace of change, and regular price drops made *all* machines more affordable as time went by, both at the top and the bottom of the ranges, and at any theoretical "middle" range.

 

The mid-range option that I'm missing in the US market is precisely what I just pointed out ... some theoretical US machine that could compete with the Amstrad CPC range, because that was what was *affordable* to build in 1984/1985. A cheap computer with an 80-column text mode, and 16 colors in chunky-pixel 160x224, or 4 colors at 320x224.

 

To put it in terms that might make sense to an American ... lets call it the computer equivalent of what Nintendo was designing in 1982 ... the hardware technology that knocked American companies out of the ring as home-video-game manufacturers.

 

Unless I'm missing something, American computer companies didn't release anything significantly *new* (I do NOT include cost-reduced versions of existing machines) for the home market in 1983/1984/1985.

 

The Coleco Adam comes the closest ... but since it was just a retread of the 1982's ColecoVision, with its 1979 graphics chip, and based on the same Texas Instruments reference design that gave us the Spectravideo, the Tatung Einstein, the Memotech MTX, and the original MSX1 ... it hardly counts as either uniquely interesting, or as a technical upgrade.

 

The Commodore 128 in 1985 offered far too few improvements to be worth discussing, and while the CoCo3 was a lovely upgrade to the Coco2 in 1986, it was still too-little, too-late.

 

Anyway ... I believe that the answer has really already been given, and its not about some technological superiority of computers-with-slots (which doesn't seem to have stopped the iPhone/iPad), it is about corporate fear and comsumer herd-mentality.

 

Commodore and Atari tried to make something different, and gave it their best try, but everyone else jumped on to the IBM clone bandwagon, and the sheer number of the alternatives made their success seem both desirable and innevitable to both consumers and businesses ... which then became a self-fulfilling prophecy.

 

As was said earlier, in business "Nobody ever got fired for buying IBM", and by the late 1980s most buyers didn't want to make a "mistake" and so buying an IBM clone was the "safe" bet.

As a US person who has owned various UK Amstrad CPC-based systems and even the stillborn console, there is no way I consider it the hypothetical "mid-range" option. It was slightly better than a C-64 in some ways and worse in others in my opinion. 

 

We *did* have new low end computers and even pseudo mid-range options here during the time period you stated, 1983 - 1985 (not counting Atari ST and Commodore Amiga at the tail end). There was the Mattel Aquarius, Timex Sinclair 2068, Commodore 128, Spectravideo 318/328, Sinclair QL, Coleco Adam, Yamaha (MSX 1), etc., but none save for the Coleco Adam (at least for a short time) and Commodore 128 sold worth a damn because there were close to half a dozen already entrenched solutions like Apple II, Atari 8-bit, CoCo, C-64, IBM PC, TRS-80, etc., and not enough of an appreciable difference or compelling reason to choose one of the new platforms over what was already out. Naturally, with the Atari ST and Amiga (and Macintosh to a lesser degree), you had real reasons to want to upgrade. The technical gulf was easily noticeable versus more of the same (or less, or compromised) with the other stuff.

 

In terms of IBM PCs and Compatibles, I tend to give them more credit these days than I've given them in the past. They were expensive and clunky early on, and not appreciably better than some of the other computer systems out there for the relative extra cost (and yes, in some ways, much worse), but there were relatively easy upgrades and definitely a slow, but steady evolution. It's the story of the tortoise and the hare, really, and it eventually came to down to what it usually comes down to, software. The platform stuck around long enough to get the majority of the best and most compelling software (and being business-centric first, didn't have to worry as much about the whims of the home market), so it eventually became a matter of what the other platforms, even the Atari ST and Amiga, were missing. The Amiga especially was a great game machine, but was not a great pure productivity machine when it came to robust word processing or spreadsheets. Once the technical gulf was bridged, especially in terms of audio-visuals on the PC side, it was naturally game over for anything else. 

 

The bottom line is is that markets tend to stabilize and standardize. In many ways, having more choices is a negative. We really didn't need a dozen (and prior to that, several dozen) different and incompatible computer platforms. It's fun for us, but for the average person who just wants something that works, something that runs what they want, and something that has great support thanks to its ubiquity, more options are really not a good thing. You can see the same homogenization that happened with computers happen with consoles, with smartphones and tablets, and many other things. It's an inevitability and again, something I'd argue is not necessarily a bad thing in terms of accessibility.

  • Like 1

Share this post


Link to post
Share on other sites
3 hours ago, Keatah said:

I learned that lesson after dumping over $1,500 into an Amiga that couldn't be expanded beyond a certain point. Wish I would have put that into a PC earlier on or saved it. But it all worked out in the end.

 

I simply got tired of sitting on the sidelines. It was harder to share my work with anyone else. And it was 2x harder to see changes and upgrades and compatibility and not get to benefit from it.

This. I was a HUGE Amiga platform advocate back in the day, but eventually, and inevitably, it started to show its limitations. Moving to the Amiga 1200 was simply not going to be a good option for me, especially since (and once) I got a taste of the PC side of things with a 386 SX-20 we also had in the household. It was hardly a beast, but its VGA graphics and sound card-based sound were still pretty impressive when used right and I loved games like Wolfenstein 3D on it, which in many ways were a match for all but the best designed Amiga OCS/ECS software. The Amiga simply no longer had the "wow" factor, and my next computer was a brilliant Pentium 90 Gateway 2000 computer with Trinitron monitor. That ran Doom like silk. I never went back (other than what I do now with all classic systems, and that's play/use them on a hobby basis).

On a side note, I remember as a kid advocating for the VIC-20 (before I moved onto a C-64), then C-128, then an Amiga. In each of those cases where I convinced someone to take action, the person who got the computer was disappointed. In the case of the VIC-20, the C-64 was the better choice, in the case of the C-128, the person was envious of PC stuff, and in the case of the Amiga, he got an Amiga 2000 and was disappointed that it couldn't display GIFs like a PC (as just one example). Over time, I came to understand that I can't give advice to someone based on my own biases. What's right for me (or what I love) is not right for them. Ever since, I've always tried to understand all of the needs and present all of the possible options, then leave it up to the person to make the decision (kind of like the whole iOS versus Android thing when someone asks me). That's part of why my perspective on platforms like the IBM PC and Compatibles has evolved over the years. Once I removed my biases of what I KNEW was better and isolated it what I knew was better for ME, I came to appreciate more what owning one meant even back then.

 

And one last point on the whole PC versus Macintosh, ST, and Amiga, using older versions of those platforms in unexpanded ways, you kind of appreciate how much faster the text-based OS (DOS) could be than the early GUI-based stuff. I enjoy my Mac SE as part of my collection now, but it's not exactly a speed demon. It's the same thing on the Amiga. As brutal as the text-based interfaces on the PC could be, they were almost always really fast, if not exactly friendly. Even the first truly usable versions of Windows, starting with 3.0, had a certain speed to it with the right system running it.

  • Like 1

Share this post


Link to post
Share on other sites
1 hour ago, zzip said:

Another factor that I don't think these companies fully appreciated back then was the 8-bit computers were the game consoles of the mid 80s, especially the C64.  When the 16-bits came, they were too expensive for many of these people, which gave an opportunity for the dedicated console like NES to return.   Mid-range models may have helped keep those gamers around.

100% agree with you here.  Moreover, the NES was eventually able to undercut gaming 8 bit computers, like the C64, which really helped propel it forward.

  • Like 1

Share this post


Link to post
Share on other sites
18 minutes ago, Bill Loguidice said:

That's part of why my perspective on platforms like the IBM PC and Compatibles has evolved over the years. Once I removed my biases of what I KNEW was better and isolated it what I knew was better for ME, I came to appreciate more what owning one meant even back then.

A lot of us kids were enamored with graphics and sound.  And if it was powerful enough to have great graphics, then obviously it was powerful enough to handle everything.   I couldn't appreciate why the spreadsheet / word processing apps on my machine weren't good enough,  why did it have to be the name-brand overpriced apps from Lotus or Wordperfect?   I thought when people said "I don't want to learn a new app" they were just being lazy.  Now that I work, I understand you don't always have time to learn something new when you are on a deadline.   Also I didn't realize that I was a hobbyist and liked to explore what a computer could do and a lot of other people were just looking at computers as tools to get tasks done.

 

1 hour ago, Bill Loguidice said:

 

And one last point on the whole PC versus Macintosh, ST, and Amiga, using older versions of those platforms in unexpanded ways, you kind of appreciate how much faster the text-based OS (DOS) could be than the early GUI-based stuff. I enjoy my Mac SE as part of my collection now, but it's not exactly a speed demon. It's the same thing on the Amiga. As brutal as the text-based interfaces on the PC could be, they were almost always really fast, if not exactly friendly. Even the first truly usable versions of Windows, starting with 3.0, had a certain speed to it with the right system running it.

I remember how the computer press touted these new "all bitmap / no text mode" as innovation that rendered text mode obsolete.   But yes a vanilla ST without is pretty brutal for text speed.   Blitter helps, and using a screen accelerator app like Quick ST helps even more.   Those two things make the machine feel so much faster, but it still can't keep up with text mode on PCs.     

 

I just installed Win 3.1 on an old 486, and it's super snappy, even before I installed the custom graphics drivers.   There's an old joke that what Intel gives, Microsoft takes a way.    But later versions of Windows on newer hardware feel much slower than Win31 on a 486

  • Like 2

Share this post


Link to post
Share on other sites

When Windows 3.x came out a 486 was a high end workstation.  The 386 was just becoming affordable.  At that time I had a 386 and struggled to make room on my 40MB hard disk for windows 3,x.  I gave up.  By the time I upgraded my computer sufficiently it was time for Windows 95.  I pretty much skipped Windows 3.x.

 

------------

 

I think a consolised Amiga priced at around $350 or less, may have worked in 1985-87.  It would have been at the high end of video games coexisting with the nes at the low end.  It's too bad Amiga was bought buy a computer company but video game companies were disappearing in north america at the time.

  • Like 1

Share this post


Link to post
Share on other sites
2 hours ago, Bill Loguidice said:

... Over time, I came to understand that I can't give advice to someone based on my own biases. What's right for me (or what I love) is not right for them. ...

This pretty much sums up the responses to the recent what retro computer would you recommend...  thread. 
People giving responses based on their own biases, and what features are/were important to them.
But if you look at the responses, some people do list what they base their suggestion on. 
 

 

6 hours ago, Keatah said:

...

Herd mentality is a good thing. It promotes standards and devices that work rather trouble-free. Everyone knows how to use them. They can go about their business without babysitting the technology. I've saved tons of money once I started going with established industry standards as opposed to some whiz-bang one-off product. It's like those lightfield cameras and how they were supposed to revolutionize photography. Fast forward 10 years later and we're seeing that uber-high-tech stuff show up in devices that babies dribble drool on.

 

That's ok. Nothing wrong there. If it wasn't PC that became prevalent it would have been something else. And we'd be talking about that something else in the same style and manner.

...

When it comes to the herd/standardization movement... my one complaint, is that computers hadn't had enough time to evolve before settling on "standard" machines that were supported.
I think that actually slowed development of new solutions, alternate CPUs, better operating systems, etc.. because there wasn't as much competition.
Can you imagine how much faster machines would have developed it we had waited a decade?


In spite of the fact that I used my Amiga 3000 through several generations of PCs, and I still consider the programming environment the most productive I've ever worked on, I would have been better off getting a PC because it was the chosen standard.
 

Edited by JamesD
  • Like 1

Share this post


Link to post
Share on other sites

The one thing which stood out on the PC for me was RAM and big CPU address space.

 

Even the original ones, which were pretty slow by other 8 bit computing standards, could take a lot of RAM and that matters a lot on bigger tasks.  Back in the day we would have fun with a CoCo or Atari, Apple, whatever, beating the PC at a variety of things.  But, the moment it became something like, "Hey, let's repaginate that book", or "Let's put the entire operations budget into Lotus", maybe even, "Hey, let's develop a CNC program for that mill", RAM is all that mattered.  The rest is nice to have, but totally secondary.

 

That observation has stuck with me over the years and has held true.  If you have a choice?  Max out RAM, then CPU, then storage capacity and speed, then other stuff as makes sense.  The only exception is tasks that will bury the RAM no matter what.  Then you gotta move storage speed up above CPU for an optimal experience and time to task complete or iterate.

 

Some Apple 2 users had megabytes installed.  Boot it, and let it grind away loading up all that RAM, then go do whatever it is quickly.  Battery back that RAM for a repeat experience that's even quicker.  

 

A PC, equipped with a lot of RAM was slow, but not that slow when it comes to those larger data type tasks.  CAD is something I made a career out of, and back then being able to load a system and actually process a respectable drawing took either a ton of RAM or a ton of TIME.  Or it just wasn't possible too, depending.

 

A larger address space and similar amounts of RAM tells a tale.  At the time, an Apple spent more time managing 64K than the PC did eating cycles for no good reason.  'nuff said, the path was clear the day I saw that dynamic.  Figured they would clean up the cycles, and they did.  Intel knew something a lot of companies didn't, and that was the cycle count was not going to be all that important given increased CPU speed coming.  Many of our favorite machines ran in short clocks, hitting RAM fast.  That's what makes the magic happen.  

 

All that BS the 8086 and 8088 did was sloooooow.  But, the '286 clocked well above RAM.  Remember wait states?  Yeah.  And suddenly all that BS, caches, and other advanced things mattered a whole lot.  Intel knew it, designed for it, and with the IBM deal learned they could literally turn math into heat and the better the cooling system, the faster it goes.  

 

ARM, by the way, is headed down that same path, but did spend a lot more time on efficiency.  The Intel game has topped out.  Will be interesting to see where ARM type philosophy goes.  Might get us farther.  Arm + custom chips?  I think definitely will.  And custom silicon is coming around again.  Just watch.  The big players are already doing it.

 

The moment the PC dropped, the scope of mini computing and micro computing overlapped, and it all got much bigger very quickly.  Graphics, sound, all that stuff lagged of course.

 

It was just getting bigger address spaces and storage to match that really took things forward, IMHO.

 

LOL, and my first PC was a $20 Amstrad POS.  But what did it have?  RAM and storage.  Enough for me to make thousands and get some real machines.  20Mb Hard Card, 8088, CGA, later an 8087, and boom.  I could do big CAD.  Sloooooooly.  But I could do it for a $20 spot.  That's what the PC really did and how it changed things.

 

Ordinary people could get into bigger tasks, and they did.  And it didn't take long for getting into bigger tasks to be cheap.  No stopping that train.  And it's fine.  I enjoy the distinctive machines and always have, always will.

 

Started getting real work done on an Apple, because RAM and storage.  Continued getting real work done on a PC.  Was gonna happen that way no matter what, IMHO.

 

Because real work always trumps having fun.  

 

(unless you can have fun at work, which is what I often did.  WOLF3D, DOOM, Serious Sam, many others all played on higher end "work" machines, which I later brought home when they got cheap, or I had an opportunity.)

 

Really, for me the only exception was SGI.  I ended up with those at home because of the work I did.  AWESOME.  Sure wish we had got a taste of what they were doing in the home somehow, and earlier.  Those of us lucky enough to get a taste of that were gaming in 3D, yelling at one another, giving the bird over webcams in the early 90's...  Super $$$$$$$ though.

 

 

 

 

Edited by potatohead
  • Like 1

Share this post


Link to post
Share on other sites
22 minutes ago, potatohead said:

A PC, equipped with a lot of RAM was slow, but not that slow when it comes to those larger data type tasks.  CAD is something I made a career out of, and back then being able to load a system and actually process a respectable drawing took either a ton of RAM or a ton of TIME.  Or it just wasn't possible too, depending.

They had RAM alright,  but coding for that segmented memory model before was no fun.   Managing memory for TSRs in Dos was a PIA.    I still can't explain the difference between expanded memory and extended memory.   Thankfully the 386 cleaned up that mess.

  • Like 1

Share this post


Link to post
Share on other sites

People still mixing up Internet and WWW I see.

 

Anyway, one thing that I believe gets people in a fit over PC is that unlike other industries PC is the ONLY ecosystem available. As all the alternatives died PC compatibles were ALL that was left instead of having another competing unified ecosystem to compete against, at least ONE alternative should have been a thing but that never happened it's basically an ecosystem based monopoly with the only "competition" being the operating systems which let's be honest, is also basically a monopoly in all but name unless you are an enthusiast. 

 

Modern Macs were still PC's and the only difference being the OS but the format and traditional pricing of modern Macs for the last decade and change places it out of the mass market, still dominated by Wndows. ChromeOS is a slapped on interface over a web browser also available on Windows but may be the closest to a "fake" competitor to PC's since compatibility ends there.

 

Most other technological industries have two or more alternatives of standardization, so I get why many people will make books, blogs, or multiple video series about how Microsoft and other companies basically used unethical methods to help kick off competition. Sure I can see that but you have to place some blame on the computer companies at the time like Commodore, Atari, and Sinclair. 

Share this post


Link to post
Share on other sites
7 minutes ago, zzip said:

They had RAM alright,  but coding for that segmented memory model before was no fun.   Managing memory for TSRs in Dos was a PIA.    I still can't explain the difference between expanded memory and extended memory.   Thankfully the 386 cleaned up that mess.

Yeah, but it was less difficult and more efficient overall than banking in smaller address spaces.  Yeah, I did not actually buy into a PC, other than the $20 POS I got going with, until the 386.  Felt the same way.

Share this post


Link to post
Share on other sites
8 minutes ago, Leeroy ST said:

 

 

Anyway, one thing that I believe gets people in a fit over PC is that unlike other industries PC is the ONLY ecosystem available. As all the alternatives died PC compatibles were ALL that was left instead of having another competing unified ecosystem to compete against, at least ONE alternative should have been a thing but that never happened it's basically an ecosystem based monopoly with the only "competition" being the operating systems which let's be honest, is also basically a monopoly in all but name unless you are an enthusiast. 

 

Modern Macs were still PC's and the only difference being the OS but the format and traditional pricing of modern Macs for the last decade and change places it out of the mass market, still dominated by Wndows. ChromeOS is a slapped on interface over a web browser also available on Windows but may be the closest to a "fake" competitor to PC's since compatibility ends there.

 

Most other technological industries have two or more alternatives of standardization, so I get why many people will make books, blogs, or multiple video series about how Microsoft and other companies basically used unethical methods to help kick off competition. Sure I can see that but you have to place some blame on the computer companies at the time like Commodore, Atari, and Sinclair. 

 

Yes, the early era was colorful and distinctive.  Fun ride!

 

I think we are headed toward another one, frankly.  Apple is moving to their own silicon.  Some big players are making custom CPU's maximized for given tasks.  ARM devices everywhere.

 

Now, the down side is open vs closed computing.  That's going to come up again right soon, and already is.  Just booting a machine is already an issue, and it's going to get worse.  Custom silicon will prove to be effective at further reducing power while maximizing task efficiency and performance too.

 

For a taste of the old era, people can explore micro-controllers.  Some are quite advanced, yet remain fairly accessible in the way our favorite machines were.  It's a pretty fun scene right now, and people are making games even!  Some of those are new, many are ports, and others are emulation type deals.

 

We may soon see a simpler computer rise out of all that.  Capable, in the fun way, but not so much the way many of us use PC's and mobile today.  The nice thing, should that happen, is a smaller ecosystem not on the update treadmill.  Ordinary people can get skill and then actually get stuff done because the investment in time will endure long enough for it all to pay off.

 

As for blame...  maybe.  Everyone made mistakes, and or bad calls.  How could they not, given the times?

 

Frankly, it was all set to go boom!  Set piece in my view looking back.  Just having general connectivity changed things in crazy ways!

 

Many of us are the last people to live and work sans Internet.  It's really different now.  I do find it interesting how retro continues to attract younger people.  They, just like us, want to play, understand, game in simple ways, make shit.  It's cool.  Maybe our favorite era will always be cool, like a moment in time popular to explore.

 

 

 

  • Like 1

Share this post


Link to post
Share on other sites
4 minutes ago, potatohead said:

Many of us are the last people to live and work sans Internet.  

 

 

 

You mean web, also it hasn't been around that long lol. It'll be another 40 years before the last of the people who grew up (with enough years) before the web and internet reliability start dying off. Unless you were in your 20's or 30's when the VCS came out.

Share this post


Link to post
Share on other sites
5 minutes ago, potatohead said:

Yeah, but it was less difficult and more efficient overall than banking in smaller address spaces.  Yeah, I did not actually buy into a PC, other than the $20 POS I got going with, until the 386.  Felt the same way.

Similar problem, I agree.    It may have been more efficient than bank switching, but I don't think it was necessarily easier to program lol!   

Share this post


Link to post
Share on other sites
2 minutes ago, Leeroy ST said:

You mean web, also it hasn't been around that long lol. It'll be another 40 years before the last of the people who grew up (with enough years) before the web and internet reliability start dying off. Unless you were in your 20's or 30's when the VCS came out.

Just having an Internet in general.

 

What I mean though, is people growing up, since roughly '10, for some earlier, will have grown up very differently.  Yeah, the big die off is a ways off.  But, the story telling is happening now.  I've a 4 year old granddaughter.  So many things are different.  It's a lot of fun, and I'm showing her some older things.  She LOVES VHS, for example.  "Papa, it gave it back to me, do I tell it thanks?"  (and she does)

 

 

Share this post


Link to post
Share on other sites
1 minute ago, potatohead said:

Just having an Internet in general.

 

What I mean though, is people growing up, since roughly '10, for some earlier, will have grown up very differently.  Yeah, the big die off is a ways off.  But, the story telling is happening now.  I've a 4 year old granddaughter.  So many things are different.  It's a lot of fun, and I'm showing her some older things.  She LOVES VHS, for example.  "Papa, it gave it back to me, do I tell it thanks?"  (and she does)

 

 

I think video media will be an issue before we worry about pre-internet age stuff. It's been over 15 years since the original Ultrabooks came out and nearly every PC or laptop doesn't have a disc drive and Tablets of course don't have any. Talking about HD-DVD is already something that blows peoples minds. Heck, people around at the time forgot what that is. I still have my collection.

  • Like 1

Share this post


Link to post
Share on other sites
3 hours ago, potatohead said:

 

Yes, the early era was colorful and distinctive.  Fun ride!

 

I think we are headed toward another one, frankly.  Apple is moving to their own silicon.  Some big players are making custom CPU's maximized for given tasks.  ARM devices everywhere.

...

 

Funny how a company that dropped out of the computer market produced one of the most successful computer products to this day.

  • Like 2

Share this post


Link to post
Share on other sites

It is.  Taking a longer road.  I like to think they saw the more subtle things and just worked it through.

 

Apple coming on board helped.  A.  Lot.

Share this post


Link to post
Share on other sites

In reading all this stuff I now wonder which computer has been more inspirational to me. The Apple II or the IBM-PC & clones.

 

Early on I was extraordinarily inspired in science and tech and electronics by the Apple II. Those of you whom've read my rants would know. The Apple II had an advantage straight away because kids are typically impressionable, and the the II was there at the right time. A time of model rockets, electronic project kits, HAM radio, Space Shuttles, space colonies, Star Wars, the microchip coming of age, the arcades, and so much more. I tried in many ways to imaginatively tie the Apple II into most of that stuff. Either make-believe or for real.

 

Come the early 1990's my Apple II activities were winding down. And the PC was coming into focus. Albiet slowly. After having had observed the phenomenal growth of the industry to date, which would only increase later, I started getting antsy wanted to keep with it all. The II series had clearly run its course and the Amiga wasn't nearly as upgradable as I was led to believe. And there was no one around with which to pirate software for it anyways. It was a dead end.

 

Between 1988 - 1992 I had begun observing software for it. Not out of choice at first - stuff was thrust at me by magazines at the local grocery store. It was becoming clear that software development was in high gear for PC and only picking up speed. I had seen all kinds of cool astronomy programs and simulations with real hi-res graphics. Real star charts Games were increasing in detail and complexity. Flight simulators were becoming more realistic. There were tools and utilities and business programs for everything under the Sun and stuff I couldn't have imagined. However emulators were but farts in a dorm closet. It would be another couple of years before they were to the point of being playable.

 

And the hardware was simply astounding! PC had microprocessors running at 16, 20, and even 33 or 40MHz. There seems an endless sea of expansion options, cards, peripherals, everything! And "multi-media" hadn't even gotten underway.

 

SoundBlaster, 3D graphics, advanced DOS gaming, Windows GUI. Advanced hard disks that seemed to defy the laws of physics. Chips with millions of transistors instead of the ~3500 in the Apple //e's 65C02. Memory bigger than my local library. And more! All of it mine to discover!

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...