Jump to content
IGNORED

Apple II in low-end Market?


kool kitty89

Recommended Posts

Wow! Reading that article makes me really hate apple now. I loved the II and did all my bbs'ing and warez and phreaking on it, in addition to more productive things.

 

By killing off the II series it would seem they single-handedly handed the market to the IBM pc.

 

Maybe we are better off now for it. As there are applications on the PC that are not on the Mac.

 

And yes, I believe the IBM PC took a lot of what made the II successful and continued with it. The major parts being the Slots. With the processor up top and close to them. And we had some rom sockets about middle way down. And then below (and slightly right) we had support chips and ram.

 

It's interesting to see that this general layout more or less continued till the atx format rolled around.

 

post-4806-0-94245700-1301587966_thumb.jpgpost-4806-0-35385900-1301587966_thumb.jpg

Link to comment
Share on other sites

  • 3 weeks later...

It could be that Apple didn't want to lose sales on the more pricey II series by offering a cheaper version. If they had had a cheaper computer, a lot of people may have bought it instead.

 

Or it could be that they would have increased sa;es by having a cheaper version, because people who wouldn't/couldn't buy a II or a Mac could then buy the cheaper computer.

Exactly, even if they did lose some of their high-end customers to the low-end models, they could have gained far more with others who wouldn't have bought anything at all.

Plus, they'd also be profiting from all the peripheral sales for the low-end models (they'd be expandable for sure, as I originally mentioned). And, again, they logically would have positioned the full multi-slot expansion module to be more expensive (with the price of the low end II included) than a standalone model with built-in slots.

 

And, again, not only could have have pushed for lower cost models, but they could have repackaged the system in a form factor to be even more competitive against PCs (proper modular desktop box).

 

 

 

 

 

 

 

 

I wouldn't cry too hard for Apple over II series sales.

They sold over 6 million of them if I remember right and they had the highest profit margin in the industry.

I believe they outsold Atari... just not Commodore.

 

A 2MHz Apple II would have made a lot of sense instead of the III. The problem with the III is it was too different than the II.

Perhaps integrate an 80 column card, a DAC or AY chip for sound, and remove the weird video memory map but keep the same resolution and color artifacting. It would have been very easy to port software to, and developers would probably write versions for it before a standard II/II+ due to the ease of programming.

 

*IF* they had done that then the IIe could have been a highly integrated version of that machine with added higher res graphics.

Yes, regardless of broadening their range in the market, they definitely should have focused on extending/evolving/supporting the Apple II. It was a simple machine, but very flexible and expandable, the same things that allowed PCs to dominante the mass market. (actually, the simplicity also was a big part of that: easy to clone . . . had the ST been an open architecture design from the start, it probably would have had a lot more potential to expand similarly -if not in the US, at least in Europe)

 

Atari engineers had initially wanted to push the 800 with apple II like expansion, but upper management forced it to be an "appliance computer". (in reality, they could have done both, make the 800 a proper Apple II like system -with more integration and cost effectiveness- and pushed more limited lower-end models with the 400 and in-between -better if the 400 had 1090XL type expansion support too though)

 

 

 

Expandability and compatibility is the name of the game, and a few companies got that right, but oddly, many of them ended up screwing that up later on. (Tandy did continue support with TRS-80 compatible machines, but they didn't do a good job of supporting it as an open standard -though it had certainly started as such- and in particular, they introduced the totally incompatible CoCo overlapping with the original TRS-80 market sector -granted, the Model II had also been largely incompatible with the Model I, but you could argue they could have managed those 2 lines in parallel for a mid range/low-end and more high-end/business oriented machine -or even merge the standards later on as they were fairly similar overall)

 

 

At any point in time, Apple could have dropped prices or created a low cost machine. But why bother when you have one of the top selling machines with a much higher markup?

To expand the overall market and offer a wide range of machines. (and expand into Europe where low-end machines dominated) Again, I'm not saying they should have dropped the higher-end stuff (they should have pushed harder into the top end with the II family), but I am saying that they should have broadened the line into a full array of compatible models that could cater to most of the mass market in general.

 

 

 

 

 

 

The problem with the III is it was too different than the II.

 

Woz disliked the III. And he didn't even like the Macintosh. He said "let's develop a GUI for the II" but was shot down. The IIc had color but the original Macs were black and white. And the IIc was a good looking machine. I can see why Woz wanted to develop a GUI for it instead of going with the Mac line. Development of the Mac was started in about '82 when the II was still a huge seller.

Heh, the Apple II had color back in 1977. ;)

 

Woz had a lot of good ideas that ended up getting crushed by the idiots running the company (especially Jobs). A shame Woz couldn't somehow stage a coup to drive them out of the company.

 

Actually, Jobs and Woz are a bit like Nolan Bushnell and Ted Dabney; both stood in the limelight and took credit that wasn't theirs (or no theirs alone). Hell, Jobs even learned a lot of his sensationalist PR BS from Nolan while at Atari.

Of course, one major difference is that Dabney was forced out of Atari pretty early on while Woz is still with Apple (more or less).

 

 

A shame that Jobs couldn't get his head out of his ass long enough to realize how successful the company could be if they championed his PR skills (and general marketing) with Woz's extremely insightful engineering visions and capabilities.

 

People like to think of Jobs as a visionary, but if anything, Wozniak had the best foresight of anyone in the company (if not some of the best of anyone ever in the industry). The Apple II was probably the most innovative and forward-thinking design to ever sport the Apple brand name, a shame they ended up running it into the ground. (rather deliberately too, especially demonstrated in that article)

 

 

 

 

 

Yes, a lot of potential there too, and multiple routes to go for evolving the platform. It took until the IIc Plus to get a faster CPU, but they at least could have bumped it to 2 MHz with interleaving in faster RAM (like the BBC Micro) or 3 or 4 MHz even (but then it would make more sense to switch to a wait state mechanism more like the A8's DMA rather than plain interleaving in fast RAM -or going to a dual bus design, but that's more costly).

The Zip Chip. Which is the exact route they took for the IIc+.

Yeah, some years later, by 3rd parties (and finally with the IIC+).

 

They could have been pushing 2-3 (perhaps 4) MHz NMOS 6502s years earlier.

Aside from the previously mentioned wait state and faster DRAM options, they also could have implemented a small local SRAM buffer or "cache" (not a real dynamic cache, but similar in some respects) for the CPU to run at full speed off the main bus. (you could use that RAM for zero page registers and small amounts of high-speed code and data which could be updated as needed in software -sort of a software managed cache) They could have made the cache optional and expandable as well. (either via a special card slot or a DIP socket; if they introduced that in the early 80s, it would probably be most cost effective to start with a single 2kx8-bit SRAM chip with provisions for more and perhaps for 8k chips as well -using a slot would have been the most flexible and taken the least board space)

 

 

There's a lot of middle ground for the IIe to the IIGS in feature set. Maybe just a true 16 color bitmap mode (even if only 140x192),

The DHGR mode could be treated as 140x192x16 (using the same 16 colors as the 40x48 mode); I think at least one video card also did so.

 

you could have a GUI running in monochrome 560x192 and probably decently fast at apple IIc plus speeds

a2open.gif

Yeah, like that. ;)

 

They could also have opted for relegating the 6502 as a coprocessor on a separate bus with all the audio, video, and I/O and add a 68k as the new CPU (sort of an Apple II/Mac hybrid that uses the old hardware relatively efficiently).

Something like the //e card for the Mac LC line?

No, that was just tacked onto the Mac; I'm talking about a ground-up design that makes use of the old hardware reasonably efficiently as an integral part of the system. (then again, there's plenty of argument for never going with the 68k at all and sticking with 650x compatible chips -you could even push custom or off the shelf coprocessor logic to address some shortcomings of the 650x, and tons of other potential for hardware acceleration for graphics and/or sound -then again, if the 650x had higher demand, perhaps it would have been extended further in general, perhaps with additional FPU coprocessors as well)

 

But beyond true architectural upgrades, there's form factor, and it may have been significant to offer the Apple II in an IBM-like desktop box+separate keyboard form factor with bays for internal disk drives, etc, and a keyboard more competitive with IBM's. (at least a keypad added)

Well, the //e Platinum (which I own) had a keypad, and there was a prototype IIgs with internal drives.

Yeah, but I'm talking about a full PC-like form factor with market positioning to match.

 

 

 

 

"Yes, but high volume low-margins could be just as profitable, if not more."

 

Well, I've not seen that. I've seen lots of money made that way, but I've also seen a lot of costs and few real opportunities to add value.

 

That's one thing about the Apple ][ that a lot of people don't get. It was expensive because it had a lot of value added. Great documentation, ROM, case, slots, device cards, etc... It was serious, and expandable, and that made it very capable. Games were weak, but on business / productivity / academic, Apples were very good. The other 8 bitters really didn't compare.

 

Those things don't happen with a lower end machine.

 

Apple would have only lost with a lower price offering.

They could have broadened the market in general though, cater to lower AND higher end than they were at the time.

 

The problem is that no company even tried that, and those that came close totally f*cked it un in one way or another (usually several ways). IBM almost managed to do that with the PCJr, but totally botched that . . . Tandy got it right and DID manage to offer a pretty wide range of machines at that.

 

 

 

 

 

One thing I hated about the II line was the memory banking introduced for 128K. The memory wasn't in one location like the language card, it also switched other areas of memory which makes memory management more difficult.

Hmm, so it's not just a fixed location that gets banked to? (like the 130XE scheme -I think the CoCo III used that sort of scheme too)

 

 

 

 

 

Apple in low end market....not really....they weren't 'cheap and cheerfull' like commodore/sinclair etc

 

Commodore/sinclair weren't interested in innovation (or quality components etc) much, just getting current tech out of the door at an RBP

Actually, from a hardware perspective, the Apple II was cheap, but it never got the positioning to be "cheerful" or cheap on the consumer end.

 

And AFIK, Sinclair was ALL about invention and innovation, hell the engineers at MOS/CBM ended up being pretty damn innovative too (just not marketed quite as such). A shame that most of Sinclair's projects ended up being market failures (the ZX80/81/Spectrum were the main exceptions).

 

 

 

 

 

I've often wondered if Steve Jobs wanted the Lisa/Mac to take off and the II to die as a visionary thing (it's the future), or was it because those were his baby and Steve had to share the limelight with Woz on the II?

Honestly, I think that once the Apple II had served its purpose (in Jobs' mind), he couldn't have cared less about it. What's fascinating about him is that he doesn't have a shred of nostalgia for past achievements. He'll be thrilled to have his company popularize the next big paradigm shift in popular electronics, and will then gleefully bury it as soon as he can for the next big shift.

 

http://webcache.googleusercontent.com/search?q=cache:QgPhaEoL_jAJ:www.netherworld.com/~mgabrys/clock/weak5.html&hl=en&gl=us&strip=1

 

This paints a pretty clear picture of what Apple management's mindset was regarding the Apple II, and it actually didn't depend on Jobs' presence.

That link isn't working, so, here: http://replay.web.archive.org/20080820003428/http://www.netherworld.com/~mgabrys/clock/weak5.html

 

That's really sad, but definitely an amazing insight into Apple.

Woz (and his supporters -and the Apple II guys in general) had it right, but Jobs/marketing ruined it with the disasters that were the III, Lisa, Mac, etc. (and all the while downplaying the Apple II as obsolete crap)

 

 

With that in mind, my whole supposition on expanding the II to low (and higher) end machines seems rather insignificant. They just needed to push the Apple II more in general, regardless of broadening the market model. (hell, a broader range would have been inevitable with more support in general)

 

It still boggles my mind that a company would devote as little attention as possible to the product that was generating most of its revenue. That's just lunacy. But if you look at other microcomputer companies from back then, they always assumed that there was no way a single product would last more than a few years. They didn't seem to understand the concept or implications of a persistent legacy architecture - one could forgive them for this, since the microcomputer industry was still extremely young, but there was a ton of precedent for it from the mainframe and minicomputer industries. Making that assumption would have been the riskier bet, though.

To be fair, there WERE several companies who actively pushed legacy support, though many in an awkward manner. Atari Inc kept pushing the A8 (and were finally addressing some of the long-held limitations of the system with the 1090XL and such -albeit something engineers had wanted since day 1), Tandy extended the TRS-80 with the Model 3 and IV, though they ended up limiting their market in general with rather modest enhancements and the incompatible coco in the low-end. (the model II and related line was incompatible too, of course, though probably close enough to merge that with the Model I/III line if they'd wanted to later on)

 

 

Hell, Bushnell (and some others at Atari) had wanted to kill off the 2600 back in '78 when it was slipping (and had intended it to be a short lived machine), but Warner management ended up pushing on with the system and it ended up getting absolutely massive in the following years and still marketable though the late 80s. Warner made plenty of other mistakes, but keeping the 2600 going certainly wasn't one of them -limiting the 8-bit to an "appliance computer" was one of those problem, as was the fatally flared distribution network. (and there's plenty of other areas where Warner Atari management was significantly better than Bushnell Atari had been, like better business sense in general -a shame they didn't have someone like James Morgan in there back in '79 with more effort to limit Warner interference on top of that)

Link to comment
Share on other sites

Cool thread! Whether this belongs here or not - What I found interesting was that AppleWorks was originally written for the /// series computer. And then re-worked slightly for the //e and //c with MouseText. I believe it was Woz that pushed that issue.

 

As we all know AppleWorks was one of the premier applications at the time and Woz was concerned that the ///'s reliability would put a bad image to a stellar piece of software.

 

It was technical decisions like that which made Woz famous, aside from his frugality when working with IC components and yet still producing amazingly versatile circuitry that could be prodded to do much more than it was originally intended.

 

You don't see many engineers that have that style today, either when working with hardware or software!

 

On a side note, you don't see many other early computer companies that have such a colored and vibrant history the likes of Atari-Commodore-Apple.

 

RS/Tandy seemed a staple of offices and business and some electronic technician/engineer hobbyists. And ESPECIALLY CB/HAM radio operators. It had something to do with the way the letters "Z-80" looked. Or something. Yes! It's absolutely true!

 

And TI, well I don't know where the hell they fit in.

Edited by Keatah
Link to comment
Share on other sites

Expandability and compatibility is the name of the game, and a few companies got that right, but oddly, many of them ended up screwing that up later on. (Tandy did continue support with TRS-80 compatible machines, but they didn't do a good job of supporting it as an open standard -though it had certainly started as such- and in particular, they introduced the totally incompatible CoCo overlapping with the original TRS-80 market sector -granted, the Model II had also been largely incompatible with the Model I, but you could argue they could have managed those 2 lines in parallel for a mid range/low-end and more high-end/business oriented machine -or even merge the standards later on as they were fairly similar overall)

The CoCo may not have been compatible with the I/III but it was Tandy's top seller for pretty much it's entire life.

 

What's strange, is that the Laser/VZ 110/200/300 machines were actually TRS-80 clones with a 6847 and hacked ROMs. Sort of a Z80 CoCo equivalent if you will but minus a few features of the CoCo.

It would have been interesting to see what would have happened if Tandy had gone that route for a Color Computer.

It still would have required software to be rewritten, but I found everything but graphics to be fairly easy to port to the CoCo anyway. Ok, so there is assembly as well but I've ported from Z80 to 6809 and it wasn't that bad.

 

One bad thing, if Tandy had done that would be that we wouldn't have any major 6809 based machines to play with, or the CoCo's 6803 based little brother the MC-10.

 

One thing I hated about the II line was the memory banking introduced for 128K. The memory wasn't in one location like the language card, it also switched other areas of memory which makes memory management more difficult.

Hmm, so it's not just a fixed location that gets banked to? (like the 130XE scheme -I think the CoCo III used that sort of scheme too)

The CoCo 1/2 replaced the ROM area with RAM (32K).

The CoCo3 does that and has an MMU that lets you mess with the memory layout for the rest of the expansion RAM.

 

The first IIe's just switched RAM in place of ROM like the CoCo 1/2. It was basically the equivalent of the II+ with a language card. Link

When enhanced IIe upgradeable to 128K was introduced, the new banks were all at a different addresses with the exception of added RAM in the language card area.

This banking layout did have the advantage of giving you an auxiliary stack and page zero... but I'm not sure if anyone actually took advantage of it.

Link

 

 

With that in mind, my whole supposition on expanding the II to low (and higher) end machines seems rather insignificant. They just needed to push the Apple II more in general, regardless of broadening the market model. (hell, a broader range would have been inevitable with more support in general)

What support are you looking for here?

It was the 2nd biggest seller as it was. The only computer that rivaled the II in software was the C64 and that didn't have near the business software of the II.

The II series still had advertising even after the III and Mac were introduced so it's not like there was no support at all. Could they have pushed it and got more of everything... probably, but I don't see it overtaking the C64 unless they revamped it at some point which is why I suggested what I did. The revamp would have had to have been early in the II's life to make much difference.

 

The way I see the II series doing better is if they had jumped from 1MHz to 2MHz (ok, maybe 1.76 MHz) and dropped the video memory interleave. Adding something like a DAC for sound could have been done at any time and would have been really cheap to add. As it was, double hi-res graphics weren't introduced until the Apple IIe B motherboard and they were even worse to program. It wasn't until the IIgs that the nightmare screen layout went away but only in it's new graphics modes.

 

If you look at how many companies dropped out of the 8 bit computer race due to reduced profits from price drops, I think it's obvious the only option was to add features for the money or enhance what was already there.

They did that but screwed up by wasting time on the III rather than upgrading the II+. Frankly, WOZ should have made more improvements when he designed the II+.

Edited by JamesD
Link to comment
Share on other sites

Expandability and compatibility is the name of the game, and a few companies got that right, but oddly, many of them ended up screwing that up later on. (Tandy did continue support with TRS-80 compatible machines, but they didn't do a good job of supporting it as an open standard -though it had certainly started as such- and in particular, they introduced the totally incompatible CoCo overlapping with the original TRS-80 market sector -granted, the Model II had also been largely incompatible with the Model I, but you could argue they could have managed those 2 lines in parallel for a mid range/low-end and more high-end/business oriented machine -or even merge the standards later on as they were fairly similar overall)

The CoCo may not have been compatible with the I/III but it was Tandy's top seller for pretty much it's entire life.

 

What's strange, is that the Laser/VZ 110/200/300 machines were actually TRS-80 clones with a 6847 and hacked ROMs. Sort of a Z80 CoCo equivalent if you will but minus a few features of the CoCo.

It would have been interesting to see what would have happened if Tandy had gone that route for a Color Computer.

It still would have required software to be rewritten, but I found everything but graphics to be fairly easy to port to the CoCo anyway. Ok, so there is assembly as well but I've ported from Z80 to 6809 and it wasn't that bad.

Yeah, I was just thinking of the potnetial for Tandy to have pushed a general/flexible breadth standard range of machines that built on the Model I and II from low-end home computing to hobby to education to business. (I think the monitors used on the model I and model II were using TV sync rates too, so no direct conflict there either -other than RF possibly making 80 column text inpractical -composite with colorburst disabled should have been fine though)

 

Maybe they could have even invested in their own graphics ULA that combined the Model I/II text mode logic and added color capabilities and maybe bitmap graphics. (allowing RAM defined text characters could have been more useful though -and not that hard to hack as bitmap graphics either)

 

One bad thing, if Tandy had done that would be that we wouldn't have any major 6809 based machines to play with, or the CoCo's 6803 based little brother the MC-10.

Yes, at least outside of Japan. (where you had the neat FM-7 with dual 2 MHz 6809s -one dedicated to manipulating graphics in the 640x200 3-bit RGB framebuffer)

Then again, without the CoCo, Fujitsu might not have gone in that direction either.

 

Having more 6809 based machines in general would have been neat (I wonder if IBM ever considered that when developing the PC). Then again, there were a lot of cheaper (if not more cost effective performance-wise) alternatives on the market; similar to how the 6800 wasn't as popular as cheaper contemporaries with reasonably comparable performance. (of course, Motorola didn't ever push hard for competitive pricing with the Z80 or 6502 AFIK, so that was a major factor)

 

The first IIe's just switched RAM in place of ROM like the CoCo 1/2. It was basically the equivalent of the II+ with a language card. Link

When enhanced IIe upgradeable to 128K was introduced, the new banks were all at a different addresses with the exception of added RAM in the language card area.

This banking layout did have the advantage of giving you an auxiliary stack and page zero... but I'm not sure if anyone actually took advantage of it.

Link

I wonder why they didn't just add more banks to the same address range as the language card RAM disk.

 

With that in mind, my whole supposition on expanding the II to low (and higher) end machines seems rather insignificant. They just needed to push the Apple II more in general, regardless of broadening the market model. (hell, a broader range would have been inevitable with more support in general)

What support are you looking for here?

It was the 2nd biggest seller as it was. The only computer that rivaled the II in software was the C64 and that didn't have near the business software of the II.

The II series still had advertising even after the III and Mac were introduced so it's not like there was no support at all. Could they have pushed it and got more of everything... probably, but I don't see it overtaking the C64 unless they revamped it at some point which is why I suggested what I did. The revamp would have had to have been early in the II's life to make much difference.

I'm thinking more of the Apple II missing out on the prospect of becoming a persisting market standard like IBM ended up doing. (except IBM -and 3rd parties- kept it going with evolutionary/compatible models rather than doing weird things like Apple had)

 

Of course, IBM also failed to go into the lower-end market (they almost did that with the PCJr, but they screwed that up -Tandy shows more what it could have been, albeit without the same sort of push IBM could have had). Actually . . . IBM also started screwing up the "compatibility/expandability" aspect when they tried to set up proprietary standards retroactively (one of the main problems with the PCJr and, of course, the PS/2 line).

 

The way I see the II series doing better is if they had jumped from 1MHz to 2MHz (ok, maybe 1.76 MHz) and dropped the video memory interleave. Adding something like a DAC for sound could have been done at any time and would have been really cheap to add. As it was, double hi-res graphics weren't introduced until the Apple IIe B motherboard and they were even worse to program. It wasn't until the IIgs that the nightmare screen layout went away but only in it's new graphics modes.

 

If you look at how many companies dropped out of the 8 bit computer race due to reduced profits from price drops, I think it's obvious the only option was to add features for the money or enhance what was already there.

They did that but screwed up by wasting time on the III rather than upgrading the II+. Frankly, WOZ should have made more improvements when he designed the II+.

They could have pushed a lot harder in general, kept the simpler baseline models in the lower-end range (continually consolidated for lower cost), and expanded models in the mid-range to higher end market. (eventually having a full successor for the next generation built onto the same architecture)

Link to comment
Share on other sites

Yeah, I was just thinking of the potnetial for Tandy to have pushed a general/flexible breadth standard range of machines that built on the Model I and II from low-end home computing to hobby to education to business. (I think the monitors used on the model I and model II were using TV sync rates too, so no direct conflict there either -other than RF possibly making 80 column text inpractical -composite with colorburst disabled should have been fine though)

 

Maybe they could have even invested in their own graphics ULA that combined the Model I/II text mode logic and added color capabilities and maybe bitmap graphics. (allowing RAM defined text characters could have been more useful though -and not that hard to hack as bitmap graphics either)

Ever hear of an LNW-80? It was a Model I clone with high-res graphics and it could produce color. It also had a faster CPU. I really wanted one of those when I first started looking at computers but it was way more than my parents could afford. Its probably what the III should have been hardware wise.

 

Having more 6809 based machines in general would have been neat (I wonder if IBM ever considered that when developing the PC). Then again, there were a lot of cheaper (if not more cost effective performance-wise) alternatives on the market; similar to how the 6800 wasn't as popular as cheaper contemporaries with reasonably comparable performance. (of course, Motorola didn't ever push hard for competitive pricing with the Z80 or 6502 AFIK, so that was a major factor)

I think Tandy had special pricing on the CoCo chips. The only other major 6809 machine was a French one from Thompson. They were a 2nd source for the 6809 for military applications so it was an obvious choice for them.

 

I wonder why they didn't just add more banks to the same address range as the language card RAM disk.

It probably saved chips in WOZ's design. :roll:

I think WOZ is brilliant but I think he spent too much time trying to be clever. A little less clever and a couple more chips might have made the machine much more capable. But then I'm not even sure WOZ designed the IIe.

 

I'm thinking more of the Apple II missing out on the prospect of becoming a persisting market standard like IBM ended up doing. (except IBM -and 3rd parties- kept it going with evolutionary/compatible models rather than doing weird things like Apple had)

Without a faster CPU the II had nowhere to go. The IIgs could have been enhanced a bit more but the 6502 was a dead end and Apple had another machine that clearly had a future. After all, we still have Macs today.

 

Of course, IBM also failed to go into the lower-end market (they almost did that with the PCJr, but they screwed that up -Tandy shows more what it could have been, albeit without the same sort of push IBM could have had). Actually . . . IBM also started screwing up the "compatibility/expandability" aspect when they tried to set up proprietary standards retroactively (one of the main problems with the PCJr and, of course, the PS/2 line).

Tandy did all right in the PC market for a while but as the margins slipped away and the machines got more complex (3D graphics) their niche evaporated and they quit.

 

They could have pushed a lot harder in general, kept the simpler baseline models in the lower-end range (continually consolidated for lower cost), and expanded models in the mid-range to higher end market. (eventually having a full successor for the next generation built onto the same architecture)

And again, what you are suggesting would have questionable results.

Take a look at Commodore. Sure, they outsold everyone... but the didn't make as much money as Apple.

And like I said... no faster 6502 = dead end.

They had a II card that plugged into a Mac and once software emulation hit, the card was even done.

And Apple does have a standard that exists to this day... it's called Macintosh.

I think the II series could have done better with a major push from Apple, but cheaper wasn't necessarily better and it was destined to disappear.

Link to comment
Share on other sites

After reading that link above, it would have been really interesting for Apple to go ahead and provide good expansion hardware for the ][, and software to bridge the gap to the Mac.

 

True, the 6502 was a dead end compared to other chips.

 

However, the ][ could easily manage another CPU, as it did with the CP/M card, carrying a lot of software forward, while also offering better multi-media capabilities.

 

The very simple and open design of the ][ had exactly the same characteristics of the PC, and could have easily displaced it more than it did.

 

Apple later demonstrated that a user base could be migrated across very significant changes, as that is exactly what was done with the Mac multiple times.

 

The ][ was a very serious computer with a lot of capability. Wasn't cheap, but that didn't always matter. PCs were not cheap either. The capability mattered to business, which is why the ][ did as well as it did. Apple proved out how to do general purpose computing with the ][, and the success of the PC vetted that nicely.

 

Jobs really was ahead of the tech, and his own fetish cost a lot of money then. IMHO, those issues cost users, and he knew that and was perfectly willing to exploit them.

 

Now, with tech at a state where elegant can be realized, Apple makes very good margins on it's Mac hardware. The ][ could have stunted the PC growth, perhaps getting Apple farther along, earlier than they achieved otherwise, and the user base would have seen more value for their investment, and they were the real losers in that mess.

 

Both Woz and Jobs are interesting people to me. Woz wanted it to work, and to be built upon, and generally made trade-offs that software could account for. I believe strongly in that idea, and it still often plays out well in the embedded space. Doing the most in software isn't always as sexy, but it is very robust, and robust is worth quite a bit in general purpose computing.

 

Jobs valued things differently, and his inability to leverage both, for fear that "good enough" would trump "elegant" led him to do what he did.

 

Today, Apple enjoys considerable margin, and has a lot of money in the bank despite never reaching a dominant share. Jobs is spot on about the value of things, and the most interesting observation to me is that users, who value their time, are perfectly willing to be exploited in return for value added that saves them that time. There will always be a sizable, but not dominant fraction of us, who will value things "elegant", and they will pay for that.

 

I find it interesting that a $2000 Mac Book Pro won't come close to the computing performance a $2000 Lenovo does. (yes, I have both right now for work, and get to evaluate both in daily use for some proof of concept type work that is cross platform, OS X, Windows, Linux) But, the Lenovo machine is not "elegant" in the way the Mac is, running hotter, requiring more fiddling, etc... where the Mac is this sexy thing that, for the most part, just works, so long as the things I want to do are accounted for by Apple.

 

As soon as I leave that safety bubble, the value of the Mac drops considerably. As a Unix machine, it's a bit fiddly, and as a Windows machine, it's underpowered as a Windows machine and as a Linux machine.

 

Interestingly, I've ran OS X on my existing Lenovo, and it's a bit poor for that OS too.

 

Jobs knows that a holistic design can nail specific niches, and that's where the money is, which is why he never could accept growth in the ][. Woz understood that providing a great foundation, where value can be added in lots of ways could be adapted to do well on most things, and there is money in that too.

 

Two polar opposites, and a very interesting tech story!

  • Like 1
Link to comment
Share on other sites

A Zip Chip and a business expansion or two would have done nicely to impact the first wave of PCs, which were not all that quick relative to the Apple. Or...

 

 

The GS upgrade board fit into a //e, and could have been made available much sooner, for example. That would have looked very favorable to the PC then, because the base of users software, peripherals, etc... would have added a ton of value.

 

Even today, Apple shows clearly that the package can sell very nicely against the performance. Would have done well back then too. Some of the people moving off of CP/M would have picked Apple, with the majority still going for the PC, but Apple would have made a ton of money taking that path, able to fund the Mac, and grow the ][ for a while.

 

 

I still think the PC would have won the day. But there was more in the // than we saw, and the GS would have done a hell of a lot better had that been done. Perhaps Jobs would not have been seen as the enemy to revenue, and the Mac would have advanced better, faster as well.

 

As it was, choosing the PC was mostly a no brainer then. The //e with GS upgrade, released earlier as it could have been, would have changed that equation for a lot of people.

Link to comment
Share on other sites

The GS upgrade board fit into a //e, and could have been made available much sooner, for example. That would have looked very favorable to the PC then, because the base of users software, peripherals, etc... would have added a ton of value.

It's pretty hard to make a GS upgrade when the 65816 was late coming out and the first batch or two of CPUs Apple received didn't even work.

And you can't do much development and testing of new hardware, and a new OS without the CPU.

Apple could have released it sooner but not by a wide margin.

Link to comment
Share on other sites

Didn't know that.

 

Well, scratch that idea.

 

Fall back on the second one. The ][ could very easily have seen some expansion and software to do lots of small business productivity stuff, and it would have been selected in the market just for sheer inertia, if nothing else.

 

The story above is stunning really. Million bucks a week on press for the Mac, built on the back of the very profitable ][.

 

Now I want a nicely equipped //e even more! Won't get it right away, but I will eventually, when I have a bit of room. Then I'll build a card for it, and stick a Propeller in there! Yeah, nobody will give a shit, but I'll be happy on that day.

 

Just read today that 40 percent of Apple revenue comes from the iPhone. Gadgets is what Jobs really wanted to sell. Elegance, function, form. Gotta respect that vision, because now it makes a lot of sense. Didn't then though. Tech wasn't there yet.

 

IMHO, he's a lucky guy really. Given how badly he managed the cash cow, the fact that it paid anyway is just amazing, and probably the reason we actually see the Apple stuff we do today.

 

(none of which I'll ever bite on, BTW)

Link to comment
Share on other sites

You know there is another parallel here, and that's SGI.

 

Back in the higher end UNIX days, SGI and it's IRIX, which took a while to really get solid (6.3 and 6.5 rocked), executed a lot like Jobs would have. The desire for extremes on form were not there, but elegance and power, and sophistication were.

 

A Indy, or O2 was a great multi-media machine. Did lots of video and audio work on those things, along with some high-end CAD. Best damn computing experience I ever had, bar none!

 

The others, HP, SUN, IBM made somewhat faster machines, but they didn't produce the stuff that SGI did, and they didn't add value like SGI did, many of the ways are seen in the Apple line today, by the way.

 

OS X is a Unix done right for the ordinary person. It's a bit funky, if you are used to Unixes, but otherwise excellent. Been running a Mac Book Pro for a while, and I have to say, it corners very well while still gunning hard on the straight stretches, to use a car analogy.

 

Back then, early 90's, one could get more compute and solid graphics for less on the other boxes for less, but you had to know your stuff more than you did on the SGI computers. And, the extras brought in a lot of margin, with people paying very well for good, sexy tech.

Link to comment
Share on other sites

Ever hear of an LNW-80? It was a Model I clone with high-res graphics and it could produce color. It also had a faster CPU. I really wanted one of those when I first started looking at computers but it was way more than my parents could afford. Its probably what the III should have been hardware wise.

Interesting, but that's a couple years later than I was thinking . . . more like the followon cross compatible model that encompassed the model I/II/color systems.

I was thinking of a more rudimentary upgrade in the same timeline and price range (at least for bottom-end models) as the CoCo (1980), including TV output along with backwards compatibility with Model I modes.

 

Hell, if they were still going to go with the cheaper 2.5 MHz Z80 (rather than the Z80A), they could at least bump it up to 2.38 MHz from the 1.79 MHz used in the model I. (using a 7.16 MHz master clock -for commonality with NTSC color clock) Though I think Z80As were pretty cheap by that point too. (after all, Sinclar was using them in the super low end ZX80 at the time)

 

I think Tandy had special pricing on the CoCo chips. The only other major 6809 machine was a French one from Thompson. They were a 2nd source for the 6809 for military applications so it was an obvious choice for them.

Yes, but the question is whether it would have been cheaper to use a generic ULA (probably with some discrete logic -at least initially) and common Z80 CPUs.

 

I wonder why they didn't just add more banks to the same address range as the language card RAM disk.

It probably saved chips in WOZ's design. :roll:

I think WOZ is brilliant but I think he spent too much time trying to be clever. A little less clever and a couple more chips might have made the machine much more capable. But then I'm not even sure WOZ designed the IIe.

Couldn't they also have invested more into custom chips (or cheaper -but more limited- ULAs/PLAs/etc) to save on overall chip count, board space, and long-term cost?

 

Without a faster CPU the II had nowhere to go. The IIgs could have been enhanced a bit more but the 6502 was a dead end and Apple had another machine that clearly had a future. After all, we still have Macs today.

The 6502 alone wasn't a dead end though, there were plenty of options for faster CPUs (even the NMOS 6502 went up to 4 MHz, but I think 3 MHz was the highest rated version to be relatively common), you had the 65C02 (and further extended R65C02), and the 65816. (though the advantages of that over faster C02s are relatively limited)

Then you've got the fact that demand for the 6502 declined (especially in terms of pushing for higher performance models), so there wasn't much incentive to push the architecture further. (imagine what would have happened to x86 if IBM hadn't adopted it ;))

 

However, once you did hit a wall with the architectural limitations, you could invest in a custom derivative of the design (especially feasible with the low licensing costs of the 650x series), and that would also be attractive prior to really needing to extend the architecture significantly. (doing things like Hudson did with integrated MMU/banking logic, I/O, sound, and some added instructions)

Or you could opt to switch architectures entirely and only provide support via emulation (much easier for high-level/OS driven programs) or tack-on the old hardware for compatibility and auxilliary processing (again, facilitated by the low-cost licensing of the 650x chips). A replacement CPU could be off the shelf, or totally custom and in-house . . . like Acorn did. ;)

 

And aside from pure CPU performance, you could address some limitations with off the shelf and/or custom coprocessors. (ALU/multiply/divide units, perhaps floating point, DSPs, blitters, etc, etc)

 

One problem was that other off the shelf architectures were limited, MIPS was still pretty expensive in the late 80s, ARM hadn't opened to 3rd parties yet, and you had x86 and 68k. (68k seems the better choice for the time, but in hindsight you's hit the obvious wall in the mid 90s again where you'd need another architectural shift)

In that respect, it probably would have been safest to milk 650x for as long as possible before making a transition. (pushing for higher and higher clock speeds as available -and as memory speeds allowed, transition to the '816, probably implementing some sort of fast SRAM buffer/cache to allow faster operation than DRAM of the time allowed -at least outside of fast page accesses, and filling in the rest with coprocessors)

 

 

They could have pushed a lot harder in general, kept the simpler baseline models in the lower-end range (continually consolidated for lower cost), and expanded models in the mid-range to higher end market. (eventually having a full successor for the next generation built onto the same architecture)

And again, what you are suggesting would have questionable results.

Take a look at Commodore. Sure, they outsold everyone... but the didn't make as much money as Apple.

Commodore didn't do what I was suggesting either, and they had horrible management problems as well (possibly worse than Apple, or Apple was luckier). If you look at CBM from 1981-1984 compared with Apple in that timeframe, I'd suspect CBM was making a lot more (taking investment spending into account of course -ie money CBM was using to build up capital rather than retain it as liquid assets). The biggest caveat would be the brief period where CBM was selling at a loss (at least after rebates) in 1983, but I doubt that was enough to make it up, especially if you take Europe into account. (again, one of Apple's gaping weak points that a low-cost model could have corrected)

 

CBM made a mess of things, more so after Tramiel left (with some exceptions, but mostly worse). They managed to lose their business/education market moving forward from the PET, jumped to the low-cost consumer market, then made a mess with a bunch of unnecessary and overlapping products, brought in the Amiga, and finally had a broad range of compatible/expandable machines with the later gen Amiga line, but that took some 5 years after the Amiga's launch. (the Amiga's marketing and market positioning was pretty screwed up too)

 

And like I said... no faster 6502 = dead end.

There were plenty of options for faster 6502s though, and other options for acceleration as mentioned above. (and, of course, they could have reasonably been pushing 2-3 MHz models back in the early 80s)

 

They had a II card that plugged into a Mac and once software emulation hit, the card was even done.

And Apple does have a standard that exists to this day... it's called Macintosh.

I think the II series could have done better with a major push from Apple, but cheaper wasn't necessarily better and it was destined to disappear.

What about having the II evolve and expand like the PC?

 

 

 

 

 

 

 

The GS upgrade board fit into a //e, and could have been made available much sooner, for example. That would have looked very favorable to the PC then, because the base of users software, peripherals, etc... would have added a ton of value.

It's pretty hard to make a GS upgrade when the 65816 was late coming out and the first batch or two of CPUs Apple received didn't even work.

And you can't do much development and testing of new hardware, and a new OS without the CPU.

Apple could have released it sooner but not by a wide margin.

Why not ditch the '816 for the time being and go straight for faster 6502s/C02s earlier on? (the IIC Plus came really late)

Edited by kool kitty89
Link to comment
Share on other sites

After reading that link above, it would have been really interesting for Apple to go ahead and provide good expansion hardware for the ][, and software to bridge the gap to the Mac.

Or continue to evolve the II to a system comparable to the mac, but fully backwards compatible. ;) (and potentially better than the mac in other areas too, depending on the design philosophy)

 

True, the 6502 was a dead end compared to other chips.

Not really any more so than the Z80, the 68k was a dead-end in the long-run too, it just took longer. ;)

 

However, the ][ could easily manage another CPU, as it did with the CP/M card, carrying a lot of software forward, while also offering better multi-media capabilities.

Yes, a "tacked on" multi-CPU route would be one option, but using more powerful 650x models would make more sense up to the late 80s at least. (let alone other coprocessing)

 

In hindsight, prolonging architectural transitions for as long as possible would be the safest route given how many architectures ended up falling out of favor. That, and who knws what might have happened to 650x if it continued to be popular on machined that penetrated the market like x86/PC did. (x86 wasn't particularly good, but the demand drove investment in extending the architecture)

 

The very simple and open design of the ][ had exactly the same characteristics of the PC, and could have easily displaced it more than it did.

Yes, if not displaced in entirely. (ie expanded the 1st party market wide enough -and with an expanded product range of higher and lower end models- that you got clones on the level of PCs and interest in 3rd parties over PCs ;))

 

Apple later demonstrated that a user base could be migrated across very significant changes, as that is exactly what was done with the Mac multiple times.

Yes, though that was also a niche market and those transitions were often somewhat sloppy in nature.

In the context of the Apple II being REALLY big, you'd have to think more in the context of what IBM/PC manufacturers and MS did with hardware and software extensions (technically I believe even modern PC hardware is fundamentally compatible on a low-level with much older hardware, but the software/high-level side of things is a bit of a different story).

 

And, again, the very fact of being a market driving force could have reshaped things to the extent of not having to make such drastic changes. The mac had to make changes due to it having a tiny market share and not driving the mass market (and also using certain parts that didn't get strong mass market support -and Motorola's odd decision to clamp down on 2nd sourcing/licensing with the 68020 onward), the PC architecture became so popular (on top of being a fundamentally evolving architecture -unlike the C64) that it drove x86 to be extended far beyond it ever would have if used only as Intel had envisioned.

 

 

Of course, if the Apple II went the way of the PC, it might have meant Apple being pushed out of the business or restricted to certain sectors of the hardware market (albeit, IBM themselves could have done a lot better than they did with the PC market if they taken the "if you can't beat 'em, join 'em" philosophy -ie rather than trying to crush clone manufacturers, compete directly with them using in-house advantages).

And of course, unlike IBM, Apple was also the main/only source for the OS on the computers, so they could have gone the Microsoft route instead and dropped hardware. ;)

 

The ][ was a very serious computer with a lot of capability. Wasn't cheap, but that didn't always matter. PCs were not cheap either. The capability mattered to business, which is why the ][ did as well as it did. Apple proved out how to do general purpose computing with the ][, and the success of the PC vetted that nicely.

PCs ended up dropping into the lower end cost category too, and it was following that that the market really got solidified. (it was not the cheapest or the most capable for a given price, obviously, but there were still lower-cost models with greater and greater cost/flexibility options as time went on -like by the early 90s when you could start building your own machines to get the best deal of cost/performance and a machine well suited to you -at least if you had the know-how or friends/family who did -paying to have a custom machine made could also end up being cheaper too; used/surplus parts warehouses also became significant for low-cost custom/homebrew machines)

 

Jobs really was ahead of the tech, and his own fetish cost a lot of money then. IMHO, those issues cost users, and he knew that and was perfectly willing to exploit them.

I think he ended up getting lucky eventually. If you keep trying, regardless of how high the failure to success ratio is (or how much it costs), you'll eventually manage something.

 

That, and he certainly is good at manipulation and hype/PR. (again, like his "teacher" Nolan Bushnell ;))

 

Now, with tech at a state where elegant can be realized, Apple makes very good margins on it's Mac hardware. The ][ could have stunted the PC growth, perhaps getting Apple farther along, earlier than they achieved otherwise, and the user base would have seen more value for their investment, and they were the real losers in that mess.

Margins are important, but so are volumes. Neither tell the story overall though, you need to compare net revenue, profits, spending, etc, etc. (and if you do compare profits, you also have to take investment spending into account -you could have a healthy company making deficits some quarters to facilitate growth, it's all a matter of context -deficits/debt on top of a shrinking market share is not good though)

 

Both Woz and Jobs are interesting people to me. Woz wanted it to work, and to be built upon, and generally made trade-offs that software could account for. I believe strongly in that idea, and it still often plays out well in the embedded space. Doing the most in software isn't always as sexy, but it is very robust, and robust is worth quite a bit in general purpose computing.

One of the problems with the Mac is that it did too much in software too. Pushing that philosophy back in 1976/77 made a lot of sense, but by the early 80s you had a LOT more potential for custom coprocessor chips. (and technically, the Apple II did more in hardware than some machines; I think the ZX80/81 manages video with BIOS routines run from the CPU and only allows work to be done with the screen turned off or in vblank -the latter only in the 81, and I think TIA required a lot of CPU assistance too)

 

Jobs valued things differently, and his inability to leverage both, for fear that "good enough" would trump "elegant" led him to do what he did.

I'd argue the 64/128k Mac (among others) was pretty damn inelegant, both aesthetically and technically.

 

And more important that "attractive" aesthetics would be "appealing" aesthetics for various markets. The "serious" business market in particular demanded many things the Mac severely lacked. (large, high-res monitor, color -by '84, color was getting more significant on PCs and EGA had arrived, a professional looking desktop form factor with full sized, fully-functional keyboard, modular/expandable hardware, etc)

 

I find it interesting that a $2000 Mac Book Pro won't come close to the computing performance a $2000 Lenovo does. (yes, I have both right now for work, and get to evaluate both in daily use for some proof of concept type work that is cross platform, OS X, Windows, Linux) But, the Lenovo machine is not "elegant" in the way the Mac is, running hotter, requiring more fiddling, etc... where the Mac is this sexy thing that, for the most part, just works, so long as the things I want to do are accounted for by Apple.

One thing important about Macbook pros is the video hardware they offer. (at least in some models) No (or almost no) other manufacturers provide laptops with reasonably powerful hardware graphics acceleration, that's pretty significant for a number of applications.

 

The only Macs that are really useful and cost-effective on a technical level are the really high-end models (Mac Pros and Macbook Pros), for cases where you really need that performance. (it will either be about as expensive on similar PC workstations, or not available at all)

 

 

 

Jobs knows that a holistic design can nail specific niches, and that's where the money is, which is why he never could accept growth in the ][. Woz understood that providing a great foundation, where value can be added in lots of ways could be adapted to do well on most things, and there is money in that too.

 

Two polar opposites, and a very interesting tech story!

And Jobs seemed completely (if not intentionally) blind to his own logical errors and Woz's obvious proof for real-world success.

 

Granted, Jobs' niche market approach does work in a handful of cases (and eventually worked for the Mac -though almost failed several times -and isn't their main product today either), but it's far more risky and wasteful overall.

In fact, catering to a niche market is better used as a back-up route for a mass market product that fails to latch on to consumer interest as intended. (like the Amiga did with professional graphics and video editing, and the ST with music -of course, both became true mainstream platforms in Europe)

 

 

 

 

 

 

 

 

 

 

 

 

Didn't know that.

 

Well, scratch that idea.

Again, plenty of other options prior to the '816. ;) (and lots of middleground from the II/IIe to the GS as well -could add more moderately improved graphics, especially with a true 16 color bitmap mode at reasonable resolutions and with a linear framebuffer -perhaps lower color modes with normal linear framebuffer as well, DMA sound or at least a bare DAC or off the shelf sound chip, a nice set of programmable interval timers would be nice too)

 

 

The story above is stunning really. Million bucks a week on press for the Mac, built on the back of the very profitable ][.

Not just the profits, but the image of the Apple II and the name it had made for the company.

Even with the same amount of funding, Apple would have been FAR more hard pressed to push the Mac without the existing market of the Apple II.

 

 

Just read today that 40 percent of Apple revenue comes from the iPhone. Gadgets is what Jobs really wanted to sell. Elegance, function, form. Gotta respect that vision, because now it makes a lot of sense. Didn't then though. Tech wasn't there yet.

That was one of Ray Kassar's problems with the 8-bit line, he had a vision that was strikingly similar to what was done with the IMacs in the late 90s. (very user friendly, "smart" plug-and-play peripherals, an "appliance computer", and plans for color coordinated models to cater to different tastes -especially to make it more attractive to women ;))

 

IMHO, he's a lucky guy really. Given how badly he managed the cash cow, the fact that it paid anyway is just amazing, and probably the reason we actually see the Apple stuff we do today.

That's how I feel too, though if you look at what he did at NeXT, it doesn't completely match his other examples outside of the "elegant" form factor. (I'll admit many of the NeXT machines do look pretty cool)

 

It also makes you wonder how the market would have done without his influence? How would smartphones, MP3 players, or tablet PCs be? (similar, better, or worse?)

 

 

Or for that matter, what if Atari Corp or CBM had been "luckier" back in the late 80s? ;)

Edited by kool kitty89
Link to comment
Share on other sites

 

One thing important about Macbook pros is the video hardware they offer. (at least in some models) No (or almost no) other manufacturers provide laptops with reasonably powerful hardware graphics acceleration, that's pretty significant for a number of applications.

 

No longer true. The Apple appears to have significantly better graphics power management, actually power management throughout the machine, but...

 

In terms of high end graphics, professional, serious stuff, it's not the top end anymore.

 

The new Lenovo I have coming in is a nice, fairly open, compatible box. I actually can run XP, Win 7, Linux, and Mac OS X on the thing, if I want to. The Mac Book can do pretty much those same things. Interesting that is.

 

But, the Lenovo will come with a high speed i7, 8 core CPU, 2 GB (jesus that's big) nVidia Quattro 2K, etc... The Mac Book comes with a lower clocked i7, 2 or maybe 4 cores, if hyperthreading is in there, etc...

 

Bus speeds and other things play out too.

 

Mac is very elegant, corners well, has off the charts good battery life, great UI, etc...

 

The Lenovo is going to smoke it on higher end tasks. CAD, Virtual machines, database, etc... But it won't look as pretty, nor run as long.

 

I think a look at the whole package shows very good trade-offs on the Mac, good for nearly everybody. But, if you want to nail it, IMHO I'm not impressed.

 

They cost a very similar amount within a coupla three hundred bucks.

Edited by potatohead
Link to comment
Share on other sites

Re: Luck...

 

Well, Jobs was lucky because Woz nailed it. The other vendors didn't have such a solid vision of a home computer.

 

I love Atari machines, for example, but they were missing some key basics Apples had. If you really wanted to get basic business / productivity done, a ][ was the shit.

Link to comment
Share on other sites

Why not ditch the '816 for the time being and go straight for faster 6502s/C02s earlier on? (the IIC Plus came really late)

Ok, what year was the 65C02 released? Before you make such a suggestion, make sure it's possible. The C02 also came in the enhanced IIe.

 

I think the key to greater Apple II series popularity and longevity was the transition from the II to the II+.

As I said before...

If you are going to make any significant change it has to be before the software library is huge.

Dumping that stupid screen memory layout would have done wonders, and a 2MHz clock would have made it one of the fastest 6502 platforms out there.

A 4MHz 6502 wasn't really practical early on. You still need to go with the 65816 eventually. Sadly, Apple never supported a faster IIgs. The accelerated IIgs's out there are pretty decent for the time.

 

Just think what would have happened to the market if the II+ and been 2MHz. Then a lot of later systems would try to follow Apple to compete.

Link to comment
Share on other sites

 

One thing important about Macbook pros is the video hardware they offer. (at least in some models) No (or almost no) other manufacturers provide laptops with reasonably powerful hardware graphics acceleration, that's pretty significant for a number of applications.

 

No longer true. The Apple appears to have significantly better graphics power management, actually power management throughout the machine, but...

 

In terms of high end graphics, professional, serious stuff, it's not the top end anymore.

 

The new Lenovo I have coming in is a nice, fairly open, compatible box. I actually can run XP, Win 7, Linux, and Mac OS X on the thing, if I want to. The Mac Book can do pretty much those same things. Interesting that is.

 

But, the Lenovo will come with a high speed i7, 8 core CPU, 2 GB (jesus that's big) nVidia Quattro 2K, etc... The Mac Book comes with a lower clocked i7, 2 or maybe 4 cores, if hyperthreading is in there, etc...

 

Bus speeds and other things play out too.

 

Mac is very elegant, corners well, has off the charts good battery life, great UI, etc...

 

The Lenovo is going to smoke it on higher end tasks. CAD, Virtual machines, database, etc... But it won't look as pretty, nor run as long.

 

I think a look at the whole package shows very good trade-offs on the Mac, good for nearly everybody. But, if you want to nail it, IMHO I'm not impressed.

 

They cost a very similar amount within a coupla three hundred bucks.

For the "elegant" side of things . . . I've only occasionally found apple products to be aesthetically pleasing. Case in point: I like the look of most Lenovo (or older IBM) thinkpads better than Macbooks, and I like the look of several other laptops more as well. (I like the silver/black/dark gray scheme several manufacturers are using, including a lot of the HP cases)

I really dislike the keyboards on most macbooks too (and then they went and started using them for imacs and mac pros as well). I think some (if not all) of the newer Mac Pros have some nice keyboards though.

 

And for desktop/tower systems, you've got a massive range of aesthetic possibilities for PCs to chose from and customize. (a lot for monitors and speakers as well)

 

 

 

Good to know about the high-end laptops on the market.

 

 

 

Re: Luck...

 

Well, Jobs was lucky because Woz nailed it. The other vendors didn't have such a solid vision of a home computer.

Yes, Jobs got lucky many, many times in that respect. (granted, if he'd pushed a bit further and actually got the Apple II formally discontinued back in the early 80s, Apple would have been ruined -or if they'd treated the Apple II division so badly that they wouldn't put up with it any more, with most key staff -including Wozniak- leaving and causing the division fell apart)

 

Actually, I wonder how the Apple II guys (and especially Woz) could have ended up if they'd left back in 1980 (when things started going off with the Apple III and management continued to downplay the importance of the II) and started up a small researach company to work on a new product more tuned to the original design concept. (if not attempting a true followon to the apple II with a copyright/patent friendly clone Apple II system -of course, even if legal, you could get railroaded in court without the funding for good defense lawyers)

 

I love Atari machines, for example, but they were missing some key basics Apples had. If you really wanted to get basic business / productivity done, a ][ was the shit.

Yes, Atari made some key mistakes: forcing the engineers to remove the expansion slots, not supporting/encouraging 3rd party software support better, not investing more in in-house business applications, etc.

Hell, they were finally correcting that in 1983, but the 1090 XL got delayed due to reorganization and then canceled after liquidation. (plus there were the plans for a PC compatible for '83, also delayed with reorganization and lost in liquidation)

 

And, of course, a whole different list of problems with managing the European market. (or competing against CBM in the low-end in the US for that matter, though even then there were major differences in Europe -emphasis on 3rd party software support, tape media, etc, etc)

 

 

 

 

 

 

Why not ditch the '816 for the time being and go straight for faster 6502s/C02s earlier on? (the IIC Plus came really late)

Ok, what year was the 65C02 released? Before you make such a suggestion, make sure it's possible. The C02 also came in the enhanced IIe.

The 65C020 came in early 1984 at the latest.

 

However, I was speaking in context of initially using 2 (possibly 3, maybe 4) MHz NMOS 6502s before moving on to the 'C02. Hell, they could have even invested in cost saving options like integrated halt logic (as Atari did) among other things. (many, many cases where some R&D investment for greater consolidation would may dividends in the long run)

 

With the "too late" comment, I was mainly speaking of the IIC Plus's 4 MHz CPU, with no middle ground from 1 MHz prior to that. (and 4 MHz should have been available years earlier than the 1988 C+ anyway -my '88, they probably should have had an 8 MHz option)

 

I think the key to greater Apple II series popularity and longevity was the transition from the II to the II+.

As I said before...

If you are going to make any significant change it has to be before the software library is huge.

Having relatively straightforward backwards compatibility and consistently offered upgrades, a progressive, evolutionary standard should work well.

 

Dumping that stupid screen memory layout would have done wonders, and a 2MHz clock would have made it one of the fastest 6502 platforms out there.

A 4MHz 6502 wasn't really practical early on. You still need to go with the 65816 eventually. Sadly, Apple never supported a faster IIgs. The accelerated IIgs's out there are pretty decent for the time.

Clock speed alone isn't all that matters either, but how much CPU time is actually available (not even considering coprocessors to offload work onto).

If you retained a single bus design without speeding up RAM enough to allow faster interleaving (like the BBC Micro's 2 MHz CPU with interleaving like the Apple, but 2x as fast), or you could implement wait states (or allow double speed only in vblank) and cut nominal performance considerably.

That A8's 1.79 MHz CPU is nominally only ~1.2 MHz because of refresh and video DMA time stealing cycles.

 

Of course, there's also the option I mentioned with a small buffer or "cache" (again, obviously not a dynamic cache) of fast SRAM to allow the CPU to run at full speed (especially if treated as local memory and not part of the main bus).

As such, you could also have the CPU run faster in SRAM than it practically could in DRAM (especially lower-cost DRAM), even without contention (so even more of a gap considering contention issues).

Edited by kool kitty89
Link to comment
Share on other sites

The cost savings from integration came with the IIe design. It was clearly late due to the III which just should have been skipped. With a 2MHz II+ the III wouldn't have been very attractive at all from a marketing standpoint. Maybe release a II+ that has an enhanced keyboard like the platinum IIe with an 80 column card built in and 128K. That would have given Apple their business system they wanted so bad and would have eliminated Franlin's advantages over the II. Then the IIe would just serve to offer higher integration of parts and bring the 80 column card and 128K as an option to everyone else.

Apple never did come out with a larger memory standard for the II so that was another thing they could have explored. The IIgs of course use the 65816's 24 bit addressing... though Apple didn't fully take advantage of it. Another mistake.

 

I really don't think 4MHz would have done much to increase sales over 2MHz. Only a small % of owners bought a Zip chip or IIc+ and a lot of the bottleneck at that point is the storage media of the day.

I think CPU speed upgrades were more popular on the IIgs. The GUI and games were more CPU intensive and a greater % of hardcore II users bought the IIgs. Really, it needed to be in the 8 MHz range from day one.

The IIc+ was faster than the IIgs out of the box, and can be upgraded to go even faster by replacing the CPU and cache RAM.

Now, if they had introduced a IIc model with IIgs capabilities, that would have been nice!

Once you get up around 14-16MHz the 65816's speed is topped out so the II series would have been gone anyway.

 

Frankly, the 680x0 series is way nicer to program than the 65816 and I think developers didn't really want to do much for the IIgs. Even with it's enhancements the 65816 couldn't keep up with a world going to high level languages and the "Terbium" (32 bit version) wasn't going to change that either.

 

<edit>

FWIW, the Franklin Ace was a II+ clone with the language card (on the 1000) built onto the motherboard, it had a larger power supply, numeric keypad, optional 80 column card, and it was offered with several business applications. Sort of the business machine people really wanted from Apple instead of the III.

Edited by JamesD
Link to comment
Share on other sites

The cost savings from integration came with the IIe design.

I meant continuous consolidation though. How consolidated was the IIE? (at some point they should have had all the video, I/O, memory interface, refresh, etc, etc logic all in one ASIC, perhaps merge the CPU on chip once it came time for a true successor at the end of the 80s -assuming they switched architectures at that point)

 

I really don't think 4MHz would have done much to increase sales over 2MHz. Only a small % of owners bought a Zip chip or IIc+ and a lot of the bottleneck at that point is the storage media of the day.

I was thinking of 3-4 MHz as time went on and you had more CPU intensive applications. (you could have a step between the IIe and GS as well -like similar graphics modes as the GS, but perhaps limited to the old II palette or maybe 6-bit RGB, basic DMA sound, enhanced OS with GUI, etc)

 

I think CPU speed upgrades were more popular on the IIgs. The GUI and games were more CPU intensive and a greater % of hardcore II users bought the IIgs. Really, it needed to be in the 8 MHz range from day one.

A faster C02 (better with the R65C02 instructions) could have been easier early on (and for lower-end models). Plus, then they could implement a bank switching scheme preferable to the '816's. (granted, you could bypass the 24-bit mode of the '816 as well for external banking . . . just as you could ignore the 808x's segmentation with a custom mapping scheme ;))

They could have invested in licensing the 6502 and merging that custom MMU logic on-chip later on. (the 650x chips were generally cheap to license, so rather practical to do that with)

 

The '816 was faster per clock than the R65C02 though, code compatible, but faster at some operations. (so you'd eventually want to push more for that)

 

The IIc+ was faster than the IIgs out of the box, and can be upgraded to go even faster by replacing the CPU and cache RAM.

Now, if they had introduced a IIc model with IIgs capabilities, that would have been nice!

Once you get up around 14-16MHz the 65816's speed is topped out so the II series would have been gone anyway.

Well, that's assuming a massive market share of the 650x didn't spur development of more advanced 650x derivatives (certainly happened with x86). Or if not more advanced chips, maybe at least higher clocked versions of the old chips. (using newer processes with smaller and smaller die space -maybe adding on-chip RAM at some point)

What would have been really nice is a derivative of the 65816 that added logic to allow a 32-bit flat physical address space (like the 386's protected mode), 16 and/or 32-bit wide data bus, and perhaps dynamic caching support. (even if they didn't make much change to the existing R65C02/816 ISA, there could have been a lot of added logic for faster execution of various operations, features to make better use of a wider data bus, etc).

 

As it was, there simply wasn't a market for a high-end 650x compatible design to drive such development.

 

 

Plus, Apple could have pushed more and more for hardware acceleration as well. (blitter, math coprocessors, etc, etc -either custom or off the shelf depending on the context)

 

If no better 650x CPUs arrived by the late 80s, they definitely would have needed to make preparations to transition to another architecture (which the Mac has done twice). At the time, it would seem like ARM could have been the best choice. (went open market in 1990 with the ARM 2 and 3 -including the lower cost/power CMOS ARM2as, generally very cost effective design, etc) The 650x core could be embedded in a coprocessor/interface ASIC for compatibility and some slave duties in the system, at least until the ARM CPUs got fast enough to emulate that in software. (assuming you needed low-level compatibility; again, cheap to license the 650x too and a similar ASIC would have been used in late model 650x based Apples too)

 

Frankly, the 680x0 series is way nicer to program than the 65816 and I think developers didn't really want to do much for the IIgs. Even with it's enhancements the 65816 couldn't keep up with a world going to high level languages and the "Terbium" (32 bit version) wasn't going to change that either.

True, but we're not talking about elegance of programming, but market dominance and compatibility. Hence x86's dominance with the PC. ;)

Link to comment
Share on other sites

  • 2 weeks later...

Apple has always had good quality products, but overpriced. I had an Apple //e, and later IIgs, and they were great machines. The IIgs was purposefully given a low 2.8 mhz speed as not to compete with Macs. So the IIgs was sort of a dead end machine. I would have preferred it if Apple had continued the Apple II series, without any Lisa/Macs in the picture.

 

Many IIgs friends jumped over to the Amiga, which was a great machine. However following the 16 bit era, computers do not have personality any longer. The 8 and 16 bit computers were the best.

Link to comment
Share on other sites

Apple has always had good quality products, but overpriced. I had an Apple //e, and later IIgs, and they were great machines. The IIgs was purposefully given a low 2.8 mhz speed as not to compete with Macs. So the IIgs was sort of a dead end machine. I would have preferred it if Apple had continued the Apple II series, without any Lisa/Macs in the picture.

 

Many IIgs friends jumped over to the Amiga, which was a great machine. However following the 16 bit era, computers do not have personality any longer. The 8 and 16 bit computers were the best.

 

Truer words na'er bein spoken m'man!

The Amiga and ST were the last computers to have a real "personality" - as us retrogamers and enthusiasts use the term.

 

I believe it is compatibility and pervasiveness throughout the market that enables any one standard to succeed :roll: Compatibility will trump performance when it comes to needing a large number of cheap installations. Compatibility is important for getting something adopted throughout a culture.

 

The "PC" could handle so many file formats, and hardware seemed to be everywhere, and it could interface to everything. Some of those points existed in the II series, but were never really really pushed or developed. Had Apple really really marketed the II with vigor, it would be the "PC" of today.

 

I sort of wanted an ST, off and on. But it was too expensive at the time considering I was spending money on the Amiga. Now, for all its hardware strengths and whiz-a-bang graphics the Amiga almost had a special appeal, But...but... I found it frustrating to use and "too much" got into the way of what I wanted to do at the time. And that was simple word processing. And I mean simple. A task, which today, is handled by Notepad/Word with aplomb.

 

I always felt the Amiga was full of hot air and grandiose promises. The ST to a lesser extent. So many cool things to do, but everything was so conditional, you had to have this this and this.

 

Now with the ][+ and //e -- granted, you needed to get a printer to print and an interface to make a connection, and a cable and perhaps a mounting bracket and connector/strain relief hardware kit on the II -- it certainly didn't feel like I was nickel and dimed to death. Because, once I got those parts, everything worked! From the MagicWindows word processor, to listing programs, to using print-shop, not to mention dumping a hi-res screen image!

 

It seemed with the Amiga I would need to have spent something extra-in-addition to get that level of versatility and functionality. Either buying a hardware converter, memory expansion, buffer, especially extra software. Yes, that's it, the extra software. Extra drivers. It was not an easy task. Despite the apparent crudeness, and yet simplicity, of the Apple II, it worked.

 

What else was nice, real nice, was before I got a real word processor program. I was able to use a text editor built into either Ascii-Express, or wait, no, it was Pro-Term I think, yes. I could do basic WP on that. That was nice. I still have original text files we wrote up on that!

 

Transferring files from the II to the Amiga was simple though. I gotta give the Amiga PD software, at the time, some credit. It just sucked up my documents left and right!

 

The Amiga was good for static painting and artwork. Very good. And I learned a lot about graphics images and electronic painting and stuff. But when it came to video, well that was hell, for me. I got that Digi-View thing. And rented a Vidicon or Saticon camera to digitize some photos or something. Well I had to spend MORE money and get the right cables, a gender-changer, a stand, lights, hardware for mounting a color wheel, extra memory, another disk drive, power supply for the lights, flat glass plate to flatten the picture. All that stuff + it took like 4 minutes time to grab an image. Ridiculous even back then in the 90's! And even more money had to be spent on getting an image from the VCR to disk. That it was pissed me off about the Amiga. It made the promises, but in actuality it was a huge pain in the ass to work with. Tedious, slow. I'd rather have waited another 3 years or so when digital cameras started coming to the consumer level. Far superior in every way. But from the Amiga marketing and advertising material they made things look so easy! Bullshit.

 

Now that I think about it, it was the Amiga (and some multi-media PC experiences of the time) that turned me off from being an early adopter. With the PC, it was the promise of 3-D accelerators and soulless FMV cut-scenes. First now, yes, First now we're seeing GPU's worthy of even the name "GPU"..

 

Let's boil it down further, I also strongly believe it was the DOS 3.3 (and later ProDos) filing systems that really made a huge difference in today's PC environment. Not forgetting the latter versions of MS-DOS, like 5.0 and 6.22 .. Despite both the hidden inherent (and outward) differences between Apple's Dos and MS-DOS, the style of how they transferred data from memory to cpu to disk is incredibly similar, the software that got loaded into ram did an amazing amount of work when it came to controlling the drive. The Apple II disk sub-system is one of the best engineered storage devices, so much was done with so little hardware, a lot was under main cpu control and RWTS/dos.

 

With the Amiga I always felt the disk was not under my control, I couldn't easily change things around, like re-assign the VTOC to a different track or change some things to drastically speed stuff up. No easy trick nybble editing of sectors or anything. The Amiga disk sub-system had a mind of its own and was terribly far far removed from application control. There seemed to be un-necessary complexity in the file system. And some strange hardware control seemed to rule the day. :roll: .. way too much overhead. And the disks were horribly slow.

 

One more thing with the Amiga, for all its purported expandability claims, it felt like a closed-off system. Any expansion seemingly needed to have to be developed with far more resources than what could be done on the //e. Though I think that became prevalent on a lot of the 16-bit systems of the time. Perhaps, but what about the complexity of the 1541 drives? Didn't those have ram and rom and a cpu going? Why?

 

In summarizing, I think the (then) success of the Apple II and (today) PC can be attributed to things like a good solid file-system, with fast and quick in relation to the amount of data they could push around. Next comes the expandability. That is a topic which could span hundreds of pages, too, but you get the idea.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

Going back to what kool kitty 89 said in post #28 - The Atari 8-bit units had far more integration than the Apple II.

The sound and graphics chips alone, on the 400/800, were totally optional on the Apple II, II+, & //e. The Atari units had SIO and plenty of input options with the 4 joyports. That would be 8 A/D converters + 20 switches for 4 joysticks. RF modulator too.

 

All that was optional on the // series, at great cost!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...