Jump to content
IGNORED

Bless and Curse of Atari and Amiga computer designs


calimero

Recommended Posts

10 minutes ago, oky2000 said:

People just assumed ALL PC DOS software was better back then (more fool them).  

When I left the Military, I went freelance programmer (mugs game :) ) and had to buy a PC as that's what every business had.

MS-DOS - Borland C etc. etc. after using an ST mainly programming in C, had some big compile/run failures due to the PC's archaic

memory configuration, having to use NEAR and FAR etc. instead of just letting the machine take care of linear memory. 

 

Plus all those memory drivers (EMM386 etc. ) installed during boot to manage anything over 64K. 

  • Like 1
Link to comment
Share on other sites

On 3/21/2022 at 8:44 PM, TGB1718 said:

When I left the Military, I went freelance programmer (mugs game :) ) and had to buy a PC as that's what every business had.

MS-DOS - Borland C etc. etc. after using an ST mainly programming in C, had some big compile/run failures due to the PC's archaic

memory configuration, having to use NEAR and FAR etc. instead of just letting the machine take care of linear memory. 

 

Plus all those memory drivers (EMM386 etc. ) installed during boot to manage anything over 64K. 

I did high level programming for DOS as a final year project. I wrote the same video rental store management system in FAST BASIC (or STOS) and it was identical and did everything a top of the range PC system would do in 1986. Small businesses really needed to be more open minded.

 

If I had actually thought of my estate agent management suite running via Amiga + multimedia database capabilities (interlaced ham images for properties and hi-res lace mono copy for mono printouts) along with all the usual guff you get with bespoke software in the 80s for cheap ass PC setups in the UK nobody would have bought it anyway despite it being 1 or 2 DECADES ahead of what you could actually get on a PC off the shelf. Hell if you had 9mb of RAM you could actually have animations for transition of going from one room to another and back out again and a 360 degree rotation of a spot of the front of a house and the centre of the garden. 

 

Problem is people generally go with the masses, after all that Apple has done over the decades......they are only still around today because 3/4 of its competing non Wintel competition self imploded by the late 1990s (Acorn, Atari, Commodore). In the case of OS/2 v2.0 to v4.0 IBM just priced themselves out of the market on x86 OS wars.

 

Point is you could do a lot more with both an Amiga or ST than any 600 bucks bottom of the barrel 512k XT class PC clone running DOS or Windows in the late 80s but business users make 'safe' bets. Only when it is too late do they realise this so called 'business class support' was a pile of useless pre-recycled paper with lots of get-out clauses hidden in any contract or user licence :).

  • Like 2
Link to comment
Share on other sites

When something is ahead of its time, no one knows what to do with it. there are little real-world needs and applications. And it doesn't penetrate the public consciousness. The developers and others that do recognize the advanced capabilities are often left dumbfounded wondering as to why???

 

With something like computers, there have to be standards otherwise one is left sitting alone on a solitary island. Advanced computers often need new and untested standards to accommodate those spiffy features. And that mean less compatibility.

Edited by Keatah
  • Like 2
Link to comment
Share on other sites

10 hours ago, oky2000 said:

Only when it is too late do they realise this so called 'business class support' was a pile of useless pre-recycled paper with lots of get-out clauses hidden in any contract or user licence :).

It's all about covering your arse in your purchasing decisions.  If you're a mid-level manager and buy a software package from a small software house running on Amigas and it all goes sideways then upper management is going to start questioning your purchasing decisions and you will take the blame.   If you bought from a well known name like HP or Oracle or someone and it fails just as bad then well it's not your fault.    Of course the flip side of that when you are dealing with a company bigger than you, they aren't going to prioritize getting your system back up and running as much as a smaller company will.  (they need your business more)  That's why they used to say "Nobody ever got fired for buying IBM"

 

There's also other shenanigans that go on.   I'll give an example I experienced.  We were looking to replace a critical software product because the company producing the existing product had been purchased and the product line was being discontinued by the new owners.   We screened alternatives and narrowed it down to two:

 

Company A - a fairly well known name in the computer industry

Company B - a VERY well known name in the computer industry

 

We brought both in to do demos.  

 

Company A sent a very knowledgeable guy who answered all our questions, the product did everything we needed and was similar enough to the one we were replacing that it wouldn't have been a huge effort to replace

 

Company B sent a team of 8 to 12 people to do their demo, all dressed to the nines.   The demo was a complete disaster, they demoed a different product than they were selling us, couldn't answer half of our questions satisfactorily.  The product they wanted to sell us was incomplete and missing features.   It went so bad that our CIO yelled at them halfway through the sessions.  Also the software operated so differently than what we were replacing it would require significant training for us to learn it.

 

Guess which product we ended up buying?

 

Well Company B of course!  Even though we all hated it,  they were so slimy that they were going to up our support costs for other products of theirs that we used if we didn't buy it and that's solely what influenced the purchasing decision.

 

So that's another factor in why people go with big names when the software is inferior-  they can pull shady crap to keep you locked in and get away with it.

  • Like 2
  • Sad 1
Link to comment
Share on other sites

30 minutes ago, zzip said:

Well Company B of course!  Even though we all hated it,

Something sort of similar, I moved to a new department that was using IEEE488 controllers to develop

in-house automated test systems, I was in charge of a new development and we were using some systems

that had been chosen to replace and older system (can't remember the name of the system, but it ran BASIC09).

 

I found out that this new system did not meet the initial specification in many important areas and as such would not

be suitable for my project nor any of the others we were developing.

 

I got some guys from HP to come and show us their wares and we ended up buying a load of controllers I think

HP5000 systems and dumping the others.

 

Why were the other systems purchased in the first place ?

The guy who did the evaluation left and joined that company :) ... draw your own conclusion

 

  • Like 1
  • Haha 1
Link to comment
Share on other sites

In the home/family "IT" arena. Supporting the immediate family and some crotchety neighbors. I just do what needs to be done. And that's it. If they don't like it they can call best buy tech and pay $200 for them to come out, and then keep coming out again and again and again.

 

Link to comment
Share on other sites

  • 2 weeks later...

Small businesses like video rental stores wouldn't be getting any support worth mentioning, they would also get zero support worth mentioning for using Windows either (and Windows was a disaster, only idiots used that rubbish for any business). People just bought anything that ran on PC assuming it must be the best for business situations. 

 

It's like Fashionistas who bought Bang & Olufsen TVs instead of saving a thousand bucks and getting an identical TV with a Phillips badge etc :)

Link to comment
Share on other sites

10 hours ago, oky2000 said:

It's like Fashionistas who bought Bang & Olufsen TVs instead of saving a thousand bucks and getting an identical TV with a Phillips badge etc :)

Not sure I quite agree, I bought a B&O back in the 80's and there was nothing quite like it with other makes,

yes they are overpriced, (I got mine at a huge discount else I wouldn't have bought it), but

the quality was exceptional, service the few times I needed it was exemplary, the built in sound system

put a lot of HiFi systems to shame.

Link to comment
Share on other sites

On 4/2/2022 at 1:56 PM, TGB1718 said:

Not sure I quite agree, I bought a B&O back in the 80's and there was nothing quite like it with other makes,

yes they are overpriced, (I got mine at a huge discount else I wouldn't have bought it), but

the quality was exceptional, service the few times I needed it was exemplary, the built in sound system

put a lot of HiFi systems to shame.

The CRT tube and the driving electronics, and therefore the picture quality, was stock rehoused Philips chassis most of the time from what I remember seeing. I was a CRT TV engineer, fixed many eighties TV's in my time, know about most brands from an engineers point of view. The kings of eighties CRT displays were Trinitron and Quintrix (Panasonic's alternative) but Philips had a much more warm n fuzzy late seventies type image like everybody else in that era. I can appreciate the importance of style choices hence my term 'Fashionistas'. With CRT TVs it is mostly down to the alignment of electron gun/shadow mask accuracy and how dark a grey colour the CRT tube is when turned off, there are practically no improvements you can make to improve those by changing the motherboard :) I personally like the Normende (?) eighties TVs that looked like something out of The Jetsons but today I know the actual picture quality was not really any better than our Fergusson TX TV from early eighties (which had bass/treble/balance and stereo speakers and a pseudo 'surround sound' mode for audio along with SCART RGB for my ST and Amiga so it really was just a style thing, technically we didn't miss out on that much)

 

I would have to ask a friend as I have not worked on B&O audio gear myself so that may be pretty good and actually top of the line quality.  They do look interesting but I have no idea what off the shelf parts/base machine they use as the basis for the audio products. Audio equipment has a huge potential improvement via minor, slightly more costly, component changes or even just circuit layout/component placement. Tape hiss, underlying hum from turntable etc are all quite possible to improve, a better needle cartridge than what is supplied as standard for the base turntable from another manufacturer makes quite a difference too, assuming B&O don't make 100% bespoke turntable chassis.

 

You can't make an early 1990s Goldstar TV image look as good as Mitsubishi Diamondtron if you start with an identical CRT tube inside the Goldstar, they both cost the same however. I think Philips and B&O are companies from the same country though hence the TV parts they use.

 

The point is as long as you accept the technical limitations and are happy to pay that much more for something that looks a particular way instead then that's cool. If it's down to design I would go with the twin fold out speaker Normende TV that looks like something out of The Jetsons or 2001 but as an engineer and home movie buff of the 80s I know the money would have been better spent on a 28" Sony for the living room AND a 14" Sony portable TV for me upstairs (if we had that sort of money in the mid 80s!). 

 

 

Link to comment
Share on other sites

I think the main differences with B&O was all the little extra's B&O put in, like

the interface between the TV and VCR, all VCR programming was done with the 

TV system, I think you could also connect your HiFi (B&O) to the TV also.

 

One thing (more of a novelty) was an RS232 port on the TV which you could

connect to a printer to print TeleText pages, it did work too, I borrowed

a printer with RS232 interface and it just printed fine.

  • Like 1
Link to comment
Share on other sites

On 3/3/2022 at 10:38 AM, Goochman said:

Adspeed showed the ST shouldve come with the 16mhz 68K as standard.  Like Darklord stated everything seemed smoother and snappier.  I can only think of 1 weird issue when using 16mhz mode - when launching Spectre 128 software in 16mhz mode it would scramble my settings so I had to start in 8mhz and flip the switch to 16mhz after launch.

 

I dunno...  I'd probably vote for an 8MHz+ 68010 from the start instead of the 68000 [at any speed] so that the platform's software would start out accustomed to virtual memory options, protected modes, and the other stuff the 68010 introduced and consequently hampered compatibility with some games for the Atari ST, the Amiga, and the Sega Genesis/Mega Drive.

 

And more importantly, cutting a deal with Atari Games to use whatever their custom graphics chip they used in their Atari System 1 for use in the ST or the next systems thereafter. I mean, that sucker could display 256 colors on screen at once out of a 1,000+ color palette with a load of sprites and at resolutions surpassing both the ST and the Amiga. That would've been a game changer... no pun intended. It would've made the game conversions easier too [they used 68010s] aside from not having the YM2151, a 1x/2x/4x POKEY, and the TI Speech Synthesis chip(s).

Link to comment
Share on other sites

On 3/14/2022 at 8:07 AM, zzip said:

 

 

For scalable fonts.   One issue was GDOS came late and was kind of awkward to use so ST didn't handle scaleable fonts as well as it should have.

 

The real issue being Atari Corp failing to provide GDOS integrated into TOS ROM as they originally promised once DRI completed it. That would've standardized it better. They also promised an easy Blitter upgrade for all STs once they supposedly had the manufacturing problems resolved and also failed to deliver on that promise.

Link to comment
Share on other sites

On 3/1/2022 at 11:50 AM, oky2000 said:

People like Dave Haynie didn't have a clue, or Bil Herd. Nice people but their hardware designs were shit and way below the cutting edge the preceding machines they worked on had. A500 is a cock up, zero improvements and the so called cost cutting could have been pretty much matched just by replacing the 192k RAM daughterboard for Kickstart OS with a single ROM socket on the A1000 motherboard (the fall in RAM prices between Spring 85 to Spring 87 and the reduction of 192k in total RAM required to make a "512k" Amiga combined with the complexity of a motherboard+daughterboard design is what made A1000 so expensive) .

 

 

 

I find Dave Haynie's comments about the Atari ST in his presentations very off-putting. But then again, he is playing to the fans that booked him.

 

The C128 is just odd. Bill Herd made a few videos on YouTube explaining why it was designed as such. The Z80 ended up in it not only to provide a CP/M mode but to also get around some incompatibilities with certain 3rd Party C64 cartridges that wouldn't work at the initial startup in the C128 prototypes that didn't have the Z80 to start the machine with. Why they didn't improve the C64 by just adding extra RAM instead of segmenting it had to be for reasons other than backwards compatibility. I'd say the Atari 8-Bits handled RAM upgrades far better. Well, at least they received better support from the software coders. Commodore's REUs were interesting but weren't supported well outside of GEOS. Apparently, they had a mapper chip inside them to support the added RAM plus it supposedly had Blitter capabilities. Commodore didn't offer "Commodore" BASIC beyond 2.0 for the C64, unlike Atari Inc/Corp upgrading Atari BASIC in the XL and XE lines. But Commodore does get props for providing a built-in 80 Column Mode in the C128. Apple did 80 Columns better with the Apple II line than anyone else in the 8-Bit world but it also came with Apple's pricing. 

Link to comment
Share on other sites

52 minutes ago, Lynxpro said:

Apple did 80 Columns better with the Apple II line than anyone else in the 8-Bit world but it also came with Apple's pricing. 

Like totally k-rad man! And when looking at the Apple branded 64K/80Column cards I was rather impressed at the simplicity. Just a few latches and gates, and memory if getting the 64K upgrade too.

 

The text was crisp and fast and didn't feel like sucking through a straw. Seemingly no bandwidth issues. Coupled with MouseText, it allowed for a rather simplistic (but responsive) GUI. IDK - I thought the 64K/80C upgrade along with the Enhancement Kit were extremely well thought out and properly positioned. Those upgrades didn't bloat the machine or slow it down either. They didn't step out of what was practical/usable at the time. The only option I wanted to see was colored text, like on PC. But I didn't complain.

 

Back in the day I just got used to high prices being high, part of the hobby. That was that. And that helped when transitioning into the PC world where it was possible to get a machine piecemeal. Separate PC parts were high priced and required. Couldn't do without a graphics card, or a Multi-I/O board. Simply said - no sticker shock.

Edited by Keatah
Link to comment
Share on other sites

8 hours ago, Lynxpro said:

Commodore didn't offer "Commodore" BASIC beyond 2.0 for the C64, unlike Atari Inc/Corp upgrading Atari BASIC in the XL and XE lines.

I think Atari only fixed bugs in revision B and C BASIC?   Not any significant upgrades beyond that I don't think.

Link to comment
Share on other sites

On 4/4/2022 at 12:19 AM, oky2000 said:

The CRT tube and the driving electronics, and therefore the picture quality, was stock rehoused Philips chassis most of the time from what I remember seeing. I was a CRT TV engineer, fixed many eighties TV's in my time, know about most brands from an engineers point of view. The kings of eighties CRT displays were Trinitron and Quintrix (Panasonic's alternative) but Philips had a much more warm n fuzzy late seventies type image like everybody else in that era. I can appreciate the importance of style choices hence my term 'Fashionistas'. With CRT TVs it is mostly down to the alignment of electron gun/shadow mask accuracy and how dark a grey colour the CRT tube is when turned off, there are practically no improvements you can make to improve those by changing the motherboard :) I personally like the Normende (?) eighties TVs that looked like something out of The Jetsons but today I know the actual picture quality was not really any better than our Fergusson TX TV from early eighties (which had bass/treble/balance and stereo speakers and a pseudo 'surround sound' mode for audio along with SCART RGB for my ST and Amiga so it really was just a style thing, technically we didn't miss out on that much)

 

I would have to ask a friend as I have not worked on B&O audio gear myself so that may be pretty good and actually top of the line quality.  They do look interesting but I have no idea what off the shelf parts/base machine they use as the basis for the audio products. Audio equipment has a huge potential improvement via minor, slightly more costly, component changes or even just circuit layout/component placement. Tape hiss, underlying hum from turntable etc are all quite possible to improve, a better needle cartridge than what is supplied as standard for the base turntable from another manufacturer makes quite a difference too, assuming B&O don't make 100% bespoke turntable chassis.

 

You can't make an early 1990s Goldstar TV image look as good as Mitsubishi Diamondtron if you start with an identical CRT tube inside the Goldstar, they both cost the same however. I think Philips and B&O are companies from the same country though hence the TV parts they use.

 

The point is as long as you accept the technical limitations and are happy to pay that much more for something that looks a particular way instead then that's cool. If it's down to design I would go with the twin fold out speaker Normende TV that looks like something out of The Jetsons or 2001 but as an engineer and home movie buff of the 80s I know the money would have been better spent on a 28" Sony for the living room AND a 14" Sony portable TV for me upstairs (if we had that sort of money in the mid 80s!). 

 

 

IMHO B&O were the Apple of audio/video tech in the 80's. Some great model, most being middle of the road, but with a really attractive package that made them seem better than what they really were and just as Apple today, they were a pain to take apart to repair with those space age design. Give me a good old 80's Marantz instead...

Link to comment
Share on other sites

On 3/23/2022 at 10:02 AM, zzip said:

It's all about covering your arse in your purchasing decisions.  If you're a mid-level manager and buy a software package from a small software house running on Amigas and it all goes sideways then upper management is going to start questioning your purchasing decisions and you will take the blame.   If you bought from a well known name like HP or Oracle or someone and it fails just as bad then well it's not your fault.

In the corporate world, we call this "One Throat to Choke".  That is LITERALLY what it is called.

Link to comment
Share on other sites

36 minutes ago, pixelmischief said:

In the corporate world, we call this "One Throat to Choke".  That is LITERALLY what it is called.

Pre IBM 5150, the desktop computer business was chaotic and plenty got burned by adopting systems that couldn't keep up with the time. IBM introduced order in that chaos by going with an open platform that was easilly cloned and thus made available to a greater market. Once they took over the business world, all others were pretty much done for. Apple, Atari and Commodore also made a mistake by hitching their system to Motorola. Only Apple survived, more due to the iPod and iPhone than their computer line.

 

 

Link to comment
Share on other sites

  • 2 weeks later...
On 4/12/2022 at 8:43 AM, Tuxon86 said:

Pre IBM 5150, the desktop computer business was chaotic and plenty got burned by adopting systems that couldn't keep up with the time. IBM introduced order in that chaos by going with an open platform that was easilly cloned and thus made available to a greater market. Once they took over the business world, all others were pretty much done for. Apple, Atari and Commodore also made a mistake by hitching their system to Motorola. Only Apple survived, more due to the iPod and iPhone than their computer line.

 

 

 

That wasn't IBM's intent. They went with the "open" system because it allowed for a fast development process to getting the product on the shelf after they decided not to acquire another company - and Atari Inc was one of them - to design their first "PC". They were also still under Antitrust investigation by the US DOJ that had dragged on since the early 1970s so that was reason enough not to use an IBM semi design for the CPU or have the IBM software groups design the OS. They chose Intel because Motorola couldn't guarantee the volumes IBM expected either directly or from 2nd Source Suppliers for either the 6809 or the 68000. And then they thought nobody would dare clone their BIOS. That's why they didn't care about Microsoft retaining their own rights to sell or license DOS.

  • Like 1
Link to comment
Share on other sites

7 minutes ago, Lynxpro said:

That wasn't IBM's intent. They went with the "open" system because it allowed for a fast development process to getting the product on the shelf after they decided not to acquire another company - and Atari Inc was one of them - to design their first "PC". They were also still under Antitrust investigation by the US DOJ that had dragged on since the early 1970s so that was reason enough not to use an IBM semi design for the CPU or have the IBM software groups design the OS. They chose Intel because Motorola couldn't guarantee the volumes IBM expected either directly or from 2nd Source Suppliers for either the 6809 or the 68000. And then they thought nobody would dare clone their BIOS. That's why they didn't care about Microsoft retaining their own rights to sell or license DOS.

I have read that the IBM execs at the time didn't really believe in the personal computer space all that much.  It was a passing fad to them,  the big iron mainframes were still their bread and butter.   They just wanted to have a product in the newish personal computer space and they didn't think the open architecture through.    Once they saw the clone market becoming a threat, they tried to make PC more proprietary with the PS/2 line, but it was too late.

  • Like 1
Link to comment
Share on other sites

1 hour ago, zzip said:

I have read that the IBM execs at the time didn't really believe in the personal computer space all that much.

They didn't. They didn't see everyone needing or wanting a machine at home. And look. Today. Granted that was 50 years ago..!

 

1 hour ago, zzip said:

They just wanted to have a product in the newish personal computer space and they didn't think the open architecture through.

Not sure what could've been done better - aside from the segmented memory and 640K limitations.. Much of that was processor dependent anyways. At the same time something, anything.. had to be marketed. The industry was strapping on its first diaper.

 

Not thinking through a totally brand new product isn't always bad. I think it sometimes worse when thought through too much. And in computers the difference between "not at all" and "too much" can be too much.

 

An overdesigned computer doesn't let you do anything without jumping through technical hoops. Apple II and IBM PC gave you slots and their specs and said, "here, do what you want, we don't care." And that's backed up with the failure of MCA & PS/2.

 

Case in point the Amiga. That's all the engineers did was think think think and design something really interesting and novel. Design teams weren't conservative enough and went down rabbit holes. The novelty and interestingness prevented the platform being accepted in mainstream. Only people and companies that thought the same way had hope of making successful add-ons. Hardware was just too eccentric. Whereas PC could be fully functional and customized with either straightforward basic hardware or eclectic hardware, at many price points. And fitting many needs.

Edited by Keatah
Link to comment
Share on other sites

14 minutes ago, Keatah said:

Not sure what could've been done better - aside from the segmented memory and 640K limitations.. Much of that was processor dependent anyways. At the same time something, anything.. had to be marketed. The industry was strapping on its first diaper.

Had they gone with the Motorola 68000 like they considered that segmented memory and 640K issues wouldn't have been an issue.   Of course they did mostly correct it by the 386.

Link to comment
Share on other sites

1 minute ago, zzip said:

Of course they did mostly correct it by the 386.

Sure. I never thought the segmentation was a problem. Even when learning x86 assembly. It was just how it was done. Might even be thought of as a strength as there were many workarounds and a cottage industry of memory management software + add-in boards formed. Quite funny to watch people bitch and moan about it. Citing how other architectures that didn't have that "problem" were so superior. Superior my ass. Computers are generally fast enough to work around such limits.

 

Additionally people like & enjoy spending a little bit of money here and there to overcome limitations in products. Makes them feel like they leveled-up. Especially individuals and hobbyists. Maybe less-so with a faceless corporation, but it's still there. The beancounters and IT decisionmakers can get boss-approval & brownie points by solving an issue on the cheap. It's of course one factor in a sea versatility that made PC so great and omnipresent.

 

I was always confused and confounded by CHIP RAM, SLOW RAM, FAST RAM. What does it all mean? How do I upgrade it? What amount can I upgrade it to? What software would make use of it? These were all practical questions that had no answers unless one dived into the chipset and OS - and then a user found answers that were too limiting. Where nothing worked. Or broke compatibility.

 

On PC, you would plug in more memory, and if it wasn't available or didn't work, you would get a memory manager software package from your local supermarket/computer store, and suddenly everything would work. (Or the hobbyist geek might choose to do EMM386.) Either way there were several options to get going.

 

To me, and how I learned it in the Apple II days, was RAM is RAM. A simple memory map I could print out and tape to my wall. And when I got BIG memory cards like a 64K or 256K card, I learned about bank switching to bring memory pages and segments into and out of the processors window of visibility. Easy concept.

 

When I put my bigboy pants on and moved into PC, it was easy to understand that video RAM was framebuffer RAM and had the sole purpose of holding the image. Made sense to put it on the graphics card. More sense than to be sharing it between custom chips and processor and blitter and general storage. Too homogeneous to be practical.

 

That homogeneity promoted weird programming styles (like on the 2600). Weird enough they were locked to one machine and one setup by nature. But the Amiga isn't a 2600 with one set of hardware specs. A computer could be all over the spectrum.

 

So this PC memory segmentation was a grounding. An anchor into some sort of common framework shared by millions of machines. With software working on millions of variants. I could get game A or application B to work regardless if I was shadowing BIOS or reclaiming the Monochrome area or not.

Link to comment
Share on other sites

14 minutes ago, Keatah said:

Sure. I never thought the segmentation was a problem. Even when learning x86 assembly. It was just how it was done. Might even be thought of as a strength as there were many workarounds and a cottage industry of memory management software + add-in boards formed. Quite funny to watch people bitch and moan about it. Citing how other architectures that didn't have that "problem" were so superior. Superior my ass. Computers are generally fast enough to work around such limits.

I first learned assembly on 6502 and also dabbled in 68000 assembly.   Had to learn x86 assembly as part of a college program.   So I was familar with the basics of various assembly languages, the thing that made x86 assembler uniquely frustrating was the segmentation.   Like I never quite knew when to use "near" and when to use "far",  how am I supposed to know when two routines will be in the same memory segment and when they won't?   Whenever my code failed it was almost always because I didn't handle segmentation properly.   But it's not just assembly language,  You had to use near and far in C code as well in that era. 

 

Granted working on a 6502 you have to deal with memory management as well,  because a 6502 can only address 64K RAM and if your machine has 128K, 256K or some other amount, memory has to be swapped in and out so you end up with a memory scheme not dissimilar to PC segmentation.   But 6502 assembly didn't have semi-abstracted keywords like near and far either,  if you wanted to use extended memory, you had to manage it yourself.  Or you could simply opt out of not using it.

 

But I wouldn't suggest PCs should be built off of a 6502 either.   It just wasn't up to the task

 

By contrast, 68000 assembly was elegant compared to both.  If that had been the basis,  I think early PCs would have been more pleasant to work with.

 

39 minutes ago, Keatah said:

I was always confused and confounded by CHIP RAM, SLOW RAM, FAST RAM. What does it all mean? How do I upgrade it? What amount can I upgrade it to? What software would make use of it? These were all practical questions that had no answers unless one dived into the chipset and OS - and then a user found answers that were too limiting. Where nothing worked. Or broke compatibility.

The Motorola architecture would have growing pains like like Intel did as well.  Eventually it would standardize on one type of RAM.   But it's not dissimilar to PC where you had memory on the mainboard,  usually faster memory on the video card which can sometimes be upgraded.   Even some sound cards had memory slots the user could add to.

 

42 minutes ago, Keatah said:

That homogeneity promoted weird programming styles (like on the 2600). Weird enough they were locked to one machine and one setup by nature. But the Amiga isn't a 2600 with one set of hardware specs. A computer could be all over the spectrum.

 

So this PC memory segmentation was a grounding. An anchor into some sort of common framework shared by millions of machines. With software working on millions of variants. I could get game A or application B to work regardless if I was shadowing BIOS or reclaiming the Monochrome area or not.

one difference is the weird architecture of the 2600, the extended memory on 6502-based systems would make things tougher on programmers.   But they didn't affect end-users.   On a 2600 you plug in a cart and it works.    On a 6502 your program uses extra memory or it doesn't.

 

DOS PCs made memory segmentation not just a programmers problem, but also an end user's problem.    "Do I need EMS, XMS or both?  How should it be configured? "  "What TSRs can I LH,  can I free up enough of the bottom 640K to get my new game to run?"   I never had to even think about such things on any other system I used.

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...