Jump to content
bluejay

The Intel 80186?

Recommended Posts

I didn't even know something called the 80186 existed until the last time I re-watched The 8-Bit Guy's video on the Mindset computer. And apparently it's about as fast as the 80286 but it wasn't very compatible with the 8088, therefore wasn't used in too many computers. What are some relatively well known computers that run on a 186 processor? Was it really that bad? What's the difference between it an the 286?

Edited by bluejay

Share this post


Link to post
Share on other sites

sigh.

 

*Deep-breath*

 

The 80186 was an evolution of the 8086 that added an 8089 compatible I/O controller, a built in interrupt controller, timers, wait state generator, and chip select lines. The 80188 is the same chip, except built around an 8088 core, with a similar reduction in the data bus lines.

 

It was intended to be a solid cost-reduction of the 8086 that could be used in embedded applications, as it reduced the number of chips needed for a complete system implementation. 

 

However, since the 80186 design started immediately after the 8086, and the IBM PC (nor its predecessor the DataMaster/23) hadn't even made it to market yet, Intel still thought that the instruction trap design that they were adopting for chips like the 8087 co-processor, could also be used for things like the interrupt, DMA, and I/O controllers.

 

But the IBM PC didn't go that route. It used a cut and paste of the DataMaster/23 I/O architecture, built around the 8-bit 8255 (GPIO), 8259 (Interrupt), and 8237 (DMA) controller chips.

 

Because the 80186 used a completely different I/O map (and calling conventions) for all of these consolidated components, from what was established in the IBM PC, there was no way to make any 80186 or 80188 design compatible with the IBM PC. There were some MS-DOS compatible machines that utilized this chip (e.g. the MAD and the Mindset) but they were not, and could not be completely PC compatible. So they were essentially used in embedded designs (e.g. the Chroma Polaris synthesizer was built around an 80188)

 

The 80286 was a different design, which started from the 8086, increased the address bus size (while still maintaining a segmented memory model), demultiplexed the address and data buses, added memory protection, but did not incorporate the chip consolidation that had occurred with the 80186, so a compatible hardware architecture could be built around it similar to the original IBM PC, and thus was selected by IBM for the PC/AT. The rest is history.

 

  • Like 3
  • Thanks 1

Share this post


Link to post
Share on other sites
15 minutes ago, tschak909 said:

sigh.

 

*Deep-breath*

 

The 80186 was an evolution of the 8086 that added an 8089 compatible I/O controller, a built in interrupt controller, timers, wait state generator, and chip select lines. The 80188 is the same chip, except built around an 8088 core, with a similar reduction in the data bus lines.

 

It was intended to be a solid cost-reduction of the 8086 that could be used in embedded applications, as it reduced the number of chips needed for a complete system implementation. 

 

However, since the 80186 design started immediately after the 8086, and the IBM PC (nor its predecessor the DataMaster/23) hadn't even made it to market yet, Intel still thought that the instruction trap design that they were adopting for chips like the 8087 co-processor, could also be used for things like the interrupt, DMA, and I/O controllers.

 

But the IBM PC didn't go that route. It used a cut and paste of the DataMaster/23 I/O architecture, built around the 8-bit 8255 (GPIO), 8259 (Interrupt), and 8237 (DMA) controller chips.

 

Because the 80186 used a completely different I/O map (and calling conventions) for all of these consolidated components, from what was established in the IBM PC, there was no way to make any 80186 or 80188 design compatible with the IBM PC. There were some MS-DOS compatible machines that utilized this chip (e.g. the MAD and the Mindset) but they were not, and could not be completely PC compatible. So they were essentially used in embedded designs (e.g. the Chroma Polaris synthesizer was built around an 80188)

 

The 80286 was a different design, which started from the 8086, increased the address bus size (while still maintaining a segmented memory model), demultiplexed the address and data buses, added memory protection, but did not incorporate the chip consolidation that had occurred with the 80186, so a compatible hardware architecture could be built around it similar to the original IBM PC, and thus was selected by IBM for the PC/AT. The rest is history.

 

So Intel made the 80186 in hoped that the PC would use it to simplify the motherboard and reduce the chip count, but the PC decided to use the inferior 8088 to cut costs? Did the 80188 come before the PC as well?

To clarify, the 80186 was an 8086 but with a bunch of features built in to reduce the chip count on the motherboard, but was not compatible with the 8086 and 8088 at all. Right?

Then Intel realized that the 80186 went horribly wrong so decided to make an enhanced 8086 that would still be compatible with the 8086 yet more powerful, then called it the 80286?

Edited by bluejay

Share this post


Link to post
Share on other sites
7 minutes ago, bluejay said:

So Intel made the 80186 in hoped that the PC would use it to simplify the motherboard and reduce the chip count, but the PC decided to use the inferior 8088 to cut costs? Did the 80188 come before the PC as well?

To clarify, the 80186 was an 8086 but with a bunch of features built in to reduce the chip count on the motherboard, but was not compatible with the 8086 and 8088 at all. Right?

Then Intel realized that the 80186 went horribly wrong so decided to make an enhanced 8086 that would still be compatible with previous processors yet more powerful, then called it the 80286?

Intel did NOT intend the 80286 to be used in personal computers. They had intended for them to be used in multi-user mini-computer setups (contemperaneously, this would have been something like the Altos multi-user terminal systems that were commonly being sold by various VARs at the time). It wasn't until IBM selected it for the IBM AT in 1984, that Intel drastically shifted more efforts to extending the 8086 design. (In fact, Intel, until 1984 had been hoping that the future of large scale computing would be the 88000 aka the iAPX 432 chip family)

 

-Thom

 

(FWIW, I know the 8-bit guy, and the Mindset in that video was loaned to me, by its owner Rob Ivy, to write new software for. I wrote the first new pieces of software for that machine in over 35 years.)

 

-Thom

Edited by tschak909
  • Thanks 1

Share this post


Link to post
Share on other sites
1 minute ago, tschak909 said:

Intel did NOT intend the 80286 to be used in personal computers. They had intended for them to be used in multi-user mini-computer setups (contemperaneously, this would have been something like the Altos multi-user terminal systems that were commonly being sold by various VARs at the time). It wasn't until IBM selected it for the IBM AT in 1984, that Intel drastically shifted more efforts to extending the 8086 design. (In fact, Intel, until 1984 had been hoping that the future of large scale computing would be the 88000 aka the iAPX 432 chip family)

 

-Thom

Seriously? A multiuser minicomputer in 1982? I mean, I guess it kinda makes sense, but I'd have imagined by 1982 home computers were becoming more and more popular and minicomputers were becoming less and less popular. So Intel designed it for use on a powerful minicomputer, but then IBM decided they'd use it on their PC/AT. Then Intel started to focus on developing their processors for use on home computers?

Then... did Intel make a revised version of the 80286 or did Intel's new home computer biased microprocessors start from the 80386?

Share this post


Link to post
Share on other sites

The 80386 was a from the ground up 32-bit wide redesign, that established a page table memory model design; that also took an exploit that some software engineers were using since the 80186 (the LOADALL instruction, which basically bypassed any and all memory protection contexts), and made a working VM8086 mode for compatibility with 8086 CPUs (the 80386 boots up in this mode)

 

(There was also an embedded form of the 80386, the 80376, which lacked the VM86 mode)

 

-Thom

 

  • Like 2

Share this post


Link to post
Share on other sites

Huh, then the 80386 was an enhanced 32 bit 80186 that also happened to be compatible with the 8086?

Share this post


Link to post
Share on other sites

/me facepalms.

 

The 80386 was the 80286 architecture, extended out to 32 bits, explicit memory segmentation lifted, and a working virtual 8086 mode that allowed for existing 8086 software to run in a virtual machine (and thus the trick you always saw in Windows of an MS-DOS prompt that would run alongside Windows, just working)

 

-Thom

Share this post


Link to post
Share on other sites
18 minutes ago, bluejay said:

Seriously? A multiuser minicomputer in 1982? I mean, I guess it kinda makes sense, but I'd have imagined by 1982 home computers were becoming more and more popular and minicomputers were becoming less and less popular.

Sure...for some applications. The computer camp that I attended in 1982 and 1983 was part of a regional college that utilized VAX minis. There was a rank of VIC-20s (and later, C64s) for the kids, but I didn't bother with those. People forget how balky, unreliable, and EXPENSIVE connectivity was in the early '80s and programs, like e-mail and the PLATO programs ("Moria" received heavy play), that relied upon it didn't run well on home computers (or was just prohibitvely expensive to use). Minis didn't have this problem. BBSing was too expensive for me until about 1987, the cost of using CompuServe was always prohibitive, and you could just forget about long-distance calling until well into the late '90s. And this was in Canada's biggest market! No, I can see why minis stuck around for awhile.

  • Like 1

Share this post


Link to post
Share on other sites

Curiously, most of the standalone computers you found the 80186 in, were various school computers not compatible with eachother. Many probably ran CP/M-86 as mentioned and many seem to have chosen the 80186 in order to NOT become PC compatible for various reasons.

Share this post


Link to post
Share on other sites

In high school we had a computer called "Icon".  An 80186 based computer with a unix type of os, and networked to a file server.  They were amazing, had decent graphics, a built-in trackball, and speech synthesis.

 

https://en.m.wikipedia.org/wiki/ICON_(microcomputer)

 

 

The Tandy 2000 also used the 80186 cpu.  It came out before the Tandy 1000.

  • Like 1

Share this post


Link to post
Share on other sites

Neat.  I have only ever seen the 80186 in modems and some old router.

6 hours ago, tschak909 said:

virtual 8086 mode that allowed for existing 8086 software to run in a virtual machine (and thus the trick you always saw in Windows of an MS-DOS prompt that would run alongside Windows, just working)

This leads to a pet peeve of mine, that so many people still insist that the Windows 9x kernel is just a DOS "shell" when, in fact, as 9x boots it "elevates" DOS to a virtual environment.  This has been published in Microsoft technical documents for a couple of decades, and Mark Russinovich has explained in great detail, but people still throw that old myth around.

  • Like 1

Share this post


Link to post
Share on other sites
6 hours ago, tschak909 said:

/me facepalms.

 

The 80386 was the 80286 architecture, extended out to 32 bits, explicit memory segmentation lifted, and a working virtual 8086 mode that allowed for existing 8086 software to run in a virtual machine (and thus the trick you always saw in Windows of an MS-DOS prompt that would run alongside Windows, just working)

 

-Thom

Then, the 80286 was compatible with the 8086 by default, but the 80386 needed a virtual 8086 mode to be compatible with it?

Share this post


Link to post
Share on other sites

 

1 minute ago, bluejay said:

Then, the 80286 was compatible with the 8086 by default, but the 80386 needed a virtual 8086 mode to be compatible with it?

Not all 8086 instructions could be executed in protected mode.

-Thom

Share this post


Link to post
Share on other sites
1 minute ago, tschak909 said:

 

Not all 8086 instructions could be executed in protected mode.

-Thom

On the 80286?

Share this post


Link to post
Share on other sites

correct. same as on all future chips. Basically IBM _tried_ to implement a virtual 8086 mode on the 80286 (initially during the development cycle of TopView, and later for OS/2), and (short answer) didn't work all that well. Different steppings (revisions) of the chips exhibited slightly different behavior of the LOADALL (and other undocumented) instruction.

 

-Thom

Edited by tschak909
  • Like 1

Share this post


Link to post
Share on other sites

Yup.  The specific problem was that there was no (official, proper) way to LEAVE protected mode, and return to real mode, once you entered it.  The LOADALL instruction could be used to get access to realmode again, "Kinda sorta", but it was a barrel of worms.  This is the real reason why the protected mode of the 286 is not really supported by much of anything, and nearly all period software treats the 286 like a really fast 8086. 

 

There were some exceptions to the rule, like certain memory managers written specifically for 286s, like "EMM286.exe", which would get the XMS memory managing capabilities leveraged, without actually leaving real mode. (It abused LOADALL in a different way, to gain access to the xms memory to simulate EMS memory for applications that used it, and do so without entering proper protected mode. Since you could not return to real mode if you did, this was necessary to continue using DOS software.) There were XMS managers as well that did similar memory access shenanigans.

 

Intel fixed this issue in the 386 by allowing the CPU to return to v86 mode after entering protected mode-- and as previously mentioned, this is why you can run a DOS application in a window, and do the preemptive/cooperative multitasking thing, and why you don't have to hunt for obscure memory managers to make it all nice and happy.  386 implemented proper MMU, and a functioning v86 mode.

 

 

 

 

 

 

  • Like 1

Share this post


Link to post
Share on other sites

My TRS-80 Model 2000 had a 186.  It was so close to being a compatible, yet so far away.  Graphics were the biggest issue as it wasn't CGA or EGA compatible.  Floppy drives were also 720K each instead of the standard 180K or 360K.

It was a Radio Shack corporate computer that had every single option in the catalog added to it.  The RS in my local mall sold it to me for $100 bucks...hi res color monitor and modem included too.  At one point I added everything up and it totaled $10,000+, and it wasn't even a discontinued system yet.

If it was being used in a business environment or for an engineer using CAD, it was the ultimate system (600X400 color graphics in 1984).  I was a teenager at the time...and well, it didn't run games.  My entry level 1000EX was 1000 times better because it ran everything and all software was free on local pirate BBSs.

The 2000 was like a Lamborghini crippled with skinny bias ply tires.  All that power with no good way to make use of it due to limited software.

 

If I still had it today, I'm sure I would appreciate it much more.  I let my best friend use it so we could play Trade Wars on the local BBSs and never bothered to get it back.  He thinks it's still stashed away in his parents' house.  Might be cool to get it back one day and experience it again in my 50s.

  • Like 1

Share this post


Link to post
Share on other sites
3 hours ago, Turbo-Torch said:

My TRS-80 Model 2000 had a 186.  It was so close to being a compatible, yet so far away.  Graphics were the biggest issue as it wasn't CGA or EGA compatible.  Floppy drives were also 720K each instead of the standard 180K or 360K.

It was a Radio Shack corporate computer that had every single option in the catalog added to it.  The RS in my local mall sold it to me for $100 bucks...hi res color monitor and modem included too.  At one point I added everything up and it totaled $10,000+, and it wasn't even a discontinued system yet.

If it was being used in a business environment or for an engineer using CAD, it was the ultimate system (600X400 color graphics in 1984).  I was a teenager at the time...and well, it didn't run games.  My entry level 1000EX was 1000 times better because it ran everything and all software was free on local pirate BBSs.

The 2000 was like a Lamborghini crippled with skinny bias ply tires.  All that power with no good way to make use of it due to limited software.

 

If I still had it today, I'm sure I would appreciate it much more.  I let my best friend use it so we could play Trade Wars on the local BBSs and never bothered to get it back.  He thinks it's still stashed away in his parents' house.  Might be cool to get it back one day and experience it again in my 50s.

The Model 2000 was supposed to be a whole new computer that happened to run MS-DOS that's better than the IBM PC, if I recall correctly. It was also more expensive than the Tandy 1000 as it was more powerful.

  • Like 1

Share this post


Link to post
Share on other sites

The Tandy 2000 came out before the Tandy 1000 - so really the statement should be “The Tandy 1000 was cheaper because it wasn’t as powerful as the 2000” - But they really were 2 different markets.  The 2000 was to be the ultimate business machine.  The 1000 was a reaction to (and a decent clone of) the IBM PCjr.

  • Like 2

Share this post


Link to post
Share on other sites
On 9/22/2020 at 10:05 PM, bluejay said:

Then, the 80286 was compatible with the 8086 by default, but the 80386 needed a virtual 8086 mode to be compatible with it?

Pretty much all x86 chips, from 8086 even to Core, begin in "real mode".  That's the 8086 proper mode.  1MB address range.  No page tables, nothing special. 386 and up allowed 32-bit register access, but memory access was still limited to 1MB.

80286 added "protected mode".  This adds security rings for OS enforcement of user access, and replaces direct segmentation with a concept of "selectors".  Each segment "selector" is assigned to a specific start address and has a specific length, up to 64K per selector.  (It still has 16-bit registers and instructions).  However, the selectors could be placed anywhere within 16MB of address space, so you at least get more memory in this mode.  Protected mode can only be exited by reset, which is why it has to run a completely different OS that is not DOS compatible.

 

80386 added 32-bit instructions, 4GB addressing, virtual memory and port mapping, and virtual 8086 mode.  It starts in REAL 8086 mode.  Virtual 8086 mode is a special hybrid of Protected Mode and 8086.  Interrupt service routines in virtual 8086 mode return the chip to full Protected Mode to service the interrupt.  Windows or other v8086-supporting systems will typically return the system to virtual 8086 mode at the location specified in the Real Mode for the same interrupt, but that is not always the case.  For instance, Windows will run its own disk service routines instead of letting the real-mode BIOS attempt to manipulate the disk drive.  80386 could also enter or leave real 8086 mode, protected mode, or virtual 8086 mode at will.  One trick used in early DOS games was to enter protected mode, set the segment registers for huge swathes of memory, then return to Real Mode.  This exposed an oversight where the processor would continue to allow access to a segment's full range even after the switch to Real Mode, but only until you changed that segment register again.  This was dubbed "Flat Mode", though it seems to be officially entered into The Wiki as "Unreal Mode".

Whether you understand all of this or not, the upshot is that since every chip starts in real 8086 mode, they all begin compatible with the 8086.  But each one provides a superset of its predecessor, which adds new features.

  • Thanks 1

Share this post


Link to post
Share on other sites

Earlier this evening I was working on cleaning out some stuff I never use. In amongst the reference books is an Intel "Microsystem Components Handbook: Volume I" ca. 1984. Many moons ago, I first heard of the '186 from that book. 

It's looking for a new home. PM me soon if you're interested in adopting it.

Share this post


Link to post
Share on other sites

Wrt 80186 wikipedia seems alright:

 

https://en.wikipedia.org/wiki/Intel_80186

 

And to be clear the 80186 is an 8086 (with a few added instructions) + peripherals in the same package (like microcontrollers).  So it can run 8086 machine code, but it is not a "PC-DOS" compatible replacement because of those very same integrated peripherals.

If you were to write 8086 code that did not use any peripheral, say to compute a math formula all from/to memory, as long as the code is written "PC relative" (relocatable) it would run unchanged on the 80186.

Share this post


Link to post
Share on other sites

Aside from programs that wanted to talk directly to hardware (like games talking to video cards), the DOS interrupt vector table handler methodology should have allowed any random old bit of peripheral to be used just fine, as long as nothing was sitting in the spot used for the vector table, and as long as the 1mb memory space remained consistent.

 

That this did not happens suggests that the low 1mb memory area was not conserved.  I don't rightly know that, but I am open to correction.

 

Otherwise, for the same reason you can slap a scsi card in that has its own controller bios, (the controller bios drops in a vector table entry which intercepts INT13 calls, and tells the system how to use the card using the baked in controller bios via the int13 interface, etc..) it should have been possible to run MSDOS on the system with a special system driver (that sets up the vector table, and handlers).

 

The issue was that basically everyone was trying to talk to bare metal back then, for performance reasons.  You can't talk to bare metal that is not there, even if there is a syscall handler that facilitates the same kind of thing with different metal underneath.

 

 

There is a similar kind of issue in the wild today, as concerns chromebook hardware.  These systems are typically not intended to run windows, and as such, are not constructed with a proper PCI bus architecture, and instead use a loosely defined specification of an ACPI bus variant.  Things like MrChromebox's custom firmware literally EMULATE a PCI bus using the system management controller baked into modern CPUs to facilitate compatibility with these operating systems.  It's a testament to how far HAL technology has come since the 80s, that these devices are able to run modern software, and are not limited to just chromeOS.

 

 

 

Edited by wierd_w
  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...