Jump to content
DrVenkman

1200XL Mysteries (Continued)

Recommended Posts

I had one of my inquiries come back.

 

I asked someone for their personal perspective which might be suitable to reprint (or, that failing, then for off-the-record information which I could use as background material) on the combined topic of Atari/Apple/Sweet-16/WDC/6502/65816. I provided some straightforward background material, I shared my suspicions about some of the interactions between the various parties, and I asked for his own thoughts/insight/perspective on the topic. The reply was from Bill Mensch, who was part of the team at MOS (led by Chuck Peddle) that created the 6502. He's also the designer of the 65816, its 16-bit successor.

 

He really didn't have much engagement with Apple or Atari, and he certainly didn't have any details about this topic in particular (although he was familiar with Steve Wozniak's "Sweet 16" 16-bit virtual extension for the 6502). Normally, if I received a reply that didn't give a strong confirmation or denial, I'd keep it in my back pocket for a while, but Bill's response just struck me as being too good not to share. In fact, I feel a bit guilty over what a great effort he put into his response. It was such a finely crafted and detailed perspective from an historically significant individual in microcomputing; it would have felt wrong to keep it for myself.

 

I'm going to see if I can hunt down someone who can take Bill's response and do some justice with it. (Suggestions are welcome and appreciated!) If anyone would like to republish this (or you know someone who would), please be my guest.

 

Bill's email is below.

 

Hi Josh,

Thanks for the request to clarify some of your story below.

First of all, I do remember Woz’s “Sweet 16” project. That said, I never looked at that spec and Apple never promoted that spec to me in our discussions back in the day. I recently briefly looked at Woz’s Sweet 16 spec and came away scratching my head about the intentions behind his design.

The 6502 and its variants like the 6507 that Atari used in its games were in fact not what Chuck Peddle had in mind when he specified the 6501. Chuck is not a semiconductor engineer but rather a marketing type with a strong systems background. Chuck wanted a PDP11 type processor with the pin out of the 6800 so he could cannibalize the marketing and sales efforts of Motorola and the DEC system expertise. This made sense to Chuck and Rod Orgill. Rod’s strength was logic design and he was my mentor in that regard.

It was my idea to create the 6502 which combined the 6501 with a single phase clock input. The 6501 required a two phase non overlapping clock generator like what the 6800 and Intel 8080 needed at that time. This on the surface is a simple change. That said others in the industry had extreme difficulties in doing making a single phase clock input with non-overlapping internal clocks. Because I chose all of the transistor sizes and cell designs for the registers and decode logic, I knew the timing. So this actually was a simple change for me.

So why am I bringing this up in response to your story and questions about the history of the Sweet 16 designs at Apple and Atari? The W65C816 and W65C802 follows the same basic logic as adding a clock generator to the 6501 to create the 6502. The basic design criteria for the ‘816/802 was to create a larger memory map for a 6502 compatible CPU. The ‘802 was basically a 16-bit 6502 with added instructions and added register size. I received no input from either Apple or Atari. They had their own interests some of what you have stated below.

My interests for the ‘816 was to create a “virtual memory” processors along the lines of the Prime Computer virtual memory processor in the Prime 300 and Prime 599 series processors. I purchased a Prime 550 for simulation of my chips using a couple of programs I had call Logic and Circuit for logic simulation and circuit simulation. This was before Verilog and Spice existed.

OK, so the Data General Nova and Data General Eclipse system were the systems that I used to design the masks for the W65C02 and W65C816 CPUs.

The Nova had 16K bytes of main memory with a 5 megabyte hard drive to run the Calma GDS1 mask design software. When I upgraded to the Eclipse for the Calma GDSII mask design tools I then had a CPU with 256K bytes of main memory and a 25 megabyte hard drive. Back when I was designing the ‘816, 16 megabytes of main memory was larger than the minicomputer system I was designing with, in fact it was equivalent to the hard drive size.

To handle the virtual memory concepts I added the ABORTB interrupt input so the ‘816 could repair the physical memory needs of the system by aborting the instruction without losing register contents in the process and then rerunning the instruction when physical memory was placed in the memory map. By the way ARM engineers were the only people that ever asked me about ABORTB which they emulated in their ARM CPUs back in the day.

Also, I had a student and professor at the University of Arizona help me understand more about compilers and they recommended that I strengthened stack processing and that is where my new stack addressing modes came from. I don’t that anyone has ever used those stack addressing modes.

Let’s go back to the Atari relationship I had at that time. Well as you may or may not know the W65C02 uses one extra cycle in the decimal mode math operations. I used an extra cycle so I would avoid my patent that MOS Technology owned at that time. This avoided patent infringement of my own patent. The extra cycle prevented the W65C02 to be used in the Atari games since the cycle count had to be exactly the same because Atari did some housekeeping on the retrace time of the CRT/TV displays at that time. The Asteroid game broke up the asteroids. When I design the W65C816 I figured out a way I could remain timing compatible with the 652 and not violate my own patent.

So as you may be able to detect, there is a lot more to designing a CPU then just a software emulation like Wiz’s Sweet 16 or the Atari Sweet 16 from a semiconductor standpoint.

This commentary may lead to yet other CPU considerations like the W65C832. I specified 5-10 different 32-bit CPUs and in the end I decided not to create any of my specification in actual silicon, The reason being, a 32-bit processor requires orders of magnitude more engineers to realize the system needs of a 32-bit machine. My example is IBM+Apple+Motorola with the PowerPC could not dislodge Intel from the PC market. Apple eventually went to the x86 architecture.

Another interest point concerning 32-bit and beyond CPUs in the ARM ISA. ARM engineers came to me in 1983 asking me to design their RISC processor. When I refused for the reason I already stated they reached out to Apple and VLSI Technology to design their RISC processor know at the time as the Acorn RISC Machine (ARM). I had taught Xerox, Boroughs and Kodak engineers in CPU design before ARM approached me, they may have known that.

The first Internet hardware used the 6502 and a division of GTE asked me to add some features to the 6502 in the W65C02 for packet switching for the Internet support in the day. GTE was a licensee of mine at that time. Memory Lock was one of the features I added for their systems.

So you see, Apple and Atari had their own ideas and NOT working with me was one of those ideas. I met only once with Woz for second sourcing. We met in their Mountain View offices and that was the day I became the first fabless semiconductor company so I could second source my own microprocessor that GTE became the first source. NCR, a licensee on the W65C02 at that time, refused to manufacture the W65C816 for the Apple IIgs. Interesting footnote to the relationship that I/WDC had with Apple and Atari.

Now Nintendo is a different story. I licensed Ricoh the camera and copier company of Japan. I gave Ricoh an exclusive license on the W65C816 processor for use in a Nintendo game system. That exclusivity meant the I would not license anyone else to manufacture a W65C816 for Nintendo, an exclusivity for only one company, Nintendo.

Also, Ricoh invited me to speak with about 100 engineers in Japan. Ricoh who designed the chip set for the Super Nintendo asked me what I would recommend they do to maximize the power of the W65C816 CPU. I suggested they use the W65C816 for the human interface for their games, the logic of their games, and have task processors for video and audio. The trick would be to coordinate these three basic functions. They apparently followed these ideas to create the SNES system.

Let’s circle back to Atari and the W65C816. Although the W65C816 could play all of the existing games back then, Atari never met with me in an effort to build a 16-bit game system.

I licensed a company in Taiwan on my W65C816 design that has built probably about a billion game system chip over the years, many times what Apple and Atari combined ever sold. Those systems are still being sold to this day. I received royalties off every one sold.

So you see, probably Apple could still be selling interesting education computers and Atari could be selling interesting game systems to this day if they would have considered working with me. They apparently had bigger fish to fry and the rest is history.

Now WDC is working on a technology approach that would let the system user download new CPUs with Universal Interface Adapters (UIAs) for use in a broad variety of systems. We will see what traction we get in the “connected system” world some call IoT.

Where is the Apple, CBM and Atari when we could use them for round II? :-)

Enjoy!!

All the Best,

-Bill

Edited by jmccorm
  • Like 10

Share this post


Link to post
Share on other sites

Outstanding insight straight from the source. Wow! Sometimes I'm a little slow, but I have a question. Does "Sweet Sixteen" WRT Atari imply a 16 bit machine? Bill's first hand account with Atari seems to not support this (am I being slow?). If so, then the 1400XL, 1450XL and more importantly the 1000XL were always designed to use the 6502C SALLY. The Atari 1800XL (??) was really Loraine (aka AMIGA) which would have been the real 16 bit Atari? Sound right?

Edited by ACML

Share this post


Link to post
Share on other sites

That's what the last few posts have been all about ...

 

Short answer - one person seems to think so. Personally, I see nothing in the historical record as it exists, or in the design decisions made with regard to the XL line, that suggests this at all. My theory on the codename is that it was someone's idea of indicating that the computer line was "growing up" or maturing and evolving. If there are written records indicating Atari ever considered working in any kind of 16-bit processor into the machine from the outset, or testimony from an engineer involved to that effect, I will revise my opinion.

 

Now, about that bad attempt to boost saturation at the expense of clarity in the Composite signal ... :)

  • Like 1

Share this post


Link to post
Share on other sites

 

Now, about that bad attempt to boost saturation at the expense of clarity in the Composite signal ... :)

SV 2.1 video upgrade fixes that nicely, very nicely for composite and chroma/luma. :)

Share this post


Link to post
Share on other sites

SV 2.1 video upgrade fixes that nicely, very nicely for composite and chroma/luma. :)

 

Haha, I know how to fix it - I did the ClearPic 2002 mod on one of mine several years ago, but now have a UAV in that machine, which I like better (seems sharper to me, but I don't have any before/after pics). In fact, I have UAVs in two of my machines. :)

 

EDIT: Oh, hey - Retrobits.net is back up and running. Whew!

  • Like 1

Share this post


Link to post
Share on other sites

 

Haha, I know how to fix it - I did the ClearPic 2002 mod on one of mine several years ago, but now have a UAV in that machine, which I like better (seems sharper to me, but I don't have any before/after pics). In fact, I have UAVs in two of my machines. :)

No doubt UAV is probably sharper, but I love the color saturation I get with the fixed chroma circuit, and I'd have to see a UAV side-by-side to convince me it's better color-wise. And with the SV 2.1 software 80 columns from SpartaDOS and TLWP are absolutely sharp and clear, if it can get better I'm not sure if I need it. But, I'm rebuilding an 800XL for a friend (a victim of the notorious resin-brick PSU), so maybe you have convinced me to try the UAV for it, since it doesn't have a chroma boost circuit anyway. ;)

Edited by Gunstar

Share this post


Link to post
Share on other sites

Outstanding insight straight from the source. Wow! Sometimes I'm a little slow, but I have a question. Does "Sweet Sixteen" WRT Atari imply a 16 bit machine? Bill's first hand account with Atari seems to not support this (am I being slow?). If so, then the 1400XL, 1450XL and more importantly the 1000XL were always designed to use the 6502C SALLY. The Atari 1800XL (??) was really Loraine (aka AMIGA) which would have been the real 16 bit Atari? Sound right?

 

The 1090XL expansion for the 800XL and 14x0XL(D) clearly allowed for a bus-mastering 16-bit CPU. That isn't my unique assertion... go ahead and click on that link. I don't think that anyone is disputing that... right? The Sweet 16 Project is where the PBI interface was introduced into the 8-bit line. The Sweet 16 Project is what made external control with a 16 bit CPU possible on an Atari Home Computer. I don't think DrVenkman is questioning that? More of intent, it seems. Was a 16-bit upgrade actually a real goal of what they were working on, and was it an influence on the project's name? Was Sweet 16 just a "coming of age" reference? (In that context, the "Sweet 8" 600XL seems problematic.)

 

Bill couldn't say either way. He was involved in sourcing the CPU for Apple IIGS, but he was not involved in developing Apple's new system (and likely did not expect to be). But the IIGS was released in 1986. Atari changed hands in 1984 and the 1090XL was cancelled before they got far along in its development. It may have been too early for Atari to bring Bill in before Jack flipped the company upside-down. Or it maybe Atari felt that it didn't need to work with Bill. Or it could have been a bench experiment or nothing at all.

 

Evidence to support/refute any side of intent is pretty low. Still too many unknowns.

Edited by jmccorm

Share this post


Link to post
Share on other sites

awesome about the retrobits site! By the way DrVenkman, I'm well aware you know what you are doing fixing/upgrade Atari's and more...I'm still so delighted with the 1200XL version of SV 2.1, I preach it every chance I get. (only for the 1200XL mind you)

Edited by Gunstar

Share this post


Link to post
Share on other sites

awesome about the retrobits site! By the way DrVenkman, I'm well aware you know what you are doing fixing/upgrade Atari's and more...I'm still so delighted with the 1200XL version of SV 2.1, I preach it every chance I get. (only for the 1200XL mind you)

 

No worries. No doubt something like SV 2.1 or ClearPic2K2 isn't a big deal for a well-armed hobbyist with an array of discrete components on hand and the time to do the mods. It's certainly cheaper than a $25 UAV board. :) That said, Bryan (the UAV inventor) has a long and storied history here on AtariAge doing video mods, scoping out the signals and posting the results. The UAV encapsulates all of that in a very easy-to-install package. I've got four of them in service (two in 1200XL's, one in an 800XL, one in a 4-Switch 2600) and a fifth one coming for installation in the 1088XEL I'm building.

 

Anyway, I'm just glad the Retrobits site is back up. My restored "Ugly Duckling" 1200XL may get a PBI interface one of these days and it's nice to see Bob's original article back on the web. ***scurries off to save a web-archive of the page ***

  • Like 1

Share this post


Link to post
Share on other sites

 

No worries. No doubt something like SV 2.1 or ClearPic2K2 isn't a big deal for a well-armed hobbyist with an array of discrete components on hand and the time to do the mods. It's certainly cheaper than a $25 UAV board. :) That said, Bryan (the UAV inventor) has a long and storied history here on AtariAge doing video mods, scoping out the signals and posting the results. The UAV encapsulates all of that in a very easy-to-install package. I've got four of them in service (two in 1200XL's, one in an 800XL, one in a 4-Switch 2600) and a fifth one coming for installation in the 1088XEL I'm building.

 

Anyway, I'm just glad the Retrobits site is back up. My restored "Ugly Duckling" 1200XL may get a PBI interface one of these days and it's nice to see Bob's original article back on the web. ***scurries off to save a web-archive of the page ***

My Ugly duckling is now a beautiful swan with a real PBI port all the fixes and upgrades and now, LED lighting. :grin: But I've always preferred the 1200XL aesthetically far more than other Atari's, it was it's insides that needed beautifying. I've got two more to do yet, though.

Edited by Gunstar

Share this post


Link to post
Share on other sites

But again, there is zero evidence that Atari planned ahead and was thinking of a 16bit CPU upgrade card for the XL's, let alone one being thought of during the Z800/S16 design reviews.

  • Like 1

Share this post


Link to post
Share on other sites

But again, there is zero evidence that Atari planned ahead and was thinking of a 16bit CPU upgrade card for the XL's, let alone one being thought of during the Z800/S16 design reviews.

 

That is correct, although I'd be surprised to hear that they hadn't anticipated that an external CPU card could be 16 bits.

 

Still, I phrased it as "evidence to support/refute any side of intent is pretty low. Still too many unknowns." The project's name is something that conjurs images of 16 bits, it shares the name of a (then) well-known 16 bit virtual extension which ran on 8-bit hardware, and an external CPU on the PBI interface actually could be 16 bits. Was this a curious oddity, or intentional first steps in a migration to 16 bits? We don't have anyone who is authoritative and on the record explicitly saying that introducing 16 bit components was or was not an intent of the Sweet 16 Project, or of any relation to the name.

 

EDIT: The response from Bill was just an artifact of that search for someone to explain/confirm/refute these Sweet 16 questions, but it didn't change anything from where we were on the last page. I wasn't providing his response to advance or refute any significant point regarding Sweet 16. I pasted his response because I knew that other people would enjoy reading it.

 

EDIT2: If they support or refute a 16-bit connection, I'm more than happy to uncover new information and share what I learn. I'll see if I can better target a new group of people. I'm thinking Ajay Chopra (Systems Engineer) may be an excellent choice. In fact, I just grabbed his contact info. Here I go...

Edited by jmccorm

Share this post


Link to post
Share on other sites

Atari was looking to the future, with talk of 16, 24 , and 32 bit spit balling... 24 bit? yeah 24 bit........ wtf..... luckily no one was writing it all down in the bathrooms, halls, cafeteria, or workbenches that would be weird.... 3/4 of Atari is gone forever.... sad.... very sad.... Plenty of ideas promising stuff shit canned and zombie projects continuing in parallel to others that just kept going.. poor management, poor chronicles, but great computers..

 

Now let's enjoy what we do have... K?

Edited by _The Doctor__
  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...