Jump to content
phuzaxeman

classic battle atari 8bit vs commodore 64

Recommended Posts

The company was liquidated so nobody really knows what the figures were. The numbers we hear of today come from fans, ex employees & computer historians. Which explains all of the confusion.

We may not have exact figures of the Atari 8-bit sales because of all the different models. I think the 800XL sold over 2 million units. Wikipedia is stating 4 million units for the 8-bit.

 

 

Wait, was there a different version of Capitalism in U.K. and U.S. just few decades ago ? Granted, the ex-communist clusterfu*kistan I grew up in, made sure we were shielded from things like stocks, options, ROI and dividends. Though, in the last 5th year of my university, we had a subject covering stock market, which literally, was the only official subject matter that could be obtained without the risk of ending on a watch list plus a permanent mark on your police report.

 

Surely, though, the people who invested in publicly traded companies were entitled to quarterly reports on company's well-being ? That stock market thingy is well over a century old, so ?

 

In early '90s, when the CNN could be officially broadcast at home, their financial segment looked (IIRC) very similar to present-day financial segment. It's not a rocket science after all - either your stock goes up or your investment goes belly up

 

 

 

Unless, one of the two happened:

1. Atari wasn't a publicly traded company

2. Quarterly reports didn't need to have those numbers (ridiculous, but very remotely probable)

 

 

Given the predominantly American population of AtariAge, surely there's couple dozen people here who can clear this up ? And no, I most certainly do not intend to look it up on wiki or other sites serving their particular agenda.

 

Both Atari and Commodore went belly up around 1995. Apple had a lot of problems. I know backward compatibility was mentioned here before. All 3 companies are guilty of this one. When moving from the 6502 based computer lines to the 68000 based 16/32 bit lines, they practically shot themselves in the feet. 16bit versions of the 6502 were available back in the 1980s, Commodore even had control of MOS that manufactured the 6502. We had the 65816 and MOS 6509. Someone could had easily built a machine that can run native 6502 code on a 16bit machine. We know the 68000 was a powerful CPU vs a 65816, but with competition and demand, I am sure MOS or WDC would had made their chips faster and more powerful over time.

 

We know Apple did the IIGS, but that was 3 years after releasing the Macintosh, and Apple was already abandoning the 6502 based platform and less software was being made.

 

Companies would had need to have backward compatible video, audio, and IO support. Atari already went from a CTIA to GTIA, and were they not trying to make chip to combine ANTIC and GTIA to one chip?

 

 

Now something I see on these threads, people are comparing the graphics and sound. But these are computers, how useful are they. How easy are they to use? A few Atari users, could not make sense of Commodore's OS and needing memorize and enter codes to do disk functions instead of using Atari DOS Disk Utility Program (DUP.) In native Atari Basic, you also need to use codes to access functions as well.

 

What software is available at what price, on what media? What peripherals available at what cost. I remember the Commodore drives costing more vs Atari ones back in the day.

 

Do the machines have word processors, financial programs, spreadsheets, communications software. What programming languages are available. Those are very important as well.

 

Compatibility issues again. Up to switching to 16bit 68000. Can my Atari 800XL or 130XE run a program made for the Atari 400? May need a translator disk. Can the Commodore 64 run something made for the Vic-20 or Pet? NO!!! that was my beef with Commodore.

  • Like 1

Share this post


Link to post
Share on other sites

That is a perfect example of, what I thought when

I was reading the thread title the first time... ;-)

 

http://www.8bit-slicks.com/

 

And the next thing I saw was:

 

  • -
  • -
  • -
  • A friendly community. :evil:

Can it be ? A community without

squabbling ? Just playing the same
game among themselves? Even with
different machines? :lol:
Stefan
  • Like 1

Share this post


Link to post
Share on other sites

 

Both Atari and Commodore went belly up around 1995. Apple had a lot of problems. I know backward compatibility was mentioned here before. All 3 companies are guilty of this one. When moving from the 6502 based computer lines to the 68000 based 16/32 bit lines, they practically shot themselves in the feet. 16bit versions of the 6502 were available back in the 1980s, Commodore even had control of MOS that manufactured the 6502. We had the 65816 and MOS 6509. Someone could had easily built a machine that can run native 6502 code on a 16bit machine. We know the 68000 was a powerful CPU vs a 65816, but with competition and demand, I am sure MOS or WDC would had made their chips faster and more powerful over time.

 

 

Amiga was designed around to use the 68000 from the beginning ( before Commodore acquired them ). They had a couple of divisions working on the machine, whilst Jay Miner & his team worked on the chipset. All of this work was based around the 68000.

 

They already a couple of demos running on it by the time Commodore purchased the technology, so I assume it would have been too late to change the specs for the chipset to use a different CPU. It was a monumental task, to get an OS written for the machine in time and take those suitcase sized breadboards and convert them into real hardware. They were able to do that because of the MOS plant. I don't believe there would have been many competitors around at that time capable of performing such a task and bringing it to market at an affordable price point, especially without their own semiconductor plant.

Share this post


Link to post
Share on other sites

 

Amiga was designed around to use the 68000 from the beginning ( before Commodore acquired them ). They had a couple of divisions working on the machine, whilst Jay Miner & his team worked on the chipset. All of this work was based around the 68000.

 

They already a couple of demos running on it by the time Commodore purchased the technology, so I assume it would have been too late to change the specs for the chipset to use a different CPU. It was a monumental task, to get an OS written for the machine in time and take those suitcase sized breadboards and convert them into real hardware. They were able to do that because of the MOS plant. I don't believe there would have been many competitors around at that time capable of performing such a task and bringing it to market at an affordable price point, especially without their own semiconductor plant.

I think Jay Minor and his team wanted to avoid Jack Tramiel, because he would had gutted the Amiga and try making them as cheap as possible. When he came over to Atari and made the ST, who was his targeted market? Jack and his brother believe video games were evil. Education industry favored Apple, and business world favored the IBM PC. The professional graphics people probably like the Amiga better because it was a much better machine.

 

Something that started hurting Commodore and Atari even more by 1990 was the IBM PC became more capable as all these other machines. EGA and VGA cards for graphics and Sound Blaster for sound. Plus was able to run all the business application. One fault was it did take Microsoft awhile to catch up with all the versions Windows on the PC to match what the Macintosh, ST, and Amiga was doing years earlier. They would had been better off dedicating themselves to making PC clones at that point.

Share this post


Link to post
Share on other sites

I couldn't see Jack buying the Amiga anyway and knowing what he was like ( always looking to reduce the chip count ) he would have thought that there were too many chips in the machine. Steve Jobs actually thought the same.

 

Affordability is what pushed me away from the Amiga, and that's probably true for most users. There were certain things I couldn't do with the Amiga anymore like browse the web, play the latest games. At that point Amiga become too expensive to upgrade and maintain. PC it was for me.

Share this post


Link to post
Share on other sites

We may not have exact figures of the Atari 8-bit sales because of all the different models. I think the 800XL sold over 2 million units. Wikipedia is stating 4 million units for the 8-bit.

Just a reminder, just because the Wiki says something, doesn't make it so.

The Wiki can be edited by pretty much anyone with an account.

Sales numbers were argued at length in another thread, and people seemed to want the magic total of 5 million machines.

So if you figure 1 million non-XL machines sold, you have... 4 million XL machines!

There really wasn't any evidence to support the 5 million number.

At best, a magazine (German?) posted some number but didn't even cite a source if I remember right.

 

 

...

Do the machines have word processors, financial programs, spreadsheets, communications software. What programming languages are available. Those are very important as well.

Just about every popular 8 bit machine had spreadsheets, database programs, word processors, and multiple programming languages.

But that doesn't mean much if you wanted a specific program.

If you wanted Visicalc, dBase, WordStar, UCSD Pascal, etc... instead of Bob's CalcBaseWordPascal, then you were probably out of luck most of the time.

 

 

Both Atari and Commodore went belly up around 1995. Apple had a lot of problems. I know backward compatibility was mentioned here before. All 3 companies are guilty of this one. When moving from the 6502 based computer lines to the 68000 based 16/32 bit lines, they practically shot themselves in the feet. 16bit versions of the 6502 were available back in the 1980s, Commodore even had control of MOS that manufactured the 6502. We had the 65816 and MOS 6509. Someone could had easily built a machine that can run native 6502 code on a 16bit machine. We know the 68000 was a powerful CPU vs a 65816, but with competition and demand, I am sure MOS or WDC would had made their chips faster and more powerful over time.

 

Apple's biggest problems where caused by introducing a Mac with too little RAM, and no internal expansion, as well as superior competition (PC, ST, and Amiga).

The IIgs didn't run a few II+/IIe programs, but that was relatively small number of titles, of which almost all been patched.

There is literally a handful of known programs left that don't run on the IIgs.

Some of those patched versions are recent, but many have been around since before the IIgs was discontinued.

 

Calling the 65816, or 6509 16 bit is a stretch IMHO.

Yeah, the marketing material calls the 65816 a 16 bit CPU, but if it is, it's a 16 bit CPU on an 8 bit data buss, with instructions that take the same amount of time as an 8 bit CPU.

Really, it's a 6502 with some 16 bit number support, and even that isn't as good as other 8 bit CPUs.

The other new features such as larger stack, larger index registers, and stack relative addressing are probably the most useful additions.

 

The 6509 definitely isn't 16 bit, and it's RAM bank switching is really odd, though it has some potential.

Accessing large amounts of data wouldn't have been difficult with the 6509.

Just set the bank you want to access, then use an LDA/STA indirect Y opcodes to read from or write to the other bank of RAM.

But splitting applications across 64K banks? That looks complicated.

Without the CPU ever appearing on a popular model, I'm not sure anything supported it.

The 6509's memory banking registers even conflict with the 6510's I/O port, which by itself would preclude it's use on the C64 even if you were to add the I/O port externally.

 

A lot of the people designing the CPUs, and machines had experience with mini-computers and/or mainframes.

They knew that 16 and 32 bit CPUs were the future for computers.

Compilers are the norm for development on larger machines, and the 6502 just doesn't support compilers well.

You could continue to try to compete with enhanced 8 bit CPUs, or cut your losses and move on.

 

Getting out from under the control of MOS was also a good thing no matter how it was done (65C02, 65816, 68000).

 

  • Like 3

Share this post


Link to post
Share on other sites

 

There seems to be a lot of comparison of demo's on here, still pictures, but how about some software people use. Games in action with the YouTube videos.

 

Once-again...

 

 

https://youtu.be/5AmE1TLyqis?t=94

 

I should mention that it actually looks better on a real monitor (like a 1084 for example).

Edited by Nebulon
  • Like 2

Share this post


Link to post
Share on other sites

The 6509 definitely isn't 16 bit, and it's RAM bank switching is really odd, though it has some potential.

 

Yes, it probably is good for data storage in BASIC but seems rather hopeless when it comes to machine code, of course depending on how you layout rest of the memory map. As far as I know, the 6509 is only present in the CBM-II business computers which themselves were somewhat short lived, and there is a good reason why computers with 128K or 256K of RAM still need an additional 24K RAM cartridge plugged in order to easily run any machine code that wants access to system ROMs, I/O chips, video matrix without bank switching back and forth ad nauseam.

 

Unless of course BlueMoon_001 who doesn't seem to be with us any longer, really thought about the Motorola 6809 which of course was not within Commodore's reach anyway. For that matter I believe the 65816 is a WDC design, so in the path of going from 6502/6510 to a 16-bittish version of the architecture they would need to have sourced the CPU from another vendor anyway, just like they of course did with the far more powerful 68000.

Share this post


Link to post
Share on other sites

EDIT: The closest I could find so far (not easy to browse through the C64 prod), is this (timestamp 4:10):

https://youtu.be/zpKcw7naKkw?t=250

Effects like this are mostly done using similar tech to A8 LMS.

 

You can make Vic chip show different charset and screen each raster line. Combine those screens and charsets and you can show any line from texture in any line of screen.

Limit is amount of memory you can spare and texture gfx that is broken all over memory (you use like 32 bytes from 1screen/charset combo).

 

There's a good explanation here:

http://codebase64.org/doku.php?id=base:twisters_x-rotators_and_waving_carpets

  • Like 2

Share this post


Link to post
Share on other sites

The sweet spot seems to be around 112x40, which takes 48,594 cycles (29.83 fps).

That sounds like you're using something similar to lda,ora,ora,ora,sta to combine texture pixels ?

 

What zoom size are you looking for ? Or to put it differently - what is ratio between smallest and largest texture line on screen ?

Don't know if you know about it but there's a cool demo tech (from c64) to make gfx zoom in/out between 100-150%

Larger ranges can be made from combinations of multiple texture sizes (similar to mip-map).

 

If your texture is 8 pixels wide "abcd efgh". (each letter is single pixel).

You make 4 shifted versions of it:

"abcd efgh"
"bcde fgh_"
"cdef gh__"
"defg h___"

Then for example you draw this:

"abcd fgh_" - scaled down 1 pixel
"abcd efgh" - size 1:1
"abcd defg h___" - sized up 1 pixel

By this expand/contract each byte you can get many different x-sizes of original texture.

Instead of combining 4 pixels into single byte, you "just use" preexisting pixel combos :)

 

Hope it makes sense, or ignore if you knew about it :)

 

ps. Works on Atari and c64 doesn't matter ;)

  • Like 2

Share this post


Link to post
Share on other sites

I would like to see the one theme running through a long demo - so that it feels connected - and that it is a viewing experience to go through - instead of things disjointed - with text that says nothing much at all.

Have you seen this one ?

 

And "Zelda" is maybe this one ?

 

  • Like 2

Share this post


Link to post
Share on other sites

Unless of course BlueMoon_001 who doesn't seem to be with us any longer, really thought about the Motorola 6809 which of course was not within Commodore's reach anyway. For that matter I believe the 65816 is a WDC design, so in the path of going from 6502/6510 to a 16-bittish version of the architecture they would need to have sourced the CPU from another vendor anyway, just like they of course did with the far more powerful 68000.

Bahaha - that didn't take long. Why the ban (was this account really a shill for / double of petey)?

Share this post


Link to post
Share on other sites

First of all I have no ideal who this Bluemoon guy was, nor is currently associated with anyone in my group. I have my own suspicions, and I never responded to anything this person stated.

 

I had considered porting some stuff to the Commodore 64, but decided later it was too much trouble. As I am planning to start up more games for the Atari 8-bit in the next few months. I still need to finish up some PC and Atari 7800 game ports. I might consider C64 ports if I got a lot of help with the first few games, team up with someone.

 

Seeing that Mythos Zelda port is making me to start thinking about further upgrades for Secretum Labyrinth to make the game look better and demonstrate what the Atari 8bit can do.

Share this post


Link to post
Share on other sites

As JamesD pointed out, you really cannot go on Wikipedia for accurate sales figures for Atari hardware as anyone with an account can edit them and some of the arguments for why figures should be used are classic strawman examples.

 

I've not looked at it for months, but if the figure for Lynx units sold still exists based on the fact it was printed in a RetroGamer Magazine article and came from a figure suggested to Ex-Atari Marketing's Darryl Still who simply went along with it...then i rest my case.

 

 

We've covered the issue of sales figures before on various threads on here and the same sources are cited...vague potential numbers stated in various magazines, Atari documents giving region specific numbers upto a certain date and so on and so on.

 

The Tramiels used to sidestep giving actual numbers when interviewed, saying it was not Atari policy to give out such information and when asked for it for magazine articles, statements about DAY UK Market share would include all the Atari 8 bit consoles (2600,7800 and XEGS) lumped together.

 

 

You still see the likes of RetroGamer magazine have bar charts of units sold, when they do features on hardware, Atari included.

 

But i very much doubt it's intended as anything but a suggested figure.

Share this post


Link to post
Share on other sites

Effects like this are mostly done using similar tech to A8 LMS.

 

You can make Vic chip show different charset and screen each raster line. Combine those screens and charsets and you can show any line from texture in any line of screen.

Limit is amount of memory you can spare and texture gfx that is broken all over memory (you use like 32 bytes from 1screen/charset combo).

 

There's a good explanation here:

http://codebase64.org/doku.php?id=base:twisters_x-rotators_and_waving_carpets

I'm not convinced that the higher cost of rendering into characters offsets the tiny speed gain, outside some marginal scenarios. Like with everything, there's going to be an inevitable cycle threshold where it stops making sense.

 

Everybody always mentions the filling, that you can fill 32 color px with one STA, but I never actually saw anybody post concrete numbers of how much performance does that save.

 

In my 6502 flatshader, the filling takes only 15% of the whole pipeline, and if I really wanted, I could bring it down to 12% (I tested that already, it's just not integraded yet), but it's fast enough that it doesn't need to get faster (for now) - there's other stages that need optimization.

 

But there's simply no way that rendering 3 different polygon edges with 3-4 different colors into same char (very common scenario all across the screen) would be faster than directly addressing FrameBuffer, because of the:

1. additional overhead of addressing scanlines within a char.

- my test scene from my Jaguar's StunRunner has 3 segments, each 10 triangles (5 quads) and 210 scanlines

- this overhead will get executed 210 times

 

2. additional overhead of DLI for charsets

- in 160x96 you need 4 charsets, so that's 4 DLIs

- each DLI costs you 105 cycles, so that's 4*105 = 420 cycles per frame

- let's say your scene needs 6 frames to rasterize, so that's 6*420 = 2,520c that have been burnt already

 

3. additional overhead of addressing current charset

- when you're filling a polygon, for most of them you will go through 6-10 chars till you can switch to another polygon

- this means you will have to switch the pointer to the charset at least two-three times per each polygon

- this needs to be detected per scanline

- just this condition will get executed 210 times, plus the 2-3 switches to next charset pointer

 

4. additional overhead for the Quick Fill

- finally we're coming to the reason why we chose chars

- you must compute, per each scanline whether you can actually have 8-px tall and X*4-px wide rectangle

- that is not going to be cheap

- in my test scene (real-world game 3D scene), only 25% of polygons would have more than 1 quad to fill quickly

- so, for 75% of the scene, this [very expensive] condition will get executed, but won't bring any performance, just burn more cycles

 

So even if you find some simplistic 3D scene where it manages to offset those 15% of filling, we'll be talking about 5-7% at best. But:

- you lost the absolutely generic nature of scanline drawing

- you burnt more time implementing additional code instead of optimizing other stages of flatshader

- you introduced 2 additional indirections into your codebase, so good luck debugging that 6 months from now : )

- you really have to keep the scene simple with as little number of scanlines as possible, otherwise you quickly burn through those 15%

 

 

It's not directly comparable to flatshading (apples to tractors, really), but if you are rendering just a wireframe of a 3D mesh, then chars can be indeed quick, for a mostly empty screen (you gain a lot of the cycles by quick ClearScreen). I am, indeed, very curious, where's the performance threshold between drawing lines into framebuffer vs charsets.

 

 

But, now that I've experienced the difference between wireframe and flatshading, and it's just those ~15%, I don't wanna go back to simple wireframe. It's a huge visual difference if your polygons are filled and still in a very playable framerate :)

 

I recently implemented smooth color shades for my Jaguar flatshader, and while it was running in auto-play mode (I did some basic scripting so it can keep playing through first 25 levels by itself to confirm if it's stable over course of days) and yesterday I realized (it's running in parallel on a small TV next to my main one, so I can keep watch) how that same coloring scheme could be implemented with DLIs on 6502.

 

This is -very much- an uncharted territory, on Atari...

  • Like 2

Share this post


Link to post
Share on other sites

If your texture is 8 pixels wide "abcd efgh". (each letter is single pixel).

You make 4 shifted versions of it:

"abcd efgh"
"bcde fgh_"
"cdef gh__"
"defg h___"

Yeah, I have however immediately discarded this option, because:

- I started with a texture 160x40, as highres as possible

- I didn't want to cheat using extended RAM, as each such texture would take huge amounts of RAM (you must store each texture's scanline at all Z depths, and if you do the math for 160x40 ... )

- This approach would make it impossible to switch between various textures at run-time - During one second, I can literally switch between 30 different textures (if they fit into RAM), and have each frame use a different one

- It would make it impossible to do Render-To-Texture - I can now render decals, tire tracks, or anything, really, one frame, and discard it next frame at no additional RAM cost

- I wanted this to run on base Atari, with 64 KB or less

- The code for texturing is definitely less than 4 KB, so if I wanted to do procedurally generated texture at load-time, the final executable would be under 8 KB, for sure

 

 

What zoom size are you looking for ? Or to put it differently - what is ratio between smallest and largest texture line on screen ?

 

I tried to keep this ratio as run-time as possible, and not hardcoded in any way. So, at load-time, you have 2 parameters :

- Polygon Width at max zoom

- Polygon Depth (scanlines count)

- Perspective angle - this is what defines how many pixels are on last visible scanline - algorithm is generic, it doesn't care if it's 2 pixels or 160 pixels

 

This computes the texture indices for each Z depth, plus XPOS offsets for each scanline, which is why this solution is generic and allows switching textures from frame to frame at no cost, as textures are not preshifted or precomputed.

 

For example, for a classic racing game, you would have this computation run only once per game, as there's no reason the road width or depth perspective would change. But of course, the flexibility is there - say, you wanted to have wider roads in cross-country track and very narrow in a city, the code gives you this option.

 

I ended up using horizontal mirroring, so while you still have max.pixel resolution of 160 per scanline, without any texel magnification artifacts, the texture is now 80x40.

 

Thinking about extended RAM, you can:

- fit 20 different textures into 64 KB uncompressed (e.g. each texel is 1 byte for max.performance, as you most certainly don't want the decompression happening per texel)

- fit 80 different textures into 64 KB compressed, and just uncompress at load-time

- even stock 130XE could have incredible visual variability

 

That sounds like you're using something similar to lda,ora,ora,ora,sta to combine texture pixels ?

Nope, that would be impossible without pre-shifting, and wasting lots of RAM, so:

 

- first byte of each scanline undergoes either 2 or 2+4=6 bitshifts

- remaining bytes (till the center of the line) undergo 6 bitshifts each (1 ORA plus 2 shifts per texel, except last one that is just ORA'd)

- this is the core strength of 6502 - each shift is just 2 cycles, and this is why the A register is accumulator, I literally accumulate a result of 10 different operations (without any temp storage), till I get the result to write to FrameBuffer

 

Now, I'm sure that at this moment, most of coders are panicking - "OMFG, you MUST preshift, shifting is slooooow".

Actually, I coded that alternative. It was slower. MUCH slower than accumulating in A register. If you think about it, it does make sense, for a 160x40 texture, out of 3,552 total onscreen pixels, there's 1,776 texels to compute (429 quads (the other 429 are mirrored)), so you burn only 429*6*2 = 5,148 cycles. Anything involving preshifting requires additional indices, which thrashes your current X,Y registers, which is done, obviously, 429 times.

 

Hence, the performance threshold for preshifting (to begin to hope to be faster) is 5,148 / 429 = 12 cycles. But that's per byte, e.g. 12/4 = 3 cycles per texel. Since even a simple temp storage to zero page is 3 cycles (and restoring it is another 3 cycles), it's thus impossible to be faster :)

 

 

My current last version of HiRes texturing is doing texture 128x40 (2,656 rendered pixels) at 48,082 cycles, which is slightly over 30 fps at 160x96. That's still without loop unrolling and self-modifying code that would make it even faster, but 30 fps is surely good enough for me. You still have 24,186 cycles for all other game logic, and game would still run at 20 fps. Not bad for 1.79 MHz :)

 

 

What I don't understand, however, is how come nobody did that 20-30 years ago ? W.T.F. ?!? It took me about 3 days to optimize it like this, which is - like a good, intense, coding weekend. Totally doable even while going to work/school/etc.

  • Like 1

Share this post


Link to post
Share on other sites

I say Super IRG method works great on getting more screen colors on the Atari, The method of swapping between two character set in each TV frame. However the VIC chips can also swap character sets, the Commodore 64 (& Vic 20?) can most likely do something similar. Not sure if TED or whatever the PET/CBM machines use can change the character set memory pointers. The ST & Amiga uses bitmap screens for text displays so the method won't work there, unless you alternate whole screens between each frame. The PC text modes, I could never find anything about changing the character set pointer and doing a custom font there.

Share this post


Link to post
Share on other sites

As JamesD pointed out, you really cannot go on Wikipedia for accurate sales figures for Atari hardware as anyone with an account can edit them and some of the arguments for why figures should be used are classic strawman examples.

...

You still see the likes of RetroGamer magazine have bar charts of units sold, when they do features on hardware, Atari included.

 

But i very much doubt it's intended as anything but a suggested figure.

The unreleased sales figures are a thorn in the side of several classic computer communities.

Tandy stopped releasing #s of each machine sold and switched to just releasing profit info for their stockholders in 1980.

Someone that worked for Tandy recently mentioned how much a Model III actually cost dealers, and the markup was in the neighborhood of 500% if what he said was true.

This is why Tandy was a sales leader through most of the 80s. They had better margins than most companies and a lot of different models.

Not releasing sales numbers but dropping hints has lead to a lot of speculation around the Color Computer.

People at Tandy said it was their best selling computer, and seemed to indicate that was the case over it's entire life even in the late 80s.

If it even sold the same number of units as a Model I did per year in 1979 (before personal computer sales really took off), that's at least 400,000 per year and you end up with at least 4 million sold.

If you go by the number of Model 100s supposedly sold, those sold in the neighborhood of 7 million. So wouldn't the CoCo have sold more based on the best seller comment?

But all it takes is adding "home computer" to that best sales comment and you could end up with under 2 million because they didn't consider any other machines strictly home computers.

Trusting comments from people about what happened 30+ years ago where you don't know how much sales info they were privy to in the first place isn't going to be reliable.

Most of the managers at Tandy that might know were only interested in profit, and knew nothing about computers.

Computers were just another product to sell, and most of them (the managers) are probably dead anyway.

 

I would have a hard time believing the Atari sold less than 3 million machines, and I wouldn't be shocked to find they sold over 5 million... but how would you prove it?

With Atari changing hands, even the people that were managers under Jack might not have had access to sales info up to that point.

Managers in Europe supposedly complained about sales numbers, but does that mean they were only selling 50,000 per year, or is that relative to the C64 and they were selling several hundred thousand per year?

Disappointing sales numbers could just mean a salesman missed the mark for a bonus.

 

Edited by JamesD

Share this post


Link to post
Share on other sites

I say Super IRG method works great on getting more screen colors on the Atari, The method of swapping between two character set in each TV frame.

It seems to me that interlace modes have a detrimental effect on colour saturation and luminosity, and that's quite apart from the flickering exhibited on PAL systems. Interlaced modes are great for displaying photographic images with high colour depth, but Super IRG looks muted to me. I prefer to see bright, solid colours on the A8. I'm not a gamer, but Crownland is an excellent example which springs to mind.

Edited by flashjazzcat
  • Like 3

Share this post


Link to post
Share on other sites

I say Super IRG method works great on getting more screen colors on the Atari, The method of swapping between two character set in each TV frame. However the VIC chips can also swap character sets, the Commodore 64 (& Vic 20?) can most likely do something similar. Not sure if TED or whatever the PET/CBM machines use can change the character set memory pointers. The ST & Amiga uses bitmap screens for text displays so the method won't work there, unless you alternate whole screens between each frame. The PC text modes, I could never find anything about changing the character set pointer and doing a custom font there.

The TED based machines have selectable character set addresses starting on every 1K boundary.

One of those starts at address zero, there's space at the top of memory for hardware and non-banking kernel mem, so there are addresses that shouldn't be used, but it's obviously there are more than enough to swap between two character sets.

I think the character set is a maximum of 128 characters.

Share this post


Link to post
Share on other sites

BTW, I wonder if BlueMoon_001's reference to the 6509 was actually about the MOS 6509 chip, as he's talking about 16 bit upgrades for 6502 machines.
The 6509 was clearly 8 bit.
Maybe he was referring to the "announced" Synertec part that has been referred to as the 6509 in a few discussions due to it's similarity to the 6809.
The actual name of the Synertec chip was supposedly going to be the 6516, but the company denied the chip even existed, possibly after threats of a lawsuit.
Western Design Center could get away with making the 65C02, 65802, and 65816 because it was started by the designers of the 6502, and they probably still had some rights to it.

Share this post


Link to post
Share on other sites

I know the VIC-20 & C64 had 256 character sets. The Vic-20 also had double height characters and 4K font that was used to emulate a bitmap graphics mode. I know the TED based machines also had a color palette more like the Atari, but had no sprites and no SID chip. By the time the 16 and +4 were put out, many people either had a Commodore 64, Atari, Apple, TI-99, or IBM PC. I do not think those machines sold very well. The intention was to release another low cost machine like the VIC-20, and one chip that did it all. But it was a flop. Many in the United States called the "Plus/4" a "Minus/60" because it could not perform as well as the Commodore 64. As I said it many times, Commodore would had been better off if they taken the 64, sold it with 16K with blank sockets, or sold a cartridge with the "+4+" software to make it into a "+4" or include it on internal ROM chips. Did they also update the Basic?

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...