Jump to content
IGNORED

Apple II in low-end Market?


kool kitty89

Recommended Posts

Since the TRS-80 Model II was mentioned I thought I'd look up some prices on Tandy machines.

 

In July '82 a mail order place called Computer Plus has the following prices listed.

Keep in mind those are discounted from Tandy's normal prices.

 

Model III 16K $799

Model III 48K 2 drives & RS232 $1949

Model II 64K $3100

Model 16 128K 1 Drive (68000 machine) $4299

Model 16 128K 2 Drive $4999

A CoCo 1 with 32K and Extended Color Basic was $499

Wow, yeah, the model 2 and 3 aren't very competitively priced at all. (even compared to the PC . . . $3100 in 1982, you should have been able to get a PC 5150 with 64k, MDA monitor, and a disk drive -at least for the prices I've seen)

 

On top of that, the value is further diminished by incompatibility with their other TRS-80 lines. (I'd imagine it's mainly a margins issue and not an actual manufacturing cost one . . . even using discrete logic, it shouldn't have been THAT cost ineffective -plus, investing in consolidating that logic should have been a prominent consideration as well)

 

 

It seems like the best option may have been to morph the lower-end TRS-80 model 1 line into a CoCo-like machine (same sort of market positioning, price point, and general capabilities, but backwards compatible with model 1 software -with a small range of machines from bottom-end RF only models with cheap keyboards to mid-range models with better keyboards and monitor and more built-in peripheral support), and then have the separate model 2 range in the higher-end. (but follow that up with tighter pricing and marketing with more consolidated models and perhaps a more PC-like desktop form factor -and preferably merge the features of the coco-a-like with later model 2s as well, so it would be a fully cross-compatible high-end machine -graphics capabilities also started becoming more important for science and business in the early/mid 80s, probably abandon 8" floppies more rapidly too and aim at supporting PC compatible disk/data formatting too)

The model 1/2/3/4 all used TV compatible sync rates iirc (regardless of the resolutions being impractically high for RF), so that would at least allow a broad standard to be established somewhat more easily.

 

A Z80 based alternative to the CoCo may not have been quite as cost effective as the real CoCo (from a raw performance comparison in a vacuum), but the value of compatibility (and potential for CP/M support) would likely have been much more significant. (the TRS-80 model 1 was one of the 1st 3 significant home microcomuters of 1987, and it sold better than the Apple II and PET in the late 70s from the figures I've seen -though they're a bit sketchy, and may not have has as much inherent potential as the Apple II, but definitely more than the PET -a very closed design, to the point of removing the DIP RAM sockets on later models to prevent expansion ;))

The model 1 was actually fairly flexible with expansion (with the module, plus the highly detailed hardware documentation with the user manual), but they really should have had a desktop version with the expansion module integrated. (especially due to the reliability issues of the module)

 

Even with that module, it still didn't have the same open-ended flexibility of the Apple II (or CoCo), though was very hackable. (but that's far less accessible to average users, even ones with moderate tech knowledge and OK soldering skills)

 

 

In reality, most (all?) of those 32K CoCo machines had 64K. They supposedly used half defective 64K chips in some but in my experience, there is no such thing so I'm guessing that was all talk because they didn't want it to compete with the Model III. Plus, I think it's more expensive to find half defective 64K chips than to buy good ones.

When I read about the 1/2 bad chips, I never thought they'd have attempted to buy only bad chips, but bought bulk orders of questionable batches (possibly with only a small percentage being bad, but not special testing to sort that further), and any "bad" batches would probably have persisted for a short time only and most orders would have been functional.

 

Except, why continue to use an excessive amount of DRAM chips when 32kx2-bit or 16kx4 chips became available? (they should have been readily available by 1983 at least, Atari was using 16kx4 chips predominantly in the 600XL iirc) Plus, up until 1983 (or perhaps late 1982), 2k (16kbit) density chips were cheaper to use than 8k (aside from the added board space), so it would have made sense to use 16 16kx1-bit DRAMs for early 32k models. (especially since the CoCo was an older design and would necessarily have started using lower density chips -unlike the C64, which was introduced on the verge of 8k chips becoming cheaper -and close enough to make the reduced board space worthwhile, not to mention the near-future cost reductions of the design)

 

Plus, why use the "half bad" chips only in the 32k models and not 16k, etc?

 

 

 

 

 

 

 

 

@kool_kitty: The CoCo ran interleaved at somewhere around 1.7Mhz with a 6809.

CoCo I and II were fixed at .89 MHz in DRAM due to the limits of DRAM and (later) the SAM's capabilities. The CoCo III ran at 1.79 MHz interleaved, but that was in 1986. (probably using similar speed DRAM as the Amiga, which ran interleaved at a similar speed -the 68k just takes 4x as many clock cycles to complete an access than a 6502/6800/6809 and the Amiga's DMA set-up basically split a 3.58 MHz bus into 2 1.79 MHz buses for the CPU/chips to use, with the additional ability for the chips to steal 100% of the bus for burst DMA at full bandwidth)

Actually, the CoCo I/II could run at 1.7MHz when accessing ROM. They could do the same with RAM but it required disabling RAM and video refresh. Someone has actually written a demo that animates more on screen objects by using the mode that disables video/RAM refresh for a fixed number of cycles. If you wrote a game for a ROM pack you should be able to run it at 1.7MHz except for RAM access, but I don't think anyone ever did that.

Yes, I knew that too, I was speaking of working in RAM though. (plus it also assumes game cart ROMs used was fast enough to allow that, which may not have always been the case)

 

As for disabling refresh, how can you do that to DRAM without getting corruption? (if not for that, it would have made perfect sense to run at 2x speed in vblank all the time)

 

 

Plus, that's all overclocking the CPU by ~80% over the rated speed, right? (not that most, if not all 1 MHz rated 6809s weren't actually stable at those speeds . . . that seems to have been a case with a lot of old CPUs -plus, it was pretty much always logic stability, not overheating or failure until things like the 486 and 68040 came around- it makes you wonder why more manufacturers didn't overclock as routine -I wonder how many STs or Amigas had CPUs that would have actually been stable at 10-16 MHz, if not higher -especially for late models with CMOS versions -best case would be if yields were so high that no additional quality control would be needed and the very rare occasion of a failure could be handled at the consumer level, short of that they could do factory benchmarks and downgrade any unstable systems to lower speeds -especially if a modular design was used with simple jumpers to switch clock rates)

 

 

Hmm, for that matter, would it have made much difference if the 6809 in the CoCo was always run at 1.79 MHz and forced wait states were induced to allow the interleaving scheme (not a complex wait state system to only block access when video or refresh is actually using the bus, but the same simple interleaving scheme with added wait states to block RAM accesses for every other cycle). Given the memory dependent nature of the 6809, clock doubling like that wouldn't be that helpful, but it should accelerate some things at least. (and be better off than the 6800 or 6502 with few internal registers -though not as useful as doing the same for the 68k or similar)

 

 

I think the GIME in the CoCo 3

Were you going to say something else here?

 

 

It's not a question of if it were possible but how much it would cost. Does it make sense to increase the cost of a machine like that by 50% to 100%?

Yes, I mentioned cost issues several times too . . . but the actual feasibility may be in question as well. (at least if you're talking DRAM speeds of mass produced chips, not switching to SRAM or something crazy like that ;))

 

In the case of the Motorola chipset that could have been the CoCo upgrade, you could have over 1MB of RAM and it might make sense to run at 3.5MHz because it could compete with newer systems.

You'd need a 6309 to have those CPU speeds without overclocking . . . though overclocking might have been fine too. ;)

 

But the CoCo had a simple design that lended itself to advanced upgrades. Can you imagine updating all the Atari video modes to accept different CPU clocks?

What about the Atari video system would be problematic for faster clocks? There's already a wait-state system in place to block the CPU's access during video accessing or refresh, right? (so why wouldn't that same mechanism remain functional regardless of the CPU speed?)

 

Changing the clock of the video chips might be more problematic. (even the relatively "simple" upgrade of doubled dot clock versions of current modes would probably be rather problematic . . . well, mainly if Atari had to start over with new engineers working backwards to rework the old design ;) -now, adding a new mode to allow pixel accumulation for an 8bpp mode at the same resolution as the 4-bit GTIA modes would have been another step beyond that)

 

 

Plus, worst to worst, you could limit boosting the CPU speed to vblank and avoid timing issues. (except you'd still need wait states for DRAM refresh)

 

 

Of course, the main issue with the Atari was total lack of provisions for external expansion at all. (there had been plans for Apple II like slots, but management halted that . . . then came PBI some 3-4 years later, but with the cancellation of the 1090XL and general decline of that line in general in the mid/late 80s)

 

(just about the only thing lacking was the sound hardware . . . if they'd just added a bog standard SN -or preferable AY/YM- PSG, that would have opened sound up nicely -with the DAC used occasionally for drums/SFX/etc -actually the YM2413 might have been a good option if it was ready in 1986)

I don't think OPL was affordable if it was available at that time. There was an upgraded AY chip that had more control over the individual channels while maintaining backwards compatibility. Not sure what # it is.

The YM2413 was the cheapest (and crapiest ;)) FM chip Yamaha ever released . . . a cut-down OPL2 (15 presets and only 1 programmable instrument at a time) in a tiny 18 pin DIP with built-in DAC with multiplexed/interleaved mixing (no adder). I'm not sure on the actual pricing though, and I'm not sure it was available prior to 1987. (not sure when the MSX and Sega Mk.III introduced their add-ons)

 

Otherwise, one of the sound-only AY/YM PSGs would probably be the best low-cost option (SN would be cheaper, but the feature set is among the weakest of any sound chips ;)) including the smallest 16-pin version from Yamaha. (GI only had it down to 24 pins with the AY-8913)

 

There was the enhanced AY-10-8930 you mentioned (3 envelopes rather than 1, and variable pulse width), but I'm not sure when that became available and it was only produced by GI and only in the full 40-pin package iirc (though that would be OK if Tandy wanted to add digital joystick support ;)). The only significant consumer use of that chip was by Covox AFIK. They used it in their Sound Master PC sound card of 1989 which also added mono 8-bit DMA sound (mapped allow compatibility with the parallel port DAC) and 2 Atari style DE-9 joyports (great at a time when the analog PC port was clumsy and rather resource intensive), but that card (like several others) was too late to make to be worthwhile. (the Adlib had created a baseline standard and Soundblaster was launched in '89 as well . . . had someone come out with a card like that covox one -or a an earlier incarnation- in '84-86, it might have caught on -perhaps just a plain '8910 with 2 joytports and a bare DAC to use like the parallel port sound boards ;) -rather odd that the PC got nothing, casual or professional, for sound until 1986/87 -covox DAC and then adlib)

 

At least the CoCo 3 got the programmable timer that can be used to drive the DAC at an acceptable rate. To play music in the most efficient manner generally requires large samples with all the music rather than having to take time and mix smaller samples. But they should have at least included the 8 bit stereo DACs of the Orchestra 90C.

Yeah, a cheap PSG (even the SN76489) and an upgraded DAC would have been nice. (a small array of DACs and higher res interval timers would have been nice . . . or even 1 DAC with higher res timers to allow amiga-style note sampling -if you did 1 channel per DAC)

 

The CoCo 3 got the sort of upgrade the ZX Spectrum could have used to become a truly persistent/evolutionary standard. ;) (had the 128k added that sort of video and CPU upgrade on top of the AY chip, it might have made for real competition against the 16-bit computers ;) -the later SAM Coupe obviously came too late and was still too weak with lack of hardware scrolling)

It's sad it didn't at least get the new modes from the TS2068.

Yeah, though some other clones got those TS2068 modes. ;)

Hell, with those modes (especially the higher res mode for more business applications), an upgraded Spectrum (perhaps a 6-8 MHz Z80) could have made more sense than the QL. ;) (hell, aiming at a CP/M compatible machine would have made sense . . . a spectrum 128 with at least TS2068 modes and PSG plus a 6/7/8 MHz CPU and 80 column text rendering support -if using the 512 wide mode, they'd need TMS9918 style 6 pixel wide text, plus a composite video/monitor port and colorburst disabled modes to make high-res text more useful -which the TMS9918 did do for the text mode iirc . . . then all they needed to do was drop those terribly unreliable microdrives in favor of plain cassettes and floppy disks -maybe aim at a BBC/Acorn compatible format given that was the only reasonably popular disk format in the UK/Europe at the time)

 

Better for games and better for business. (the added high res attribute mode would help with games to a fair extent, though still not great for 1984 -it would have been cheap though, and backwards compatible with the most popular home computer in the UK and much of Europe with still more room for improvement)

 

Of course, Loki would have been a truly massive leap from the Spectrum 128k (especially if it had ended up anything like the subsequent Flare 1 of 1986/87), a hell of a lot better than the (much later) SAM Coupe or Amstrad CPC+ too. (for that matter, it would have been incredible for use in something like the GX-4000 in place of what Amstrad did . . . at least if it was remotely close to the Flare 1 chipset -or the slipstream ASIC if you're talking 1990 . . . for that matter, Amstrad must have known of Flare's projects post Sinclair and could still have chosen to license those too -the Flare 1 chipset was licensed to Bellfruit and some others in the late 80s, the Slipstream was only licensed to Konix/MSU and given the publicity, many 3rd parties may have assumed it was an exclusive license -though it wasn't as Konix lacked the funds for that -I wonder if Martin Brennan ever brought the slipstream up with Atari while he was working on Panther in '89, or when he convinced them to drop Panther in favor of the Flare II/Jaguar project around that time ;))

 

Yes, aside from the Apple II, none really had good expandability (the Atari would have if the engineers had their way, and almost did again with the 1090XL . . . ), and most lacked any sort of general evolutionary development. (albeit the ST and Amiga seriously suffered from both of those problems as well . . . the Amiga got some boosts in expandability and both got a wider range of form factors, but it was mostly just an increase in RAM and minor tweaks to the graphics end for most of their lives -the ST Blitter was rather significant, but limited to high end, Amiga fastRAM was notable, but both lacked faster CPU models for a long time and only the ST ever got a faster 68000 version with the MSTE -none had a faster low-end model though,

The Amiga got the 1200 as the low end machine which had the 16MHz 68EC020. To be honest, that should have come several years earlier though.

I meant none used faster 68000s . . . and the 1200 was very, very late. They should have had a 16 MHz (or 14.3 MHz) 68000 version much earlier for the low-end (perhaps with the 500+), and earlier still for the high-end. (like for the 2000, or at least use 10.74 MHz for the 1986 launch of the A2000 -probably with wait states for normal interleaved chipram accesses and full bandwidth in fastRAM -fastRAM would have made a lot more sense with a faster CPU, as it was, it only really improved bandwidth for the chipset -the MEGA ST should have been 12 or 16 MHz too, at least optionally and probably with optional fastRAM as well -probably workstation versions with FPUs too, if not have faster CPUs and/or fastRAM options for the console models prior to that)

 

Hmm, maybe they could have done something else besides full double buffering, maybe use dual ported SRAM on-chip so the back end of the line buffer could be filled as the active part was still being scanned out to the screen. (you'd need precise timing to ensure that you didn't overwrite something that had yet to be spit out to the screen ;))

<snip>

I worked on an embedded system that used dual port RAM in the late 80s. It was expensive stuff back then so it would have to be a pretty limited size.

I've often thought about using dual port RAM to make a replica of some old machines... it's not so expensive now.

I wasn't talking about main RAM itself, but just for the line buffer (line RAM). I was under the impression that GTIA (and MARIA -but the latter being double buffered) loaded a single scanline of video into an on-chip buffer which was then spit out to the screen . . . or was that "line RAM" actually in main memory too? (in that case, you'd need to add an external SRAM buffer to allow the scanline to be buffered off of the main bus)

Hmm, though given the amount of RAM that would actually take (especially if buffering non-indexed 8-bit pixels), that would be pretty big to have on-chip for the time. (and the 7800 used SRAM for MARIA/CPU work RAM too, so it could have allowed pretty high bandwidth and may have interleaved screen scanning with building the next scanline -either way, MARIA used burst DMA in the sense that it wasn't interleaved with the CPU and blocked the CPU's access while working)

 

In either case, you wouldn't necessarily need dual ported RAM either, but you could use SRAM fast enough to allow interleaved accesses at full speed (though if it was on-chip, that might be easier than interleaved accesses).

 

And if GTIA had the line buffered into main memory (regardless of tweaking things like adding a dedicated SRAM buffer), that would mean double buffering (a la MARIA) was just a matter of logical features of ANTIC/GTIA, and nothing to do with on-chip buffering. (again, MARIA double buffered the scanline, so it could saturate the bus and use an entire H period to load the next scanline to the back buffer as the active scanline is being clocked out to the screen)

 

A trick you didn't mention is to double the data memory buss width and to cache the 2nd byte(s) for the next read internally on the custom chip. For an 8 bit machine it's a practical option for cutting display memory reads in half.

CPU reads are less predictable so you either need cache or wait states when there is a collision.

Using a 16-bit bus would also add to cost and complexity, wouldn't it. (the sheer cost of wider/more DRAM chips, wider video bus, more logic for that buffering in the video chip, etc)

Though that sort of cost trade-off would probably be more attractive than some others depending on the exact circumstances. (obviously not as cheap as using the same speed RAM on an 8-bit bus, but could be cheaper than using faster RAM instead of wider RAM)

 

As for an 8-bit CPU on a 16-bit bus, couldn't you add an external 16-bit latch to fetch 2 bytes which the CPU could then read consecutively? (or read only end up using 1 of the 2 bytes depending on the case, so only improving shared bandwidth some of the time)

Edited by kool kitty89
Link to comment
Share on other sites

It seems like the best option may have been to morph the lower-end TRS-80 model 1 line into a CoCo-like machine (same sort of

I think this has already been discussed in this or another topic and I really don't understand your obsession with the low end Z80 machine instead of the CoCo.

The CoCo was Tandy's top seller over it's entire life and they sold MILLIONS of them. When I bought mine, a lot of retailers were out of stock. Tandy was probably selling them as fast as they could make them.

 

Tandy didn't want to compete with their other product lines or they could have added more features to the CoCo. They even axed better versions of the CoCo that had been developed (better keyboard, hardware driven RS-232, RAM disk supported by BASIC, etc...). Why on earth would someone with that attitude create a computer that might compete even more with the Model II/III?

 

The highly integrated chipset in the CoCo made it possible. No similar commercial chipset appeared for the Z80 until MSX. The custom chip in the Speccy made it possible but it was proprietary and not available at the time the CoCo was developed. Building a Z80 machine without a highly integrated chipset made them expensive and large or very stripped down.

 

If Tandy wanted to do a cheap CoCo with a Z80, all they had to do is something like the Laser/VZ200 & 300. They are a Model I clone with a 6847. But they didn't have built in 64K, RS232, or joysticks. Again, it comes down to the chipset. Even later versions of the VZ had custom chips.

 

The closest thing to what you propose is the CP300 from Microdigital (Brazil). It was a Model III clone that added hi-res graphics and sound. Sort of a Model III with a few of the Model I hacks built in, and in a smaller case with a CoCo like chicklet keyboard. But with that number of chips it would have cost more than a CoCo and it didn't support CP/M.

 

The memory map of the Model I did not lend itself to conversion to a CP/M machine without a major remap of memory and you loose compatibility if you don't offer switching the memory map. Someone did come up with a board that does just that for the Model I and a special version of CP/M. However, regular CP/M expects an 80 column screen so you need special versions of all software which defeats the purpose of using CP/M, or an 80 column display which won't work on an RF TV. It also increases the chip count of the machine. Tandy did support CP/M in the Model IV but it was still an expensive machine with its own monitor. The motherboards in the III/IV are huge. Z80 CoCo or not, Tandy should have moved towards their own custom chip but they didn't have the staff or facilities to do that... and apparently no desire to do that.

 

Now, it was easy to upgrade a Model I to a faster clock speed, hi-res graphics (384 x 192), upper/lower case, composite out, 48K, etc... without an expansion interface. (See 'The Custom TRS-80 and Other Mysteries')

If they had come up with FG/BG color parameters similar to the Speccy (and LNW80) they could have maintained backwards compatibility while adding color. Or 384 x 192 mode could have generated color artifacts on NTSC without FG/BG color support. Using graphics with a proportional print routine could get a few more characters per line, maybe even 80, but it also requires more memory and you still have problems for CP/M. And again, more chips.

 

The custom chipset from Motorola for the CoCo made that machine possible and nobody was making a similar chipset for the Z80 until MSX which is way too late. The only reason the Speccy was possible was because Sinclair could make their own chip to integrate everything. If Zilog had made a highly integrated chipset to allow cheap CP/M machines, the market would have been flooded with cheap Z80 machines. That was a missed opportunity on their part.

 

Clearly, there were options available to Tandy with the Z80 route but I don't think CP/M would have fit into a cheap unit and a low cost unit would have required a custom chip which Tandy obviously wasn't prepared to manufacture.

 

The biggest problem with the CoCo was Motorola should have made the graphics chip a little more flexible with color and they should have included a programmable timer like the CoCo 3 had for driving the D/A converter for sound. If the 6847 had included a mode that gets color for each pixel from an analog input, the CoCo could have had palette registers in the SAM. It would have been much more attractive even with a limited number of colors on screen. The timer would have been easy but when you consider when the CoCo was developed, nothing like that had been done before.

 

Yes, I knew that too, I was speaking of working in RAM though. (plus it also assumes game cart ROMs used was fast enough to allow that, which may not have always been the case)

People building the game carts could buy fast enough ROMs if they were using the double speed ROM mode.

There were different speed grades.

Since the CoCo didn't have sprites, it might have made quite a difference with some games.

Games like Demon Attack could have ramped up the speed on higher levels.

 

As for disabling refresh, how can you do that to DRAM without getting corruption? (if not for that, it would have made perfect sense to run at 2x speed in vblank all the time)

On the CoCo, DRAM is refreshed during the first few scan lines of the display when the screen border is being drawn. So doubling the speed during the VBLANK doesn't actually disable the refresh.

Besides, DRAMS are refreshed more often than they need to be. If you don't mind loosing video you can run full speed for number crunching and just re-enable RAM refresh periodically to keep from loosing memory contents.

 

Some Dragon users and I tried messing with manually refreshing RAM periodically (just reading/writing will do it) but we didn't get it to work. I'd like to experiment with that further someday.

 

Plus, that's all overclocking the CPU by ~80% over the rated speed, right? (not that most, if not all 1 MHz rated 6809s weren't actually stable at those speeds . . .

Just to clarify for people that don't know, there are actually two high speed POKEs. One (POKE65495,0) speeds up the CPU in ROM address space. The other (POKE65497,0) speeds up the CPU during RAM access.

 

All white CoCos (1,2, or 3) seem to support the high speed POKE and most of the grey ones. I think pretty much all CoCos from the 'E' board on worked with it. The E board came out within a year or so of the CoCo introduction and was what was in my first CoCo which ran fine with the high speed POKE. The F board definitely worked with the high speed POKE and made the 64K upgrade easier, it was in the white CoCo 1s, TDP-100s, and I think later grey CoCos.

 

Most of the machines that didn't handle the high speed POKE (D board... probably the first board released for sale) would work with it if you cut some capacitors. At least an article in an early Rainbow magazine indicated that. That means it was a buss signal issue rather than heat that caused most machines not to work with the high speed POKE. A few early machines also needed a new SAM chip.

Dragons seem to have less success with the high speed POKEs than CoCos.

 

Some machines work with the ROM POKE but not the RAM one. Maybe that is the capacitor issue or maybe it's a SAM issue. I have no idea why as I've never seen it.

 

I think the GIME in the CoCo 3

Were you going to say something else here?

More likely I forgot to delete that.

Oops.

 

But the CoCo had a simple design that lended itself to advanced upgrades. Can you imagine updating all the Atari video modes to accept different CPU clocks?

What about the Atari video system would be problematic for faster clocks? There's already a wait-state system in place to block the CPU's access during video accessing or refresh, right? (so why wouldn't that same mechanism remain functional regardless of the CPU speed?)

The 6847 accessed RAM though the SAM which took care of all the timing issues on the CoCo, but it was designed that way from the start. The GIME just expanded what the SAM did.

The Atari chipset wasn't designed that way.

Every custom chip in the Atari is dependent on a fixed clock rate and changing that clock will require changes to all the internal and external timing... and it would need to work at two different clock rates and different buss timings.

 

As for an 8-bit CPU on a 16-bit bus, couldn't you add an external 16-bit latch to fetch 2 bytes which the CPU could then read consecutively? (or read only end up using 1 of the 2 bytes depending on the case, so only improving shared bandwidth some of the time)

It would actually be two latches, one for video and one for the CPU, the latch circuitry would be integrated into a custom chip. On the CoCo it would be the SAM or GIME replacement.

In the case of video refresh it works very well because reads are from consecutive addresses. That is where the big gain would be, fewer collisions with the screen refresh. The latch for the video refresh alone cuts collisions in half.

It also works for the CPU until the code branches or code isn't aligned on even bytes, so there could be a few wait states. And then you have to deal with writes which involves a load and a save for every byte unless you use cache memory. The important thing is that you are running the CPU at full speed most of the time even during screen refreshes. With a small amount of cache memory for the CPU you could almost eliminate wait states for a pretty low cost design.

 

The biggest expense with this type of design are a larger custom chip (8 more pins) and a more complex board layout.

You don't need 16 bit (wider) RAMs to do it since you can place 8 bit RAMs in parallel. So RAMs should be cheaper than the same amount of faster RAMs. The question is, does it make up for increased board and custom chip costs. I think it would have at that time. 64K RAM chips to upgrade a CoCo 1 were $75. If faster RAM chips are even 30% more expensive it probably pays for the more expensive custom chip and board changes.

Link to comment
Share on other sites

It seems like the best option may have been to morph the lower-end TRS-80 model 1 line into a CoCo-like machine (same sort of

I think this has already been discussed in this or another topic and I really don't understand your obsession with the low end Z80 machine instead of the CoCo.

The CoCo was Tandy's top seller over it's entire life and they sold MILLIONS of them. When I bought mine, a lot of retailers were out of stock. Tandy was probably selling them as fast as they could make them.

 

As I said, I was thinking along the lines of a single, unified architecture or computer family to preform in a range of different models, all built on top of the little old model 1, but building on that considerably. (doing away with most/all TTL parts and investing in custom chips -be it ULAs or full custom parts- that would include model 1 compatibility modes as well as enhanced modes more competitive with -if not superior to- what the CoCo offered) Then you'd have higher-end models with further enhancements, but compatibility with the lower-end models as well. (offering more apple-II like expandability would have been really significant too)

 

Basically what they did much later with their PC clones (custom chipset wide range of buisness to "home computer" class machines, flexible expandability, etc).

 

 

They definitely could not go off the shelf since there was no off the shelf LSI hardware that could perfectly emulate the model 1 hardware, and LSI video hardware that was available for the Z80 was rather limited or exclusive. (OTOH, using some off the shelf LSI support chips would make sense, but some areas -especially video- would necessarily have been custom -things like using an AY8910 for parallel port and some control lines as well as enhanced sound would make sense)

Doing that should have been significantly simpler than the ASICs in the Tandy 1000 (be it in-house or reverse engineered from PC jr components) given how rudimentary the TRS-80 display hardware is. (even if the enhanced modes were just RAM definable character modes with per-cell color attributes, that would have been pretty good for the time, especially if they'd increased the maximum number of text lines -32/64/80x25 would have been really nice)

 

As it is, it's rather odd that Tandy didn't invest in consolidating the model 1/2/3 chipsets. (especially as ULAs and gate arrays became cheap/common in the early 80s)

 

 

The CoCo may have sold very well, but there's no reason to think that a model 1 derived system with similar (if not greater) cost effectiveness wouldn't have been even more popular and had more potential for general software and hardware add-on support. (aside from the ability to run CP/M)

 

 

Really, it's the same argument as with the Apple II, and the same thing that made the PC a big success . . . except the Apple II was already a lot closer to being that ideal machine than the TRS-80 was. (built-in expansion slots, color capabilities, etc -all it needed was the right management/marketing to expand the line and push it as an indefinitely long-term open-ended product line with a broad range of machines from the "toy-store" level up to professional business and workstation class machines -and to accomplish that, you only really needed perhaps 3 different form factor models with different RAM/peripheral/etc configurations in each of those sub-categories)

 

 

Likewise, that was also a major component lacking in both the ST and Amiga. Regardless of hardware design, they both lacked some critical marketing/management factors that really could have made those systems, one thing that kept them from being totally superior from PCs of the time: flexible expandability and intercompatibility (with marketing/management promoting that heavily), and going a step further, having a broad range of machines from day 1. (if they did it carefully, they could probably get away with just 2 or 3 motherboard designs for the entire range, but with different cases and general configurations . . . you could have 1 board for the low-end console models as well as pizzabox machines -both using an external expansion interface- and then a big-box model with a built-in array of expansion slots) And then successive evolutions of the design following that. (especially faster CPUs and added RAM)

 

By 1987 the Amiga did actually have that array of machines with the A1000, 2000, and 500, but that was 2 full years after the initial release, and still rather lacking in standardization and expandability. (the A1000 and 500 could not easily add A2000 class expansion, especially the 500) There was also a lack of models with faster CPUs (especially odd given FastRAM seemed to beg for a 14.3 MHz 68k) aside from the somewhat kludged 020/030 accelerator boards for the 2000. (no fast mid-range 68k models, no 020/030 models on the motherboard with 32-bit fastRAM until the A3000, and that on top of the flawed management/marketing)

 

It's not even an issue of cost: adding a flexible expansion slot on the ST in place of the cart slot would have had very little impact as such, it was really just a matter of a persisting flawed market model . . . and one Atari Inc management had finally been recognizing themselves in '83. (hence the 1090XL) But CBM, Atari Corp, and Apple all threw the Apple II and PC's strengths out the window and started that cycle again. (except with far more fierce competition than the Apple II had vs A8 vs C64, etc)

It would have been very important in Europe as well as the US, in spite of the ST and Amiga doing so well as it was. (if there had been easy expansion, the ST certainly would have become more deeply entrenched -rather than having to go out and buy a whole new machine, just upgrade it -it would make consumers far, far less likely to start switching over to Amiga, make upgrades for RAM/blitter/sound/etc far more prevalent -and software likewise- and directly competitive with PCs when they finally started making a real appearance on the mass market in Europe)

 

But that's really a different topic. ;) (and one I wanted to discuss separately)

 

 

 

 

Tandy didn't want to compete with their other product lines or they could have added more features to the CoCo. They even axed better versions of the CoCo that had been developed (better keyboard, hardware driven RS-232, RAM disk supported by BASIC, etc...). Why on earth would someone with that attitude create a computer that might compete even more with the Model II/III?

Exactly, and with a universal standard, there wouldn't be such contention of different product support. (and margins on the higher-end systems would be much higher thanks to the low-cost embedded chipsets they'd have to use)

 

Tandy obviously didn't want to discontinue the older product lines and focus on a co-co specific standard (ie move forwards with Model 2/3 range CoCo derivatives as well as the low-end models), and there was obviously still a market for the pre-existing designs, so why not build on that and have most of what made the CoCo great as a marketable product (ie a cheap machine, reasonably well marketed/distributed, and with decent performance and flexibility), but without the contention that made it mutually exclusive in the higher-end markets?

 

The highly integrated chipset in the CoCo made it possible. No similar commercial chipset appeared for the Z80 until MSX. The custom chip in the Speccy made it possible but it was proprietary and not available at the time the CoCo was developed. Building a Z80 machine without a highly integrated chipset made them expensive and large or very stripped down.

And why would they have to use an existing chipset? Why not engineer their own basic chipset that merged the rudimentary TTL hardware of the original design(s) and added some good features on top of that. (RAM definable characters and color support especially)

And then an AY-8910 for audio and perhaps the parallel port. (and/or use it for digital joyports -having full Atari analog/digital joyport support would have been significant . . . though given the limited number of paddle games on the CoCo, they probably could have gone all digital and saved a lot of headaches)

 

If Tandy wanted to do a cheap CoCo with a Z80, all they had to do is something like the Laser/VZ200 & 300. They are a Model I clone with a 6847. But they didn't have built in 64K, RS232, or joysticks. Again, it comes down to the chipset. Even later versions of the VZ had custom chips.

Yes, but those came too late and lacked features I mentioned.

It would be a bad idea to bother with the 6847, it would be a waste to include on top of the custom ULA for TRS-80 model 1 compatibility . . . enhancing that ULA would make a lot more sense. (be it RAM defined characters, color modes, bitmap modes, enhanced semigraphics modes, etc, just some general configuration that would work OK . . . RAM definable characters and Spectrum quality RGBI attribue cells would be pretty decent for the time -like CGA, but with RAM definable characters, and maybe with bitmap modes too -or display list interrupts to allow character set to be changed after each row and thus allowing a fairly straightforward pseudo-bitmap mode, or a more limited set-up like the VIC used)

 

The closest thing to what you propose is the CP300 from Microdigital (Brazil). It was a Model III clone that added hi-res graphics and sound. Sort of a Model III with a few of the Model I hacks built in, and in a smaller case with a CoCo like chicklet keyboard. But with that number of chips it would have cost more than a CoCo and it didn't support CP/M.

What I'm thinking of was a successor to the model 1 (and perhaps model 2 if they hadn't created it as part of the model 2 standard) that included all the features/hacks of the model 1 and 2 (the latter at least for high-end models) with most/all the custom TTL embedded in custom chips (ULA, gate array, or full-custom depending on the case) and then adding some features to make it more attractive in the home market. (albeit colored text would be useful for science/business as well -graphics would eventually become important as well, but that didn't really happen until the mid 80s)

 

The memory map of the Model I did not lend itself to conversion to a CP/M machine without a major remap of memory and you loose compatibility if you don't offer switching the memory map. Someone did come up with a board that does just that for the Model I and a special version of CP/M. However, regular CP/M expects an 80 column screen so you need special versions of all software which defeats the purpose of using CP/M, or an 80 column display which won't work on an RF TV. It also increases the chip count of the machine. Tandy did support CP/M in the Model IV but it was still an expensive machine with its own monitor. The motherboards in the III/IV are huge.

Yes, add a basic MMU/mapper ASIC (if not embeded in the video ULA) to remap memory for new modes, and definitely add a true 80 column text mode. (especially 80x25)

80 column will definitely work through RF, but the question is more a matter of "will it be readable" and how high the actual resolution used is. (going with 6x8 character cells and a 10.74 MHz dot clock would probably be the most sensible -so 480x200- and it obviously would be a colorburst disabled mode -you could have color modes in RGB)

 

It would definitely work on composite monitors, in that case you could definitely use a 14.3 MHz dot clock with CGA quality text (which is exactly what CGA supported, with RGB quality grayscale text on a composite monitor/TV -with colorburst disabled), but the question is whether it would be usable in RF. (and that would depend on the TV and the modulator used) That's the main reason I say 10.74 MHz is a bit more realistic, though if you could get good quality RF modulators for the time (Atari's seem to be pretty damn clean), doing full CGA res would have made sense. (besides, those wishing to really use the machines for business would have bought monitors anyway and thus not had any issues whatsoever)

 

 

Z80 CoCo or not, Tandy should have moved towards their own custom chip but they didn't have the staff or facilities to do that... and apparently no desire to do that.

That's the main issue and the main point I was pushing towards . . . and something that had changed by 1984. (at least enough so that they could reverse engineer the PCJr chipset and implement it in a standard ISA motherboard -and that's assuming it is fully reverse engineered and not their own implementation with similar features)

 

It all comes down to business/management decisions, not so much the technology. (both are always important, and a good business model can be ruined by crap bardware, but there's far more examples of great engineering design with lost potential due to business/market model -in some cases doing well for a while, but eventually falling apart -the PC could have gone the way of the Apple II if IBM had been allowed to run it into the ground, ie if clone manufacturers hadn't made it a true, persisting standard -Tandy certainly made its mark as one of the first, if not the first to really push the PC as a realistic mid-level consumer standard -and eventually low-end and more cost effective than anything else on the market . . . and Apple II clones gotten to the state of 1985/86 PCs by 1982/83, it may have become the standard that the PC ended up taking instead . . . there's also sill the huge difference from what IBM was pushing and Apple; even in the worst cases of the incompatibility of the PCJr or PS/2 or lack of any decent offering for the mid-range/low end, IBM was still pushing a persisting hardware/OS standard -the PS/2 may have omitted ISA and 5.25" drives, but it still ran DOS and normal PC software -and OS/2 was pretty damn good, though eventually ruined by bureaucracy, preventing it from truly competing with MS Windows)

 

If they had come up with FG/BG color parameters similar to the Speccy (and LNW80) they could have maintained backwards compatibility while adding color. Or 384 x 192 mode could have generated color artifacts on NTSC without FG/BG color support. Using graphics with a proportional print routine could get a few more characters per line, maybe even 80, but it also requires more memory and you still have problems for CP/M. And again, more chips.

Again, using actual character graphics definable in RAM (Atari/VIC/TMS style) would have made the most sense. None of the heavy CPU overhead of managing a framebuffer or the RAM required for such a buffer, just RAM to hold the character data and tables for color attributes and character graphics. (that's something that CGA really, really should have supported . . . CGA games in 16 colors using 40x25 or -especially- 80x25 text -the latter would mean pretty smooth horizontal movement and better use of dithering)

Besides, you get color clash problems in Spectrum type bitmap modes anyway, and some of the better optimized Speccy games intentionally move only on a cell basis for that reason. (it's also a lot easier to blit things that way . . . horizontally -working only on byte boundaries, same issue with the ST due to its use of planar pixels)

 

Just have the CPU manipulate arrays of 8 or 16 bit words (depending on whether the attributes and tilemap/name tables are combined or separate) and occassionally upate the character set if necessary.

 

The custom chipset from Motorola for the CoCo made that machine possible and nobody was making a similar chipset for the Z80 until MSX which is way too late. The only reason the Speccy was possible was because Sinclair could make their own chip to integrate everything. If Zilog had made a highly integrated chipset to allow cheap CP/M machines, the market would have been flooded with cheap Z80 machines. That was a missed opportunity on their part.

That happened in Japan with the NEC PC8000, 8801, and Sharp X-1. ;)

 

There were tons of generic LSI support chips compatible with various CPUs . . . the main missing area was video though (plenty of parallel, serial, timer, etc support chips). There were some Intel video chips that may have been usable, but I'm not sure. (in any case they wouldn't be TRS-80 compatible)

The video portion (and sometimes the sound section) are what custom chips were generally used for, or discrete logic, but all of those custom machines had a lot of off the shelf LSI logic too. (tons of stuff from Motorola, MOS, Zilog, Intel, and various 2nd sources -the VCS had them, as did the A8, VIC, and C64 -RIOT, PIA, VIA, CIA, etc, etc -and of course the AY8910)

 

The TI chipset had been available since '79 (or '81) too, but there obviously would have been a conflict of interests for use in home computers. (but Coleco did use it in 1982, totally off the shelf and in nearly direct competition with the 99/4 -given how game oriented TI's market was, especially the closed software nature and emphasis on cart, then a variety of Z80 based machines arriving with that chipset in 1983, and the Sord M5 was actually out in '82) None of that was CP/M compliant though. (let alone tandy compatible ;))

 

TTL is an option too. Sinclair did just that with the super low-end ZX80, but, of course, that was super rudimentary and simile even for TRS-80 standards. (it couldn't even display video and run code at the same time . . . the ZX81 solved that with the mode allowing the CPU to work in vblank)

 

Clearly, there were options available to Tandy with the Z80 route but I don't think CP/M would have fit into a cheap unit and a low cost unit would have required a custom chip which Tandy obviously wasn't prepared to manufacture.

Tandy wouldn't manufacture it, they'd need to engineer it though, or pay to have it designed externally. (they'd use some generic chip vendor to produce the custom chips -commodity ULAs or gate arrays would be the cheapest option and should have been available by then -at least ULAs- but full custom would require more substantial investment)

 

The biggest problem with the CoCo was Motorola should have made the graphics chip a little more flexible with color and they should have included a programmable timer like the CoCo 3 had for driving the D/A converter for sound. If the 6847 had included a mode that gets color for each pixel from an analog input, the CoCo could have had palette registers in the SAM. It would have been much more attractive even with a limited number of colors on screen. The timer would have been easy but when you consider when the CoCo was developed, nothing like that had been done before.

Yes, 4 palette registers would have been great . . . or even fewer than that with some fixed colors. (even with the limited palette of the SAM, but probably 16 colors if you had to use 4-bit color registers anyway)

 

People building the game carts could buy fast enough ROMs if they were using the double speed ROM mode.

There were different speed grades.

Since the CoCo didn't have sprites, it might have made quite a difference with some games.

Games like Demon Attack could have ramped up the speed on higher levels.

Yes, they'd have to be willing to have that higher cost for games.

 

Most of the machines that didn't handle the high speed POKE (D board... probably the first board released for sale) would work with it if you cut some capacitors. At least an article in an early Rainbow magazine indicated that. That means it was a buss signal issue rather than heat that caused most machines not to work with the high speed POKE. A few early machines also needed a new SAM chip.

Dragons seem to have less success with the high speed POKEs than CoCos.

I highly doubt heat would ever have been an issue with such chips at higher clock speeds, it's the same thing for overclocking pretty much any pre-386 (and even most 386) era CPUs. Overheating from high clock speeds was a non-issue (NMOS circuits also barely dissipate more power at high speeds than low ones . . . but much more than similar CMOS chips at pretty much any speed ;)), the issue was logic stability, not heat. (you could get overheating by pumping too much power into the chips -especially too high voltage, but that's a separate issue ;) -likewise, you also generally won't kill old CPUs with overclocks, just have them not function at the higher speeds, but not burn up either -this came up in a recent discussion on Sega-16, including 100 MHz overclocking of old NMOS 68ks being harmless, but 9-12+ volts being almost instant death -of course, the chip was non functional at 100 MHz, totally unstable, though from most anecdotal accounts, pretty much all those old NMOS CPUs from late 80s/early 90s MD/Genesis consoles can be overclocked substantially -the limiting factor is always external logic/memory and not the CPU . . . unless you replace that external logic, some later models overclock better due to faster memory, but then around 1992 that stops and you can't overclock anything past ~10 MHz due to the new, timing sensitive system ASIC, the very early systems tend to be limited to ~12-13 MHz, while the late early models can go well into the high teens and still be reasonably stable -the big issue then tends to be the games, too slow ROM will cause errors or crashing, plus there's a lot of other factors not related to the CPU overclocking tolerance -especially the fact that it's an asynchronous overclock in a system without provisions to facilitate such, so missed reads/writes to I/O or RAM/ROM would be the prime stability issues -synchronous overclocks require much heavier modification and are so much work that's it's not worth it beyond curiosity . . . in the CoCo's case, you necessarily have memory that's already fast enough to do that with, unless you really overclock the CPU beyond that 2x speed ;))

 

As I said, it's rather odd that so many old CPUs seem to overclock with relative (or complete) stability. (maybe the areas of instability are rarely encountered in certain machines or a high percentage of lower rated chips really are totally stable at higher speeds, but down-rated for whatever reason -be it yields from certain plants/batches and not wanting to deal with the overhead of grading each CPU individually, or just down-rating CPUs regardless to allow the desired price hierarchy with a range of different speeds rather than just the fastest one at a low price -I'm not sure about the early CPUs, but I seem to remember mention of most/all late generation -ie early 90s onward- 68000s being capable of 16 MHz at the very least, and often more than that -but manufacturers keeping lower grades for price differentiation, and conflict of interests in Motorola and Toshiba's cases -as they had higher performance 68k successors)

 

I wonder how many CPUs end up being like that (totally fine with high yields of high speed chips, but persistent lower grading for business reasons), and how early in the life cycle that might occur. (obviously, stability at higher speeds is a very real issue, especially when a chip is new on the market)

 

I wonder why more companies didn't overclock by routine. (either test on a batch basis or per CPU depending on how confident they were at the actual speed tolerances)

Granted, if that did happen, chip manufacturers would have to figure something else out for managing their margins. ;)

 

The Atari chipset wasn't designed that way.

Every custom chip in the Atari is dependent on a fixed clock rate and changing that clock will require changes to all the internal and external timing... and it would need to work at two different clock rates and different buss timings.

So all that wait state generation, video, CPU, and refresh timing is interdependent? (actually, that makes a lot more sense . . . otherwise you'd need dynamic management of wait states and that would be extremely complex for the time -and difficult to maximize efficiency)

 

In that sense, it would probably be easiest if only integer multiples of the original speeds was supported. (ie if you did keep the video/DMA speed the same, but doubled CPU/RAM speed, you'd need to make sure everything accepted 2 clock cycles where you had 1 before, and that all wait states responded fast enough to halt the CPU before it got an error -ideally, you'd speed up the refresh logic as well and minimize wait states there too -of course, if the video hardware was no faster, you'd get no added bandwidth for video and access at the same old slow speed)

Anything in-between 1.79 and 3.58 MHz would be funky to manage though . . . unless the video/DMA timing was changed to match. (and that would also screw up timing for NTSC colorburst -another thing simplified by integer multiplication)

 

If they ever did update the chipset, it probably would have made more sense to double the speed of everything in the system. (3.58 MHz CPU with effective ~2.4 MHz performance, maybe a bit more if refresh was more efficient, and double the video DMA bandwidth -and potentially double the horizontal resolution for all the current modes, which would really be significant -adding an 80 pixel 8bpp mode would be significant too)

Of course, anything like that would be especially difficult without the original engineers or at least extremely detailed low-level documentation of the hardware. (but if they did actually do that, the A8 would suddenly have become extremely capable for the time -software notwithstanding- with enough resolution for more serious applications, high res higher color depth modes, double the CPU resource, etc -160 wide 4bpp probably would have been the best for games, and not bad for software blitting on byte boundaries either -moving 2 pixels at a time in an 80x192 grid)

 

As for an 8-bit CPU on a 16-bit bus, couldn't you add an external 16-bit latch to fetch 2 bytes which the CPU could then read consecutively? (or read only end up using 1 of the 2 bytes depending on the case, so only improving shared bandwidth some of the time)

It would actually be two latches, one for video and one for the CPU, the latch circuitry would be integrated into a custom chip. On the CoCo it would be the SAM or GIME replacement.

In the case of video refresh it works very well because reads are from consecutive addresses. That is where the big gain would be, fewer collisions with the screen refresh. The latch for the video refresh alone cuts collisions in half.

It also works for the CPU until the code branches or code isn't aligned on even bytes, so there could be a few wait states. And then you have to deal with writes which involves a load and a save for every byte unless you use cache memory. The important thing is that you are running the CPU at full speed most of the time even during screen refreshes. With a small amount of cache memory for the CPU you could almost eliminate wait states for a pretty low cost design.

You'd also have to weigh the cost and complexity (and performance) of a shared 16-bit bus compared to 2 dedicated 8-bit buses. (would have been the same issue for the Amiga going 32-bit with a latch for the CPU rather than adding fastRAM -though faste page mode is also an important consideration, and separate buses are much more important for that, once the bus masters get fast enough to make use of it -had the ECS added page mode playfield scanning, it could have doubled the bandwidth without using faster clocked memory -ie 140 ns FPM reads for scanning the screen, AGA did that but only for Copper/Alice, no added bandwidth for PAULA or -more importantly- the CPU -it really needed to have an updated serial DMA mechanism for the CPU to have burst access to chipRAM with no interleaving . . . that interleaving made it about as bad as an 8 MHz 8-bit ISA bus, or worse than ISA if you were doing 8 or 16-bit accesses and not 32-bit)

 

In the Amiga's case (or would-be ST for that matter), dual buses was probably the right way to go for the time . . . but they didn't really pull that off effectively. (the main alternative to multiple buses would be heavy buffering to allow higher bus width and sustained fast page access -on the chipset end that would be caches or line buffers at the very least, and the CPU would need a cache too if it was to use the bus efficiently -dual buses would require less intensive R&D, and given how poorly CBM built onto the chipRAM bus, fastRAM was the only marginally foolproof option they could use ;) -the more cost effective options needed more aggressive R&D and some might not have been realistic for the time, or require so much board space that it would be impractical -ie if caches/buffers all that to use external SRAM/logic, it would be totally unrealistic)

 

 

The biggest expense with this type of design are a larger custom chip (8 more pins) and a more complex board layout.

You don't need 16 bit (wider) RAMs to do it since you can place 8 bit RAMs in parallel. So RAMs should be cheaper than the same amount of faster RAMs. The question is, does it make up for increased board and custom chip costs. I think it would have at that time. 64K RAM chips to upgrade a CoCo 1 were $75. If faster RAM chips are even 30% more expensive it probably pays for the more expensive custom chip and board changes.

Yes, but again, there's the issue of how the cost (and relative performance) would compare with a 2 bus 8-bit design. (same number and speed of DRAM chips too, same number of data pins/traces, no complex buffering, but added complexity for interfacing 2 buses with 2 DRAM controllers and an interface to link the 2 buses -be it I/O ports or DMA- )

Separate buses also tend to make less efficient use of memory too, so that's a consideration.

 

 

In either case, you may or may not have needed more (or wider) DRAM chips. It might have been that you already had 16-bits worth of RAM chips wired to 8 data lines (ie 16 1-bit chips, 4 4-bit chips, etc), though you might also have something like 6 4-bit chips where you couldn't do 16-bits without changing the chips used. (or obviously cases where you only have 8 data lines total, 2 4-bit chips, 4 2-bit, 8 1-bit, etc -so you might not end up with more total chips, but you'd need wider DRAMs at the very least)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...