Jump to content
Sign in to follow this  
kool kitty89

Why didn't/couldn't Atari Corp use the AMY in the ST?

Recommended Posts

From memory some of the I/O bits are adhoc and some are used by the printer port.

 

16 bits I/O isn't really "special" but saves having to incorporate a PIA/CIA/whatever, and of course it's costly in terms of number of pins you need to allocate.

 

Quad Pokey to my knowledge was never released and was really just a consilodation effort to save board space. Probably nothing special about it and I wouldn't be surprised if Tramiel had rights to it anyway... Atari stopped using Pokeys in its later arcade games.

 

I reckon if Pokey had I/O like the YM, then it probably would have landed in the ST. If they'd have put in the effort they could easily have created a newer version - the way to go IMO would be to have dual or quad, then cut back the number of POTs to 4 or 2, which would free up some pins which could have gone to I/O.

 

But like so many things, chances are they had absolutely nobody onboard with the ability to develop it further.

Share this post


Link to post
Share on other sites

Edit:

missed finishing part of my last statment so see here:

The reason the ST was decided as their first new major machine was because 68000 was the way to go and 16bit was clearly the future looking at Mac and Amiga really I guess.8086 was DODO as far as technicians were concerned and in 1985 the 80286 CPU probably cost as much as a complete Amiga 1000! So there is certainly an element of truth in that, but everything was rush rush faster faster including CP/M 68k conversion and GEM 68k conversion. We don't complain about GEM we do complain about the Spectrum/Amstrad sound chip. Can't win em all and it was significantly cheaper than Amiga 1000 PC XT and Mac and wasn't the worst of the four so good value all in all for 1985.

Not so much... the ST was in development (RBP) before Atari Corp existed (or rather under TTL before it was renamed Atari Corp). Tramiel acquired Atari Inc's consumer holdings specifically because of the ST/RBP design and needing the resources/name/network to bring it to market.

. . .

Atari Inc was already set on the 68000 and had been since ~1983 with several advanced chipsets being developed (AMY eventually becoming part of that) along with a Unix based OS and "Snowcap" GUI. Warner initially took strong interest but pushed back when they decided they really wanted to focus in the consumer market more than the high-end. (the 68k machines had predominantly been configured as high-end workstations, albeit in 1983 even something like the ST would have been high-end and there was a range of how high-end the Atari machines were, but that was also up to implementation and NOT the chipsets themselves -like including 2 or 3 68ks or 2 MB or RAM ;)) But in 1984 they were aiming at the Amiga chipset with their Mickey console and also aiming at pushing the 16-bit OS onto that platform as it evolved into a computer (in 1985 it was to get a computer add-on and up to 128k RAM, in '86 it could have been sold as a full computer), but Amiga lied to Atari Corp saying they couldn't get the Loraine chipset to work and returned Atari's $50,000 investment just one or 2 days before Warner split up the company and Tramiel got the consumer holdings. (something Warner had NOT bothered to tell any of the Atari Inc senior management including president James Morgan, making the transition into a confused mess and leading to lawsuits against Atari Inc -atari staff didn't even understand that they'd all been laid off with Atari Inc being liquidated and Atari Corp was just TTL renamed and thus it was up to them to hire new employees from the old Atari Inc staff) All the time pushing for the careful withdrawl of the 5200, release of the 2600 Jr, and 7800 (along with adapter for 5200 owners) for fall of 1984 along with the MICKEY.

 

Had Atari Inc continued to exist they likely would have sued Amiga and switched to one of their own chipsets (wgungfu mentioned Rainbow was being looked at for using in a game console) and pushed on. They might have been able to win the lawsuit against Amiga/CBM faster than Tramiel did, and the 7800 would have been out in 1984.

It wasn't tramiel's fault, but Warner's of course, but what Morgan was doing in 1984 was extremely promising according to Curt and marty: cutting away the bureaucratic red tape and reworking Atari Inc into a lean, clean corporation. (NATCO was part of that)

A real shame reorganization hadn't started a year sooner... they really could have avoided a lot of trouble had their problems been headed off when they started to become serious in 1982.

 

 

 

 

Pointless then in investing in R&D projects then if v. little of them ever see the light of day

It's not pointless for big/healthy companies. Sometimes you invent important things, that you can put into real products later. Other times you get patents out of it. And it attracts really smart people, that you just couldn't hire otherwise.

 

But it's true that research groups running amok can waste both resources and opportunities. There are enough famous stories of that in the valley, all the way back to Xerox PARC. If you're taking products from research groups and showing them at CES, you're probably about to miss the boat.

Yes, and had Atari Inc not run into such managment issues it would have made total sense. Much of the stuff that was way ahead of its time in 1983 might have made a lot more sense by '85/86, especially if implemented in less extravagant configurations. (ie unlike 3 68000s or 1 MB system RAM + 1 MB Video RAM in 1983 ! )

 

While not originally designed for any such requirement (neither were the 16-bit chipsets), AMY became part of several (if not all) of those advanced 16-bit computers and chipsets in development (Sierra/Gaza/silver&gold/GUMP/Rainbow) with the exception of the Amiga based MICKY design planned for release as a high-end game console in late 1984. (with computer expansion a year later and full computer allowed in '86 per the contract with Amiga, Amiga defaulted on that just a couple days before Warner rashly broke up Atari Inc)

 

 

 

 

 

 

 

 

 

 

 

From memory some of the I/O bits are adhoc and some are used by the printer port.

 

16 bits I/O isn't really "special" but saves having to incorporate a PIA/CIA/whatever, and of course it's costly in terms of number of pins you need to allocate.

The MOS based I/O chips (like RIOT and PIA) used 16 I/O lines just like the AY chip and were also 40 pin DIPs.

 

Quad Pokey to my knowledge was never released and was really just a consilodation effort to save board space. Probably nothing special about it and I wouldn't be surprised if Tramiel had rights to it anyway... Atari stopped using Pokeys in its later arcade games.

I wasn't suggesting that per se, but hypothetical consolidation of POKEY and/or other 8-bit chips in a practical sense. (early on using the plain chips, then later consolidating them and perhaps dropping pin count in the interim)

 

As for the Arcade, Atari Games continued to use POKEYs though the late 80s alongside 6502s dedicated to audio controlling. (in most cases POKEY was only used for timers and the 6502 was driving a YM2151 -presumably Atari Games had enough stock left over to make that attractive over other off the shelf parts -or Atari Corp was selling them to AGames)

 

I reckon if Pokey had I/O like the YM, then it probably would have landed in the ST. If they'd have put in the effort they could easily have created a newer version - the way to go IMO would be to have dual or quad, then cut back the number of POTs to 4 or 2, which would free up some pins which could have gone to I/O.

 

But like so many things, chances are they had absolutely nobody onboard with the ability to develop it further.

That's why I was thinking in terms of what they had and rather than re-engineering anything to add capabilities, simply using existing hardware: even if it used more board space there were the advantages of owning the IP or license, already stocking them for other products (or left over stock), and the potential to consolidate them into smaller custom chips. (the most complex part of that would be shrinking the dies and potentially cutting out vestigial portions of the chips -like if the pot or SIO lines of POKEY weren't used) If they didn't have the necessary chip designers in-house (not tied to the actual creation of any of those ICs, but simply capable of building an ASIC out of the old hardware) they would need to outsource though. (cutting the pin count of the packages wouldn't require any of that though, no change to the masks even, just switching to smaller packages with some connections omitted like the 6507 did for the 6502 or the AY8912 or 8913 cutting the AY8910 from 40 to 28 or 24 pins removing one or both I/O ports -possibly not dropping below 24 pins due to narrow DIPs not fitting the IC without shrinking the die)

 

I'm not sure of some possibilities like SIO being used for MIDI or maybe RS232 (I think the former is more likely) and timers would definitely be useful (POKEY and RIOT) but on top of all of that, the idea would be to minimize difference in cost. (if you had a 6502 driving a POKEY, that wouldn't be the most cost effective for the task but would be a decent hack for PCM playback short of custom DMA logic) But aside from that, POKEY+RIOT (or PIA) alone might have worked with the right glue logic added. (I was thinking in terms of the 6502 being kept as an audio/IO manager in such a case though with the GLUE connecting it to the 68k bus)

 

But again we've gone way off the main point and I should probably drop this or bring it over to a separate thread. ;)

Edited by kool kitty89

Share this post


Link to post
Share on other sites

It's not pointless investing in R&D for the stuff not to see light of day.

 

It gives those involved experience which can and is often carried over to other projects.

 

What is pointless is companies that just lose the people involved, usually due to their own mismanagement.

Share this post


Link to post
Share on other sites

It's not pointless investing in R&D for the stuff not to see light of day.

 

It gives those involved experience which can and is often carried over to other projects.

 

What is pointless is companies that just lose the people involved, usually due to their own mismanagement.

Yes, but in the case of Atari Inc, that was in large part Warner's fault for poorly managing the transition: not notifying Atari Inc (even James Morgan) of a possible pending sale/split of the company, said sale being agreed upon still with no notice and finalized over 4th of July weekend of all times with James Morgan brought in to sign the contract at literally the last minute!

 

Tramiel may have had a fair bit of tunnel vision on the ST, but the situation was a total mess with Atari Inc staff having no idea what was going on and not even understanding that Tramiel hadn't even bought the company, but just the consumer product holdings and associated production network and license to the Atari name. All Atari Inc staff were technically laid off with the liquidation and thus it was a question of who among them would be hired to Trammel Technologies LTD, hence forth renamed Atari Corporation.

 

So who knows what might have happened if Atari staff had had time to prepare (especially Morgan) for such a transition and manage it as smoothly as possible. (the 7800 conflict was an additional snag after the fact with Warner claiming ownership and requiring Tramiel to pay GCC for all R&D costs for Maria and 10 games which he finally relented on in early 1985 after going back and forth over)

Hell, it would have even made sense to bring over Morgan (and maybe some of the other senior management) to smooth the transition until the Tramiels got a handle on operations. (or keeping some on longer if that proved desirable)

Share this post


Link to post
Share on other sites

It wouldn't have anything to do with a PC though: the original AMY interface was intended to use a simple 8-bit MCU with am 8051 specifically referenced (successor to the older 8048 line -used in the Intellivision among many other embedded applications)

 

Nope! The Intellivision uses the CP1610 which is a 16bit CPU and nothing like an 8051.

 

Yep, true. The i8048 was used in the Odyssey 2.

Share this post


Link to post
Share on other sites

It wouldn't have anything to do with a PC though: the original AMY interface was intended to use a simple 8-bit MCU with am 8051 specifically referenced (successor to the older 8048 line -used in the Intellivision among many other embedded applications)

 

Nope! The Intellivision uses the CP1610 which is a 16bit CPU and nothing like an 8051.

 

Yep, true. The i8048 was used in the Odyssey 2.

I already mentioned that in post 24. ;)

 

http://www.atariage.com/forums/topic/172615-why-didntcouldnt-atari-corp-use-the-amy-in-the-st/page__p__2140366#entry2140366

Share this post


Link to post
Share on other sites

sorry for resurrecting old thread but I just stumble on one engineer, John Foust, who worked in company that bought AMY from Atari!

 

http://www.dadhacker.com/blog/?p=1000 and do search for "AMY":

 

John Foust says:

April 16, 2008 at 6:43 am

I was part of the company that bought the AMY sound chip technology from Atari. We’d hoped to use it at the heart of new synthesizers. As a digital additive synth, our software and methods quite resembled the technology that would become MP3. We had very rich sounds compressed to an unprecedented degree.

 

 

landon says:

April 16, 2008 at 7:20 am

I remember the Amy chip; it was one of the really cool things that the Tramiels kept going. I remember a few non-working revs of the chip coming back, and that everyone was a little bit depressed about that.

 

Here’s a page on the Amy that I found. Definitely ahead of its time.

 

http://www.atarimax....org/achamy.html

 

maybe somebody could contact John F. for more information?

Edited by calimero
  • Like 1

Share this post


Link to post
Share on other sites

The only sensible upgrade to Yamaha YM of original ST would have been a true stereo (ie panning control of each channel)4 channel DAC with ring modulation, 4 complex filters and synchronization.

 

Amiga and especially Acorn Archimedes showed us what restrictive junk the MT32 PC/SNES/MEGADRIVE/PC-Engine traditional choices were. The future was DACs.

 

We got some 2 channel low rent junk only for STE + YM rubbish. What were they thinking!

Share this post


Link to post
Share on other sites

The only sensible upgrade to Yamaha YM of original ST would have been a true stereo (ie panning control of each channel)4 channel DAC with ring modulation, 4 complex filters and synchronization.

 

Amiga and especially Acorn Archimedes showed us what restrictive junk the MT32 PC/SNES/MEGADRIVE/PC-Engine traditional choices were. The future was DACs.

 

We got some 2 channel low rent junk only for STE + YM rubbish. What were they thinking!

 

MT-32 junk? Sigh.

Share this post


Link to post
Share on other sites

The only sensible upgrade to Yamaha YM of original ST would have been a true stereo (ie panning control of each channel)4 channel DAC with ring modulation, 4 complex filters and synchronization.

 

Amiga and especially Acorn Archimedes showed us what restrictive junk the MT32 PC/SNES/MEGADRIVE/PC-Engine traditional choices were. The future was DACs.

 

We got some 2 channel low rent junk only for STE + YM rubbish. What were they thinking!

 

MT-32 junk? Sigh.

 

Yes JUNK, your imagination as a musician is restricted to those crap general midi type sounds. DACs mean you make any sound possible. Sure it was 16bit quality for the instruments provided but the tunes all sound like 99 bucks synthesizer demo tunes.

 

Give one to a real musician and he would flush it down the toilet. Rob Hubbard said such devices were stunting his creativity when he moved to PC infested USA.

Share this post


Link to post
Share on other sites

MT-32 junk? Sigh.

Yes JUNK, your imagination as a musician is restricted to those crap general midi type sounds. DACs mean you make any sound possible. Sure it was 16bit quality for the instruments provided but the tunes all sound like 99 bucks synthesizer demo tunes.

 

Give one to a real musician and he would flush it down the toilet. Rob Hubbard said such devices were stunting his creativity when he moved to PC infested USA.

 

As someone who has been paid to perform music for a crowd during the heyday of the MT-32, I humbly suggest that a) That certainly qualifies me as a 'real musician', and b) You have no clue what you are talking about. The MT-32 was a versatile, good quality device, used by many professional musicians, though some mods were necessary to get clean sounds out of it. I used it with an ST, an Octapad, a Poly-800 and a DX-7.

 

Devices don't stunt creativity, in fact it takes creativity to get exactly what you want from older devices. Those who can't are lazy or uninspired.

 

I see multiple threads where you rag on any sound device that isn't a DAC, so I will let that bit of fanboy-ism speak for itself.

 

AMY is a fundamentally different technology then the MT-32. Trying to directly compare them is not useful.

Could Atari have chosen a better sound chip for the ST.... absolutely.

  • Like 1

Share this post


Link to post
Share on other sites

Bravo, poobah! :) That was what I as thinking. Also, it sounds like macgoo did not follow (or even lived) through those old computer industry days of the late 80s. RAM and CPU power simply could not provide the horsepower to run a DAC soundtrack while the CPU is processing graphics, input devices, scrolling, etc. Having synth modules like the MT-32 offloaded the work of high quality sound to an external device while the computer CPU did its thing.

Share this post


Link to post
Share on other sites

MT-32 junk? Sigh.

Yes JUNK, your imagination as a musician is restricted to those crap general midi type sounds. DACs mean you make any sound possible. Sure it was 16bit quality for the instruments provided but the tunes all sound like 99 bucks synthesizer demo tunes.

 

Give one to a real musician and he would flush it down the toilet. Rob Hubbard said such devices were stunting his creativity when he moved to PC infested USA.

 

As someone who has been paid to perform music for a crowd during the heyday of the MT-32, I humbly suggest that a) That certainly qualifies me as a 'real musician', and b) You have no clue what you are talking about. The MT-32 was a versatile, good quality device, used by many professional musicians, though some mods were necessary to get clean sounds out of it. I used it with an ST, an Octapad, a Poly-800 and a DX-7.

 

Devices don't stunt creativity, in fact it takes creativity to get exactly what you want from older devices. Those who can't are lazy or uninspired.

 

I see multiple threads where you rag on any sound device that isn't a DAC, so I will let that bit of fanboy-ism speak for itself.

 

AMY is a fundamentally different technology then the MT-32. Trying to directly compare them is not useful.

Could Atari have chosen a better sound chip for the ST.... absolutely.

 

Bullcrap.

 

Recreate the following 5 pieces of music on an MT-32 perfectly and we will see....

 

Capella's 1990s chart hit You Got to Know

Jean Michelle Jarre's Zoolook

Ridge Racer 5 track Rare Hero

Enya's classic Boadicea including her voice humming along

A reasonable rendition of the SID game soundtrack of the game The Last V8.

 

Oh dear don't have the instruments built into your shit restricted set instrument model sound module? Shame. Never mind you could always make elevator music style general midi renditions of the tracks so we can all laugh.

 

A musician indeed, when you have posted on Remix.Kwed.Org a highly rated remix using your so called awesome MT32 I will change my mind, until then please keep your cheesy elevator music to yourself thanks

Share this post


Link to post
Share on other sites

Bravo, poobah! :) That was what I as thinking. Also, it sounds like macgoo did not follow (or even lived) through those old computer industry days of the late 80s. RAM and CPU power simply could not provide the horsepower to run a DAC soundtrack while the CPU is processing graphics, input devices, scrolling, etc. Having synth modules like the MT-32 offloaded the work of high quality sound to an external device while the computer CPU did its thing.

 

Oh yeah because the Amiga games Super Stardust 1200 AGA is soooo compromised running cutting edge 3D graphics whilst playing 4 hardware channels and 2 software channel (6 channel) DAC sound. I guess the Archimedes also is a piece of shit even though the 1987 machine has 8 stereo DACs AND is still more impressive than an Atari Falcon from 6 years later?

 

Oh yes that told me, old computers really could not make music with sampled instruments using DMA'd DACs as sound chips, let's all go and buy some complete shit sounding MT32 and play our bollox PC games with sounblaster scratchy sampled FX and dirt shit elevator music that grates on your ears from the LAPC1 or MT32 sound module selected in the DOS set up. A stock 14mhz Amiga if set to 72hz VGA output actually can drive the DACs at 56khz through some freak of circuitry using a 30khz horizontal sync for the VGA non-interlaced video output. Like I said already the only restriction was sound channels (because nobody improved Jay Miner and co's 1983 prototype Paula/Portia soundchip) but 8 stereo channels of 1987 Acorn Archimedes is more than enough to make any music of commercial quality.

 

And these are machines using CPUs probably 5x more powerful than even the Archimedes haha great stuff.

 

This my friends is people who played Day of the Tentacle or Lotus III on PC with such JUNK and somehow decided this is our history, this is the same kind of morons who think Nintendo had any part to play in the cutting edge of gaming technology. No my friends it goes from VCS to 8bit machines to Archimedes/Amiga and then back to consoles UNTIL PCs had multichannel DAC based windows sound cards with software multiplexing via the sound device hooks and a Pentium CPU.

 

And finally I don't think Betty Boo used an LAPC1 or MT32 to make her demo tapes before becoming a music sensation topping the charts. No I think you will find she made her millions using an Amiga 500 and a sampler. Now when Poo the musician has 3 UK number one singles topping the charts rather than playing cheesy shit wedding music that sent you to sleep in the 80s or 90s I might bother to read his replies to me.

 

Clearly the two of you have zero understanding about sound hardware of the 80s or 90s :lol:

Share this post


Link to post
Share on other sites

No, you obviously don't know what you're talking about. Amiga 1200? That's 1992, not 1988. Things generally improve in 4 years. I don't know much about the Archimedes since I'm in the US, but having 8 stereo DACs doesn't always equate to better sound. It's in the ears of the listener. I rarely saw the Archimedes for sale in the UK gaming magazines probably because it was more expensive than an Amiga or ST.

 

An Amiga 500 was 8MHz, not 14MHz. Not everyone upgraded or hacked their Amigas to play games.

 

It may be your opinion that the MT-32 is junk, but it played a big role in bringing better music to games in the late 80s and 90s when everyone was using the crappy Adlib/Soundblaster FM synthesis used in more PCs. Starving musicians could finally afford a good quality sound module for an affordable price compared to other multitimbral sound modules out at the time. Sure, it may not have sounded as good as the higher end stuff, but it was cheaper and sounded decent. You get what you pay for.

 

If anything, your post continues to show how out of touch you are with late 80s/early 90s music technology.

Share this post


Link to post
Share on other sites

Also, it sounds like macgoo did not follow (or even lived) through those old computer industry days of the late 80s. RAM and CPU power simply could not provide the horsepower to run a DAC soundtrack while the CPU is processing graphics, input devices, scrolling, etc.

 

You're right -- if you restrict yourself to configurations where the whole of the system is a CPU attached to a simple frame buffer and a DAC.

 

But that wasn't the only solution available in the 80's. The Amiga, for instance, had a special DMA controller that accelerated graphics, sound processing, scrolling, etc, and greatly reduced the burden on the CPU. An admittedly limited form of wavetable synthesis came almost for free on the Amiga.

Share this post


Link to post
Share on other sites

An Amiga 500 was 8MHz, not 14MHz. Not everyone upgraded or hacked their Amigas to play games.

 

 

7.16Mhz. The Atari ST's Motorola 68000 ran at 8Mhz.

Share this post


Link to post
Share on other sites

No, you obviously don't know what you're talking about. Amiga 1200? That's 1992, not 1988. Things generally improve in 4 years. I don't know much about the Archimedes since I'm in the US, but having 8 stereo DACs doesn't always equate to better sound. It's in the ears of the listener. I rarely saw the Archimedes for sale in the UK gaming magazines probably because it was more expensive than an Amiga or ST.

 

An Amiga 500 was 8MHz, not 14MHz. Not everyone upgraded or hacked their Amigas to play games.

 

It may be your opinion that the MT-32 is junk, but it played a big role in bringing better music to games in the late 80s and 90s when everyone was using the crappy Adlib/Soundblaster FM synthesis used in more PCs. Starving musicians could finally afford a good quality sound module for an affordable price compared to other multitimbral sound modules out at the time. Sure, it may not have sounded as good as the higher end stuff, but it was cheaper and sounded decent. You get what you pay for.

 

If anything, your post continues to show how out of touch you are with late 80s/early 90s music technology.

 

Amen.

 

Also, it sounds like macgoo did not follow (or even lived) through those old computer industry days of the late 80s. RAM and CPU power simply could not provide the horsepower to run a DAC soundtrack while the CPU is processing graphics, input devices, scrolling, etc.

 

You're right -- if you restrict yourself to configurations where the whole of the system is a CPU attached to a simple frame buffer and a DAC.

 

But that wasn't the only solution available in the 80's. The Amiga, for instance, had a special DMA controller that accelerated graphics, sound processing, scrolling, etc, and greatly reduced the burden on the CPU. An admittedly limited form of wavetable synthesis came almost for free on the Amiga.

 

The Amiga used 8 bit DAC with 6 bits of volume.

The MT-32 used 16 bit Burr-Brown DAC (very high quality) and 7 bits of volume.

You get into the 90s before any computer has those specs.

 

At any rate, I'm done feeding the troll. I used this stuff back in the day, and it did the job masterfully.

Commenting without that experience is simply pathetic posturing.

 

A guy I used to work with had a sign over his desk that said "The person who says a thing can't be done is often interrupted by the person doing it."

Share this post


Link to post
Share on other sites

Time to really blow everyone's minds...

 

In April 1984 Atari had designed an AMY Expansion card for the 1090 XL Expansion box...See attached ZIP file of the actual Schematic.

 

This and other evidence I'm examining... I am not believing the old storyline that Atari Corp couldn't get it to work. Somethings just don't add up and from the Atari Inc standpoint, it seems like this chip was nearly ready to go, something else happened during the Atari Corp days. I'm leaning more towards - they needed money to fund the ST design, so they decided to sell off the chip design or license it out, or something along those lines, IMHO.

 

1090_AMY_Card17APR84.zip

Share this post


Link to post
Share on other sites

I think that the reason is very simple: ST was designed as low cost (relative to power), general purpose computer. Components were chosen carefully, and price/power ratio was important. That's why they went on 6 chip TOS instead simpler, less space taking 2 ROM chips - larger capacity was much more expensive in 1985 . Same stays for RAM .

So, some fancy sound chip would increase price a lot. Then why MIDI in ST, could someone ask . That was really low cost: 1 ACIA chip more, optocoppler, 1 1/4 $ logic chip, 2 cheap 5 pin DIN connectors and little support in SW.

Share this post


Link to post
Share on other sites

Yet the spectrum didn't use that soundchip till about 3 years later. That's another thing people keep repeating. The ST didn't have a spectrum soundchip. It was the other way around.

Technically they're not the same sound chip either. The Spectrum used the AY variant. The off the shelf nonsense Amiga owners keep saying is getting tiresome too.

 

I know this is an old topic, but the YM2149 was build on license from General Instrument, so the Ym2149 was a copy of the AY-3-8910 the chip used in the Spectrum/Amstrad and also in a lot of arcade machines.

 

 

Share this post


Link to post
Share on other sites

problem was there was not many of shelf sound chip at that time, many of the sound chip mentions here first came later.

 

so personally i don't think there was anything wrong with the Ym2149 the Atari ST just had to few.

Gyruss (arcade game) had 5 ay-3-8910 so if the ST just have had 2 YM2149 when you could have had 4 channel for music and 2 for sound FX.

Share this post


Link to post
Share on other sites

I think that the reason is very simple: ST was designed as low cost (relative to power), general purpose computer. Components were chosen carefully, and price/power ratio was important. That's why they went on 6 chip TOS instead simpler, less space taking 2 ROM chips - larger capacity was much more expensive in 1985 . Same stays for RAM .

So, some fancy sound chip would increase price a lot. Then why MIDI in ST, could someone ask . That was really low cost: 1 ACIA chip more, optocoppler, 1 1/4 $ logic chip, 2 cheap 5 pin DIN connectors and little support in SW.

 

And wasn't much of the ST design work done by ex-Commodore engineers before Tramiel even bought Atari? I think that's sufficent to explain why the ST used virtually nothing that was created by Atari Inc R&D

  • Like 2

Share this post


Link to post
Share on other sites

 

And wasn't much of the ST design work done by ex-Commodore engineers before Tramiel even bought Atari? I think that's sufficent to explain why the ST used virtually nothing that was created by Atari Inc R&D

I don't think that Tramiel even started to think about concrete new, 16 bit computer design while still was at Commodore. Plus, design of C64 and ST, done by same man, Shiraz Shivji is very similar.

C64 - weak CPU, many custom chips, HW support for video, sprites ... ST - most of it is on CPU, except DMA . Other thing is what would be if Commodore would not take over Amiga design ...

Why would ST use some chips created for 8-bit, low res, game oriented XL and family ?

Share this post


Link to post
Share on other sites
Why would ST use some chips created for 8-bit, low res, game oriented XL and family ?

 

Atari didn't stop their R&D efforts after the 8-bit line was released. Such as Amy chip.

 

But even the Pokey was a better sound chip than what they used for the ST. They also could have included an SIO for peripheral backwards compatibility.

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...