Jump to content
IGNORED

CollectorVision Phoenix Release Thread


Bmack36

Recommended Posts

8 minutes ago, alortegac said:

I think a "Maybe TI-99 core in the future also. "  would be a nice to have core.  Mainstream cores are certainly Atari and Intellivision.   But the original idea,  as I understand it,  is not only to be able to run  the roms from a SD card,  but to build cart adapters (like expansion modules)....is this correct?

Outside of the 2600 cartridge adapter there has been no mention of other cartridge adapters at this point. Not saying they wouldn't do more or couldn't do more, just no mention of it so far.

Link to comment
Share on other sites

Some very mature posts here lately.. good to see some things don't change.

 

I remember getting banned from another Phoenix thread, after pre-ordering early, for daring ask why there was an assumption that most people would get two.  Very professional.  Saved myself a lot of money not supporting this or their games anymore.  So, though I'm but a wee single person, that hurt sales too, big picture.  If only it made a different to something other than my wallet.

 

Let the banhammer drop again!

 

Link to comment
Share on other sites

Yeesh.  As a kid, did anyone ever go over to a friend's house to find them getting a spanking?

 

I think a 7800 core would be nice to have.  I would expect it to replace the 2600 core as compatible.  5200 would be nice, but controllers would be difficult as they are analogue.  I would expect any number of cores could be created.  The F18A supports NES-like graphics modes, so an NES core could exist.  GameGear, Master System... but, who will create them?

  • Like 3
Link to comment
Share on other sites

I think a wiser move is to concentrate on fixing any issues on the Phoenix CV core - . Just making it perfect as a modern CV replacement with no issues of any kind. 

I do not feel going for additional emulation cores is really a good value proposition, perhaps only for real cores coming with expansion hardware.....such as 2600
 

 

  • Like 4
Link to comment
Share on other sites

I have to admit considering the console race was a three way affair during the early 80's, it would be nice to see FPGA support for all 3.   I admit I'm very much a fan of expanding the ability of the Phoenix, but I'm also A-OK with what we've gotten so far.  If the current cores were the final updates to the console (and I'm sure it's not), I'd still be a happy camper.

  • Like 2
Link to comment
Share on other sites

I don't really care about the 2600 core. I already have a dozen ways to play 2600 games on a modern TV. Intellivision, ADAM are top priorities and I will be stoked if the Intellivision sees a stable core on this. 

 

My wish list would be the Astrocade, Spectrum, AMF MP 1000 Dream Machine, TI 99, Apple IIe, other early computers as those do not have a lot of ways to play around using the technology of the Phoenix. C64 is not needed as there are a lot of ways to play C64 games and emulate the computer in a way that is close to the original.

 

Also nice would by an Odyssey2 interface. 

 

5200 would be a waste of time given the analog nature of the system. NES and beyond have a lot of support already from other projects.

Edited by Aracabata
Link to comment
Share on other sites

21 minutes ago, eebuckeye said:

I definitely agree the Coleco core should be 100% before providing other cores.  I believe it is very close, right?

I think we may need an adapter for the Roller Controller and Driving module not to glitch. I've seen that a high quality Smallymouse2 USB to Atari mouse adapter with a passive pin reassignment added works glitch free, although using such a "USB to Atari mouse" solution results in a 3x more sensitive roller ball with only one working button instead of 4 and no keypads.  I know this adapter has Schmitt Triggers (digital filter, I think) and possibly Zener Diodes (electronic isolation), but don't know if that's the key and not quite sure on the appropriate part numbers. I should test a standard Atari mouse or trackball, but I'd need to add a third power line which makes me worry about power pass-through.

 

I believe they talked about adding a way to shut off the optical pins to stop game freeze for a few games for the Super Action and Edladdin controllers and some homemade's that use the Easy CV.

Edited by Swami
Link to comment
Share on other sites

On 6/23/2020 at 12:48 AM, Loafer said:

retroillucid / opcode: I really wish both of you would stop with the grenade throwing approach to problem solving as it just doesn’t work.  Why can’t you guys, when you see something that upsets you like a controller project or a club membership, just send a note to the other party calmly saying “I’m not sure I get this, can we discuss ?”

I 100% agree with this.

 

Guys - All you are doing is giving people a reason to buy popcorn. Please stop.

 

Seriously, I have a ton of respect for both of you and I will always be a customer of anything you guys BOTH do.  So will most people here in this group.  Even if you both made a controller, both have a club, both have a console.  I will buy from both of you guys.

 

I'm sure I'm not the only one, right? 

  • Like 6
  • Thanks 1
Link to comment
Share on other sites

6 minutes ago, TPR said:

I 100% agree with this.

 

Guys - All you are doing is giving people a reason to buy popcorn. Please stop.

 

Seriously, I have a ton of respect for both of you and I will always be a customer of anything you guys BOTH do.  So will most people here in this group.  Even if you both made a controller, both have a club, both have a console.  I will buy from both of you guys.

 

I'm sure I'm not the only one, right? 


I don't think doing the same things would serve the community very well 
It's fun when the others comes with original ideas/projects (I.E. the trading cards project by Pixelboy) 

That's why I'll NOT pursue works on the controller, I'll leave it to Eduardo 
It's more fun here when it's peaceful anyway 



 

  • Like 4
  • Thanks 1
Link to comment
Share on other sites

8 minutes ago, TPR said:

I 100% agree with this.

 

Guys - All you are doing is giving people a reason to buy popcorn. Please stop.

Seconded, and if it doesn't stop, Eduardo, I am going to kick you from this thread.  There is no reason you guys should be arguing in public like this, especially if you didn't first contact CollectorVision privately to try and work out any issues.  And resorting to name calling ("jealous asshole") just weakens any legitimate arguments you may have.

 

Thank you,

 

 ..Al

  • Like 4
  • Thanks 2
Link to comment
Share on other sites

On 6/23/2020 at 7:52 PM, OLD CS1 said:

Yeesh.  As a kid, did anyone ever go over to a friend's house to find them getting a spanking?

Oh yes indeed. In fact we were always getting into trouble and we were always getting spanked and yelled at. And most people in the area had no discretion including parents. So when we expected a spanking session we'd BMX on over to the "victim's" residence and watch and wait. When the yelling started we'd "re-enact" it out in the front yard.

 

Quote

I think a 7800 core would be nice to have.  I would expect it to replace the 2600 core as compatible.  5200 would be nice, but controllers would be difficult as they are analogue.  I would expect any number of cores could be created.  The F18A supports NES-like graphics modes, so an NES core could exist.  GameGear, Master System... but, who will create them?

I suppose literally anyone with the skills and drive to fulfill the need. Simple answer. Lame answer.

 

But. That reminds me of a point. And someone correct me if I'm wrong. Most FPGA implementations (of classic consoles) I've seen don't handle analog controls well, if they do at all. Seems like more than just the FPGA is needed. Some sort of analog conditioning circuitry or A/D and D/A, some way to measure resistance. And to do it in a specific range and voltage. Having a certain size voltage ladder. Either the FPGAs don't have that capability, or it costs too much fabric, or the project really needs some additional (relatively costly) circuitry outside of the FPGA.

 

Another thing is if FPGA is "so accurate" as many non-techies claim. Then why don't console cores simply support ALL controllers. Perhaps this is an area that will be addressed in good time.

 

BTW I say this with Atari VCS cores and paddles in mind. I've nary a clue how the Phoenix does analog in and out.

 

Edited by Keatah
  • Like 1
Link to comment
Share on other sites

On 6/23/2020 at 8:08 PM, alortegac said:

I think a wiser move is to concentrate on fixing any issues on the Phoenix CV core - . Just making it perfect as a modern CV replacement with no issues of any kind.

I vote this way too. It's the same deal with software emulators. And a thousand other things. I like to see bugs fixed first and then features added afterwards.

 

Or both being worked on simultaneously. But never new features first.

 

On 6/23/2020 at 8:08 PM, alortegac said:

I do not feel going for additional emulation cores is really a good value proposition, perhaps only for real cores coming with expansion hardware.....such as 2600

Why not? Once bugs are fixed, the more cores the better.

Link to comment
Share on other sites

15 hours ago, Keatah said:

Another thing is if FPGA is "so accurate" as many non-techies claim. Then why don't console cores simply support ALL controllers. Perhaps this is an area that will be addressed in good time.

FPGAs are "magic", and like all magic things, they are attributed abilities that don't make sense if you just think it through.

 

Let me ramble for a moment about hardware, emulators, and FPGA reproductions. I'll oversimplify for the sake of comparison, so let's try to stay out of the weeds.

 

So what's hardware to a console? The ColecoVision is a Z80 CPU, a 9929 VDP, an SN sound chip, a replaceable game ROM, and a host of associated circuitry and switches to link it all together. We also have to remember the television, since without it we can't really see anything happening. It has a base clock of roughly 3.5MHz, which means that three and a half million times per second, something happens. Even better, every piece does its bit at the same time, with no concern for the rest of the system except through the very specific links. So we could say that 3,500,000 times per second, at least 4 things happen. (I said this was simplified ;) ).

 

What's an emulator? Well, in this case it's a piece of software meant to reproduce the behaviour of a piece of hardware. Software doesn't look anything like hardware. It operates generally as a series of numeric expressions (math, compare) examined one at a time. Technically, you can do anything in software that you can do in hardware, but while a hardware transistor can switch in nanoseconds, the software reproduction of that transistor needs a lot more time - requiring instructions to fetch the data to compare against, decide whether to switch, and activate the switch itself. Back when emulation was becoming a thing, this could reasonably take microseconds - which sounds great but is a factor of a thousand times slower. Modern top of the line hardware is pretty fast, but we're dropping back to smaller hardware like PIs again, and even top of the line stuff suffers due to complex cache systems and the worst enemy of emulation: the operating system. The OS really doesn't want to give one piece of software attention three and a half million times per second.

 

So, software emulation makes compromises. The raw circuits are not emulated, only their final result. In many cases, this greatly reduces the amount of work needed. (For instance, you can handle the sound chip at 44,000 times per second rather than counting every clock at 3.5M. The sound chip doesn't act on most of those clocks anyway!) To make the operating system happy, larger blocks of time are processed in a batch. For instance, an emulator might process an entire frame in one go - that's about 16ms and means the OS only needs to give it attention 60 times a second. The emulation can only process one thing at a time, rather than all circuits in parallel (modern multicore changes this but most emulators are still single threaded). This can produce timing issues - especially if a chip is emulated for a block of time, and then the next chip, etc. And finally - the operation of the circuits may not be fully understood. The software emulation will only be as good as the skill and understanding of the programmer who reproduces it.

 

What is an FPGA then? An FPGA is basically a giant switchboard. Using a description language like VHDL, input that resembles code is assembled into a circuit inside the FPGA. It's not terribly unlike those old spring-clip electronics trainers where you clip the wires to different sections of the board. Because it's a real circuit, the switching times are very fast - sometimes not as fast as the original circuit but a lot closer than the software can be. Furthermore, because its a real circuit, everything runs at the same time, just like it's supposed to. This means that an accurate FPGA reproduction can be cycle-accurate, indistinguishable from the original.

 

But it's usually not. First, it doesn't solve the understanding of the circuit. There are few machines, even today, that are completely understood to the transistor level. The CV is definitely not one of them! Without that understanding, there is always the chance of getting some edge case wrong. FPGAs have a limited amount of circuitry - and the prices go up fast. So the designer may be forced to trim a design in order to fit in the hardware that they have. A human has to translate an understanding of those circuits to the hardware description language - meaning both misconceptions and outright mistakes can make it in there. Finally, and most relevant - modern reproductions pretty much never reproduce the original machine on purpose. Nobody is looking for a ColecoVision replacement that outputs YCrCb 15khz analog video, they want upscaled HDMI. Cartridges - nice gimmick, but we want to load from SD card. How about new controllers or USB? These modern systems need to be integrated into the old design in a way that works as close as possible to the original, but there's no prior art to work from.

 

I dunno, something like that.

 

  • Like 5
  • Thanks 3
Link to comment
Share on other sites

I love mini-essays like that.

 

I understand the inherent parallelism in FPGA vs serial style software on an x86. But an x86 has multiple cores and runs pretty quick. So couldn't each subsystem or custom chip or task be assigned a core?

 

Sometime ago I got into a discussion with another emulator author(s) and they said that multi-core chips offer no advantage and can be a disadvantage because each core needs to be sync'd with the rest of the emulated system. Told me that single-core operation was the only way to achieve coherency and preserve the timing. Not sure I agree with that. With 5GHz 8-core chips becoming standard, there's power and time to sync things.

 

And I do totally agree that..

37 minutes ago, Tursi said:

Finally, and most relevant - modern reproductions pretty much never reproduce the original machine on purpose. Nobody is looking for a ColecoVision replacement that outputs YCrCb 15khz analog video, they want upscaled HDMI. Cartridges - nice gimmick, but we want to load from SD card. How about new controllers or USB? These modern systems need to be integrated into the old design in a way that works as close as possible to the original, but there's no prior art to work from.

That's right. I'm not interested in composite connectors or mounds of cartridges. Not even D-sub 9 connectors, not at the moment anyway. Those were relics from a bygone era. Relics that filled a specific and practical need. They solved the technical problems of their time.

 

Composite & RF were needed to connect a console to a television display device. Cartridges were needed to swap the program on a small computer - in kid & consumer friendly manner. Presently we have superior alternatives that are far more capable than anything imaginable back in the day.

 

..and in a similar way it's why I like software emulation so much. A modern store-bought PC brings all the building blocks, the subsystems, sound, graphics, i/o, cpu, memory, storage, and display, together into one box. A modern box. A fresh box that isn't ratbaggy and threadbare. Each emulator is like a core that gets loaded on demand when you click on the icon.

 

And with software you get a rich "environment" to play in, you get a settings panel and storage management. And many other tools.

  • Like 1
Link to comment
Share on other sites

3 hours ago, Keatah said:

I love mini-essays like that.

 

I understand the inherent parallelism in FPGA vs serial style software on an x86. But an x86 has multiple cores and runs pretty quick. So couldn't each subsystem or custom chip or task be assigned a core?

 

Sometime ago I got into a discussion with another emulator author(s) and they said that multi-core chips offer no advantage and can be a disadvantage because each core needs to be sync'd with the rest of the emulated system. Told me that single-core operation was the only way to achieve coherency and preserve the timing. Not sure I agree with that. With 5GHz 8-core chips becoming standard, there's power and time to sync things.

 

And I do totally agree that..

All I can speak to on this is that software needs to be specifically designed to use multiple cores and also 4-cores vs 8-cores, then there's the synching issue. Maybe just a lot more work and trial and error for people who do this as a hobby.

Link to comment
Share on other sites

On 6/23/2020 at 8:52 PM, OLD CS1 said:

I think a 7800 core would be nice to have.  I would expect it to replace the 2600 core as compatible. 

 

+1. I would also like to see this in future (and 7800/2600 cart adaptor)

Link to comment
Share on other sites

20 hours ago, Keatah said:

Sometime ago I got into a discussion with another emulator author(s) and they said that multi-core chips offer no advantage and can be a disadvantage because each core needs to be sync'd with the rest of the emulated system. Told me that single-core operation was the only way to achieve coherency and preserve the timing. Not sure I agree with that. With 5GHz 8-core chips becoming standard, there's power and time to sync things.

Yes and no... my own emulator, Classic99, has been multithreaded for probably 20 years. But, it's also emulating a computer that is slightly unusual in that its VDP is de-synchronized from the CPU, having completely separate clocks.

 

A single threaded emulator is absolutely the best way to guarantee correct sync between your components. It's certainly the easiest to code. Once again, the operating system gets in your way - even if you trigger another system to start operating "now", there's no way to know when it actually starts to. If you need to guarantee the sync of two components, a single thread is the most reliable way to do it.

 

It's kind of non-intuitive, but because most systems run their various components off a shared clock, the timing between the components is predictable. The ColecoVision Z80 and VDP will always run at the same offset, while on the TI the ratio between the CPU and the VDP can vary from machine to machine.

 

Even with all that though, you can still take advantage of multi-core. For instance, maybe running the VDP needs to happen in lockstep with the emulation (depending on how precise you want to be), but the video display doesn't remotely resemble the television the VDP expects anymore. So you can run your display loop in a separate thread without any real heartache. Same with sound, no reason to care what phase the 48Khz audio output is in with respect to the emulated sound chip, since it didn't even have a sample rate. Offloading those things can certainly help. Dealing with OS services, menu systems, all that benefits from multicore as well.

 

I will always disagree with any assertion that claims there's only one way to solve a problem, but even in my emulator I've often considered dropping back to single threaded just to reclaim some sanity. But I'm stubborn. ;) 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

On 6/28/2020 at 6:41 PM, Keatah said:

I understand the inherent parallelism in FPGA vs serial style software on an x86. But an x86 has multiple cores and runs pretty quick. So couldn't each subsystem or custom chip or task be assigned a core?

 

Sometime ago I got into a discussion with another emulator author(s) and they said that multi-core chips offer no advantage and can be a disadvantage because each core needs to be sync'd with the rest of the emulated system. Told me that single-core operation was the only way to achieve coherency and preserve the timing. Not sure I agree with that. With 5GHz 8-core chips becoming standard, there's power and time to sync things.

I have had discussions with people who do multi-core programming, as well.  Same as you, I neither do such a thing on a daily basis nor have any experiencing in multi-core multi-threaded programming.  So, I speak no more as an authority as you and can only regurgitate what I have read and been told by seasoned programmers.

 

I understand that programming on a multi-core system is not as easy as just telling a program to run on a single core.  The multi-thread nature thus required by system emulation with many disparate parts means that you could not efficiently, even on a 5GHz core, perform such emulation.  Your independent parts (say video, audio, RAM/ROM access, I/O, etc.) could only run in time slices, which requires many context changes on that core, which means dumping and re-populating caches, stashing and retrieving full register sets, and worst of all the potential for stalled pipelines, even at 5GHz.  There are effects on other cores which I do not fully understand, and I am sure someone around here can demonstrate the math for why doubling the number of cores does not double performance.

 

You would require an operating system to manage the emulator threads, which then could be assign core affinity, but there are some nuances to doing so, and best to just let a mature multi-processor/-core operating system manage that.  Even in the absence of an operating system, on a multi-core general purpose CPU something must still manage threads.

 

The same approach was proposed back during the introduction of dual- and quad-core CPUs for the consumer market, when "experts" (tech bloggers) proposed the ability to dedicate one core to running security software.  Real programmers emitted a hearty guffaw and then explained why that was not feasible.

Link to comment
Share on other sites

On 6/28/2020 at 2:53 PM, Mr. John ColecoVision said:

Is there a work around or fix in the newer core to allow Magical Tree to play? I wanted to buy a copy of Magical Tree but since I only have a Phoenix now, well, I don't want to plunk down big money for a copy if it won't work. Thanks !

Yes, it would be nice to see some progress with the CV core to eliminate the issues with early Opcode games.  As far as I'm aware the issues currently remain unresolved.

 

https://github.com/CollectorVision/Phoenix-Colecovision/issues/18

Link to comment
Share on other sites

18 minutes ago, Ikrananka said:

Yes, it would be nice to see some progress with the CV core to eliminate the issues with early Opcode games.  As far as I'm aware the issues currently remain unresolved.

Thanks for the reply. Such a fun game.

18 minutes ago, Ikrananka said:

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...