Jump to content
IGNORED

MAME going to start using the GPU.


Keatah

Recommended Posts

Seems MAME is caving in and going against what they once said they'd never do. And that is to use the GPU.

 

My opinion, they have to. Given that processor development and increases in speed is all but at a standstill these last couple of years.

 

 

I think if the developers can use the GPU to relieve some cycles on the CPU and still retain the original graphics quality then that will be great. MAME is a resource hog on the CPU.

 

I had to move to MESS which is a much more stripped down version for my needs in order to free up CPU cycles. I am running a BBS via MESS emulation and when running MAME the cpu usage was too high and caused my clock within the emulated system to be off within a week to be a day wrong. Moving to MESS solved that problem. Of course I don't think the developer thought that someone would run MAME/MESS in an extended period.

 

I think utilizing the GPU is a step in the right direction.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...
  • 1 month later...

Seems MAME is caving in and going against what they once said they'd never do. And that is to use the GPU.

 

My opinion, they have to. Given that processor development and increases in speed is all but at a standstill these last couple of years.

 

 

The way you word it, it sounds almost like they had a deep philosophical issue with using GPUs? (for anything but rendering/scaling the final image?)

 

I guess because in the early days of GPU computing it all was proprietary and unstable and limited to a few vendors. Today, OpenCL and OpenGL compute shaders are mature open standards and supported by any GPU vendor worth it's salt, even embedded ones. Integer arithmetic is now available, which relieves the precision concerns with floating point handling on some GPUs.

 

Depending on the algorithm it is usually more work to write certain code for GPU, though, that still limits their use in open source projects. Having to implement things twice, once for CPU and once for parallelized GPU doesn't really help either.

  • Like 1
Link to comment
Share on other sites

The way you word it, it sounds almost like they had a deep philosophical issue with using GPUs? (for anything but rendering/scaling the final image?)

 

That's correct. They did and still do, but are slowly coming around. I probably have archived chat conversations and save webpages discussing it.

Link to comment
Share on other sites

My concern here is that they might be trading one set of problems for another. GPU's produce quite variable results and which GPU's are they intending to support? Are systems with integrated graphics going to be included or are we talking dedicated GPU's only here?

 

And is there really that much to gain from it? Is the MAME library not already plentiful enough with thousands of successfully emulated arcade machines with emulated graphics from the CPU?

 

I think the best compromise would be to give the user the option of using the GPU in the emulator settings. That way those who want to stick to tried and tested methods can do so while those who want to experiment can do that as well.

  • Like 1
Link to comment
Share on other sites

MAME should have three emulator cores for each machine/game: One built for accuracy, and another built for performance, and a third which is the best compromise between accuracy and performance for contemporary computers.

 

I know that's asking a bit much though.

 

That has been tried with another emulator. Byuu dabbled with it for awhile with his SNES emulator. It may work well short term, but for the long haul, it adds more complexity and work (of course), including additional issues trying to maintain three separate target builds.
Byuu has long since dropped the one "built for performance" (bsnes-performance), as well as "best compromise between accuracy and performance" (bsnes-compatibility), and kept the "build for accuracy" (bsnes-accuracy) as the goal...just like MAME. ;)
Link to comment
Share on other sites

The goal for every emulator should be accuracy. This point isn't open for debate. It is the user's responsibility to provide enough host computing power. Besides, if you want a higher performing emulation, then just go with an older version - and don't complain about inaccuracies.

  • Like 1
Link to comment
Share on other sites

I'm in agreement that accuracy should ideally always be the priority. But because there can be a considerable wait time for host hardware to become available that is powerful enough to even allow for accurately emulating other hardware, I can sympathize with the decision to forego some accuracy. For example if doing so means the difference between having any emulation at all vs. none. But if it is possible to have better accuracy with hardware that is available then that should be the path. Those who can't (or chose not to) afford hardware of sufficient performance will just need to wait until they can afford it.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...