Jump to content
IGNORED

FPGA Based Videogame System


kevtris

Interest in an FPGA Videogame System  

682 members have voted

  1. 1. I would pay....

  2. 2. I Would Like Support for...

  3. 3. Games Should Run From...

    • SD Card / USB Memory Sticks
    • Original Cartridges
    • Hopes and Dreams
  4. 4. The Video Inteface Should be...


  • Please sign in to vote in this poll.

Recommended Posts

 

It sounds more like an issue with your TV, especially since it's happening with another device as well. I don't know why DVI would cause it (since HDMI and DVI are basically the same thing, at least as far as the digital video portion goes). It sounds like a problem with how your TV recognizes and/or handles the signal.

 

HDMI is basically DVI-D with the audio channels and HDCP handshaking.

 

My earlier message with the capture card kinda spazzing at 1080p reflects this kind of thing too. Nearly any time you see "green everywhere" there's some kind of handshake problem.

 

When you use HDMI as a "PC Monitor" most video cards actually default to an overscanned mode. This typically requires going into the video card's control panel and turning overscan adjustments off. When you use a "PC Monitor" as a TV over HDMI, usually the monitor goes edge-to-edge right away, because that's what it's expecting.

 

This would be easier to explain with a brand new screen. Past experience has basically indicated that a "computer" output will wind up not filling a 1920x1080 HDTV screen, even though it's at 1920x1080 unless a setting is told not to. I had this problem with AMD cards. Since Windows moved to a resolution-agnostic mode, this problem has kinda only persisted with multi-monitor setups that are of unlike resolutions/aspect ratios.

Link to comment
Share on other sites

I wasn't implying the SNT was in DVI mode, was just passing along any tidbit I could find re: reproducing the effect! My apologies for the implication. If anything, I expected the ITC flag to maybe be the metadata culprit, but it isn't if the OSSC is working as described (OSSC always has bugs though, on every firmware, though it's getting better).

I noticed this effect on my monitor as well. In DVI mode, it gives you every pixel without overscan. If you start sending those hdmi packets, though, it starts overscanning whether you want it to or not, and there is no option to turn it off on this particular monitor unfortunately. Even the monitor's mode display gets dumbed-down. it would say i.e. "1920*1080 60hz" -> "1080p" going from DVI to HDMI.

  • Like 1
Link to comment
Share on other sites

I noticed this effect on my monitor as well. In DVI mode, it gives you every pixel without overscan. If you start sending those hdmi packets, though, it starts overscanning whether you want it to or not, and there is no option to turn it off on this particular monitor unfortunately. Even the monitor's mode display gets dumbed-down. it would say i.e. "1920*1080 60hz" -> "1080p" going from DVI to HDMI.

I'm guessing, and it's just a guess, that some monitors are designed like, well, DVI didn't have product description infoframes, so we'll assume 'PC' for everything DVI, and nobody wants overscan on a PC because it screws your desktop and UI. And some are probably half assed and then assume everything HDMI is video even though the monitor has a place to find the source product. And of course overscan was once a 'feature' for video as it reduced calibration. But even then what a device decides to do with 'video' or 'game' or whatever is going to regrettably vary. And yeah, my nine year old plasma was more configurable than any of the sets I see today.

 

And on top of that the OSSC only gives you two values and doesn't let you choose what they are or tell you but I found a doc listing a whole byte's worth of source products and who knows how many there are now. I could have sworn I saw somewhere there was a 'text' type but who even knows.

 

hBQ4YfF.png

 

Whew.

 

Edit: and on top of that I think from what I'm seeing that HDMI prior to 1.4 didn't have a way to specify source as far as I can tell making the jungle thicker. Anyhow, I'm probably wrong about some of this but what a mess.

Edited by Beer Monkey
Link to comment
Share on other sites

So, what is it about chips like the SuperFX and SA1 that makes them seemingly so hard to emulate? Is it a much higher level of complexity or have they just not been documented as thoroughly as other chips?

 

I'm assuming that's why they haven't been done in any of the major flash carts as of now, so correct me if I'm wrong.

Link to comment
Share on other sites

So, what is it about chips like the SuperFX and SA1 that makes them seemingly so hard to emulate? Is it a much higher level of complexity or have they just not been documented as thoroughly as other chips?

 

I'm assuming that's why they haven't been done in any of the major flash carts as of now, so correct me if I'm wrong.

I don't think documentation is an issue, as I'm pretty sure SuperFX and SA1 are pretty well emulated in software emulators. I think the reason that neither of those chips have been emulated on the SD2SNES comes down to man/womanpower on the code. AFAIK ikari and just a handful of other devs work on the SD2SNES firmware in their spare time. According to the SD2SNES blog, SuperFX should be able to fit on the SD2SNES FPGA, but SA-1 may require a hardware revision with more resources. SA-1 is basically a second SNES CPU. It'd be amazing if Kevtris, ikari and the rest of the SD2SNES contributors could collaborate and get full enhancement chip support on both a hypothetical Super NT jailbreak and the SD2SNES.

  • Like 1
Link to comment
Share on other sites

 

Package came tonight at like 8-something. And I can confirm along with others this console, at least with the default firmware, has issues with 720p/1080i TVs that don't support full HD. I made a video but it's past my bedtime and I need to spend some more time playing with it before I upload.

 

If your Super NT refuse to display picture, try the following, curtesy of wherewilf of NintendoAge:

Sorry, SELECT+DOWN is the default. I changed mine to SELECT+B.

So, if you want to try this blindly, you can do so. Make sure the console is on. Then change it to 720 by doing the following:

 

1. SELECT+DOWN to bring up the menu

2. DOWN x 3 to mark Settings

3. Press Y to go into Settings

4. Press Y to go into Video

5. Press Y to go into Resolution

6. DOWN x 1 to go to 720p

7. Press START to change to 720p

 

That might help.

http://nintendoage.com/forum/messageview.cfm?StartRow=1051&catid=31&threadid=177652

 

Other thoughts: it is nice that Kevtris decided to allow horizontal and vertical offsets, but it would be nicer if zero could be dead center and plus/minus to move the screen up/down left/right. Currently I have no way of knowing where the centerline is.

Link to comment
Share on other sites

This is not true. Many of the gaming LCD monitors have 1 to 3 ms input lag for drawing the very top part of each frame, 8 to 10 ms for drawing half the frame and 16.67 ms for drawing the entire frame. A CRT has the same property: it takes 16.67 ms for a 60hz CRT to draw the entire frame, so many of the gaming LCD monitors have no additional input lag compared to a CRT.

 

CRTs have lag because it takes time for the CRT to draw each line from top to bottom. So the real measure for an LCD should be compared against that baseline.

 

 

What you are talking about isn't input lag. When a LCD has a rating of 16ms of input lag, that is an additional frame of lag before a response occurs from when you press a button and for your TV to show a change.

 

What SegaSnatcher said. We aren't talking about how the particular technology draws a frame. We are talking about the delay from when the screen gets the data to do so, and then actually starts to draw. Display manufacturers DO NOT publish input lag data. That 1ms mentioned by BenQ is GTG pixel response which is how fast a pixel can change state from grey to white, back to grey.

 

Actual input lag values: https://displaylag.com/display-database/

 

But again, it doesn't really matter once you're down to that low of a delay. Real world gaming isn't affected. With the manual lag test, I can't do any better on a CRT vs my 4K set with OSSC . That's what the manual lag test is really showing, how a particular display affects your gaming. So for me, my flat panel is just as good as a gaming experience as I would have with a CRT (minus light gun support of course)

 

My CRT:

post-45470-0-23816900-1518585008.jpg

 

4K Set:

 

post-45470-0-42336800-1518585311.jpg

post-45470-0-23816900-1518585008.jpg

post-45470-0-42336800-1518585311.jpg

Link to comment
Share on other sites

 

 

 

What SegaSnatcher said. We aren't talking about how the particular technology draws a frame. We are talking about the delay from when the screen gets the data to do so, and then actually starts to draw. Display manufacturers DO NOT publish input lag data. That 1ms mentioned by BenQ is GTG pixel response which is how fast a pixel can change state from grey to white, back to grey.

 

Actual input lag values: https://displaylag.com/display-database/

 

But again, it doesn't really matter once you're down to that low of a delay. Real world gaming isn't affected. With the manual lag test, I can't do any better on a CRT vs my 4K set with OSSC . That's what the manual lag test is really showing, how a particular display affects your gaming. So for me, my flat panel is just as good as a gaming experience as I would have with a CRT (minus light gun support of course)

 

My CRT:

post-45470-0-23816900-1518585008.jpg

 

4K Set:

 

post-45470-0-42336800-1518585311.jpg

So which 4K tv is this? None of them in that link have lag below 10ms but your test here shows 5ms.

Edited by Toth
Link to comment
Share on other sites

So which 4K tv is this? None of them in that link have lag below 10ms but your test here shows 5ms.

 

The manual lag test is not the method they're using to test these displays. It's just meant to give a rough estimate of how well YOU do on a particular display, it's not the actual value for the display. The manual lag test is like a basic rhythm game, you're trying to time button presses with shapes aligning. It takes the average value over 10 cycles. My point of showing the manual lag test, is for me gaming, I'm not going to notice a difference between a CRT and this 4K set with OSSC or SuperNT hooked up.

 

The device they are actually using is something like the Leo Bodnar tool. http://www.leobodnar.com/shop/index.php?main_page=product_info&cPath=89&products_id=212

You place it over and area of the screen, it sends out a video signal. It has a photo eye to see the image on the screen, it measures the lag between when it sent out that pulse and sees the results.

 

Actual input lag for this display is around 14ms. Full review: https://displaylag.com/gaming-review-tcl-p-series-55p607-4k-hdr-tv/

Edited by keepdreamin
  • Like 1
Link to comment
Share on other sites

 

 

 

What SegaSnatcher said. We aren't talking about how the particular technology draws a frame. We are talking about the delay from when the screen gets the data to do so, and then actually starts to draw. Display manufacturers DO NOT publish input lag data. That 1ms mentioned by BenQ is GTG pixel response which is how fast a pixel can change state from grey to white, back to grey.

 

Actual input lag values: https://displaylag.com/display-database/

 

But again, it doesn't really matter once you're down to that low of a delay. Real world gaming isn't affected. With the manual lag test, I can't do any better on a CRT vs my 4K set with OSSC . That's what the manual lag test is really showing, how a particular display affects your gaming. So for me, my flat panel is just as good as a gaming experience as I would have with a CRT (minus light gun support of course)

 

My CRT:

post-45470-0-23816900-1518585008.jpg

 

4K Set:

 

post-45470-0-42336800-1518585311.jpg

You both are wrong. The display lag numbers published on the net are often an average of lag measured at the top middle and bottom of thr screen. Highend LCD gaming monitors draw the upper part of th screen before the middle and lower parts. So the input lag is near zero at the top and near 8ms in the middle, and 16.7ms near the bottom. The average of these measurements makes the lag seem higher than it really is, i.e., the first row of pixels is drawn with a near zero delay. Here is an example of the measurements being taken shown different lag numbers depending on how low down on the screen the measurement is taken:

https://youtu.be/mHIoJWtGR_w

Link to comment
Share on other sites

So which 4K tv is this? None of them in that link have lag below 10ms but your test here shows 5ms.

 

Part if that is reaction time.

 

This is the result of the manual lag test on my 4K monitor, the SuperNT runs at 1080p here.

 

240p lag test @4k

vxd85.jpg

 

480i lag test @4k

10fqnno.jpg

 

 

On my 1080p TN monitor with the splitter in place

 

240p lag @1080p

t6320x.jpg

 

480i lag @1080p

2j2utr5.jpg

 

Note how only the 480i mode seems to suffer.

Link to comment
Share on other sites

Just wanted to give extra thanks to Kevtris for using the 256 * 8/7 formula on the SNES horizontal aspect correction labeling in the Super Nt. I can't tell you how many times I've gotten into arguments with people that think you simply push the active graphics out to a 4:3 ratio and be done with it. Typically when you tell them about that formula, they latch onto the 8/7 part and think you're talking about 8:7 pixel AR, and it just goes downhill trying to explain the difference.

 

But yeah, it's so awesome that Kevtris looked into that and knew to use that formula. Huge props!

  • Like 2
Link to comment
Share on other sites

You both are wrong. The display lag numbers published on the net are often an average of lag measured at the top middle and bottom of thr screen. Highend LCD gaming monitors draw the upper part of th screen before the middle and lower parts. So the input lag is near zero at the top and near 8ms in the middle, and 16.7ms near the bottom. The average of these measurements makes the lag seem higher than it really is, i.e., the first row of pixels is drawn with a near zero delay. Here is an example of the measurements being taken shown different lag numbers depending on how low down on the screen the measurement is taken:

 

You're right. :razz: But that 1ms # BenQ talks about is for GTG pixel response, which is for lack of motion blur. But I guess, saying 1ms is still accurate for when the screen starts to draw if your results mid screen are around 9 - 10.

 

I knew the results we're taken from mid screen. I just read the details on that bodnar page, it actually states the minimum value at the bottom of a screen should be 16ms.

 

So when talking about my 4K set, they're showing 6.8ms at the top. So actual delay is less than half a frame, nice. Explains why I can't tell a difference between it and a CRT. :thumbsup:

Edited by keepdreamin
Link to comment
Share on other sites

I'm waiting for the JB to even order.

If it never comes to be then I would have not wanted a single purpose FPGA based console. I praise kevtris for the phenomenal achievement of course, that is worthy of all admiration.

The price of the SuperNT is about right and it does have a bigger FPGA than the Nt Mini.

 

Given I have all the consoles I care to have and all the flash devices I care (aside the up and coming Jaguar cart) and a Framemeister for when I really really want it that way, is not that I cannot play the games already (and a small crappy CRT for the few light gun games or 3D games based on CRT tech).

 

Personally (and it is obviously just my opinion) a single purpose FPGA console is a bit of a waste given that it CAN be reconfigured and we know kevtris has all the cores (at least the ones on the NT Mini).

So told given how "done" I kind of am with the 8bits, I believe a couple more of 16bits cores (MD and PCE [i know not fully 16bits but bear with me]) would make an even bigger splash in the scene.

 

Will see what comes to be.

If by the time I decide to purchase (if it happens that is) the Super Nt is sold out and Analogue decides not to make anymore and the scalpers want to sell it for 5x then I'll do without ..... if all of this happens I truly hope kevtris makes enough money to buy himself the island of Lanai in Hawaii and go bananas there, or should I have said go pineapple .... at any rate I could see other companies interested in hiring kevtris talent for more retroFPGA goodies and paying his weight in gold to get him (eat a lot before that interview my friend).

Link to comment
Share on other sites

 

OK, having a strange image with my OLED65B7A. (Anti-OLED brigaders, please save it for elsewhere. Thanks)

 

I've been using the set since October with no issues and a variety of sources; specifically no overscan which is key to getting great image quality. ESPECIALLY when dealing with upscaled low-res graphics, as overscan resamples the image outside of what your device is doing, causing artifacts...plus there's the whole losing screen real estate thing. Anyhow, I've had no problems including using various classic systems with the OSSC, and no issue with my PC, Xbox One, etc. so I thought I'd be good.

 

However, the Super NT is triggering my display to overscan the image, causing the requisite issues. If I squeeze it down in the Super NT (need height of around 693 instead of 720 to get things to fit with a 256x239 test grid) then we're resampling on top of resampling and that's even scarier.

 

I put my PC in 1280x720 60hz and there is zero overscan; perfect 1:1 pixel mapping and nothing scanning off the physical screen. Physically plugged the Super NT into the same physical input with no settings changes and if I set height on the Super NT (firmware 4.1) to 720 there's major overscan (regardless of width). I have to crank height down to 693 to get test patterns from 240p suite to fit on the screen, but the resampling wrecks the picture with scanlines, of course.

 

Someone suggested changing the ITC flag in the OSSC to see if it caused the same issues as some sets interpret ITC when deciding how much processing would be OK on a given image.

 

Changing the ITC flag in the OSSC did nothing; there was still no overscan. BUT, to I was able to get the OSSC to do the same thing (trigger the set to overscan) if I set TX Mode to DVI instead of HDMI. This is easily reproducible, and the amount of overscan looks the same per the 240p test suite grid pattern.

 

Another suggested that limited RGB could be a factor; I put my PC in limited RGB mode (and even tried component 4:2:2 and component 4:4:4), still, no overscan.

 

If anybody has a similar issue LMK how it goes for you.

I have the same TV as you, except it's the European model (55B7V). My super NT will be delivered today so I haven't been able to test it yet. However, make sure the "Just Scan" option is enabled in the settings. Also try the different aspect ratios to see if it changes anything. "Original" sounds like the most obvious setting but you should try them all just in case. If it still doesn't work, then rename the HDMI port that you use for your super NT to "PC" (without quotes and in uppercase). This magic renaming will also change the behavior of the TV slightly.

I will test mine this evening and I'll let you know what happens.

 

-Eicar

Link to comment
Share on other sites

Anyhow, I'm probably wrong about some of this but what a mess.

 

Derp. And of course I was. Though I'm still not sure what OSSC does, nor am I sure from the behavior whether it's doing it reliably or accurately, either. Maybe they are setting the content bit to one but ignoring the content byte, but it worked for the one person who opened the support thread so it was considered solved. Or maybe that's just the safest thing to do. I do NOT think it addresses the issue with the LG set(s) though. So many ways for a display to decide when overscan (and latency) is bad, and then on top of it there's the whole other side of the protocol where the display can tell the device what it can/will do re: overscan (Video Capability Data Block, page 112).

 

https://standards.cta.tech/kwspub/published_docs/CTA-861-G_FINAL_revised_2017.pdf

 

I mean, I guess in a perfect world every device that's HDMI 1.4 or greater would at least process minimally, 'game' content (no scaling, minimum processing to avoid lag) though I'm sure there's millions of devices that don't. 'Graphics' seems kind of a catch all re: scaling but not latency, and there's a Panasonic plasma from 2013 that will apply (shudder) artificial sharpening to 'Graphics'...but back then the TV actually had a menu to override all extraneous processing based on each individual content type (or common ones).

 

What a rabbit hole.

 

8HKZ8Id.png

s0OVQOu.png

Ywpj3EI.png

 

My latest best guess, which is probably wrong, is that PC monitors getting DVI assume no scaling (pixel perfect) but that recent TVs are stupid about anything that is DVI or pre-1.4 HDMI and might do gosh knows what. And that recent TVs reading content as 'Game' on 1.4 or greater HDMI also assume no scaling.

 

I now return you to your regularly scheduled program of input lag testing discussion.

Edited by Beer Monkey
Link to comment
Share on other sites

So, does this mean that the "source device" (Super NT) needs to set the CN1 and CN0 bit on in order for the screen to detect in as a Game device? Is the Super NT doing this?

I'm getting my Super NT today, and I have the same LG TV as @Beer Monkey, however I also have a Nintendo Switch connected to it and I havn't had any issues with overscan.

So I guess that if the Nintendo Switch works on my TV without overscan over HDMI, then the Super NT should also behave the same way.. Both transfer audio over HDMI so they are both running in HDMI mode and not DVI.

Can't wait to test this when I get home...

 

As a side-note... Alot of people have issues with overscan using the Raspberry PI. The raspberry pie has a special disable_overscan=1 setting that you need to add to the boot config file. This setting magically fixed my

issues with overscan on an older Samsung TV. I'm pretty sure that this setting has nothing to do if whenever the videomode is HDMI or DVI, so I guess there is some magic "overscan" bit somewhere in the HDMI data stream

that a device can set to control this. The question is if the Super NT is doing this correctly...

Link to comment
Share on other sites

I don't know if this has been reported yet, but Unholy Night Darkness Hunters hangs after the first two screens with the music still playing. I tried it via SD2SNES as well as the original cartridge. I'm still using the factory firmware however. I will update to the latest version tomorrow and see if this problem persists.

I've found this game to occasionally lockup on original hardware as well, I think it was rushed or something. Which is a shame because I really wanted to get behind it.

Link to comment
Share on other sites

Hi first post here been enjoying playing the super NT since Friday when it arrived at my house in the uk.

 

Just wanted to mention a couple of issues I have had on the super NT, I have an original cart of the super famicom version of muscle bomber and have noticed some graphical glitches with lines running through the player whilst playing as Hagger not sure if this is my cart or the Super NT

 

Also the joe and mac collection from retro bit has issues with the games not working since the 4.1 update the original joe and mac has garbled intro screen menu and in game glitches.

 

also noticed some very minor graphical glitches from super turrican directors cut but I'm not sure if they are present in the original.

 

All carts used where cleaned and firmware was updated to 4.1.

 

Thanks

Link to comment
Share on other sites

So, does this mean that the "source device" (Super NT) needs to set the CN1 and CN0 bit on in order for the screen to detect in as a Game device? Is the Super NT doing this?

 

 

Only if the monitor/HDTV/Capture device actually respects those bits. To go back a few pages, I use a SA7160 based capture device, it has a very useful panel

 

This is what the TV box normally does at 1080i

5y7td0.png

 

This is what the SuperNT (4.1) does at 1080p

 

w18m5i.png

Link to comment
Share on other sites

I got mine in yesterdeay. Tested it with a bunch of stuff and everything ran perfectly. I didn't touch any of the graphics settings. I see no point in doint that. It looks perfect out of the box IMO.

 

I did change the font and a couple other stuff, and made the console boot straight to the cartridge.

 

Kevin, Analogue (what was the name of the guy who made the case again), good job people. Seriously. What a nice piece of kit. This does feel premium.

 

The ports really are great, the cartridge slot is working absolute perfect for me. Now goodbie. I'm gonna play and test stuff. :)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...