Kismet
Members-
Content Count
494 -
Joined
-
Last visited
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by Kismet
-
Cost. Up front, if you have thousands of dollars to throw at a game console, you could make it do anything. There is a concept called "law of diminishing returns", which is the point that no amount of money will ever justify investment (be that cost or time.) The price point, as indicated by the poll at the beginning of this thread is $199. It's within reason to simulate any chip that operates up to about 40Mhz in a cheap FPGA. Past that point the price for the FPGA jumps. To give you an idea: The Cyclone V used in the Super NT.. The Cyclone V is a 28nm chip built on the same process as cpu's from 2013-2016. The Cyclone 10, is a 14nm chip, so it has 4x the amount of space in the same area. The Cyclone V, E series, The A2 is $50 (25K LE), the A4 is $71 (49K LE), A5 is $126 (77 LE), A7 $227 (149.5K LE), A9 $373 (301K LE) The Cyclone 10, LP series, 10CL025 is $39 (24K LE), 10CL040 is $56 (39K LE), 10CL055 is $88 (55K LE), 10CL080 is $119 (80K), 10CL120 is $205 (119K LE) Both of those series are generally labeled "low speed/low power" chips. For the purposes of just getting a price I picked the first chip that showed up on digikey. Now if you go look up the soft cores for the SH2 (you'd need two of them for the 32X or Saturn), according to jcore's own website you'd need at least a Spartan 6, XC6SLX9, which has 9K LE and costs $25. It can run the sh-2 core at 50mhz. So there is already the means to do the CPU's. That does nothing about the 32X VDP, audio or bus/bandwidth/memory requirements. Like it may very well be possible to do the 32X, but it might require more block memory in the FPGA which is only provided by the most expensive models, or you have to start adding SRAM/PSRAM to the FPGA console. To emulate a complete 32X, SegaCD, Mega Drive, Master System/SG3000/SG1000/Game Gear you need: 2 x SH-2 cores 2 x 68K cores 1 x Z80 core 1 x 32X VDP 1 x Mega drive VDP (includes PSG and Master System VDP features) 1 x Sega CD ASIC 1 x YM2612 (OPN2) Various other chips like the RAM, DMA controller and so forth. The MD PCB has 12 chips on it. The Sega CD board has 16 chips on it. The 32X I think has 8, but most PCB images tend to only show one side of it. So you're looking at 36 chips, or at least the logic from them to work together. So as a FPGA console doing all of that probably feels overly redundant when there are shortcuts that can be taken knowing how the 32X, SegaCD and Megadrive/MD features fit together. Like to begin with you'd stick all the VDP parts together to remove the need for genlocking an analog signal and multiple DAC-ADC phases. Likewise for the audio. There is also at least one DSP cart for the Mega Drive (Virtua Racing.) Like I completely see why doing the 32X and the SegaCD might be an incredibly insane undertaking. We know the MD is doable because there is already one partially-working core for the MiST/MiSTer project. That FPGA core is also incomplete. The most likely thing to happen would be to discard the 32X and SegaCD "attachment" features altogether and only ever run those from a SD-card, as that eliminates the most expensive parts. There is a reason why the Genesis III doesn't support either, and that's pretty much due to the amount of parts that would be needed to support something that most people won't have.
-
Better here than... say the MLiG stream where it was asked every 2 minutes. As said many times in this thread. If people want to play roms, there are already options for that. kevtris could go "surprise, it will support (homebrew) MSU-1 roms from the sd-card only" thus enabling an entirely digital path for the audio. Or maybe kevtris can make his own "expansion chip" cart to sell that runs other cores and only works on the Super NT by connecting it directly to the scaler when detected. At any rate there's plenty of ways of doing things, maybe no JB firmware is forthcoming and some other way of "monetizing" other cores comes along. It's not like the SD2SNES is a closed system.
-
You do realize that the features in the NT Mini JB firmware are not in the official firmware right? Nowhere has Analogue said anything about the NT Mini JB firmware other than it won't void the warranty. Go watch kevtris interviews and videos about the NT Mini, basically he had all these other cores long before Analogue came along. It was basically a no-brainer to do this. But the Super NT agreement likely has some provision in kevtris's agreement to work on it that he can't/won't release any non-official firmware within some time frame. I have to imagine that Analogue was not blindsided by that NT Mini JB firmware a week after the NT mini was released, but it was also a premium product that was only different from the RetroUSB AVS in that cost half the price up to that point. So if the JB made the NT Mini more attractive to one set of retro enthusiasts, that wasn't really doing any harm to Analogue as long as it wasn't being marketed as a feature in the first place. If anything, it probably resulted in less sales for the RetroUSB AVS. In no case has the firmware for any of these devices been designed to prevent flash carts. Had they done that (and they sure could have), then you might question the motives for it.
-
They aren't answering that question because it would affect things negatively if they do. There's two scenarios, and we're not even looking at legal things. A) someone drops the ball and announces the JB firmware, sales spike, and everyone who really wants one sits in preorder hell. Then the JB doesn't happen and people start wanting refunds. B) no JB is ever announced and 2 years later it just shows up one day unannounced, like any other JB firmware for any other device, and then sales uptick. Analogue already has sales figures for the NT Mini, they can see exactly at what point the JB spiked interest in it, and likely also when people started complaining that they couldn't buy it. If it never gets a JB, then people who bought it for the potential JB, or to load it up with roms and resell it on eBay start wanting their money back. That is why you never market that feature, and the preorders were doing fine without any suggestion of a JB ever becoming available. If it's not promised, then people can't demand a refund for a feature that was never advertised in the first place. That is why nobody gets an answer to that question. Even kevtris can not answer that question because people -here- at least know he's capable of doing it. Will it kill anyone to just stay patient. It's not like the everdrive and SD2SNES are going to vanish.
-
5-5-5 is 15-bit color, commonly referred to as 32K colors. If Higan is as accurate as byuu claims it to be, yes, the output from HDMI 4:4:4 from the SuperNT should have no subsampling on the input side and should loo identical. Keep in mind that many capture devices, because they do hardware encoding, actually naively capture at YUV 4:2:0 subsampling. The Component video Retrovision cables from a Super NES are as good as you're going to get without modifying the console. So a slight difference in brightness should be the only thing perceived if all other things are the same, which is what the gamma setting is for in the Super NT. For reference, this is what a GPM-02 with the HD Retrovision cable looks like on my SA7160 (micomsoft sc 512) (set to NTSC Japan apparently) to reference back to Karbuncle on the previous page (Higan >>> SNES Classic >>> Junior console modded with Voultar board >>> 1CHIP03 with 750 Ohm resistors >>> APU SNES with subcarrier interference removed):
-
There is no palette for the SNES on a whole, it's just 5-5-5 BGR. It creates a 256 color palette using 15-bit values. There is also color math involved which is why there will be more than 256 colors on screen (things like cloud, fog, smoke, water.) So when people start arguing about one emulator or hardware version looking better than another version, it's not the output from the PPU that is different, it's the signal calibration against the black level. NTSC-J Japanese and PAL games have a different black level than NTSC-M America. Likewise NTSC has different color values than PAL. The black level is just different enough between NTSC-J and NTSC-M that people kinda get upset when the Nintendo Wii/WiiU virtual console is significantly darker than they expect it to be, but somehow on the Wii U tablet screen shows this as an expected color output. So I really do expect some people to fail to notice this, and are instead told to change the gamma on the SuperNT depending if they're playing a US or a non-US game if the game is too dark.
-
Super Nintendo Classic Edition - SNES Mini thread
Kismet replied to Rev's topic in Classic Console Discussion
They have never been restocked in Canada as far as I've seen. It's not even in BestBuy or EB Games online catalog. -
When I get mine, I'll probably make an effort to compare the SNES and the Super NT, but I'll probably have to move the surround sound box to the PC to do it, as it's currently setup around the CRT.
-
https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled Basically, if you're really fond of news channels, video games, or cable/pvr units with persistant UI widgets, those widgets will eventually destroy the screen after 840 hours of being lit. People who play MMO games certainly play long enough to do that, and likewise people who are doing speed runs. I'm not saying "don't buy an OLED, they suck", I'm saying that they are not the option I would go for for a video game screen. Even taking into account the input latency (which is worse than the IPS panels) they will have a much shorter usable lifespan. OLED screens usage case is basically the Netflix crowd that are cable cutters. Thus a lot of the problems with OLED aren't relevant because the UI is only visible for a few minutes before video is being played.
-
Since when do people play a game for just one hour?
-
I takes 5 weeks to destroy a OLED https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled
-
OLED's have very poor life spans right now, so I would avoid them for anything but "movie" screens. You don't want to watch the news or play games that have a persistent HUD on them because the OLED's get screen burn in really quickly. I think OLED's may have some use for screens that are not in constant use, but they're certainly terrible as mobile and always on "TV" screens.
-
You know... I stuck with my 2005 laptop until like 2015, when it would no longer boot up I stuck with my Samsung LCD monitor from 2008 until last year when I bought a 4K monitor. The secondary monitor was originally a TN monitor that went "bang" one day and was replaced with a BenQ monitor, roughly around 2010, where as the previous monitor was bought in 2006. The lifespan of a computer monitor that gets a lot of use is roughly 10 years. Same with a laptop. Most consumer electronics are good for at least 7 years, but each product should only be replaced when it doesn't do what you need it to do. I'm compelled to replace my 2012 iPad because the software is pretty slow, and Apple hasn't updated iOS for in in 2 years, but because it hasn't hit the right amount of time I believe for it to be obsolete, I'm not replacing it right now. Likewise the desktop I have now was upgraded incrementally based on the requirements of the software I was running, not because I felt I needed something newer. I only have USB 3.0 ports on my current desktop case because I had to get a bigger case to fit the video card. I had the previous case since 2006 or so. A lot of failure on the "3D screen" marketing can be laid directly at the feet of the manufacturers who rolled these screens out to give the crappy theater experience an even crappier home experience. Nobody standardized on 3D glasses, let the market decide, and the market decided that 3D sucks. Likewise the market has decided that VR sucks, again. People do not want to invest in the hardware, the HMD or reserve a 12'x12' living space to play the games. No standard HMD, No standard input devices (clunky wiimote-like controller systems.) The power glove was in fact the right input device for VR all along, but somehow we're trying to do it with cameras and no haptic feedback. So, us, the people that like Retro games, be it because that's the games we like, or have a nostalgia-tinted rose colored glasses on, are the very last people that should be going "upgrade your stuff", because we know that replacing the SNES hardware with it's original controllers and CRT with a RPi plugged into a HDTV, connected to whatever rubbish usb/bluetooth controller is not a reasonable thing at all. It's like saying "Hey ma, I'm going to replace your Vintage car that gets low fuel milage with a Prius that has all these fancy features" when all "Ma" wants is to get in the car and turn the key, and she was doing just fine with the vintage car.
-
It's an inevitability that one or both will happen at some point: 1) Some Chinese bootleg factory will try to use the FPGA code to make their own SNOAC/NOAC and thus you will start seeing $90 Retrons made from $4 worth of parts. 2) Nintendo, or some third party (eg Capcom or Konami, which both produce arcade hardware) will licence the FPGA code and produce their own chips to make Arcade "throw back" boards that can be installed in the original arcade cabinet's with a HDMI screen, or make a much cheaper NES/SNES -mini classic like device that feature just their companies games. There is certainly a market for plug-and-play stuff, seeing as how Nintendo can't keep the things in stock, but I imagine that SquareEnix, Capcom, Konami, and pretty much everyone else would love to produce their own little box that they can cut Nintendo completely out of. Imagine a production run of a box that plays every fighting game released for the SNES, Sega Genesis, and Arcade designed for low-latency input. But hey, there's certainly a market, FPGA or ASIC to make the retro hardware live again that isn't currently met by software emulation of the Virtual Console. It would very much be in the interests of these companies to standardize on one device (eg the Super NT) but it doesn't necessarily mean they will want to produce new cartridges. They may very much be interested in producing SD cards with the games on it, provided there isn't a way to re-dump the games, or maybe they just don't care, knowing that there are plenty of illegal dumps out there anyway.
-
Off hand, "unknown signal" would suggest that the mode setup isn't compatible, or the TV doesn't set it's EDID data correctly. Case in point, I have a BenQ computer monitor that supports 1080i, but I can run the desktop at 1080p if connected to a PC, but if I select 1080p while it's plugged into the TV box I instead get a black screen with "Out of Range !" despite the fact it does support 1080p. Also, the EDID in the TV can be overwritten/corrupted. If you can plug in the TV to your computer, grab the EDID with http://www.entechtaiwan.com/util/moninfo.shtm, that might help with troubleshooting it.
-
If you hear sound out of it, it has to be working, though it seems kinda off if you can update the firmware with no picture. Are both TV's the same model? Does the TV say what mode it's in, or does it just stay black? Also try another USB cable, as mentioned earlier in the thread, some cables are troll cables or are too cheap and result in voltage drops.
-
That's probably just prioritization. Remember that they ship mostly-empty boxes all the time because it better fits the back of a truck.
-
We need some new Sega hardware!
Kismet replied to my80chevette's topic in Classic Console Discussion
A DIY project is not complete project. The DE-10 is a stand-alone SoC, so that gives the developer a bit more flexibility of what they want to simulate in the FPGA and what they emulate in software (eg storage devices) but the MiSTer project is something that is incomplete at best. -
A human can not see the difference in something that no difference occurs in. If the screen is being updated at 60hz exactly, then detecting a change of non-multiples of 16.7ms is impossible. In a 120hz or 240hz panel you should in fact see 8.35 and 4.175 frames in these tests, but instead we're seeing 16ms and 32ms, and on the OLED's. There is a reason for this, go to the page where they explain how they test it. https://displaylag.com/the-lag-tester-a-new-standard/ So what it's testing is a white-to-black response. So a screen with a slower white-to-black response will get a poorer score. Which the author states here: So the display latency measurement is supposed to be the bottom bar, which is always the worst number. Take note that the lag tester is not designed for 4K leobodnar.com So when it is used on a 4K screen, it's letting the monitor use it's own scaler. Likewise it's not testing 120hz or 240hz, it's testing how the monitor behaves with a 1080p60 signal only. edit: tag soup broke everything
-
That's not what I said. I said that you'll notice a difference once the latency crosses 16ms, because that's when the audio will stop being in sync. Or are you going to suggest that DDR and Street Fighter players are hallucinating?
-
Depends what you're playing. The requirements for video and the requirements for 7th/8th generation game consoles is a bit wider than than those before it. There will be some inherent latency over HDMI that isn't present over displayport, but we're talking about eDP in that context. But that latency database is not likely to be 100% accurate either since you need the same conditions for every device, and even changing cables can change that number. LG got a lot of backlash for the latency in their 2016 models that they somehow managed to patch in firmware, so that is also another factor to consider. But out of the box people rarely change defaults, so you just need to find what you can deal with.
-
Note how all 240hz monitors have high lag, all OLED's and all of samsung QLED's do too. The only monitors with 9ms are IPS monitors, which are a premium. Likewise the only HDTV's with 12ms are 4K IPS as well. I have a MG24UQ 4K 24" monitor, it's 10ms. You're doing ok if it's 12 or under. When you cross 16ms, for a video game it's very noticeable. For home theaters', you have to start adjusting the surround sound speakers delay when it hits 16 as well.
-
If it's the thing I'm thinking of, it's because the GBA runs through an ARM SoC. The same SoC let's it play mp3's and such.
-
That's not how USB cables work at all. http://www.usb.org/developers/docs/ All USB cables must support data transfer and power. On the device side, USB cables may be used for power or power+data. Devices with their own power supply are supposed to supply power to other devices plugged into them. "USB chargers" are simply USB ports with no data features. They can supply more power power if the D+/D- pins indicate such. The cables are meaningless except for what speed they can transfer data at. Some devices may have a proprietary connector instead of another micro-usb or usb-c connector. For example, Apple's chargers simply have the USB 2.0 image on them. But they still are data cables. Notice "with power delivery" shows the USB icon with a background of a battery. That is on the host device side. USB-IF does not licence "charge only" cables. If you have "charge only" cables, (eg came with a USB wall wart) they will not look any different from any other USB cable.
-
Yes and no. It would be possible to create a "FPGA cart", and put any game console that has a maximum 256x224 resolution in it. The GBA is 240x160. Both do 15-bit color. But because of the bus speed, it would not be possible to push a [email protected] image from the GBA framebuffer to the SNES. If you note, the MSU-1 only does [email protected] So while it would be possible to make something that could output video through the SNES PPU's, it would likely be a less difficult thing to simply have the FPGA cart tell the FPGA Super NT to just bypass everything and route it directly to the upscaler (thus the Super NT ends up being more like an OSSC + controller inputs for the FPGA cart.) Thus it would not work on a real SNES. Who knows, maybe a "dummy core" could be made where the Super NT just takes a digital NTSC/PAL input sent along the data bus as Ypbpr/RGB and lets it use the same scaler/filter. Thus sticking something like a "MiSTer" cart would allow it to use MiSTer cores.
