Jump to content

Kaide

Members
  • Content Count

    129
  • Joined

  • Last visited

Community Reputation

47 Excellent

About Kaide

  • Rank
    Chopper Commander
  1. Ah, yeah. That’s foam padding, not a sponge. On one hand, foam padding like that shouldn’t be conductive. However, I’ve never seen it used on top of components and contacts like this before, and would even say it generally shouldn’t be used in this manner. Last thing you want is something getting trapped by the foam and creating a conductive path for a short. EDIT: I should have read more first. FirebrandX does point out that the foam in the repair is conductive, to ground the board to the chassis. That part would make sense (if not an ideal way to do it). So why was the original site over an IC that likely took Vcc and Ground? Or is it that there was no grounding at all before?
  2. For early systems like the NES, it’s simpler to load the whole thing into the FPGA’s memory and access it directly. So yes, the SD card shouldn’t even be a factor once things get going, since the ROM file has already been fully loaded. Emulators would do the same, and the Mister uses a larger developer FPGA board, which should have even more memory onboard. That said, NES games did have slowdown issues on real hardware, including some Nintendo titles like Kirby’s Adventure. Bubble Bobble is another notorious one which I was a fan of but find it hard to play today vs the arcade ROM. Since I spend most of my time in the SNES catalog, I don’t really remember much about if SMB3 was one of those titles with slowdown issues or not. It would definitely be worth comparing to real carts though.
  3. It’s clear you didn’t really read the content of what I was saying, or misunderstand the points I was trying to make. The core point is that there’s nothing specific about retro gaming that is any different than other uses of a modern TV. So there’s not a whole not of additional texture someone on this forum can add that you can’t find elsewhere. About the only thing that might matter is compatibility with non-standard refresh rates when using an OSSC or something like that. Sadly, nobody really reports on that, so it’s a bit of a lottery there, and wholly depends on the controller used, and has nothing to do with OLED vs LCD. If you use an AV receiver or processor, that’s another link in the chain that can break OSSC/etc, making it even more annoying. Note that I mentioned that an LG-based OLED will have better viewing angles, hands down has better contrast, and even commented on the faster response times of the panel itself in my comments on motion clarity. And in terms of motion clarity, I think it’s a trade off (and an annoying one to make). And I even mentioned that for things that are 60fps, the OLED pulls ahead IMO. But honestly, sample and hold has been a huge step back from CRTs that we still haven’t solved, and instead apply bandaids to. As for image retention & burn-in, I intentionally used the phrase “image retention”. My set is 3-4 years old, it gets retention. I haven’t had any burn-in. But since LCD-based tech like QLED can suffer DSE (as can OLED) and have a similar effect on the picture, it’s honestly a wash. The problem with pure measurements in this case is people tend to compare based on those measurements without context, or people wind up ignoring the context of those measurements to make mountains out of ant hills. Color accuracy on average is generally good enough with a few exceptions, and calibration will tend to drag dE low enough that it makes no visible difference. Both the Samsung Q80 and the LG CX both measured at a dE of under 2 out of the box. That’s ideal. Assuming you aren’t running around in a red pushed mode or some other nonsense, that’s a great place to be in. Color gamut/volume is another issue, but so far the differences in gamut between them have generally done a good job keeping pace with each other that I wouldn’t go chasing it in a buying decision (and Rtings reports better color volume on the Samsung Q80 QLED set BTW). So yes, different displays will measure differently, but keep in mind that samples of the same model will vary here, so there’s margins of error to account for when comparing. There used to be some places that measured motion resolution, but it’s gone away in favor of the easier to measure response time as places are trying to measure more models of TV with fewer resources. It’s an important factor, but it’s not the whole picture. Much like you need to model the human ear to get a better idea of how headphones and speakers work, you need to model the human optic system to get a better idea of how different displays actually present motion to a person, rather than a camera. To be blunt, motion handling of TVs is the place where reviews are honestly terrible. One of the reasons OLED has such a “wow” factor is because of contrast. Contrast is one of the things the human eye is the best at picking up. And OLED is a clear leader there (but so was Plasma and CRT and look what happened there). So long as the processor isn’t introducing black crush (a problem LG had a while back in early OLED TVs), shadow detail cannot be beat on an OLED. That said, retro games with small palettes aren’t really impacted by that, are they. :)
  4. Quality in terms of what? Color? Contrast? Viewing Angles? Motion? Image-Retention? But really there’s nothing between different panel technology that to me says someone should go OLED vs LCD when discussing retro gaming. Color accuracy is close enough that both are equally good (assuming we are talking about Samsung QLED/MVA panels, and LG OLED panels). Contrast and viewing angles go to OLED. MVA panels that Samsung is fond of using in TVs bloom when using zone dimming, and can still leak considerable light leading to elevated black without zone dimming. Mostly an issue in darker rooms. MVA panels also aren’t great at wide viewing angles, while OLED is more in line with IPS LCD panels. Image Retention goes to QLED or any other LCD-based tech. It doesn’t bother me since my use keeps it minor enough I only notice with single color backgrounds (similar to dirty screen effect), but it is something I’ve had to deal with using OLED. Motion depends a lot on the controller driving the panel, although the tech can affect what you can do. QLED/LCD and OLED are both sample-and-hold tech, which will mess with how your brain interprets what it sees, causing visual artifacts that way. My old Sony 1080p LCD had a feature to strobe the LED backlight at 480Hz. This was great for film and other stuff in the 24-30fps range, since it got incredibly close to the look of CRT when it came to motion clarity. LCD/QLED needs time to transition between frames, which tends to “smear” the frames and blur them. Strobing the backlight helps “reset” the brain’s visual processing between frames, much like projectors and CRTs do. OLED has no backlight, so you can’t do this strobing trick the same way. Sony recently started offering a sort of “rolling blackout” on their OLED panels at 120Hz which helps, but because of the pandemic I haven’t seen one in person yet with the feature enabled. But I’m watching this particular addition closely. With both OLED and LCD/QLED, the effects of sample and hold can be lessened by higher frame rates. So 60 FPS gaming will look fine on both sets. I’d give the edge slightly to OLED since you get a slightly crisper motion, thanks to the faster response times, but I honestly kinda prefer the “blur” of 24-30fps LCD than the “double-image” I get from 24-30fps OLED. So newer consoles that have games running at 30fps to me at least, don’t look as good in motion on OLED as they do a good LCD-based panel with a strobing backlight. But really my ranking for TVs in motion clarity tends to be: LCD w/Strobing Backlight > OLED > LCD w/o Strobing Backlight. When it comes to my next TV, it’s going to be about trying to find a “good enough” balance between contrast and motion clarity for me. It might mean going back to a full-array backlit LCD of some kind.
  5. I guess what is your specific question? I’ve been using an OLED TV for retro consoles since before the Super NT came out.
  6. Mappers are transparent to the cartridge bus. Even with bank switching, the game would just write to a couple specific address locations to tell the mapping chip to update what banks were accessible. So you really just need to figure out the cartridge bus and you should be in good shape. I see what you are getting at with the idea, but it has some trade offs that aren’t necessarily ones that make a lot of sense to me. Unless you are thinking of the Pocket itself as the target architecture, versus say, a game meant for a retro console running on the Pocket. That said, if we are talking about retro consoles on the Pocket, then it seems like home brew as ROM files make more sense to me. Either done via a specialized flash cart with the “adapter” embedded into it, or through direct support of the SD card (like it apparently has for GB Studio). Physical carts for games are nice collectibles, but I’d at least personally tend to prefer those be in the format of the original system so it’s compatible with original hardware.
  7. 1) Including the core into the cart itself creates more problems than it solves, IMO. The cart adapters do seem to get detected by the MegaSG to kick it into the appropriate modes, so to me it seems possible, but I’m not certain how it’s implemented. 2) Possible, but you’d need a bridge to accomplish it. One reason there’s so many pins is because you’ve got two address buses and two data buses. So your adapter needs to effectively change how data is passed along the 32-pin connector, and then present the expected buses to the cartridge. There’s a few ways to do it, but I think my naive concern would be over getting the timings right.
  8. You are asking the wrong person to explain someone else’s design, TBH. That said, the behavior is there, on the line starting with: ”CONSTANT MAPS : arr_jmap := (” Best I can figure from a closed issue on the GitHub, and from the VHD files (it’s been nearly two decades since I last used VHDL, mind you) is that it defaults to mapper 0, unless the CRC matches one of the CRCs in this array. If it matches one of those CRCs, it uses the mapper index specified. This trick really only works if the dump is a known one. If it isn’t known, then it assumes mapper 0. Since the vast majority of Intellivision games use mapper zero, based on the spreadsheet the author references, this effectively works. Still not ideal from an engineering perspective, but “good enough” so long as you aren’t dumping your own ROMs.
  9. Which would be part of the reason for the intv2 format. MAME has a similar problem. ROM sets are matched to the version of the emulator, since the configuration information lives in MAME, not the ROM. So certain versions of MAME only work with specific dumps of the ROMs, named a certain way. MAME even goes so far as including the CRCs of working ROMs to catch dumps from older ROM sets quickly so it can reject them instead of trying to load them. For Intellivision emulators that don’t use .cfg files, I would suspect a similar arrangement where the file name of the ROM gets looked up in a table for the appropriate configuration, since the configuration is likely simpler than what MAME has to deal with.
  10. Footer is worse than either of the other options, since you have to do a search through the file for the footer, then make sense of the raw data sitting in front of it. For a footer to be "efficient" you need to be able to stream the whole file into memory in one go and then read the footer and still be in good shape. The NT Mini's ROM format here allows the memory map to be read/configured as the file is being streamed into memory, which has advantages when you don't have a lot of memory to work with, since you don't have to keep a header in memory as reference when reading the file. So this makes the approach a little more memory efficient on loading than even a header. Early GPS devices used tricks like this (such as writing the R-Tree that sorts POIs into regions into the file format itself) to reduce how much memory was used, and improve speed on searching large POIs from slow flash memory by letting the GPS "skim" the file and keep very little state in RAM. I was talking about a specific format proposed for Intellivision (.ROM) that would have appended the metadata in a footer. I thought I clarified in my post that's what i was talking about here. It depends a lot on how the ROM chips were setup on the bus. But I'd argue that "matching the original ROM chips" is not even a good metric, as it gets too much into the weeds. A good archival format for cartridge data is one that is easy to work with, and accurately describes the cartridge enough to reproduce it. Interleaved ROM chips (i.e. two ROM chips with 8-bit data buses providing high and low bytes to a 16-bit data bus) makes things more complicated than it needs to be. For the sake of emulation or even hardware reproduction these days, it's easier to pre-interleave the data for example. If I wanted to create carts from scratch, it'd likely be easier to get a 16-bit EEPROM anyways. And if I did need to recreate interleaved EEPROMs, it's a simple task to do that from a non-interleaved dump so long as the format is well documented and consistent. In terms of endianness, like the above example, the format just needs to be well documented and consistent. That's it.
  11. In the case of intv2, there's a couple things that jump out to me: 1) Specifying the binary data as 16-bit little endian words, even for 10-bit ROMs. I think we got a bit hung up on providing what was thought to be a concise example of where there is potentially no such thing as a "1:1 copy" when dumping ROMs, but still interesting in this case. 16-bit words does make the emulation of the cartridge bus a bit easier in some ways. 2) Embedding what would normally be in a config file for .int/.bin ROMs. The second one is rather important, though. It looks like outside of .rom files which embed metadata that would normally go into the .cfg file, the other formats require this configuration file to specify certain things about the ROM, including how memory is mapped so it can assign certain chunks of the ROM to specific addresses. A bit like the NES mapper example we provided before. But this data is appended to the end of the file and has to be searched for. This isn't a great engineering design, IMO, but it potentially makes it backwards compatible with emulators that only understand .bin ROMs when paired with a .cfg file. .intv2 on the other hand, uses the format itself to tell where chunks of ROM data goes. So as you read the file, it's telling you where things go, making things much easier. It has performance implications for both dumping and reading of ROM files. EDIT: Another thought that occurs to me is that one way to describe this is that raw ROM dumps aren't terribly useful. Much like a RAW file from a DSLR, it needs to be processed with additional information to make sense of the raw data. Be it information about circuitry that cannot be dumped, memory locations, etc, etc. Some ROM formats are closer to that raw data. MAME, and I guess Intellivision, fall into that category. Other ROM formats bundle in the metadata required. intv2 would fall more into this second category, but so would an NES ROM that contains information about which mapper was used, and size data on the CHR and PRG ROMs.
  12. They are ASICs. Chunks of logic. You can’t really dump them, as there’s no data there, just volatile state. When I say they sit between the ROMs and the system, I mean exactly that. The simplest mappers enable bank switching, which enable more data to be stored on the ROMs than the NES can actually address. The game will need to send signals to switch banks when it needs to access different parts of the ROM. I totally forgot about CHR RAM.
  13. Cartridges on these systems aren’t “inert”. For NES, the mapper chips are on the cart, and sit between the ROM chips and the system. Just dumping the ROM itself isn’t going to include the behavior of those chips.
  14. I'd agree it shouldn't come into play, but it does. For simpler systems with an 8-bit data bus, it certainly shouldn't be an issue since you are reading data in small enough chunks that it is effectively a 1:1 of the EEPROM contents. NES, Master System, and SNES for example. Things get more complicated as you start dealing with 16-bit data buses present on the Megadrive, or the N64. A backup tool will generally dump the cartridge a word at a time using the cartridge bus, where a word is 2 bytes in these two examples. So when I write it out to storage, should it be LSByte first, or MSByte first? If I just record these values into a buffer and flush the buffer as a series of bytes to storage, then endianness of the system affects how the data is recorded. The 68k is big-endian, while the MIPS chip the N64 has is configurable (although I don't know which mode Nintendo used, or if it was switchable while running). So if I don't standardize on what byte order is in the ROM file itself, then it can't be read properly on the other end without some sort of detection looking for something that can be treated similar to a UTF BOM. And there's an argument to be be made for both LSB and MSB when it comes to consoles like the Megadrive. LSB is more common for EEPROM programmers being used on x86 (DOS generally) at the time. MSB is how the console itself sees it. So which is "real"? At least in the case of Megadrive, the light reading I've done so far seems to suggest that big-endian is the convention, because the original dumps were done on-system. They could have used LSB by byte-swapping things before writing them out in those dumpers to more closely approximate what the raw files written to the ROMs would look like, but they didn't. It should be pointed out that smd is interleaved when compared to md/bin, due to the dumper that produced them operating in Z80 mode (and likely some addressing issues in Z80 mode), so to load those, you have to know that the high and low bytes for each word are 8K apart from each other in the dump, sliced up into 16K blocks. And which ones are the high byte, and which are the low byte. Fun.
  15. 1280 * 3 = 3840 720 * 3 = 2160 Can't do it with a 1080p display, but a 4K display should be able to integer scale 720p just fine. At least on mine, the input lag is the same for 720p and 1080p, so I just run at 720p.
×
×
  • Create New...