Jump to content

Kaide

Members
  • Posts

    146
  • Joined

  • Last visited

Everything posted by Kaide

  1. Which would be part of the reason for the intv2 format. MAME has a similar problem. ROM sets are matched to the version of the emulator, since the configuration information lives in MAME, not the ROM. So certain versions of MAME only work with specific dumps of the ROMs, named a certain way. MAME even goes so far as including the CRCs of working ROMs to catch dumps from older ROM sets quickly so it can reject them instead of trying to load them. For Intellivision emulators that don’t use .cfg files, I would suspect a similar arrangement where the file name of the ROM gets looked up in a table for the appropriate configuration, since the configuration is likely simpler than what MAME has to deal with.
  2. Footer is worse than either of the other options, since you have to do a search through the file for the footer, then make sense of the raw data sitting in front of it. For a footer to be "efficient" you need to be able to stream the whole file into memory in one go and then read the footer and still be in good shape. The NT Mini's ROM format here allows the memory map to be read/configured as the file is being streamed into memory, which has advantages when you don't have a lot of memory to work with, since you don't have to keep a header in memory as reference when reading the file. So this makes the approach a little more memory efficient on loading than even a header. Early GPS devices used tricks like this (such as writing the R-Tree that sorts POIs into regions into the file format itself) to reduce how much memory was used, and improve speed on searching large POIs from slow flash memory by letting the GPS "skim" the file and keep very little state in RAM. I was talking about a specific format proposed for Intellivision (.ROM) that would have appended the metadata in a footer. I thought I clarified in my post that's what i was talking about here. It depends a lot on how the ROM chips were setup on the bus. But I'd argue that "matching the original ROM chips" is not even a good metric, as it gets too much into the weeds. A good archival format for cartridge data is one that is easy to work with, and accurately describes the cartridge enough to reproduce it. Interleaved ROM chips (i.e. two ROM chips with 8-bit data buses providing high and low bytes to a 16-bit data bus) makes things more complicated than it needs to be. For the sake of emulation or even hardware reproduction these days, it's easier to pre-interleave the data for example. If I wanted to create carts from scratch, it'd likely be easier to get a 16-bit EEPROM anyways. And if I did need to recreate interleaved EEPROMs, it's a simple task to do that from a non-interleaved dump so long as the format is well documented and consistent. In terms of endianness, like the above example, the format just needs to be well documented and consistent. That's it.
  3. In the case of intv2, there's a couple things that jump out to me: 1) Specifying the binary data as 16-bit little endian words, even for 10-bit ROMs. I think we got a bit hung up on providing what was thought to be a concise example of where there is potentially no such thing as a "1:1 copy" when dumping ROMs, but still interesting in this case. 16-bit words does make the emulation of the cartridge bus a bit easier in some ways. 2) Embedding what would normally be in a config file for .int/.bin ROMs. The second one is rather important, though. It looks like outside of .rom files which embed metadata that would normally go into the .cfg file, the other formats require this configuration file to specify certain things about the ROM, including how memory is mapped so it can assign certain chunks of the ROM to specific addresses. A bit like the NES mapper example we provided before. But this data is appended to the end of the file and has to be searched for. This isn't a great engineering design, IMO, but it potentially makes it backwards compatible with emulators that only understand .bin ROMs when paired with a .cfg file. .intv2 on the other hand, uses the format itself to tell where chunks of ROM data goes. So as you read the file, it's telling you where things go, making things much easier. It has performance implications for both dumping and reading of ROM files. EDIT: Another thought that occurs to me is that one way to describe this is that raw ROM dumps aren't terribly useful. Much like a RAW file from a DSLR, it needs to be processed with additional information to make sense of the raw data. Be it information about circuitry that cannot be dumped, memory locations, etc, etc. Some ROM formats are closer to that raw data. MAME, and I guess Intellivision, fall into that category. Other ROM formats bundle in the metadata required. intv2 would fall more into this second category, but so would an NES ROM that contains information about which mapper was used, and size data on the CHR and PRG ROMs.
  4. They are ASICs. Chunks of logic. You can’t really dump them, as there’s no data there, just volatile state. When I say they sit between the ROMs and the system, I mean exactly that. The simplest mappers enable bank switching, which enable more data to be stored on the ROMs than the NES can actually address. The game will need to send signals to switch banks when it needs to access different parts of the ROM. I totally forgot about CHR RAM.
  5. Cartridges on these systems aren’t “inert”. For NES, the mapper chips are on the cart, and sit between the ROM chips and the system. Just dumping the ROM itself isn’t going to include the behavior of those chips.
  6. I'd agree it shouldn't come into play, but it does. For simpler systems with an 8-bit data bus, it certainly shouldn't be an issue since you are reading data in small enough chunks that it is effectively a 1:1 of the EEPROM contents. NES, Master System, and SNES for example. Things get more complicated as you start dealing with 16-bit data buses present on the Megadrive, or the N64. A backup tool will generally dump the cartridge a word at a time using the cartridge bus, where a word is 2 bytes in these two examples. So when I write it out to storage, should it be LSByte first, or MSByte first? If I just record these values into a buffer and flush the buffer as a series of bytes to storage, then endianness of the system affects how the data is recorded. The 68k is big-endian, while the MIPS chip the N64 has is configurable (although I don't know which mode Nintendo used, or if it was switchable while running). So if I don't standardize on what byte order is in the ROM file itself, then it can't be read properly on the other end without some sort of detection looking for something that can be treated similar to a UTF BOM. And there's an argument to be be made for both LSB and MSB when it comes to consoles like the Megadrive. LSB is more common for EEPROM programmers being used on x86 (DOS generally) at the time. MSB is how the console itself sees it. So which is "real"? At least in the case of Megadrive, the light reading I've done so far seems to suggest that big-endian is the convention, because the original dumps were done on-system. They could have used LSB by byte-swapping things before writing them out in those dumpers to more closely approximate what the raw files written to the ROMs would look like, but they didn't. It should be pointed out that smd is interleaved when compared to md/bin, due to the dumper that produced them operating in Z80 mode (and likely some addressing issues in Z80 mode), so to load those, you have to know that the high and low bytes for each word are 8K apart from each other in the dump, sliced up into 16K blocks. And which ones are the high byte, and which are the low byte. Fun.
  7. 1280 * 3 = 3840 720 * 3 = 2160 Can't do it with a 1080p display, but a 4K display should be able to integer scale 720p just fine. At least on mine, the input lag is the same for 720p and 1080p, so I just run at 720p.
  8. At least the last time I had to deal with customs, FedEx is the middle man if they are handling shipping. To import the package, FedEx deals with customs at the port of entry, and then reaches out to the receiver if customs imposes tariffs. But yes, 50% tariffs seem extreme. I wonder how much of this might be mistakes/confusion related to the changes in the UK import scheme due to leaving the EU? The changeover is due at the end of the year, so I wonder if the training for the customs agents is getting messed up somewhere.
  9. When I placed my pre-order, my understanding was that the “gunmetal finish” they advertised was going to be “gunmetal gray”. Was that not the understanding others had? Maybe “Noir” was a bad pick for naming?
  10. And looks like it’s gone already. Sent from my iPad using Tapatalk
  11. And VGS was shut down when Sony bought the tech from Connectix after Sony lost in court. AFAICT, the PS1 emulation used for the PSP, Vita and PS3/PS4 is based on VGS. Go figure. The evidence is somewhat circumstantial, but based on the fact that the PSP version of the emulator in particular turned out to have very similar compatibility bugs, and even having similar toggles (quirks modes) for improving compatibility on a game-by-game basis. Generally, the issues of copyright in this space aren't well covered in the courts. At least in the US, format shifting isn't really a right, and hasn't been defined as "fair use". Even your ability to make personal copies of a copyrighted work is a legal gray area, mostly carved out because companies aren't eager to do the work to get the legal precedent, and there's no legal framework for or against it in the US. So the risk is that you get to be the guinea pig on a particular legal case, with no real insight into what the law says you can do here. And yes, you are right that the resources Nintendo, Sony or Microsoft can spend are formidable. (EDIT: And in the case of Bleem and VGS, they play original copies of games. VGS in particular would check to see if the media was a CD-ROM or CD-R, and only accept the CD-ROM. VGS didn't support ISOs either.) That said, companies have been more willing to go after those doing distribution of copies. So ROM sites in the right legal jurisdiction, for example. And it makes sense since it's the best use of resources. The best bang for the buck, if your goal is to minimize piracy. This approach has worked well for the music industry, since it has let music evolve from CD to MP3 to Spotify. But that isn't to say every industry will play this way and sit back as people format shift content. Movies are still playing cat and mouse with their DRM schemes, along with eBooks and Video Games. When it comes to devices that can play ROMs, how do you get them? If you don't dump them yourself, that gray area of format shifting doesn't even apply, and you are receiving a copy that wasn't permitted/licensed to be made. Nintendo has gone after flash cart makers in the past, to varying degrees of success, and mostly focused on current systems. Honestly, it seems like these flash carts for older systems really only get left alone because companies like Nintendo don't see the point in going after flash carts for the NES, when they can go after the ROM sites they can, and leave it at that. So it boils down to how close to the line you want to play. Only you don't know exactly where the line is because there's no written copy of the rules, and you only find out when someone like Nintendo slaps you with a C&D. I don't blame Analogue for being cautious here.
  12. We can only hope. I missed the first batch of pre-orders, and was starting to forget that the Pocket was even a thing.
  13. Super NES was always more popular than the Genesis/MegaDrive. This is likely more that they keep selling through the Super NT batches as they get them, versus the Mega SG supply has already caught up to demand.
  14. Ugh, so I knew this would probably happen, but didn't bother to set an alarm, so I was busy messing with something else and getting started with work while they sold out. Oh well. I refuse to pay scalper pricing. I either get it at MSRP or I don't get it. ??‍♀️ I got an e-mail a while back about pre-orders starting today at 8am. I think that was the "notification". A notification this morning would have been worthless for a lot of people.
  15. if it does use HDMI alt mode, you’d just need a passive USB-C to HDMI cable. But yes, since we don’t know how it is spitting out the signal, we should assume the dock is going to be the best way to do it.
  16. Depends on what USB-C mode is used between the Pocket and the dock. HDMI alt mode is one option. Pros: USB 2.0 support for a dock-side hub that only supports HID inputs is plenty, if the Pocket is 1080p through the dock, then the limit of HDMI 1.4b isn’t a big deal. Cons: Uses the PD pins, so charging will be limited to 5V/2.1A (12W), which may be fine. DisplayPort alt mode is another. Pros: USB 3, PD and 1080p are all supportable here at the same time. Cons: Dock needs to have a DisplayPort to HDMI adapter built into it. Anything else is probably overkill, to be honest. But the fact that it’s DAC-compatible makes me suspect HDMI alt mode is more likely here? But my understanding is that the smaller FPGA that isn’t available to devs is also the one responsible for output to the screen or dock, and receiving input, so I doubt that developers would be able to override how the USB-C port works. The Switch uses something called MyDP which is somewhat like DisplayPort alt mode, but pre-dates USB-C, making it incompatible with the newer alt modes. It requires a special chip in the dock to pull apart the DisplayPort, Audio and USB data signals that have been multiplexed over the USB cable. A little surprising, but perhaps support for it was already baked into the Tegra platform, and they just used it as-is. PD should still be supported in this setup as well (since it doesn’t need the PD pins).
  17. It isn’t in the dock. It’s in the Pocket.
  18. This is the linchpin of your argument, and honestly, I don't believe it holds. The hardware is not the secret sauce. Like Apple they want to sell you the whole widget, and also like Apple the hardware itself is well-designed commodity hardware. The most proprietary thing in Analogue's hardware is the custom cart connector if they use one. The firmware is the golden goose in this case. The work put into the cores are why folks here respect kevtris' work, and why the widget is valuable in the first place. They could still enable contributions under a license that protects themselves against cloning, sure, but that would likely discourage MiSTer contributors as it wouldn't truly be open in the same sense that MiSTer is. And cloners won't care what the license is. The Super NT / Mega SG in particular are built as complete systems. I suspect it's not well componentized to let them just open the emulation core, while leaving everything else closed. Note that what Apple open sourced is effectively the kernel and supporting bits. Along with the BSD environment they really should be providing source for because it's customized code from the OSS community. The OS as a whole is not open, and the license forbids running macOS on non-Apple hardware. Apple themselves know their golden goose is the OS and platform they provide. Not the nice, yet pricy, hardware. Really, the Pocket's approach is probably what Analogue's idea of an open FPGA platform from them looks like. They could probably open the cores up to the community that way without giving away the whole game, but who knows if they will.
  19. So, Sonic & Knuckles itself has no memory. The FeRAM in Sonic 3 is large enough that there's separate space in it for both Sonic 3 saves and S3&K saves. Having the S&K cart locked on I think modifies the memory offset used for the saves to make this work, rather than doing any sort of memory re-mapping tricks in the S&K cart. Might be possible to just dump the RAM of the Sonic 3 cart itself, and use it for both ROMs. Again, I'm curious, I'll have to play around a bit with my copy this evening.
  20. Now I'm curious if my Retrode can read the save memory on Sonic 3. That said, the save memory on the Sonic 3 cart doesn't use a battery. The save memory chip is FeRAM. It is non-volatile (to a point), but the chip can wear out, requiring a replacement. I had to replace the chip in my copy a while back, and it's not fun getting salvaged or new old stock of the original chip. Some folks have created adapters for using newer FeRAM chips still manufactured though.
  21. To put it bluntly, because of things like vertical integration and Amazon's volume getting them deals with carriers they work with, Analogue will never be able to offer shipping as cheap as Amazon. For Amazon, their game is really fungible goods. You don't care how you get your copy of Sakura Wars, since they are all equivalent, for the most part. You care that you get a copy of Sakura Wars. So it makes a lot of sense for Amazons to play games with margins and integrate vertically to cut costs, and then convert shipping into a hidden cost baked into the MSRP when possible. It lets Amazon reap extra profits from customers that are cheaper to ship to. Analogue isn't a retailer in the sense that Amazon is, doesn't deal with the volume, and doesn't own their own logistics chain. They could roll the logistics costs into the advertised price like Amazon, but I guess for some reason they like having the advertised price be lower. It is an old-school way of doing things. The Super NT and Mega SG are both "under 200$". Roll in the costs (at least for US customers), and they can't claim that anymore, but they would get fatter margins from folks that are cheaper to ship to in the US. I can't really say if they should or not, they are both trade-offs. Would their sales drop at "$219" instead of "$189"? No idea. But the fact that they tend to run lean on stock in general suggests that they probably aren't hurting too much for a boutique company doing it as they do. Another option is that they could use Amazon as the logistics company, but it looks like they'd have to bump the price up to around 230$ anyways just to cover the FBA fees (ignoring the costs of returns for a moment). So not really any better than using their existing logistics company.
  22. It’s more a comparison that just because a rejigger of a circuit design is being done by the same folks that designed the thing originally, or have access to the original documentation and/or circuit designs, doesn’t mean it’s going to be functionally identical. The key difference between an “official clone” and a “3rd party” clone is mostly how much reverse engineering you have to do, and thus the risk of introducing bugs to the circuit because of that work. Yeah, this is where I (as an engineer) tend to clash with folks in marketing. From an engineering perspective, kevtris is cloning hardware. But on the marketing side, I’d agree that it’s hard to market something like the Super NT as a “clone” because of bootleggers and emulation boxes like the Hyperkin stuff. Really the difference is that Analogue is selling a boutique product rather than trying to produce it for the cheapest price possible, and has kevtris doing the engineering work which helps a lot.
  23. I notice that you ignore the rest of my post here. But to put it bluntly, an FPGA is no more a simulation of a electronic circuit because the circuit links are re-routable than an EV is a simulation of a car because it uses batteries as the power plant.
  24. What makes a Z80 real? Is it because it has Z80 stamped on the package? Is it because it uses Zilog’s circuit design? Is it because it is an ASIC? Is it because it is a circuit that implements the Z80 architecture as defined by some dead tree document that specifies how the Z80 works? Would custom Z80 designs like Nintendo’s variations used in the Gameboy line count as a real Z80 or not? Is it real if I can drop it into an existing circuit without the rest of the circuit knowing that it’s been swapped out? For a more philosophical question: What’s the difference between someone who got the HDL for a Zilog Z80 and put it on an FPGA versus someone who reverse engineered the Zilog Z80 into HDL and then had an ASIC made at a factory? What’s the difference between using that Zilog Z80 HDL to make an ASIC vs putting it on an FPGA? How far do we want to take this particular Ship of Theseus? But it’s not like an FPGA is running a software program though. The programming is just the configuration of signal routing, and is effectively fixed once programmed, until you program it again. It’s more akin to EEPROM (FPGA) vs ROM (ASIC). So once you’ve loaded in the configuration that recreates a circuit diagram, it’s going to have a recreation of that circuit until told to reconfigure. I will point out that compatibility issues aren’t a great argument towards calling something an emulator or not. AMD CPUs are not considered x86 emulators, despite running into the occasional compatibility issues with software. Intel CPUs are not considered x64 emulators, despite the spec being made by AMD. Sony shoved PS1 hardware in the PS2 and yet couldn’t maintain perfect compatibility with PS1 games. Nintendo had compatibility issues with the 1-Chip SNES. Just tweaking circuits using existing designs can introduce compatibility issues, so a reverse engineered circuit is going to be at least as prone to issues. As for the CRAM dots, I’m like other folks who’ve commented that I don’t really go looking for them, or care to have them enabled.
  25. Agree, virtual machines don’t really make sense in an FPGA perspective, really. Although there is some code running on the ARM SoC, IIRC. But I’m not really sure VMs make sense there either for these products, to be honest. That said, I’m going to do something dumb and wade into the emulator vs clone debate. If we think a clone is an attempt to recreate the hardware, and that emulation is the act of creating something that imitates the original hardware on some other hardware, FPGAs are in a weird grey area. I can understand the debate, honestly. On one hand, what kevtris does is the same thing that someone cloning the hardware would do: reverse engineer the original by analyzing it and creating a new description using HDL or circuit diagram. Someone writing an emulator is not seeking to create the HDL, but rather software that can run on some foreign CPU. But the FPGA is effectively “other hardware” that runs the clone, so what makes the Super NT different than BSNES? For my part, I think the distinction doesn’t really matter that closely as much of the quality of the final result, but the distinction should probably be what the output of the process is. You could take kevtris’ cores and with tweaks, produce ASICs from them and produce clone hardware that’s similar to the Super NT. You cannot do that with BSNES.
×
×
  • Create New...