I don't think software emulation is any better until you get to hardware that is off-the-shelf, because off-the-shelf hardware is well documented. Basically anything that has a BIOS and Operating system becomes a much easier target for dynamic recompilation efforts.
I'm not sure I agree completely. Both FPGA and SE accuracy are simply going to be as accurate as the information used to create their cores/routines. Custom or non-custom parts are going to translate equally good or bad. FPGA doesn't magically increase accuracy on its own. Software Emulation doesn't suddenly become better when you have a BIOS or OS or data sheets for individual TTL chips.
One style may be easier to work with in some instances. I'll agree there.
When I said, "It isn't "locked" to one chip, one design, one product." I was referring to FPGA cores. FPGA cores written for one FPGA chip need to be re-done for another chip. And yet again when you have a different design PCB. It may be re-write, it may be a re-compile with different options. FPGA doesn't migrate well. Software Emulation that works on a Pentium M will also work on Core i9. And in the far future when there are Core i23 chips, there will be, as there are now, virtualization tools and environments to run older softer. That includes emulations. We already see this trend. SE can operate on a huge range of host chips and umpteen trillion motherboard designs. With FPGA it's one mainboard, one layout, one product, one chip. That's what I was getting at.
Another advantage of software emulation is the complexity of the system being emulated. FPGA is up against hard limits because of number of elements available. While FPGA chips do increase in size and cost-effectiveness they are doing at too slow a pace to encourage migration and the rapid pace of adding new features. Developers apparently lose interest in their project too easily, before it can be improved enough.
SE does not have that limitation. And MAME has demonstrated it. As new knowledge becomes available, a system/driver/core/module can be expanded upon without fear of running out of room. For all practical purposes the host's memory is unlimited.
I've watched Emulator Stella grow from 0.8MB in 1999 to 2.6MB in 2018. All versions work on my old-ass Pentium-M through my i7. Pretty much the same deal with Altirra, from 0.3MB in 2009 to 4.6MB in 2018. MAME has an even bigger size increase, 1.4MB in 1999 to 193MB in 2018. And none of these strain the host's memory.
Some folks say that FPGA has the advantage of parallelism. Maybe. But consider a modern-day i5 or i7. You've got 4 cores running at 3.5+ GHz and extraordinary context switching at your disposal. You'll be running emulators just fine and with power to spare.
Other folks say that FPGA feels like real hardware. It probably does. I say it's psychological because you're looking at separate box away from the PC. The PC'ness "aura-of-stink" that hangs around the PC is a huge downer for some. This sort of miasma doesn't exist on FPGA boxes - because "want to believe"..
And then there's the common misconception that FPGA is an exact hardware recreation. Many believe someone analyzed XYZ board and duplicated its every transistor in the FPGA. Nope, not quite. Functional blocks are simulated to the best available knowledge.