I did measurements with my real Geneve side-by-side with MAME on my PC, and I am getting matching values for instruction timing and memory access within less than one percent. Believe it or not.
(To be precise: It is the sum of timings that matches; it is not every single instruction per se. MAME just promises to execute an equal amount of instructions between macroscopic timing events as the real iron.)
To be fair, I did a /weekend long/ test of MESS, JS99er, Classic99, and two hardware consoles (F18A and 9918A), all running the same software in the same field of view (so I could take a photo and read the cycle counts at the same time). I figured the duration was reasonable to minimizing microscopic timing jitters in the emulators and the slight (<1s) variation in startup time. I was trying to validate vertical interrupt times but ran CPU cycles as well (this was while we were working on the Megademo, back in 2016).
Per the datasheets for the 9900 and 9918A, the CPU clock should be +/- 5% (a large range!), while the VDP should be +/- 0.05% (accuracy was necessary for NTSC video generation).
For the test, I marked my 9918A console as the benchmark, so its CPU and VDP baselines were considered 100%. For the record, the VDP output was 59.917 fps.
The F18A machine ran the CPU at 100.02% (well within limits) and the VDP at 59.526 fps. Matt has recently discovered this variation was the cause of some of the ColecoVision issues and fixed it!
MESS was by far the closest emulator (and this is why I quoted you ), running the CPU at 99.99% (so closer to the first machine than the second machine was!) and the VDP at 59.924fps (0.01% difference - within spec!)
Classic99 and JS99er were both outliers then (CPU and VDP both fast on both, but within 5%), but I've done a lot of work on Classic99 since 2016, and JS99er has seen a number of revisions. MESS is also now MAME. It might be fair to run another weekend test.
Edited by Tursi, Tue Jan 8, 2019 3:33 PM.