Jump to content
IGNORED

Precisely what is wrong with VCS paddle emulation ..


Keatah

Recommended Posts

THE PROBLEM

Breakout breaksdown, Kaboom! goes BOOM!, Video Olympics becomes the Special Olympics, Circus Atari is a real circus, and WarLords battle themselves like drunken soldiers. What the fuck is wrong with Paddle emulation??

 

Let us examine the problem in depth. We must also remain cognizant of the style of controller we are using. We grew up and learned to play Paddle games (not that whacking your sister in the ass game) with a real spinning knob, a real rotary knob. Making the jump to a straight-line type of motion dictated by the mouse is certainly going to be different. Yes. Though it's not as if we're playing tempest with a keyboard or flight simulator with a trackball. If done correctly the differences will be minimal and we can transition over nicely. It begs the question of purity. How pure of an emulation do you want? How pure of an original experience do you want? How much in-between?

 

If we stay with pure emulation, that means using PC hardware and only PC hardware to recreate the classic gaming experience. This means keyboard and mouse, box and monitor. Roms are to be loaded from your 12 terrabyte hard disk and you get upset when your I7 shows more than 2% usage while jamming Slot Racers or Math Grand Prix. Perhaps you might use a PC-style joystick. That's all you get. It's up to the programmer to re-create the Classic Gaming Experience using just that hardware. Nothing else. It's up to the programmer build the best possible emulator. The bit of software that makes your system think it's something else. The only "original" component allowed here is the ROM code or Game Program as I like to call it. And it must be loaded from some form of PC style storage, like that 12TB HDD!

 

If we go with 100% original experience, that will mean no emulation. This means effectively playing on a real VCS with a real glass non-flat CRT tuned not 100% accurately to channel-3, and a real Atari-factory-made Paddle or Joystick controller. No Harmony cartridge either - only real cartridges with original masked ROM chips, or the later-day variety that used fusable links or diodes, also known as PROM's. It also requires a shag-carpet floor and harvest-gold appliances in the kitchen and wood paneling in the basement. With 80's music being played on cassette or LP vinyl. That, my friends, is purity!

 

And there's the hybrid in-between experience. This usually involves a PC somehow, either for running emulation or stuffing a Harmony cart to be used in a real VCS connected to your computer monitor via a video mod. You might have built-in a pause circuit too. Perhaps it could be Stella running on a PC connecting to a CRT monitor. Or maybe running with a modern-day flatscreen and turning on Blargg effects. You might be using any number of various controllers, flightsticks, gamepads, through USB and bliss-a-boxes, the old-skool SoundBlaster analog port, or perhaps a Stelladapter with real VCS controllers. Shit, you might even have built an FPGA interface to a real TIA or RIOT and have re-wired the 6 control switches to the F1-12 keys somehow. Maybe you've even built your emulator into a real arcade console! Whatever the mix it's sure to consist of the old and new.

 

Tonight we will be concerning ourselves with 100% pure emulation and making sure we get the best possible experience with PC-only hardware.

 

Having said that, know there are are subtle, but very real, tracking errors in Mouse-Paddle emulation in Stella. This is disruptive to a positive experience when playing fast action Paddle games. There are FOUR distinct issues surrounding Paddle controller games we need to explore in response to gamers' statements citing "The emulation of the Paddle isn't right!" On the surface this manifests itself as lag-time; when in reality it is much more. In fact we will soon see that lag time is not significant at all!

 

 

BASIC TEST METHODOLOGY

Let's take it from the top and set some things straight. We will burn away the irrelevancies until the core failure modes become crystal clear. In visiting the phenomenon of lag it is important to know where it comes from. The lag in my particular monitor and rig is 8.6ms. This is not some figure I pulled out of my ass or came up with by reading by specification sheet that mentioned the monitor's refresh rate, but by actual measurement with an oscilloscope and timer/trigger. It corresponds nicely with the 125Hz standard refresh rate of a default Windows XP installation and standard Microsoft drivers. How? What? Let us take it apart and see what we can see. Be aware I'm describing a worst-case for anyone using a Windows XP based system.

 

Consider this measurement. It begins from the physical mouse movement, through the digital camera in the mouse optical pickup, though the usb cable, to the usb chip, to the ICH southbidge, to the northbridge & memory controller, then CPU & memory, then graphics chip, through the DAC, out the vga cable, to the monitor's ADC, internal conversion & scaling circuits, and then to the pixel array block, and it stops with the complete flipping of the LCD crystals. Your mileage may be different and your signal route may take slightly different steps. But the map is the same. Basically it's the whole path from Mouse to Pointer. This test takes into account all the interrupt delays, and usb polling delays, and software processing delays. You get the point - it measures the time of the whole chain of events beginning with the time you move the mouse till the time the cursor moves onscreen.

 

Let me comment that the monitor typically adds in another 1ms. My test monitor is spec'd for 8ms response time, but that is a complete change from black to white or vice versa. Partial changes may take longer to complete. Making your LCD monitor go from RGB 0,0,0 to RGB 255,255,255 it's gonna snap almost instantly. That's the factory spec, here, of 8ms. Going from RGB 120,120,120 to RGB 150,150,150 is going to take a lot longer. This is because the voltage difference isn't yanking the wanker as hard. It's a gentle change and the pixels tend to drift over to the new value. Some monitors will yank the pixels real hard in one direction and then upon arrival at the new color, stop and apply the correct voltage to maintain the new color.

 

The photodetector I am using to sense a pixel change is looking at any change, so it will consider the "job done" somewhat ahead of time. More like 1ms! It detects the pixel's intensity change before it's half-way-through the complete flip. But that is good enough for our testing. If not excellent! It also simulates results as if we had $4,000 state-of-the-art LCD display. Grand!

 

In slow response monitors you can observe the entire 8ms flip time as an ever so slight dimming of the edges of a moving object. This is the nature of LCD displays. This is not really relevant here. Nor is the time the monitor spends processing and flipping pixels. These little delays are not cumulative, they're not significant, they don't add up to anything affecting the operation of the Paddle controllers or how the onscreen game elements interact with your perception. If it helps to know, I tried the same suite of tests on an analog RGB monitor with no processing overhead and came up with similar results.

 

 

THE PERFORMANCE NUMBERS, WHAT THEY MEAN AND WHERE THEY COME FROM

Windows' specification has the mouse operating at 125Hz rate, or 8ms delay between mouse movements and pointer activity. Check.

Software watching the USBPORT.SYS polling activity sees a PEAK performance as fast as 130Hz & 7.65ms delay. Check.

The same software reports an Average time of 117Hz & 8.5ms. Check.

My test rig hardware reports an Average time of 104Hz & 9.6ms. And Check..?? Yes..

 

So I say all three sources, the mfg specification, hardware test, and software test are all in agreement regarding how long it takes to process the mouse movements. The 1.1ms difference between what is reported by my hardware test rig and the software also takes into account the monitor delay, including processing time onboard the monitor, and pixel flipping time. These are highly repeatable tests.

 

Let us be generous here and toss in 2ms of extra time so as to account for other system activities and disk access and shit like that, maybe some stretching by the monitor scaler. So as a raw baseline I've got 10ms inherent delay from the time the mouse is moved till the time something onscreen happens, that's what this specific system can offer to any program making use of mouse as an input device. 10ms! That's what the host system can advertise. And we are erring big-time on the slow side. I believe 8ms is constantly being achieved. But 10ms sounds good and accounts for minor system activities and housekeeping distractions..

 

Be aware that I didn't overclock the mouse, nor is it a special m0dDerZ BoISe "gaming" device festooned with 50 buttons and 6,000 dpi laser sensors. It's a shit-ass Microsoft business economy mouse. I believe that other systems more powerful or even older than this test system will have similar performance because this all low level and standard across the board for all Windows' systems. This 10ms delay is based on bus counters within the system. There is nothing special about my system. This is timing is sourced on fixed in hardware. It is known as the USB mouse polling rate. This rate, however, can be changed by editing the USBPORT.SYS file. I can set my mouse refresh rate to 1ms and 1,000Hz and get even better performance. In fact with the higher polling rate, we can bring the system's pointer response time way way below what the monitor can ever achieve. But for these tests, it's not beneficial or required to do so. By the way, I'm running the monitor at 60Hz refresh rate. I could go higher, but this seems to be average. This 60Hz rate is what the image and pixels are refreshed at. It is possible to set the monitor pixel refresh rate, and the image refresh rate to be different (or the same). This is similar to what you know as vsync. But with modern LCD's this can become ambiguous. As they may do processing and scaling, and in some designs the whole image may not even be updated; but only sections, and at different rates. A pseudo refresh-rate, if you will. Where the image is presented to you at a certain rate specified by your system, but not at the rate at which the monitor paints the picture. Vsync within vsync separated by a buffer! It also bears mentioning that the best of the best gamers can generally feel a lag time of 50ms between the time they do something and the time they note it on the screen. So that effectively eliminates LCD monitor processing time and system processing time from your perception. It's irrelevant here. And please don't spout off shit saying you can tell the difference between 5ms and 35ms ping times when playing online games. You can't. You may see differences when this time become cumulative and the "game" has made a number of calls to the server. When several packets have been sent and received and the game has to wait before updating the screen, THEN, and only then, do you complain about a slow internet connection. Many packets going back and forth with a 25ms lag will add-up. And they need to add up to about 50-60ms. Then you start whining to daddy about getting a deluxe internet package! Understand that the processor may need to complete many exchanges and the already-fast 5ms ping times now become multiplied perhaps 10 or 20 times before your screen gets an update. It's this multiplying effect and cumulative delay of those many packets that irritate your fragfest.

 

Well, the advantage here in Paddle emulation is that the cycle is what it is. There are no cumulative delays that build up and then get output all at once. No waiting. For all practical purposes, the lag and delays described above are fixed, and very short. The host SYSTEM won't magically insert 200ms mega-delays unless it's underpowered and has to service other interrupts. Monitor lag and polling rates are not like a network game, where communication lag can build quite quickly. So the point I'm making is the video hardware is providing a solid steady timebase from which to work, and that is 10ms!

 

Not a series of 10 5ms lags building up to to 50ms delay, or another series perhaps 16 35ms transmission lags topping out at 560ms! DEADLY! The most lamest gamer can easily discern 50ms from 560ms. Yeh?

 

I conducted these tests with reasonable accuracy and in the spirit of what I described above. But we are going to make this easier and look at it in a way that eliminates a lot of tech-speak and is easily doable by you, right now. This can be done by everyone with no additional hardware or software. We will test with Kaboom!, and Stella, and your mouse.. Kaboom! is an excellent choice here because it is the best of the Paddle controller games for going down the rabbit-hole of time. It tends to distort time perception and drills you down pretty hard. Milliseconds can feel like minutes during parts of the game. It is in this state that you can best get a feel for the unresponsive crap emulation of the Paddle controllers.

 

Now that the explanations of the sources of system delays are out of the way, we can now look into the four things that are wrong with Paddle emulation in Stella. Four things.. Let's take them one by one:

 

 

1- 1:1 PROPORTIONAL POSITION - The position of the game element(s) does not track linearly or reliably. I flicked the mouse back and forth within a clearly defined and specific area and eventually the game pieces, the buckets, end up on the left side of the playfield. Consistently. So I made a more accurate test instead of flicking around aimlessly on the desk. I dumped the mousepad and built a new one. I took a styrofoam block and covered it with a black cloth to ensure good tracking. Then at both left and right ends I set up some goal posts. So this is looking like a mini football field now. I put the mouse in the center and started the game. I first made sure the buckets were in the center of the screen more or less, by way of lining up the left edge of the bucket with the "n" in "Activision" as a reference. I moved the mouse left 3cm and then right by 6cm, then left 3cm, bringing it back to the starting point of 0cm. This should return the buckets to the center position. It did, almost, maybe a pixel off. In the left direction. I repeated the test a little bit faster. And it did it again. This time another pixel off. Cumulatively we're two pixels off-center now. Maybe that's not a big deal. It's getting there!

 

Hmm.. What's going on?? So I did it even faster! How much *is* faster? Level 2 or 3 on Kaboom! That's a good indicator. Ahh hah! It is then that I noted, again, clearly the buckets aren't returning to the exact center, but a pixel or two further to the left instead. Moving up to level 4 and 5 the effect is a little more pronounced. Understand I'm not playing the game for points. Just using the levels as references to illustrate the speed at which I am moving the mouse pointer. The rate of error accumulation is dependent on speed, and the error is cumulative. And if you constrain the mouse to an area like your mousepad, then eventually you need to pick up the mouse and recalibrate its position by starting at the left edge - because that's where the buckets eventually end up if you do a steady rhythmic left-right motion.

 

So if I was to play a game for real, eventually the mouse ends up knocking against the right "goal post" in my test fixture. This demonstrates less sensitivity on left-to-right movements.

 

 

2- ACCELERATION DELAYS - The mouse movements have a start up and slow down delay, almost as if the mouse is connected to on-screen game objects via rubber band. There is an 53ms delay (in Stella, on my test rig) from the time you move the mouse till the time something happens on-screen. This is the lag caused by the emulator alone. It's 63ms if I include the hardware delays and Windows processing time and monitor time. It is seemingly non-linear. The sharper the movement from stand-still the more noticeable the effect. And the buckets do not track exactly. Once you move the mouse, the buckets will accelerate up to tracking speed, and then move with a 1:1 velocity. And then when the mouse stops moving, the buckets will continue to carry on like an object with real mass & momentum that tends to keep moving once in motion. They slow down over a period of time and come to a stop. This is acceleration and deceleration period is typically 15ms - 33ms.

 

Problem is if this momentum is "active and happening" (within the 15 - 33ms) when you do an abrupt direction change, tracking is lost and your movements get thrown into a black hole, not registered. Not registered until the 15-33ms time elapses then a new tracking lock comes online. And the game element stops and reverses. Look at it this way - say if your moving the buckets left in a steady state motion and snap to the right, the initial snapping motion is lost while the buckets brake to a stop and then accelerate toward the right. Tracking is just not absolute. To make matters worse - this phenomenon is without a doubt biased, with left-to-right reversal movements being less accurate and sensitive than right-to-left. This means that moving left, stopping, then moving right is less accurately tracked. Moving right-stop-left is less likely to miss pixels. So the ultimate effect you notice is the game element creeps to the left! I will explain this and PROVE it in the next section as well as reveal another error in Paddle emulation.

 

 

3- SLOW MOVEMENTS - I connected the mouse to the minute hand of a clock with a bit of thread. And I put the clock near the left goal post of my test rig. So that as the minute hand moved, it pulled a string which dragged the mouse across the mini football field. With the clock on the left side goal post, mouse in the center, I watched and waited. Eventually the mouse and on-screen bucket moved left and that is to be expected, slow, yes, but it tracked it. I had started the minute hand at 4 and ran the test for 20 minutes, stopping it at 8. That gave nice slow almost linear movement, and the mouse covered 3.8cm distance. And the buckets onscreen moved from center to the left side of the screen. Just about perfect.

 

I redid this test on the right side, with the mouse being pulled the same distance of 3.8cm, but to the right this time. And this showed a problem. The clock hand pulled the mouse alright. But the onscreen buckets did not move one iota, zilch, nada, nothing!! They stayed right in the center. Apparently Stella cannot track slow left-to-right movements. It seems the mouse velocity needs to be higher when moving to the right as opposed to the left.

 

Again this shows how the left movements are accurately tracked. You could argue that the right direction movements are being tracked correctly and that there is some hysteresis present so as to ignore some noise. In that case, you are then required to say the left movements are introducing extra counts someplace. It's all relative.. Which one is really correct?

 

 

4- GETTING STUCK IN HARD STOPS - This is best described as stuffing the buffer. This is what you do. Position the buckets in the middle. Move them a bit left and right. Works? Good. Now. Move them gently and carefully to the left edge of the screen border. When they stop moving, you stop moving the mouse. Good. Now, move them right. As soon as you move the mouse, the buckets begin moving with you. Great! Just as it should be. Absolutely fantastic! Eh?

 

Now, let's do the test again. Start from the center and test your mouse, is it working? Perfect. Now, move the buckets to the left edge, just like before. But keep moving! Try as hard as you might to scroll them off the screen and into oblivion. Maybe if you try hard enough they'll pop out the side of the monitor! Wouldn't that be cool if our classic games could escape the confines of the on-screen playfield and spill out of the monitor into the real world? I once had a dream about that you know. Well.. Keep trying to scroll the buckets left for a moment. Pick your mouse up and do it again if you run out of desktop space. Once or twice is good. The point is to try and bury the buckets and make them get stuck. Then stop. And here's the PROBLEM part! Now try moving them right again. You will notice, quite distinctly I might add, that they do not move immediately. It's as if a buffer has to be emptied or "lost-ground" needs to be made up. It's as if they really did scroll off the playfield, way out of sight, and now you have to bring them back from the ether. The area outside the monitor. Once the invisible off-screen ones come back into the playfield, we can now resume the game.

 

Incidentally, the effect is more noticeable on the left side. But it still occurs on the right side. I'm tired of measurements and tests, so I'll just estimate an imbalance of 60/40. The left side has a bigger buffer that can hold 60 off-screen "clicks" whereas the right side can only do 40. Who gives a fuck?

 

Now, note in a real VCS that the Paddle controllers have a hard stop. They can move from the 7 o'clock position to the 5 o'clock position. And that's it. The position of the buckets are directly proportional and exact to the position of the paddle dial. That's a no-brainer. The game's edges correspond exactly with the Paddle stops. As it should be. If you were to magically increase the sensitivity of the real Paddles by changing the resistor values, you'd see a similar effect. You'd need "unwind" the "buildup" or recover the lost ground before bucket movement would begin.

 

I don't observe any of these 4 issues as being specific to any one game. It's across the board.

 

 

COMPARING AGAINST WINDOWS AND THE PLAIN POINTER

To be thorough, I repeated these tests with a plain desktop and white arrow windows pointer. I made a Target icon and put that on the desktop, and centered it. A good reference point, don't you say?

 

I moved the mouse seemingly 100 times in either direction. Never hitting a goal post to stay near center. There were no acceleration or tracking errors. I could move the mouse a hundred times, left to right, at any speed and I could always center mouse in the center - both on the field and onscreen simultaneously. Never did I have to pick the mouse up and reposition anything. I upped it to 200 times and it was still spot on. I upped the speed as much as I could and was whipping it back and forth with great velocity. With some imagination all this activity is akin to a good beat-off, don't you agree? Ahem! And still the pointer stayed absolute and "on track"..

 

Next, I shot a BB at it with an impromptu slingshot and the pointer instantly jumped x-distance away. Still onscreen though. I carefully backtracked the mouse to center, and pointer was exactly at center. I tried this in both directions. Good results. No discrepancies. Windows tracked fine at moderate and super high speeds. AND at high accelerations too. Trivia: By calculations it would have been 147 G's for a fraction of a second based on stroboscopic timing, mass and speed of the mouse and projectile and times to cover the distance. Physics lessons are also beyond the scope of this tech brief.

 

All well and good, I did the clock drag test twice. And 2 hours later came to conclusion that the pointer in Windows is spot-on. There were no lost movements. Windows is able to track the pointer seemingly absolutely and equally in both directions. Windows tracked fine at near impossibly slow speeds.

 

And to wrap up my testing. I scrolled the pointer of the mouse all the way to the edge of the screen. It stopped at the very edge. And as soon as you move the mouse in the other direction the pointer comes right back. There is no "buffer" to empty, no imaginary off-screen pointer that needs to "come back" to the desktop before pointer motion resumes. This works correctly.

 

 

NOTE

I used the default mouse settings and basic Microsoft Mouse driver. Changing the settings doesn't alter the behavior, it only reduces or increases the magnitude of the 4 issues. And the settings changes to the mouse control panel options necessary to note different behavior within the emulator need to be extreme to the point where the desktop is now "not-quite-right". Therefore adjusting things on your desktop control panel is not an option.

 

I also upped the USB polling rate to 1ms & 1,000Hz, thereby providing the smoothest and most frequent updates to the pointer position. There is no change in emulation behavior despite this. All this does is support the fact that the code within the emulator needs to be re-worked. It further strengthens the position that the system hardware is not at fault here.

 

 

SUMMARY

These four idiosyncrasies are key to making players complain that Paddle emulation is not-quite-right. For the involved and serious gamer these are show stoppers right here and now. These four factors best describe that ineffable quality that is lost in emulation.

 

I don't know the exact coding mechanics used in Stella, nor do I care to get involved with them. That is a good thing because I am not biased by what is happening under the hood. It would seem that a little fix'n up is needed to correct the deficiencies described above. Adding adjustments and sliders and tuning options may mask or reduce the problem to a point where problems won't be detected, but that's not elegant and is likely to fail other tests or still feel not-quite-right!

 

I have no clue what causes #'s 1, 2, and 3; Other than it might be related to how quickly (and where) the emulator polls Windows. I, somehow, think that the "coupling" of the desktop mouse pointer to the emulator is not done correctly. Perhaps the emulator needs to get positional information from a different part of Windows? Maybe it needs to get info from the hardware directly. I have no clue, this is up to the programmer.

 

Would a custom mouse calibration routine (similar to joystick calibration routines already used in Windows) be of help here? You know, move your mouse to all four corners of the screen and press a button to continue. This way the emulator will have better awareness of how to interpret mouse movements. What about when the emulator can't get pointer information during a cycle? Does it interpolate a position? Does it do nothing?

 

The emulator also needs to know the limits of movement and perhaps perform a reset of a counter when player objects meet edges - thereby addressing problem #4. This I think can be fixed right away. Without a major re-write of the Paddle emulation routines. I'm not talking about looking at game logic or the 4K Game Program itself. No. I mean the emulator should be endowed with a working knowledge of the range of motion of the Paddle controllers and set hard limits - just like real VCS Paddle controllers, and how they can only go from 7 o'clock to 5 o'clock.

 

It is these four factors that are conspiring to create the illusion of lag here. But if lag is the problem here, like so many think, then I invite someone to write a compensating routine and call it a day. That isn't going to fix the 4 issues described in this tech brief. Just remember that minimal processing is done on the image when it comes from the PC and goes through the monitor circuitry. Yes, an analog CRT will respond quicker but the difference here isn't as much as you'd think. And the little difference is definitely not the source of the problems of playability.

 

 

BUT WAIT THERE IS LAG!

If TV lag and processing *IS* a problem and you're being anal about this - I would invite someone to build a simple circuit with a photodetector, and a transistor, 3 or 4 parts at most. Maybe 10 bucks. Have the emulator flash a white screen a few times and measure the lag. Have the photodetector send a simple blip, a low-high transition, to the soundcard. The emulator would look at the time it took to send a white screen to the time it received a tick back over the sound card mic in port. And adjust itself from there.

 

 

AND HOW ABOUT SOUND?

But before we wrap this up let us address sound briefly. Sound is not the issue here right now, it has nothing to do with the mouse movement and Paddle emulation. No one has really complained about it ever. For now, I believe the adjustments and sample sizes and output rates are good enough, unless someone is actually complaining about sound delays and shit like that.

 

And if you want to get anal about this too, you can build in a lag detector for audio. Connect your microphone input to a real microphone (duhh!) and have the emulator send out a tone, when the tone is detected, count the time and adjust playback lead/lag time accordingly. Thereby giving the illusion of perfectly timed sound, another no brainer. And no special equipment is required other than a $10 microphone. Shit, you can plug in your iPod headphones and stuff them in the speaker cone for the same effect.

 

Perhaps as time goes on and display peripherals take more time for internal processing it may become de'rigueur to have calibration tools such as these. It would be nice to have a monitor report back it's lag time to the host system. But I don't believe mfg's have thought of that as being a critical feature.

 

 

EPILOGUE

Much of the essence of VCS gaming is based around the CRT beam and how closely the TIA is coupled to it. Graphics and controlling the electron beam are first and foremost when working with the VCS. Anything else is secondary. Look, programs are structured around how much time they have to work with while the beam is "re-tracing" and blanking. Not only that, of even greater importance is how small and responsive the Game Program code is. There is no room for bloat and useless garbage, every byte counts. It's these characteristics that make programming on the VCS as challenging as playing games on it is fun.

 

If we are to correctly emulate this unit it is important to capture all the nuances and behaviors. Signal propagation times, controller response times, IPC counts, internal delays, instruction execute times. How does analog translate to digital? How does digital simulate analog? TTL logic flow.. You get the point. There's just a ton of things that need to be synchronized. And no detail must be overlooked. It should be said that emulator authors have an understanding equal to that of the original designers when it comes to the machine they're trying to emulate. Top it off with added translational skills. Skills of being able to transplant all the behaviors of one machine into an entirely alien architecture. They have a knowledge beyond us mere gamers. And it is in *this* context which I criticize and complain about downright suck-ass shit hole Paddle emulation as it now stands; in hopes to make an already fantastically executed emulator project one step better.

 

I have purchased many bits of commercial work over the years. And a lot of it has changed and metamorphosed into something completely alien today. No longer is it an improvement on the original package. No longer is the original function present. Crap is so bloated and backward compatibility shot to hell. Ugh..

 

In contrast - this emulator, is what? 17 years, 18 years old? Wow! And its base function is intact. Like many other long lived SourceForge projects we now take it all for granted and just "expect" it to be be there, semi-regularly updated and everything else. It has (and continues) to be a cornerstone in the Classic Gaming hobby of today. It is but one of many tools that helps bind the community together. Numerous homebrews are developed with it. And and many fun times are relived with it. A lot of consideration is made in keeping it backward compatible with old hardware while at the same time adding new features like the Blargg filter set and all the bank switching schemes. That is mucho appreciated. Even insignificant snot nose shitbox users like us get to have some say in testing and developing. All these things (and more) combine to make it a shining example of how a project should play out. Commercial developers would do well to study projects like this and model their business around them.

Edited by Keatah
Link to comment
Share on other sites

1. You are aware that lots of modern LCDs lag for one or two whole frames? So 8ms is basically nothing if you have such a monitor.

http://en.wikipedia.org/wiki/Display_lag

 

2. Mouse drivers are usually not linear, they have build in acceleration. So if your mouse faster, the cursor will move more than if you move it slower by the same distance. And to calculate the speed, the drivers need some time. So the movement will lag there too.

 

Anyway, your tests outside Stella seem to be done very correctly, so most likely Stella is adding significantly to the problems you reported.

Link to comment
Share on other sites

BTW, I should add that this is the way to report a bug (although I don't normally expect anyone to be so in-depth). Popping up in a forum and saying something is buggy, without ever having contacted me about it beforehand or following up on it afterwards, is likely to irritate me (*).

 

(*) This is directed at a 'bug report' mentioned in another thread.

Link to comment
Share on other sites

1 - Proportional Position: speed left-right mouse movements aren't exactly left-right, you can twist the wrist a little and in the end you'll need to pick up the mouse because the final position is too left or right in your desk. And perhaps being right handed or left handed can explain the left or right bias. Perhaps the emulation could detect vertical movements to deal with this rotational problem.

 

Edit --> On second thoughts, if you tested that much on windows and always had a centralized mouse, that wrist twist may be a poor hypotesis...

Edited by Liduario
Link to comment
Share on other sites

@Thomas:

Yeh, I suppose that is the case..about the mouse driver linearity.. There's acceleration on/off options, enhance pointer precision options, speed sliders, linearity and "slow-me-down" controls for precise pointing. I tried variations on them all. The changes usually mask the problem more or less, but never eliminate it. Usually it just aggravates the issue or makes your desktop pointer get all funny and stuff.

 

I read through the link ya'll posted. And that's cool. I always wondered what the "Game" mode did. I never bothered to research into it. Well now, At 60Hz, 1 or 2 frames can be almost 30ms delay. I suppose it depends on when the refresh cycles happen and if everything is in sync or not. It would seem in LCD based rigs, that we've got to sync in-game action (from the cpu chip proper) with the videocard buffer(s), and then get those going nicely with the monitor or TV set. Lot's of potential to get the signal to U on-time in a few ms. or significantly late while stuff syncs itself.

 

I've used my little detector thingy and have measured some of these big 240Hz ultra-speed monitors. They fall behind by nearly 100ms. There's a lot of processing going on no doubt, to build and interpolate the frames and do other shit to buff up the image. That's fine for watching re-runs of Mary Hartman, Mary Hartman! But not supertwitch gaming. No sir..

r

Then you've got the early LCD monitors, the 19"ers that cost $3,500 when flatscreens were the new thing. You know the ones with the Big Bezels! These are slow ass bitches. Not only did their scalers suck, but the pixels were anemic.

 

Now, once in a great while, to help prove I'm undrunk and all ready, I'll play Typhoon 2001 at the high levels. This game is so fast it makes Kaboom! look like Kindercare toss-a-ball tryouts. I need to turn up the refresh to 85Hz and can definitely tell you one way or another if triple buffering is enabled on your card. Well.. I can only push this game up to higher levels there when playing on certain monitors. Some are just too laggy. Or they don't snap the pixels tight enough and smear like watercolors.

 

A lot of classic gaming seems to take place with you tracking a moving object and lining things up ahead of time so that two events or actions converge to blow up a ship. So it is a natural thing to anticipate your next movement. And I think we become accustomed to compensating for lag. It just sort of happens up to a point. If the lag is too long then it becomes an annoyance.

 

How and what you measure for determining your LCD lag

You need to look at two things, the input to the monitor and the output, a flipping of a pixel. The time difference is your lag. Simple. No complex equipment needed.

 

More specifically, this time is measured from the signal-in VGA connector on the monitor to the actual twisting of the pixels on the screen. (I'll do the digital ports sometime soon enough). Seeing that this is easy analog stuff I figured I'd start here. I would also expect VGA to give a more worst-case because of the added A/D conversion and scaling and preparation of an analog signal to the digital domain. Yes?

 

How it all works

We are going to look at two signals, in and out, simple. We will discover when the monitor becomes

 

First let's get the input signal going. We set up the trigger or "beginning start time". You need to trigger on one of the RGB lines. Or you can do it on Hsync line. Then all you have to do is look for a ~1volt blast for the white pixels. That's VGA white. Or look at the Hsync line and trigger when it goes back high. You can get this blast of white pixels by making a screensaver like thing in PowerPoint that alternates between black and white. Timing is not important here as long as it changes. This alternating black-to-white signal stands out well on any scope with a 200MHz bandwidth or greater. This takes care of the looking at the input. Hook it to the ch1 of the scope.

 

I like the Hsync signal, it's repeating, and it denotes the begining of a data stream. Simple eh?

 

Next I wired together a little photodetector/photodiode. We hook it to some resistors and a transistor and battery. You can get the schematics from those 150-in-one RadioShack kits. When this shit sees light it gets excited and sends a few volts, when it sees black, it sends out 0 volts. Most of the time it sits around and beats off with itself. Hook this to ch2 on the scope.

 

The scope fires on perhaps the Hsync signal, if we're wired that way. The clock is ticking, that electron beam is a-flyin! We're live and recording! Yay! The monitor is digitzing your computer's signal, scaling it, processing it, and mapping it to 1600x1200 array of pixels, that's 5,760,000 elements (or sub-pixels) being updated 60 times a second. Don't forget each pixel is really 3 pixels combined into one.

 

Now it's time for our photodiode to earn its keep. Put it near the left side, assuming that's where your monitor begins scanning. If you don't that's still ok, the measurements will be good enough. Note that I've seen monitors scan all over the fucking place. In incomprehensible patterns too! Some don't even scan at all and just update what's different from frame to frame, like a codec. Now, as soon as the pixel spins enough enough to trigger the photocell, which is stuffed into a piece of cardboard and pressed against the screen, the "timer" stops. Show's over. Measurement complete.

 

With these two signals going straight to the scope, ch1 and ch2, you set the timebase to what you can comfortably read and subtract the difference between peaks and transitions of each channel.. Simple as that! There is no magic here. And it's highly repeatable. We're making measurements by the thousands every moment! And any errors quickly average out. But for our purposes it's instantaneous.

 

So to recap

The monitor hauls ass @ 1600x1200, 1400x1050, 1024x768, and 800x600 resolutions. These are 1:1 or 1:0.75 scaling factors. It performs best at select resolutions, achieving between 1-3ms. The more the monitor has to scale, e.g. to 0.8, 0.67, 0.6, or perhaps 0.5625

 

One thing that I noticed is that on my favorite 1600x1200 monitor, for example, is that the scaling is very simplistic and only works well with select derivatives of the native 1600x1200 resolution. This means that I am stuck with 1600x1200, 1400x1050, 1024x768, 800x600 .. That is if I want the instantaneous 1ms response I mentioned in my previous post. This is the common 0.75 scaling factor. I have tested this repeatedly a billion times again and again. 1600x1200 incidentally gets the best performance at a breakneck speed of 1.04ms! You can't beat that! The other 3 resolutions come in at 1.2 and 1.4 and 2.1 ms respectively. 1152x864 and other 0.75 derivatives are up to 4ms already despite them still being 0.75 factor. And it goes downhill from there

 

If I ask the monitor to fit a 0.8, 0.67, 0.6, or perhaps 0.5625 ratio image, then all bets are off with the lag climbing to 26ms and more. And it absolutely rains on the parade(in my head) with a dismal delay of, ((gasp!!)), 51ms!! This is the bastardized, all wrong, 0.46875 aspect 1280x600. Ughh..

 

I didn't worry about timebase correctors or external framebuffers or genlock-type garbage or any of that other shit. Too much work. And I like the simple approach of averaging the time over thousands of readings. This is an experiment anyone can do without spending a ton of money too. Afterall, we humans don't perceive specifications. We perceive things as we see them. I just extended my eyes and ears to extraordinary precision within the time domain. A fantastic little test rig this is. eh?

 

How to determine your entire system lag

Extending the above concept of lag measurement you can do something to watch something propagate through your entire system, from mouse to screen.

 

Just connect the 'scope to your mouse button, that's the start time. And the photodetector sees a change on-screen, that's your stop time. Measure whatever the fuck you like.

 

 

@Liduario:

Excellent point we have there, and thanks for bringing it up! With a clear head, now, let me explain bit further.

 

I should have mentioned that I had done a second variation somewhere in these tests. But my post was getting a bit big and out of hand and the lady was yelling all night! And my thoughts were running out of control! Full-speed-ahead! I'm sorry, I had ELO blasting, playing with the new Blargg effects in Stella, dying of laughter at the whizzing cars in Grand Prix, finishing a Photoshop edit, and guzzling Svedka and Monsters all frakking night. But I thought it out right!

 

I made sure there was no arcing motion in the mouse, as it would be when you are normally playing a game. I did it both ways, regular hand-held motions and then controlled laboratory conditions. The regular way translates into about 2cm of up and down motion intermixed with the intended horizontal movements for game action. It is of course the horizontal movements that interest us and contamination from up and down and twisting of the wrist isn't desirable.

 

So I eliminated both variables in one fell swoop! The twisting and the sliding. Here's how: I taped popsicle sticks to the sides of the mouse. And then used 2 wood rulers, like the ones you used in primary school. You know. I taped these rulers to the desk-cum-workbench in a parallel configuration about 5 inches apart more or less and I put the "modified mouse" (anyone wanna do a cartoon?) inbetween these two parallel rulers. This eliminated all vertical motion AND twisting of the mouse; thereby only sending left-right movements to the system.

 

When I executed the clock-hand test I tied one single bit of thread to the minute hand. About 10cm worth. I don't know. On the mouse I had spot-glued another piece of thread to the bottom part (closest to you) and the top part (farthest from you). I had drunken visions of the mouse being a plastic parachute which I could then hang my army guys from. So this formed a V-shape to which I tied the clock thread. This made sure the mouse didn't rotate or skew off at a tangent either.

 

Besides, the test football field was restrictive in and of itself and helped add consistency to movements to begin with. The popsicle stick rig made absolutely sure we were working in the horizontal domain only. I didn't see any appreciable change in behavior either by hand or by test-fixture-stabilized motions. If you tend to twist a mouse during left-right movements, it "untwists" as you go back to the starting point. This would account for non-linearity in the movements. But it would not account for position creep.

 

I was thinking of suspending the mouse above a belt sander hooked to two reversable switches, then we could see ultimately how fast a mouse can move before it loses track of where it's at. Trigger the scope on a white marking on the belt, and know the distance. Do the math. Then we know how fast the "surface is flying" - or the mouse is moving.

Link to comment
Share on other sites

  • 1 month later...

I can see the problem with getting a mouse to work properly with paddle games but I don't understand why hooking up paddles to a computer wouldn't be pure emulation. If the computer can do everything the VCS can including having the ability to hook up paddles then it is pure emulation because it is replacing the console. Using a mouse would be in addition to pure emulation because a VCS can't use a mouse. It would be adding a controller option in the same way that hacking Missile Command to be compatible with a trackball.

Link to comment
Share on other sites

Ohh god! I just thought of a new VCS title "Whack That Fat!" Or WTF

Here's how it works:

 

*There's a bunch of fat ass shoppers playing musical chairs with their wal-mart scooters.

 

*Your job is to keep them motivated and prevent them from slowing down by slapping them with a wooden 2x4.

 

*The challenge comes in that you need to keep the batteries in the scooters charged and everything in motion.

 

*You have a limited supply of power to distribute among the scooters during each level. The Childrens' level gives you an unlimited diesel generator to work with. And WWWIIIIDDDEEE open spaces.

 

*As the game difficulty increases there are more scooters.

 

*Sometimes they can collide and cause an accident which takes you away from charging and slapping the others.

 

*Some levels may have food aisles, which is sure to cause a commotion and jam up.

 

*Bonus rounds would be a free for all bumper car bash in the parking lot. On difficulty "A" a random speeder would race through and maul over the fat-asses you so carefully "nourished" up from the early levels. Thus thinning the herd.

 

*There could be freestyle stunt courses with speed bumps.

 

*Your score counter is displayed as Kilos Kept Moving.

 

*Higher levels have bigger scooters (and fatter women). This makes it more difficult to maneuver and more prone to jam ups. Their battery charge doesn't last as long. But they make an easier target to whack.

 

*You can name the people milling about; name them after your now-a-fat-ass wife. Or Beer Belly Bob.

 

*The attract screen would be a display of rolling fat asses going back and forth between the department store and the buffet next door.

 

*Some levels would contain special exit points where they could escape the playfield and go driving around the town. Like always wanted to do in Intellivision's Auto race. Now you're Riding Dirty!

 

*Downloadable content for your Harmony Cart would consist of a shopping mall level, airport terminal, hospital emergency room, a candy factory, aquatic olympics, processed food packs, and others yet to be revealed.

 

*Race levels might earn you tougher tires, bigger batteries, independent suspensions with better steering, heavy duty ball bearings.

 

*Two player games might be like the Lunar Lander and Moon levels. One person controlling the scooters, the other the 2x4 and battery charger.

 

*For complete immersion there would be a large intercartridge module. A stack if you will. This would be a half-height, double-wide, lead filled cart. Complete with jelly padding that plugs into the VCS. At the top is a gaping mouth ready to accept the real Game Program cartridge. Pear shape is definitely in!

 

*Harmony owner? We got you smothered! Harmony would plug into this double-wide E*X*P*A*N*D*R in the same lack-of-fashion as the real Game Program cartridge would. An option would be to purchase a single-piece SuperCharger-sized cart too. This is definitely a fat game. So regular sized carts won't do here.

 

*This XXXX sized cartridge would have feeding ports where you would plug in the ColecoVision steering wheel and pedals. Or you could roll your own!

 

*"Activision patches" would be series of small (GASP!!) laminated menus listing the highest calorie foods ever invented.

 

Thank god I don't have the time to put it all together! Eh?

Link to comment
Share on other sites

I rather see it as the PC hardware is emulating the VCS hardware.

The mouse is emulating the paddle.

The paddle is whack'n the fat lady.

I'm outta here!

 

But the mouse wouldn't be emulating a paddle. Nothing about the mouse would be functioning like a paddle. You would still be moving the mouse on a mouse pad. The only thing that would be going on is that Stella would be adding better mouse support for paddle games. Stella would be behaving differently, not the mouse. I was checking out the new TV effects on Stella the other day. While doing so, I played a short game of Seaquest with the mouse because I don't own an adaptor for a joystick. It didn't feel like the mouse was emulating a joystick. It felt like I had a controller option that I don't have on the VCS. That isn't emulation only because it is doing more than just emulating. It is in addition to emulating because it is doing more than the VCS does. I don't see anything wrong with that but a more accurate phrase would be modern hardware only instead of emulation only. If you could say that using a mouse is 100% pure emulation then I could say that using a paddle is 100% pure emulation of a mouse. It is kind of like what you said about,"If we go with 100% original experience, that will mean no emulation." If we went with 100% emulation ,that would mean no mouse. I think using a mouse would fit more into the hybrid in-between experience. It would fit in it more than," running with a modern-day flat screen and turning on Blargg effects." because the flat screen would be emulating a CRT more than the mouse would be emulating a paddle.

Link to comment
Share on other sites

But the mouse wouldn't be emulating a paddle. Nothing about the mouse would be functioning like a paddle. You would still be moving the mouse on a mouse pad. The only thing that would be going on is that Stella would be adding better mouse support for paddle games. Stella would be behaving differently, not the mouse. I was checking out the new TV effects on Stella the other day. While doing so, I played a short game of Seaquest with the mouse because I don't own an adaptor for a joystick. It didn't feel like the mouse was emulating a joystick. It felt like I had a controller option that I don't have on the VCS. That isn't emulation only because it is doing more than just emulating. It is in addition to emulating because it is doing more than the VCS does. I don't see anything wrong with that but a more accurate phrase would be modern hardware only instead of emulation only. If you could say that using a mouse is 100% pure emulation then I could say that using a paddle is 100% pure emulation of a mouse. It is kind of like what you said about,"If we go with 100% original experience, that will mean no emulation." If we went with 100% emulation ,that would mean no mouse. I think using a mouse would fit more into the hybrid in-between experience. It would fit in it more than," running with a modern-day flat screen and turning on Blargg effects." because the flat screen would be emulating a CRT more than the mouse would be emulating a paddle.

 

Semantics..semantics..semantics... Tch tch..

 

If need be you could glue a crank or full-size paddle knobs on one of the optical encoder wheels in those old-school mouses. Now you have a super-long-lived pot-less optical paddle that won't wear out! I'm sure the scaling and sensitivity would need a little adjusting.

 

HAY! I wonder if my CPU is in resonance with my 600MHz mouse and is causing the creep I described in the initial post? Second thought no. Because the trackpad exhibits the same problem. I'll still try it with the old school mechanical mouse for shits and giggles though.

 

Regarding the mouse not being a paddle emulator. What WOULD be a paddle emulator? By ur definition would a paddle emulator even be possible? You either have a paddle controller or you don't. You either upload software to the mouse to make it behave like a paddle or not. So in my mind's eye it *IS* a paddle emulator. Just as much as the F1 and F2 keys are GAME SELECT and RESET switch emulators. And the x86 is dumbed down to make it into a 6507. Same thing. They are achieving the same functionality via different methods, that's all. I believe we may have to broaden (hehe I mean fatten) the definition of what an emulator really truly is.

 

If we go 100% emulation that means no paddle controller, no stelladapter, by your definition. Just as the cpu is being made stupid like the 6507, the mouse is being made stupid by axing an axis. The cpu code is no different than the mouse code. The cpu is PC hardware made to do something different. The mouse is PC hardware made to do something different. Same thing.

 

If you glued two paddles together at a certain right angle you could emulate a mouse with it. If you connected the paddles to belt sanders and embedded them in the floor you could do a moonwalk and control games that way.

 

All this talk of paddles and stuff has got me thinking of a *new* adult game called etch-her-sketch.

Here's how it would play.

ahh crap I gotta run. I'll jot down the rules later.

 

 

god. this thread is a clusterfuck.

 

Yes it is.

Edited by Keatah
Link to comment
Share on other sites

I don't believe a paddle emulator would be possible because a paddle isn't software that can be emulated. It is hardware. It is a controller. If you got someone to play Breakout with a mouse who has never seen a paddle they wouldn't pick up the mouse and say,"What is this?" and you wouldn't answer,"It's a paddle emulator."

 

Yes, you either have a paddle controller or you don't. That is a controller option. You don't upload software to the mouse to make it behave like a paddle. You upload software to the computer to allow the use of a mouse as a mouse on paddle games. It is just two controller options. If I used a D-pad on my VCS instead of a joystick the D-pad wouldn't become a joystick emulator. In my hands, it would still be functioning as a D-pad just like a mouse ,in my hands, would still be functioning as a mouse.

 

Yes, they would have the same functionality via different methods in the program but that is exactly what different controller options are.

 

I don't think we need to broaden the definition of what an emulator is. Just call an emulator an emulator, a mouse a mouse, a paddle a paddle, PC hardware PC Hardware,... I don't see any reason to have to try to define things to fit into categories like 100% pure emulation, 100% original experience, and hybrid in-between experience. I don't see the point of saying,"Tonight we will be concerning ourselves with 100% pure emulation and making sure we get the best possible experience with PC-only hardware."

 

By my definition, 100% emulation would include stelladapters, paddles, using other things besides PC hardware to run Stella, CRTs, LCDs,... As long as a machine takes the place of the original console and doesn't do more or less than the original console then it is 100% emulation. Yours is the one that seems limited. It is a mouse, keyboard, box ,monitor, hard disk, PC-style joystick, and ,"That's all you get." That doesn't sound like you are talking about 100% emulation but 100% PC hardware because not including stalladaptors and so forth is limiting the computer to do less than the original console because it doesn't permit paddles and including a mouse is doing more than the original console by permitting the use of a mouse as a controller option. I just don't see the point of the whole PC only stuff to share your concern with making the mouse a more compatible controller option. You could have just shortened THE PROBLEM section down to the sentence," Stella needs better support for using a mouse on paddle games."

 

I agree that It's up to the programmer to build the best possible emulator. The best possible emulator isn't PC hardware only. It is them doing their best to program their emulator to give the best experience for everyone that uses an emulator.

Link to comment
Share on other sites

  • 1 year later...

I really think there needs to be an option to have the mouse controller "stop" at the side of the screen so that playing Kaboom! wasn't such a pain. Overall I agree that mouse controls could be better and TC did a lot of testing which seems like it could be useful to the Stella team.

 

Press Control-g to 'Grab' the mouse, to keep it in the game window. Then it will never exit the screen.

 

Also, improvements are definitely necessary, but I don't have a lot of time (or energy) to do it right now. I'm dealing with a lot of real-world issues (illness, work, etc). As well, I'm in the process of porting Stella to SDL2, and as mouse handling is specific to the underlying graphics library being used, I'm not even going to attempt to fix it until Stella is up and running in SDL2.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...