Jump to content

Photo

SIO timing requirements

SIO serial specification

3 replies to this topic

#1 phaeron OFFLINE  

phaeron

    River Patroller

  • 2,292 posts
  • Location:USA

Posted Sun Feb 12, 2012 7:08 PM

Does anyone know how rigidly the SIO timing specifications in the OS manual are followed in practice? I'm working on a high-speed SIO routine and am trying to determine what delays are required, since some of the delays are annoying to hit with variable DMA involved. The one I'm particularly interested in is the t2 delay from last command byte to command line deassert. This is specced as 650-950 microseconds, but it looks like the OS routine violates this by only waiting 400us:

	T-0.000403 | A=00 X=FE Y=00 S=F2 P=32 | EAA7: F0 F5			 BEQ $EA9E
  + IRQ interrupt
...
	  T-0.000002 | A=A0 X=FE Y=31 S=F0 P=B0 | EB11: A9 3C			 LDA #$3C
	  T+0.000000 | A=3C X=FE Y=31 S=F0 P=30 | EB13: 8D 03 D3		  STA PBCTL

The actual delay is even lower in a gr.4 (IR mode 8) screen, only 300us.

I was tempted to use timers 1+2 in 15KHz for timeouts as it's easier to set up than OS timer 1 and I can hit delays up to four seconds that way, but now I'm not even sure I need to be that precise with the delays.

#2 Rybags ONLINE  

Rybags

    Quadrunner

  • 15,263 posts
  • Location:Australia

Posted Sun Feb 12, 2012 7:57 PM

I suppose quickest from the computer side would be DMA off with a SIO call at some "sweet spot" in the TV frame.

Worst case (slowest) from a peripheral - you'd reckon that something like an old 810 or maybe even an older printer model with sub 1 Mhz CPU might provide a good example there, then again the 1050 is slower in some regards though.

I remember toying with all this stuff 20+ years ago when I did a 1050 emu on the ST, but I kept everything fairly close to spec.

As always, the best test case is probably to utlilize the public Beta testers.

#3 phaeron OFFLINE  

phaeron

    River Patroller

  • Topic Starter
  • 2,292 posts
  • Location:USA

Posted Sun Feb 12, 2012 9:04 PM

Well, thing is, I'm not even sure why there is a minimum specification for t2. As I understand it, the peripheral waits for the command line to go up so that it doesn't send the command ACK before the computer is ready to receive it. The command line stays up after that so there's no risk of the peripheral missing it even if it takes some time to process the command. All of the drive firmwares I have seen process the command, wait for command line deassert, and then send ACK, so they wouldn't have a problem with the command line changing immediately. The peripheral has a leisurely 16ms to send back the ACK.

The code that I'm working with is unfortunately not currently suitable for public testing as it's meant to be sent back from the emulator in response to a "get high speed SIO routine" command and therefore gets tailored for a specific drive. In that usage it doesn't need to follow the specs either as I'm also in control of what the drive expects, but I like to do things by the book and I'd also like to update my documentation to reflect the actual reality of SIO communications, which it seems doesn't quite match Atari's original spec.

#4 ijor OFFLINE  

ijor

    Stargunner

  • 1,929 posts

Posted Sat Mar 17, 2012 9:11 AM

The one I'm particularly interested in is the t2 delay from last command byte to command line deassert. This is specced as 650-950 microseconds, ...
Well, thing is, I'm not even sure why there is a minimum specification for t2.


I assume you mean t1, and not t2 (t2 is from cmnd deassertion to ACK).

I think I can understand the concept behind the minimum. Conceivable, some peripheral could check (even by hardware) that the CMD signal is asserted all along the command frame, as a way to confirm the validity of the frame. Or it could check for the actual signal edge to happen after the command frame.

In practice, I don't remember seeing any peripheral or firmware that requires the minimum. As you are saying, they just wait for CMD to be deasserted, without bothering if it deasserted somewhen earlier during the middle of the frame. Furthermore, most firmwares don't even need a maximum, they never time out at that specific point.





Also tagged with one or more of these keywords: SIO, serial, specification

0 user(s) are browsing this forum

0 members, 0 guests, 0 anonymous users