Jump to content
IGNORED

TIPI - TI-99/4A to Raspberry PI interface development


Recommended Posts

Here's a progress report: First*, I tried running the websocket server on my Raspberry Pi Zero W communicating with JS99er on my PC.  While it worked, it was incredibly slow, which I attributed to the extra latency of having to run the transactions over a WiFi connection.  Doing "CALL TIPI" would take several minutes to load.  Next I tried running the TIPI and websocket server in QEMU running on the same computer as JS99er, but the websocket performance was not improved.  Finally I tried getting the TIPI service to run natively on my Ubuntu 20.04 machine with Python 3, and after the syntax fixes were in place, it worked, but still as slow as QEMU.  I added some of my own timing diagnostics to the js99er websocket interface, and it was looking like a single TC RD RC transaction is taking about 20ms on average.  The Developer Tools allowed me to inspect the websocket messages, and the timestamps show that the server is replying immediately with the majority of the RD RC replies on the same millisecond timestamp as the TC.  But the timing diagnostic showed the the RC onmessage event was ~7 ms after than the TC send, and the getRC() call was ~8 ms after that, and the next TC send ~5 ms later.  I tried disabling the cpu.SetSuspended(true) when rc or rd are null, with no effect.

 

(*) Actually I first tried compiling js99er-angular on my Pi Zero W; after getting the right version of nodejs and npm, several hours later it locked up, probably due to running out memory.

  • Like 2
Link to comment
Share on other sites

6 hours ago, PeteE said:

Here's a progress report: First*, I tried running the websocket server on my Raspberry Pi Zero W communicating with JS99er on my PC.  While it worked, it was incredibly slow, which I attributed to the extra latency of having to run the transactions over a WiFi connection.  Doing "CALL TIPI" would take several minutes to load.  Next I tried running the TIPI and websocket server in QEMU running on the same computer as JS99er, but the websocket performance was not improved.  Finally I tried getting the TIPI service to run natively on my Ubuntu 20.04 machine with Python 3, and after the syntax fixes were in place, it worked, but still as slow as QEMU.  I added some of my own timing diagnostics to the js99er websocket interface, and it was looking like a single TC RD RC transaction is taking about 20ms on average.  The Developer Tools allowed me to inspect the websocket messages, and the timestamps show that the server is replying immediately with the majority of the RD RC replies on the same millisecond timestamp as the TC.  But the timing diagnostic showed the the RC onmessage event was ~7 ms after than the TC send, and the getRC() call was ~8 ms after that, and the next TC send ~5 ms later.  I tried disabling the cpu.SetSuspended(true) when rc or rd are null, with no effect.

 

(*) Actually I first tried compiling js99er-angular on my Pi Zero W; after getting the right version of nodejs and npm, several hours later it locked up, probably due to running out memory.

Do you think it would be possible to implement the interface at the message level instead of the byte level?

Link to comment
Share on other sites

1 hour ago, Asmusr said:

Do you think it would be possible to implement the interface at the message level instead of the byte level?

Yeah, that seems like a good idea.  I bet we could use the DSR unchanged by moving the TipiMessage processing into Javascript, and send whole messages over the websocket.

  • Like 1
Link to comment
Share on other sites

33 minutes ago, PeteE said:

Yeah, that seems like a good idea.  I bet we could use the DSR unchanged by moving the TipiMessage processing into Javascript, and send whole messages over the websocket.

 

You could have the emulator fake the RC acks that the assembly code expects, collect the outgoing bytes in the request message... send them as a whole in the websocket...   When sending a message, the 4A will stream the message by sequentially putting a byte in TD, then set TC, then read RC for an echo of TC. then repeat for each of the next bytes in the message. 

 

Reading a message is similar, the 4A sets TC, then waits for an echo on RC, and then reads the byte at RD. 

 

I think I use a unique value in what is called resetProtocol in the assembly... which could be used to signal transitions between messages.   This might help creative engineers: https://github.com/jedimatt42/tipi/wiki/TIPI-Protocol#register-level-protocol, here you see the different values the 4A will initialize TC with to sequence send requests, read requests, or the reset -- which is just a handshake. 

 

Or depending how much emulator-fu you want to exercise, you could implement a different routine to trigger message sends and message reads.

Link to comment
Share on other sites

Matt, I saw your python script message over in the TIDIR topic area.  Thanks.  I will need to understand it a bit more.


I wanted to post something on the subject here as it was more appropriate for what I had been doing just to let you know.

 

I used VB code in excel to write a TXT file out trying to write to the original design requirements for the AfterHours BBS program for displaying D/V 80 files.  Pulls up fine in notepad as I would have wanted it displayed.  It doesn't pull up the same way in MyWord (TI-Writer equivalent).  Not saying there is a bug.  When I write the TXT file out, I terminate each line with a C/R.  No L/F's are in the file.

 

TI-Writer looks at the first character of each record to determine the line length to display on a line.  If there is a C/R in the line, it displays the C/R.  I would have to double check, but I am pretty sure I can type multiple C/R's on the same line in TI-Writer. There area no L/F's in the line. If I embed the line length character, the reading of a TXT file as TIPI.FILENAME/TXT treats the line length character as a character (many times it is a non-displayable character for short lines).  Then, it wraps the line to the  next row.  I haven't done much testing, but I think you must be line breaking on the L/F character or possibly feeding the routine with a longer record length???  How it all works successfully for what I call "normal" PC .TXT files was a significant accomplishment on your part.

 

TIDIR has the ability to convert a PCFILE to a TI file.  I was using it to convert the .TXT file to a D/V 80 file, and I was telling it to convert the C/R and/or LF to a C/R character.  Then, the "feature request" I asked about to convert the 144 files I created in batch mode instead of 1 by 1.

 

There aren't many programs out there that require a C/R in a text file.  The few cases I have seen have generally been with BBS programs for various display screens.  As I said earlier, this is not a bug with your ".TXT" routine.  It's actually quite nice feature to read in a PC's .TXT file.  I'm just not able to apply it in this particular situation that I had hoped.

 

What I am doing is a special case situation.  I spent some time last night reviewing the file descriptor record for a D/V 80 file to try and understand what must be in the TIFILES header.  Still got to do some more work, as I think my solution is to write that 128 byte TIFILES header out and create a real DIS/VAR 80 file so the TIPI can handle it as required.

 

Beery

Link to comment
Share on other sites

Throwing an idea out there in the event it is something of interest for someone with the skills.

 

The PI has the capability of interfacing with a camera at under $20.  There is the ability to specify frame size.  The PI does the image processing.

 

For users with 9938 systems, I wonder about the ability to have the PI present the video data (individual frame as I doubt streaming is possible) with a direct write through the messaging system to VDP.  The TI/Geneve computer with the 9938 itself is not having to do all the necessary math to generate and write the image, rather, it is done on the PI that is a lot faster.  Once the data is in VDP memory, then the  user could tap a keypress, etc. to save the image.  The program on the TI side of things would need to write to the video registers to set them up with the proper display.

 

Not sure how rough something would be in bitmap mode on the TI.  Seems like I saw someone displaying some graphics from a camera on the TI at last year's Chicago fair.

 

Anyways, just an idea if it interest someone.

 

Beery

 

P.S.  There was a recent "open source" library for the PI camera at https://www.raspberrypi.org/blog/an-open-source-camera-stack-for-raspberry-pi-using-libcamera/

 

Edited by BeeryMiller
  • Like 2
Link to comment
Share on other sites

On 6/11/2020 at 8:38 AM, jedimatt42 said:

 

You could have the emulator fake the RC acks that the assembly code expects, collect the outgoing bytes in the request message... send them as a whole in the websocket...   When sending a message, the 4A will stream the message by sequentially putting a byte in TD, then set TC, then read RC for an echo of TC. then repeat for each of the next bytes in the message. 

 

Reading a message is similar, the 4A sets TC, then waits for an echo on RC, and then reads the byte at RD. 

 

I think I use a unique value in what is called resetProtocol in the assembly... which could be used to signal transitions between messages.   This might help creative engineers: https://github.com/jedimatt42/tipi/wiki/TIPI-Protocol#register-level-protocol, here you see the different values the 4A will initialize TC with to sequence send requests, read requests, or the reset -- which is just a handshake. 

 

Or depending how much emulator-fu you want to exercise, you could implement a different routine to trigger message sends and message reads.

It sounds like a plan. Would there be any need to keep the control signals or would it all be replaced by messages? @PeteE would you prefer binary data over the websocket or a base64 encoded string? Maybe you could specify a protocol that works for you?

Link to comment
Share on other sites

1 hour ago, Asmusr said:

It sounds like a plan. Would there be any need to keep the control signals or would it all be replaced by messages? @PeteE would you prefer binary data over the websocket or a base64 encoded string? Maybe you could specify a protocol that works for you?

Yeah, the code I'm working on is using binary websocket frames for the whole message.  No need to encode the length of each message as part of the protocol, since each websocket frame already has its own length.  I should have more time to work on it later this evening.

Link to comment
Share on other sites

15 hours ago, BeeryMiller said:

Matt, I saw your python script message over in the TIDIR topic area.  Thanks.  I will need to understand it a bit more.


I wanted to post something on the subject here as it was more appropriate for what I had been doing just to let you know.

 

I used VB code in excel to write a TXT file out trying to write to the original design requirements for the AfterHours BBS program for displaying D/V 80 files.  Pulls up fine in notepad as I would have wanted it displayed.  It doesn't pull up the same way in MyWord (TI-Writer equivalent).  Not saying there is a bug.  When I write the TXT file out, I terminate each line with a C/R.  No L/F's are in the file.

 

TI-Writer looks at the first character of each record to determine the line length to display on a line.  If there is a C/R in the line, it displays the C/R.  I would have to double check, but I am pretty sure I can type multiple C/R's on the same line in TI-Writer. There area no L/F's in the line. If I embed the line length character, the reading of a TXT file as TIPI.FILENAME/TXT treats the line length character as a character (many times it is a non-displayable character for short lines).  Then, it wraps the line to the  next row.  I haven't done much testing, but I think you must be line breaking on the L/F character or possibly feeding the routine with a longer record length???  How it all works successfully for what I call "normal" PC .TXT files was a significant accomplishment on your part.

 

TIDIR has the ability to convert a PCFILE to a TI file.  I was using it to convert the .TXT file to a D/V 80 file, and I was telling it to convert the C/R and/or LF to a C/R character.  Then, the "feature request" I asked about to convert the 144 files I created in batch mode instead of 1 by 1.

 

There aren't many programs out there that require a C/R in a text file.  The few cases I have seen have generally been with BBS programs for various display screens.  As I said earlier, this is not a bug with your ".TXT" routine.  It's actually quite nice feature to read in a PC's .TXT file.  I'm just not able to apply it in this particular situation that I had hoped.

 

What I am doing is a special case situation.  I spent some time last night reviewing the file descriptor record for a D/V 80 file to try and understand what must be in the TIFILES header.  Still got to do some more work, as I think my solution is to write that 128 byte TIFILES header out and create a real DIS/VAR 80 file so the TIPI can handle it as required.

 

Beery

 

The TIPI native text file processing purposefully strips any whitespace off the right hand side of the line.. windows EOL is CR + LF, linux is just LF, there was some system that used to use CR only...  These were all about the terminal display rules originally.

 

If you want to preserve the whitespace and all it's control codes in a native file on TIPI, don't name with any of these:  dv80suffixes = (".cmd", ".txt", ".a99", ".b99", ".bas", ".xb", ".tb")

 

Then, you can open the native file as a DIS/FIX 128, and get all the bytes in the file. The problem with that on TIPI is that fixed record length means the last record gets padded.. I think, it gets padded with 0-NUL (0x00) characters. 

 

 

 

Link to comment
Share on other sites

I doubt @ElectricLab reads this thread regularly... You @Omega-TI should get used to using the tagging feature when addressing individuals... This would also help the bad etiquette of using peoples given names on the forum. It helps with privacy concerns as well as helping the general population connect the dots.. people can choose to get notifications when they are tagged. And lurkers can go 'oh, the myti99.com thing is @ElectricLab, click oh neat, he's built these other apps too...'

  • Thanks 1
Link to comment
Share on other sites

On 6/12/2020 at 8:33 AM, BeeryMiller said:

Throwing an idea out there in the event it is something of interest for someone with the skills.

 

The PI has the capability of interfacing with a camera at under $20.  There is the ability to specify frame size.  The PI does the image processing.

 

For users with 9938 systems, I wonder about the ability to have the PI present the video data (individual frame as I doubt streaming is possible) with a direct write through the messaging system to VDP.  The TI/Geneve computer with the 9938 itself is not having to do all the necessary math to generate and write the image, rather, it is done on the PI that is a lot faster.  Once the data is in VDP memory, then the  user could tap a keypress, etc. to save the image.  The program on the TI side of things would need to write to the video registers to set them up with the proper display.

 

Not sure how rough something would be in bitmap mode on the TI.  Seems like I saw someone displaying some graphics from a camera on the TI at last year's Chicago fair.

 

Anyways, just an idea if it interest someone.

 

Beery

 

P.S.  There was a recent "open source" library for the PI camera at https://www.raspberrypi.org/blog/an-open-source-camera-stack-for-raspberry-pi-using-libcamera/

 

 

Details here

 

This project uses an ancient Rpi model B with a Pi camera. It just streams the RGB data to the TI PIO port using a simulated parallel interface on the Rpi and all further processing is done by the TI to display the image on the bitmap screen. Takes about 45 seconds per full frame image. 

  • Like 3
Link to comment
Share on other sites

1 hour ago, Vorticon said:

 

Details here

 

This project uses an ancient Rpi model B with a Pi camera. It just streams the RGB data to the TI PIO port using a simulated parallel interface on the Rpi and all further processing is done by the TI to display the image on the bitmap screen. Takes about 45 seconds per full frame image. 

 

Very nice work.  Very close to what I was envisioning.

 

Do you have a TIPI?  I'm not quite sure from your video if the half tone processing was done on the PI or on the TI?  If it was on the TI, then I wonder if you could handle that processing on the PI side.  With the TIPI and the messaging system, I wonder if you then had all the data already processed, if you could use the send/recv message commands to send the data directly to VDP bypassing a transfer to CPU and back to VDP.  And, since it is already processed on the PI side, you would only have 16K max to transfer thus speeding things up quite a bit and perhaps better processing to get color????

 

Again, this all depends upon whether you have a TIPI or not.  I think there would be several users that would get a camera to add to their system.

 

Beery

 

 

 

 

Link to comment
Share on other sites

1 hour ago, BeeryMiller said:

 

Very nice work.  Very close to what I was envisioning.

 

Do you have a TIPI?  I'm not quite sure from your video if the half tone processing was done on the PI or on the TI?  If it was on the TI, then I wonder if you could handle that processing on the PI side.  With the TIPI and the messaging system, I wonder if you then had all the data already processed, if you could use the send/recv message commands to send the data directly to VDP bypassing a transfer to CPU and back to VDP.  And, since it is already processed on the PI side, you would only have 16K max to transfer thus speeding things up quite a bit and perhaps better processing to get color????

 

Again, this all depends upon whether you have a TIPI or not.  I think there would be several users that would get a camera to add to their system.

 

Beery

 

 

 

 

All the processing is done on the TI side. I just got my TIPI-PEB but it's not yet set up. 

I could have easily done all the processing on the Rpi side then just transfered the bitmap image to the TI for display (only 6K for the half-tone image), which would have sped up things significantly, but I wanted to see what the TI could do. It's a quirk of mine :) That said,  I'm not sure if what you can send data directly to the VDP with TIPI as I am not yet familiar with it's capabilities.

  • Like 2
Link to comment
Share on other sites

9 hours ago, Vorticon said:

All the processing is done on the TI side. I just got my TIPI-PEB but it's not yet set up. 

I could have easily done all the processing on the Rpi side then just transfered the bitmap image to the TI for display (only 6K for the half-tone image), which would have sped up things significantly, but I wanted to see what the TI could do. It's a quirk of mine :) That said,  I'm not sure if what you can send data directly to the VDP with TIPI as I am not yet familiar with it's capabilities.

The TIPI has a socket layer called with various commands.  One of the commands is to get mouse input; others are to bind/unbind/accept/read/write to the socket.  You have two choices when you call upon socket communication when receiving data.  You can elect to receive it directly to a RAM buffer, or directly to a VDP-RAM buffer.  The pointers are well documented in Matt's code for the DSR functionality.

 

So, if the 6K half tone image is already processed, basically you would just call the routine to deliver the image straight to an address range in VDP.  That assumes the "new" python code for the new option to trigger the camera is present.  One would just need to have the VDP registers set for the proper mode before calling the command.

 

I'm guessing the image loading process to VDP should be fairly close in speed to the loading of the TIPICFG program off the PI.

 

Beery

 

 

  • Like 1
Link to comment
Share on other sites

4 hours ago, jedimatt42 said:

About 6500 bytes per second.

 

Very slow. 

Well that's about 1 second per frame. Compared to 45 seconds, I can live with that! (Actually it takes about 3-4 seconds for the image capture alone, plus processing time on the Rpi. This process will likely be faster however with the Pi 3 than my original model B)

  • Like 1
Link to comment
Share on other sites

6 hours ago, BeeryMiller said:

The TIPI has a socket layer called with various commands.  One of the commands is to get mouse input; others are to bind/unbind/accept/read/write to the socket.  You have two choices when you call upon socket communication when receiving data.  You can elect to receive it directly to a RAM buffer, or directly to a VDP-RAM buffer.  The pointers are well documented in Matt's code for the DSR functionality.

 

So, if the 6K half tone image is already processed, basically you would just call the routine to deliver the image straight to an address range in VDP.  That assumes the "new" python code for the new option to trigger the camera is present.  One would just need to have the VDP registers set for the proper mode before calling the command.

 

I'm guessing the image loading process to VDP should be fairly close in speed to the loading of the TIPICFG program off the PI.

 

Beery

 

 

Super interesting. I will need to experiment but it should be doable. 

  • Like 1
Link to comment
Share on other sites

I've checked in my changes to support whole tipi messages in websocket binary frames.  The tipi server fork is here, and the js99er fork is here.  The latency is quite improved, and js99er seems to load "CALL TIPI" at the same speed compared to hardware.  The websocket server implements the "RESET" text command, and adds a new "SYNC" text command, which precedes a sent or received binary message (when TC=0xf1 and RC=0xf1.)  @jedimatt42 I moved the TipiMessage procotol handling to the tipiports C module, since the API layer works on the message level now.  I haven't tested it on actual hardware though, there's a chance that I broke something.

 

  • Like 3
Link to comment
Share on other sites

@PeteE That's awesome... moving the messaging down into the C makes sense. 

 

I spent most of Friday evening and Saturday evening working on python3 branch that works on real PI with real 4A.  There were a ton of little changes, pretty much every little feature was broken, but I think I got them all. I think my 'python3' branch has a workable 'upgrade' mechanism for the tipi-services...  The web-ui is still python2. And the delegating to xbas99 is still python 2, but that should be easy to fix. and we don't need the web-ui on the desktop.

 

I'll try to test your setup on the real hardware... 

 

I'd like to finish getting the python3 out as an update to the real hardware... and some of your refactoring for the message interface on the python side. In the previous batch, the only thing I disagreed with was the #if python2 stuff... I want to avoid a support matrix.

 

The tipi messaging python changes look great. I'll try and fold them up into my master branch after some verification. Then I can work on some packaging. 

 

 

 

 

 

 

 

 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

21 hours ago, PeteE said:

I've checked in my changes to support whole tipi messages in websocket binary frames.  The tipi server fork is here, and the js99er fork is here.  The latency is quite improved, and js99er seems to load "CALL TIPI" at the same speed compared to hardware.  The websocket server implements the "RESET" text command, and adds a new "SYNC" text command, which precedes a sent or received binary message (when TC=0xf1 and RC=0xf1.)  @jedimatt42 I moved the TipiMessage procotol handling to the tipiports C module, since the API layer works on the message level now.  I haven't tested it on actual hardware though, there's a chance that I broke something.

 

Can you make a pull request for me to get your changes?

  • Thanks 1
Link to comment
Share on other sites

On 6/16/2020 at 7:53 AM, PeteE said:

Done.

You probably noticed that I already deployed your changes to https://js99er.net, but I still haven't been able to get the RPi side up and running myself. My QEMU is running with this configuration, and all the services on the RPi  appear to be running except the TIPI boot service:

qemu-system-arm -kernel kernel-qemu-4.19.50-buster -dtb versatile-pb-buster.dtb -cpu arm1176 -m 256 -M versatilepb -no-reboot -serial stdio -append "root=/dev/sda2 panic=1 rootfstype=ext4 rw" -drive "file=2020-02-13-raspbian-buster.img,index=0,media=disk,format=raw" -device e1000,netdev=net0 -netdev user,id=net0,hostfwd=tcp::9901-:9901

In JS99er it seems to be making a websocket connection (at least it's different from not running QEMU), but it's closing after a short time, and nothing happens. Do you have any ideas hot to troubleshoot this?

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...