Jump to content


New Members
  • Content Count

  • Joined

  • Last visited

Community Reputation

15 Good

About JDelekto

  • Rank
    Combat Commando

Recent Profile Visitors

207 profile views
  1. I don't think that I have access to the source any more. While I have a large bin of floppy disks out in my garage, I'm not sure if they're still viable and many of them are unmarked or just numbered. I really wish there was the concept of source control on the TI back then! I'm pretty sure that it wasn't on any of the Clipboard disks. However, I will note that the source code for all of the Clipboard articles was always on the "B" side of the disk.
  2. Jon Dyer and I really enjoyed working on the AMS version of TI-Nopoly. It was amazing seeing a game that took up well over 32k (I think it was in the range of 72k to 78k), but it's been so long since I've worked on that project. Jon had a talent for doing the graphics on our collaborative projects. Also, I tremendously enjoyed working with Art Green on writing some of the AMS assembly library. I learned so much from him during the AMS project. It was his updated AMS Macro Assembler and AMS Linker which did all the magic of allowing applications to be designed modularly and use the AMS paged memory. We developed it using a modified Small-C compiler (I believe it was based on version 5.0 of Clint Pulley's compiler). While I don't remember any of the specifics, the change basically provide a new compiler directive (started with a '#' character, somewhat like #ifdef) to eject the necessary pseudo ops for Art's assembler. The developer had to be somewhat cognizant of the size of the functions that they wrote, but the idea was to wrap several functions around these directives so that they would fit within some 4k page multiple. I can't remember if the directive cared about the page size, it is quite possible the assembler handled that. The smallest unit was 4k, but you could theoretically bank in larger chunks that were a multiple of the 4k page size. For TI-Nopoly, we grouped dependencies together and attempted to keep number of C functions within a 4k page if possible. The Linker was the secret sauce. It knew how to take all of those chunks, neatly arrange them into the correct pages based on those chunks. The assembly would add stub code into the assembly to load the correct page number and then branch to the code in that page. It was very intelligent as to how it organized the pages. I think it is probably possible to use the Small-C #asm block to manually add the necessary pseudo-ops to define the page boundaries for the Assembler, my only contribution was to kind of wrap that up into its own neat little directive for the purposes of not littering the C code with lots of #asm blocks. However, it is not necessary to use that modified compiler to take advantage of the pseudo-ops when the Small-C output was then fed into Art's Assembler. I need to go back and read the documentation (and Art was extremely meticulous and thorough with his documentation) to see how the whole paging mechanism worked with regards to the pseudo-ops and linker settings. I used to be very familiar with the manual paging in Assembly code by loading the mapper with the correct page and then branching to the page where the new coded was mapped. Several years ago, I added an initial implementation of the AMS system for Tursi's Classic99. It supported more memory than the original hardware I had used for development. I actually thought about getting back into doing some development for the TI again. I wanted to finish writing a C compiler, but life got in the way at the time. Nowadays, I think the GCC module probably obviates that. However, I have toyed with the notion of a C# compiler. There are companies that are using tools to convert C# to C++ (like the Unity Game Engine's IL2CPP utility). I had been recently inspired by Adam Haase's "Realms of Antiquity" (which I purchased last year). For a few years, I had been following his "Tilting and Windmills" blog where he was documenting his journey on creating the various systems in the game. It's amazing what people are doing with the 99/4a nowadays that seems so far more advanced than the software that was being developed back in the late 70's and early 80's on the machine. Realms of Antiquity blew my mind when compared to something like Tunnels of Doom (which I spent a LOT of time playing when I was in High School). OK, I've rambled on long enough now. 😁
  3. Wow, I can't believe that after all this time, someone is still looking for something I worked on almost three decades ago. Jon Dyer and I worked on the project together, after being somewhat encouraged by the late Hal Kam who was a member of the Atlanta 99/4A User Group. Jon and I remained friends after high school and we collaborated on this project while I was still in college. If anything, it kept us off the streets at night. I should point out, I was much younger, arrogant and a bit flippant with some of those articles, so hopefully you'll look past that and find some of the code that Jon and I had worked on of use. It was a lot of fun putting it together and in some cases, we would spend a bit of time working on the smaller projects and then ending up putting together the articles the night before the user's group meeting putting together the disk that the group would copy and send out to subscribers. I'm still a software developer today, but I love to reminisce about the time that computers were growing up and while still somewhat complex to my younger self, much simpler than the microprocessors of today. As a side note, for a modest fee, Don O'Neil at Western Horizon Technologies (who hosts the whtech FTP site) will provide a snapshot of the FTP site on a USB drive for you. I ordered one last year and intend to take some time to go through what's there. Enjoy!
  • Create New...