Jump to content

matthew180

Members
  • Posts

    3,212
  • Joined

  • Last visited

  • Days Won

    4

matthew180 last won the day on January 21 2022

matthew180 had the most liked content!

Profile Information

  • Gender
    Male
  • Location
    Central Florida
  • Interests
    My family, FPGA, electronics, retro/vintage computers, programming, coin-op games, reading, outdoor activities.
  • Currently Playing
    Divinity2

Recent Profile Visitors

22,178 profile views

matthew180's Achievements

River Patroller

River Patroller (8/9)

3k

Reputation

  1. Heh. It recent days, hit-or-miss. People are probably curious, but quickly realize it takes a real amount of time and effort to learn enough to make a program, and probably decide they would rather do something else. You can speculate about that all day long. However, back in the day you didn't just buy the E/A Module (cartridge), that was mostly useless on its own. You needed a 32K memory expansion, a disk controller, and disk drive; and that usually meant a PEB, but could also have been a collection of side-cars. That is a pretty significant monetary investment, so people were probably a lot more careful about the decision and maybe knew more about what they were getting into? However I knew nothing in 1983 (13yo) other than a vague idea of what assembly language was, and apparently my dad had more money than sense that day (or really believed in me?) when he bought the PEB package (TI was blowing them out by then, but it still cost about $400). I had many many frustrating days and nights, sometimes ending in tears, trying to figure out assembly from the E/A manual and the Tombstone City source code[1]. But I am stubborn as hell, infinitely curious, I must know how things work, I pay attention to detail, and most importantly I knew all coin-op video games were written in assembly. BASIC is sooo slow on the 99/4A (as we all know), and that was also a very strong motivator. So, the choices were, 1. play marginal games, 2. program in slow BASIC / XB, 3. learn assembly and own the machine with all the power and speed in the world! Muhaaa!! I furiously typed in the E/A examples, some worked, some did not (as we now know, there are errors). The first assembly program I tried to write was clearing the screen, trying to start simple, you know, like CALL CLEAR. I wrote this: `CLR R1`. I did not know what this "register" thing was that the "CLR" instruction needed, but that was the closest things to "CLS" that I could find, since that must be assembly for clear screen... (other BASIC dialects have CLS, so...) It assembled! It loaded! It RAN! It did *not* clear the screen. It was not until I got the Lottrup book, which does start with simple programs like clearing the screen and animating the `@` symbol, that things really started to click for me. Having to write a space character to every location on the screen was probably the first algorithm I was every directly exposed to, and it made so much sense. I don't really recommend the book these days (for reasons explained in the early pages of this thread), but it absolutely opened the door to assembly and low-level computer programming to me BITD. I still remember being so giddy about moving the `@` symbol across the top of the screen, and that I had to SLOW IT DOWN with a delay loop just to see it! Oh assembly language, you are soooo cool! I had to "slow it down!", something you would *never* have to do in BASIC where you spent your time screaming for it to go FASTER! I bragged about that program, to anyone who would listen, for weeks. I had many firsts on the 99/4A, and this was one of them. Giving up was never an option that entered my conscious. I do question my choice sometimes though, these days, as I scream at modern computers for entirely different reasons... [1] (The Tombstone City source came with the E/A manual, and was a really cool move on TI's part, IMO. Giving away source code to one of their commercial titles?! It really contradicts TI's stinginess of wanting to have all developers pay them for a license to publish software for the 99/4A).
  2. Don't use assembly... ? Just kidding, you should absolutely learn assembly, it is worth the effort and can be very rewarding. However, assembly language is only a mnemonic representation of machine code to help the programmer not have to write programs in hex or binary, so there are not going to be many ways to keep you from treating the data in an expected way. My recommendation would be: 1. Use xdt99's ability to use longer names for labels to its full extent. Picking good names for routines and variables is hard, but it is very important to help remind you what the data is and how you are supposed to use it. 2. Adopt simple prefixes or suffixes to name what the data is (i.e. integers start with `s16_` or `u16_`, bytes with `s8_` or `u8_`, addresses with `adr_` (address), etc.). These days, with xdt99, I have no problem writing 9900 assembly that does not work with TI's assembler (it was good BITD, but we have better now). 3. Use equates to name "magic numbers" and memory addresses. 4. Use larger block comments before code to explain what it is doing, which gives context to the code that follows that you don't get by commenting each line of code (which I find less useful since it does not tell you "why" or "what" is going on, only "how"). The CPU's view of memory is simply an address that holds a value. The CPU does not know if the data is a signed number, unsigned, an address, part of a larger value, or anything else. It is up to the programmer to keep all that straight, know what any particular data value is supposed to be, and select the proper assembly instructions to work with the values as intended. This flexibility comes at a cost though, and assembly programmers need to be very meticulous and detail oriented. You have to build a mental model of your data as you design and write your code. With a small retro computer a human can keep all this detail in their head at once, and is one of the main aspects of retro computing that separates it from modern computing. This is also why higher level programs were one of the first programs written for computers, with their abstractions and simplifications to what you can do with data, etc.. Such abstractions are needed to help make writing much larger programs even possible, but to also make computers usable and approachable for people who do not need or want to know how the computer works, but rather use it as a tool for doing other things (which may not even be computer related, i.e. writing a book, plans for building something, etc.).
  3. I'm sure I cannot take credit for that font, but I may have tweaked it a little. With the Classic99 smoothing filter turned on it is hard to know for sure which font it is. Probably taken from a coin-op game though, I like the way video games did their fonts. Yeah, but when you think of it like a 40 page book, suddenly it seems rather short. Although content density is probably way higher on the thread than in a book. The best opportunity for helping is when a learner is engaged, asking questions, and following through. Opry99 did pretty well and got what he needed from assembly, and AirShack (RIP) finished a game in assembly. IIRC, it was a buck-list of his that he achieved. It was very rewarding to see their "ah ha!" moments.
  4. I hope that includes the first 40 pages of this thread. They pretty much cover everything in detail, no hand waving or mumbling around topics. The second 40 pages are mostly a rehash of the first 40 pages. Sometimes it is hard to give an answer without slipping into some lower level detail. There is a lot of nuance at the hardware layer that leaks over into the assembly layer. The first thing people try to do is put abstractions on top of these details in an attempt to make the system easier to use. But there are trade-offs, as there always are. The hardest part about assembly language is not assembly language; it is understanding the system upon which you are trying to write code for. If the console ISR is allowed to run at all, for any reason, *every time* you want to do *anything* with the VDP you have to disable interrupts, set up the VDP address, read / write your data, and re-enable interrupts. And if the console ISR runs and you need VDP status in your own program, you have to get it from the dedicated location in scratch-pad (I don't remember the exact address) where the ISR stores a copy of the VDP status. @dhe The VDP can only receive one byte at a time. It takes two writes to the VDP to set up its internal address register, and reading the VDP status will reset that sequence. The 9900 (and most CPUs) will check for interrupts between instructions. This means if CPU interrupts are enabled, your code can be interrupted between any instructions in your program. If you are in the middle of the sequence to set up the VDP's address register, and the interrupt fires and does any communication with the VDP, then it will wreck your sequence and the VDP address register will not be set. Also, if you have set the VDP address register and are now in the process of reading / writing VRAM when the interrupt fires, and if the ISR is set to auto-play sound or process sprite motion, then it will change the VDP address register to what it needs and read / write VRAM. When your code resumes, you are now reading / writing to the wrong VRAM location. In the 9918A Datasheet, pg 2-1, section 2.1.2, along with a big NOTE. The 9918A Datasheet was not laid out very well, but the info is in there, it just takes some focused reading and taking lots of notes. What you don't get from the datasheet is the interplay between your code, the console ISR, and the VDP. This is why I really like to turn off the console ISR, and recommend for people learning to do the same. The VDP is very straight forward, and reinventing VSBW, VMBW, etc. on your own is part of learning. All the console ROM and GROM abstraction routines are usually thought of as helpers, however if you don't know the details of what they are doing and all the details about the ISR and such, they can be foot-guns when trying to get started and just do some simple things like putting graphics on the screen. IMO, the simple way to co-exist with the ISR: START LIMI 0 . . . Init code . . . MAINLOOP . . . All your code . . . LIMI 2 LIMI 0 B @MAINLOOP Allow interrupts once in your main loop, in a very controlled place.
  5. It is never safe to write to the VDP with interrupts enabled. You need to have a `LIMI 0` sometime before you call `BLWP @VMBW`.
  6. *nods* Sorry if I came across harsh. It always sounds better and kinder in person.
  7. Use emulation then? Or stick with your PEB. Or sell the PEB to pay for the TIPI... This is a hobby, so spending lots of money is pretty much in the definition. These devices are made by regular people here in the community who put a lot of their own time, money, and effort into creating the devices. And then even more time and effort to make them into some thing people can just buy and plug in. This is no small effort and it is not cheap. People selling and supporting their devices deserve to recuperate the time and life they gave up to make it available for others. I'm sure the designs for the TIPI and SAMS are out there for anyone who wants to make their own PCB and assemble the devices. Don't know, I bought them both.
  8. No, not really. I would like to, but the chips I'm using are a PITA to put on a breadboard, and sometimes the parts are only available in SMD, and the frequencies are too high (100MHz access to an SRAM or SPI Flash is not really going to work on a breadboard). Digital logic is pretty straight forward, the hardest parts are the analog bits, and of course noise (which is only worse on a breadboard). But it depends on what you are doing, so sometimes it makes sense (to prototype on a breadboard), sometimes not. For the F18A, I developed initially using an FPGA devboard that had the FPGA I was going to use, and I made a cable to plug into the 9918A socket on the host computer (99/4A in this case, see photos). Once it was mostly working, I went directly to a custom PCB, and it took three revisions to work out the electronic problems. You can use simulation these days too, to great effect, especially for digital stuff. You can also use HDLs (VHDL, Verilog, etc.) to write simulations, test them, see the timing diagrams, and use that to prove you circuits that you build with discrete logic. HDL is not just for programmable logic. Photos are of F18A early days of development.
  9. No need to justify anything, I'm just giving my thoughts as I look at the board. Apologizes if it came across any differently. Paying attention to your design rules (based heavily on what PCB house you use), and getting them set up first, will make your life much easier in the long run. Trace / space is a big one, keep-out and distance between components, and via drill to annular ring size are important. Having a clear and detailed silk screen will make your future-self very happy when you go to do assembly and troubleshooting.
  10. Why is U8 so close to the edge? You have a ton of unused space on the board, I would keep things well back from the edges. Consider spending some time working on the silk screen to make all the labels big, clear, easily visible, and add any information you need to configure the card. Every component should have a designation, and make sure all pin-1 designators are clear. It is hard to tell, but the input power trace to the regulator looks like any other trace; you are paying for the layer, so use the copper. Also, the regulator is close to the edge, which I realize is typical, but I never understood why they were done what way. IIRC, the regulators would also short out to the metal case if assembly was done incorrectly. No need to perpetuate a problematic design. What trace/space are you using for signals, and what are the specs for the vias? Edit: These are just my thoughts as I look at the board, they are not intended to be criticism. Just things you might want to consider.
  11. https://github.com/hneemann/Digital "Direct export of JEDEC files which you can flash to a GAL16v8 or a GAL22v10." GALs are getting harder to find support for, and the best options seem to be some form of PALASM, CUPL, or ABEL, and some various of the open source tools. I was recently introduced to Renesas "GreenPak" devices, which come in small packages and can cost as little as $0.50 (a 20-pin device that would easily replace a GAL is about $1.32). It has current software support for Linux, MAC, Windows, and the screenshots look very schematic and drag-n-drop. The specs of the devices are nice and it appears they can go to 5V Vcc, and therefore can support 5V TTL directly. I have not used any yet, but they look nice. https://www.renesas.com/us/en/products/programmable-mixed-signal-asic-ip-products/greenpak-programmable-mixed-signal-products As for validating your PALASM, I hope someone who knows the language and has some spare cycles will chime in. It looks easy enough, and if your chip works as expected then that is probably the best validation you can get.
  12. What manufacturer and specific part number did you end up with, and what programming tools are you using? When I was doing PAL/GAL programming, I used Lattice ispLeveler Classic, which allows writing the logic using HDL (VHDL in my case). I never learned PALASM, and although it looks somewhat straight forward, using an HDL would be easier, faster, less error prone, IMO.
  13. Oh, interesting. If there was a manufacturing reason for a particular orientation then for sure that would take a priority. I didn't know that.
  14. True, but computer graphics on a CRT go back as far as 1963 with Ivan Sutherland's Sketchpad, which can do things I still can't do today. Then in 1968 Douglas Engelbart gave the Mother of All Demos (you should really watch this in YT, along with videos about Sketchpad). In 1973 Xerox created the Alto with 808x606 GUI, mouse, Ethernet, GUI, etc.. The Alto was not commercially available, but it heavily influenced the 1980 PERQ-1 (which is consider the first commercial workstation). But before that, there were systems with frame buffers doing amazing things. "The Works" never-finished animated short from New York Institute of Technology was started in 1979 with some very impressive 3D rendering capability. And the hardware used for movies like TRON was being built many years before the movie released in 1982. So, we had these great things, then IBM gave us CGA... I understand the memory cost aspect, but memory prices dropped quickly during the 80s, but by then we were stuck with the myth of compatibility. I don't think that was a consideration. IBM was not into computer graphics, and their mainframes were not involved in that kind of stuff. It was mostly research computers from MIT or other schools, or places like PARC. Also, the PC was made in secret division of IBM in Boca Raton, away from the corporate prying eyes and influences, and my understanding from reading various accounts is, they could do pretty much what they wanted when creating the PC. I think cost drove most of the decisions though.
  15. This thread prompted me to better nail down the events of my 99/4A history. After searching through a lot of magazines between 1981 and 1984 (thanks archive.org !!), I discovered a few things that I never knew, totally forgot, or was just oblivious to: 1. I got my 99/4A in July 1983 (I turned 13 in July) right in the middle of the $149 with a $50 rebate phase (so a $99 computer). This was right on the heels of the $299 with a $100 rebate phase, and just before TI announced they were exiting (sometime in the fall), which caused the price to then drop to ~$50 in November 1984. It was interesting to watch the prices dropping every few months in the magazine ads selling systems. 2. I did not realize the price war between TI and Commodore was with the 99/4A vs the VIC20!?!? That was really really really stupid of TI!! Even more stupid than I thought they were. I always thought the price war was between the Atari 400, the C64, and the 99/4A. There was no way TI could compete with the VIC20, and Commodore still got to sell the C64 for a good profit. TI should have kept the 99/4A up in price with the C64 at the very least, and promoted software rather than trying to force licensing. Really stupid. 3. By October 1983 Sears (and everywhere else) was blowing out their TI inventory, and a PEB with SSSD, disk controller, 32K, E/A, and LOGO 2 was available for $399. I remember my dad took me to get one, which I really did not understand how we could afford that because we were not that well off. Sears was out of stock, so my dad got a rain-check; I only remember that because I did not know what a rain-check was, I only knew we were leaving without a PEB... Apparently Sears called in mid December that our PEB was in stock and my dad gave it to me for xmas. I had totally forgotten about it by then, and was not expecting that under the tree. Best xmas ever! 4. The 99/4A was released in June 1981, the IBM PC was released Aug 1981. It must have been down hill for TI from then on. I was oblivious to computers at this time (11/12 years old), and it was not until 1983 when I got the 99/4A that I seem to have become aware of, and really interested in, computers other than arcade coin-op games. 5. In Feb 1982, Compaq was started by three TI engineers who got fed up and left. I had no idea. 6. There were awesome computer graphics in the late 70s and early 80s (even before TRON), and the home computer, as well as the lame CGA/EGA in the PC, were just horrible and a huge step backwards. Yeah, I know, memory and cost, but memory prices dropped rapidly, yet we were then stuck with crappy hardware until the mid 90s. IMO, computers were way more interesting BITD!
×
×
  • Create New...