Jump to content

landondyer

Members
  • Posts

    56
  • Joined

  • Last visited

Everything posted by landondyer

  1. That's likely my fault. The output of the Alcyon C compiler was really, really terrible. I wrote a peephole optimizer using the Unix tool sed(1), using something similar that the 4.2 bsd folks did on the Vax as inspiration (the Vax compiler was also terrible). The script just looked for really easy patterns in the Alcyon compiler's intermediate assembly to collapse into smaller code sequences. I think it reduced the original ROM size by about ten percent. The ROMs went final in May 1985; I think we spent 2-3 weeks towards the end, crunching out bytes to get the code to under 192K. We were well over 210K at one point. No worries, nearly everyone on the Atari software team knew how to make code smaller for ROMs, most of us were game programmers, after all.
  2. Boy howdy, you can say that again. One of the build steps for the ST ROMs was a peephole optimizer I wrote . . . in sed. One pass of a relatively stupid script (100-200 lines IIRC) trimmed the code by over 5%, closer to 10%. Credit where credit is due: I got the basic idea from the build scripts of 4.2bsd for the Vax 11/780 -- the compiler there wasn't all that bright, either. I think the script found redundant loads and other quite local things, like replacing ADD.W with ADDQ.W. This may have been where we did the "line F" compression hack as well (which let us turn a six-byte absolute mode function call into a 2-byte trap, at some cost in execution speed). There are a bunch of other tricks you can play if you've got intermediate assembly language available for a whole ROM, like collapsing common tails of functions, and identifying "nearly identical" functions for rewriting by humans.
  3. "wivax!dyer" is Steve Dyer, not me. Additinally, my start date at Atari was in late October 1982; the post in question is from April 1982. (My first 400/800 game was a Centipede clone, which Atari tried to suppress, and which also got me a job there. Irony, several times over).
  4. There was nothing "backwards compatible" with the 32K decisions made by Apple in the Mac runtime. Mac apps were encouraged to keep their heap blocks relocatable as much of the time as possible, to minimize heap fragmentation. 16-bit PC-relative operations were both nicely position-independent as well as 2 bytes smaller than instructions that used absolute addresses; smaller code was very, very good on the Mac. Handles were just non-relocatable blocks (anywhere in memory) containing pointers that the memory manager maintained to relocatable blocks as they were shuffled around. Apps would often call MoreHandles() as part of their startup, so that the non-relocatable handle blocks would be located near the bottom of the heap, again to reduce fragmentation. GemDOS doesn't even have a heap. It just gives the launched program "the rest of memory" to do with as it will. The GemDOS loader and various simple tools can generate relocation information that allows absolute addresses to be fixed up. It's not clever, it's just about the simplest thing you can do and still be able to load a file off of disk and run it. (When I worked on MPW, I ran a little project that effectively removed the 32K limits on code and data for the 68K Macs. By then the increase in code size didn't matter as much, since most systems were selling with 4MB or more of RAM).
  5. The Mac was hardly backwards; it had some of the most sophisticated graphics and toolkit functionality on the market. And while the PC sucked, Microsoft was doing pushups and it would stop sucking in '91 or so. Apple and Microsoft had many, many more engineers than Atari did, and the writing was on the wall -- Atari was going to get buried by better software, with more breadth than the small team at Atari could ever hope to match. I'm not sure what Atari could have done to be a success. They could compete on price, to a point. But Atari management was never that great at defining and providing a software platform, and there were maybe 20 software developers at Atari, and that made it really difficult to compete in the long term.
  6. The ROMs are definitely still copyrighted. I don't know if the entity that owns the rights even knows they do (or perhaps there are multiple entities who think they have the rights, but only one does), but it's certainly owned by someone. You might check the ROMs for Atari's "copyright bitmap". It's like 8 or 10 bytes of random gunk, not ASCII or ATASCII or 6502 code, that Atari wanted developers to put into games so that ownership could be proved. Basically they'd ask an offender "Hey, what are these bytes for?" and they'd be unable to answer, because they had no functionality at all and were just there as a signature. (No, I don't remember the bit pattern...)
  7. Give my regards to Mike. The AMAC / editor combination was a real winner; that editor was one of better ones that I used on the 8-bit line.
  8. CAMAC stands for "Cross AMAC" and the only version I know of ran on the DG MV/8000. The native (non-cross) version, AMAC, ran on the Atari home computers. That's about all I know -- the tools / systems folks in the Home Computer Division took care of it, it was pretty much a black box to me. I don't even know what it was written in. (It was a nice assembler. I have fond memories of it, and I stole some features from it for a series of assemblers I wrote a few years later).
  9. John Hoenig and I collaborated on this. I wondered out loud at him one day if it were possible to do a cheap MMU, say, base-and-bounds, like mainframes of yore. He was dubious about an adder. "If you can do this in about one gate delay, we can probably implement it," and that was the challenge he gave me. Carry propagation time ruled out doing address translation with an adder, and trap generation with a bounds check, though, and you needed more than one address range to handle the stack the way that Unix wanted. Too many gates. So I thought about it for a while, realized the "all 1s or all 0s in the upper bits" hack while out on a walk, then John and I spent maybe an hour working out some details (it was his idea to map the ROMs in -- I didn't think that was necessary, but it was cheap to do). I think that he kind of snuck it in to the memory controller design; I'm not sure that Shiraz knew it was in there. I really like the design, and it would have been fun to port Unix to the box, but there was never enough time to do that.
  10. This is the MMU that I described in my blog. We called it the GMMU, for "Garbage Dump MMU", which is where we thought it up. I mentioned to John Hoenig that it would be great to have an MMU on the ST. John said, "Real MMUs take too many gate delays" and challenged me to come up with a design that could be implemented in minimum gates, with just a gate delay or two in the data (well, address) path. A bunch of walks out to the Sunnyvale settling ponds and beyond, we had it. I think it's kind of neat, though Atari never shipped an OS that took advantage of it. I think the chips started coming back just about the time I'd decided to leave Atari and go to Apple. And Atari never had enough engineers to really follow through (sigh).
  11. Interim computers (whatever worked; think Apple Lisas and some random 68K systems) until STs with hard drives were available. We also used Vaxes (running VMS and bsd Unix). The hardware folks had some CAD systems (Mentor, I think). No PCs during my time there. Much of the financial stuff was being done on the VAX/VMS system, but I don't know the packages they used.
  12. Hey, I was a little older than that. I think I reached legal drinking age in Maryland before I finished that game :-) Looking back on it, it's a mess. The colors suck pretty hard. There are bad animation artifacts on the centipede segments. The segments interact in awful ways that are pretty much guaranteed to wipe the player out no matter what they do (I claim that it's also possible to get unavoidably wedged by barrels in DK, since a lot of their movement is random and with no guarantee of survivability, though it doesn't happen too often and it always looks like the player's fault). I remember being really depressed when Atari sent me the rejection letter for the game. Like, crawl-under-the-bed-for-a-while depressed. I stuck the "public domain" notice on the title page, found a user group meeting in the DC area, gave a quick demo of the game and said "Here you go, have fun." I guess it all spread from there. Glad you liked it :-)
  13. That's pretty neat. My God, that stuff is 30 years old, and people are still hacking on it? Yike. Here's a little history about that mode. I have a habit of putting contractors out of work. The 68K stuff in MadMAC was nearly done when I got wind that someone in the group doing the 7800 (on the other side of Building 1196 -- it might have been Richard Frick's group) had hired a contractor to write a 6502 assembler, since there was nothing very good available for developers to do cross-assembly with. I'd written a series of assemblers as a hobby over the past couple of years, and now had nearly finished MadMAC, and my curiousity got the better of me. I wondered "What does an assembler written by a professional contractor look like? I'll bet there are some cool techniques, and hey, we own the source code, so..." I was system operator on the VAXes we were using, so I logged in as superuser, dug into the contractor's account and started reading what they'd produced. The contractor (a couple of guys, actually) had taken several weeks and so far only had some notes and a little bit of code. It was clear that they had never written an assembler before and had little idea what they were doing. Their assembler was going to take 3-4 months to finish, and it wasn't going to have conditionals, or macros, or listings. In short, it looked like my Hobby Assembler #2 (which I thought was unusable for production). I was sorely disappointed. I was not impressed. Honestly, it was depressing to see how prosaic their effort was. I was kind of mad that we were paying some not-very-with-it guys money to do a miserable product. (That's assuming they *would* be paid, which was not exactly a sure thing in those days, at Atari, by the way). So I told Rich that he he might not need those guys. "I can do that work in week or two." He didn't believe me, so I just did it, over the next week or so. It was easy because most of the assembler infrastructure was already there in the MadMAC code, and being slightly angry (and having three or four other hobby assemblers under my belt) made it even easier. Then I told Rich, "Hey, we've got a 6502 assembler now, you don't need those contractors." It took a little convincing, but in the end we saved the company a chunk of money, and MadMAC's 6502 mode, clunky as it was, was almost certainly better than the tool we would have gotten from the contractors. I said it was a habit, to put contractors out of work, so here's another example.
  14. This was a really deliberate decision. In the arcade, the meta-game that the console manufacturer plays with the arcade owner is that of play time. Anything (well, within reason) that can be used to extend the play time of the unit can cause artificial demand and cause the arcade to buy more machines to meet that demand. Thus many games had cute cartoons and title screens and other time-wasters built in. Arcade owners would be happier if games lasted ten seconds before you had to plug another quarter in. Players wanted to play for hours for twenty five cents. Some tension there. Also, players enjoy the cartoons. With a cartridge, the company already has all the quarters they're ever going to get. There's little point in sucking up player time, and you might as well let the user enjoy the game play at maximum velocity. And that intro cartoon gets really tiresome after a while. I still wanted to put the intro animation into DK, and had enough ROM to squeeze it in (it might have been a few hundred bytes, maybe more for the sounds). I also figured that an attract mode in a store would pull in more sales (I have no idea if this worked, there was no way to do A/B testing). I should add that Atari management was basically clueless about my decisions, and they would have been happy with anything that displayed a monkey that they could sell for fifty bucks. I still think that little touches are important.
  15. This is a good time to say this, after 8+ months of your improvements and much discussion: I'm red-green colorblind. [ducking]
  16. The girder level was removed because there was insufficient space. I don't think the game play suffers because of the change. Could have done scrolling (I proposed doing this) but it would have looked strange. Likewise, compressing the vertical space of all the objects on the screen would have looked pretty bad. Removing the level of girders was a reasonable compromise.
  17. On second viewing, Kong looks far less like a teddy-bear now. One of you time-travelers go back and hand those fixes over, k?
  18. The decision to start the game immediately was conscious. Artificially extended play time is good for Arcade machines, because that means an arcade owner may need to buy multiple machines to satisfy demand. In the case of a home game cartridge, you already *have* the user's quarter, so to speak, so making them wait through a long intro on every play is unnecessary (and irritating). The attract mode was there for traditional reasons, to drive sales in kiosks . . . and somehow it wasn't Kong without the music and the platform collapse. I should add that no one in Atari HCD marketing ever told me to do those things. I don't think they knew enough to even ask.
  19. That's pretty nice. I think it would be awesome for someone (not me!) to re-do the XOR graphics. With more memory available it should be possible to do a masked draw-and-repair rather than a repeated XOR; that would make the artifacts when things overlap go away. Those always bugged the hell out of me, but I didn't have time (or memory) to do anything about it. You'll have to make masks for the bitmaps; shouldn't be hard. It also might be possible to reposition a player or missile with a VBI to get white into Kong's face. And fix Mario's final death frame. Argh, that was embarrassing. :-)
  20. On the 400/800, "Soft sprites" (heck, it's just bitmap animation) potentially give you more colors, and more objects on a line. For instance, in Super Pac-Man pretty much everything is done by animating character graphics cells (all the ghosts, the fruit, etc.), and Pac-Man uses two players and one missile (a 17-pixel wide circle looks way better than a 16-pixel wide one). I think that the eyes of some of the ghosts are done with missiles positioned over the ghost, an effect that nobody ever noticed, but kind of fun to do (and it probably got an extra color). Restricting your moving objects to the hardware sprites is pretty limiting, and I saw a lot of games that fell into the trap of "welp, we've animated all we can with those things, and we're done," the result being a title that looked kind of spare and unengaging. Those objects are just tools, and sometimes you're better off using them as "flair" rather than the primary vehicle for your animation.
  21. Why would you need multiply or divide to position a sprite? (If you're using floats for coordinates you're probably leaving a lot of performance on the table. Consider using fixed point). Actually doing the masking and rendering operations is similar on both CPUs, though the 68000 can do them faster. On both platforms you probably want to pre-shift images, unroll loops, etc. I found collision detection based on multiple bounding boxes to be cheap and good for tuning the player experience. Pixel-perfect collision detection isn't very friendly (for instance, every time I saw someone lose a life in Caverns of Mars when one of the ship's pixels hit the cavern walls, I died a little inside; what an awful way to treat a player).
  22. As far as I know, no games issued from Atari that were written in FORTH. Certainly no big titles. The language was fun to play with, but ultimately too slow (and a generally write-only class of unmaintainable) for production use. Donkey Kong Junior for the Atari 400/800 was started in FORTH, and it was a disaster on rollerskates -- really a spectacular mix of arrogance, self-delusion and lying -- until the programmer was fired and it was re-written in assembly (by Jeff Milhorn and Kevin Sacher, IIRC). FORTH was used in the bring-up of the Atari ST hardware, and it worked well there. It was an in-house adaptation of something (probably 68000 FigFORTH) on a cartridge, and used to exercise registers and do limited testing. We software types were quite happy that the hardware engineers were able to try their own stuff out and fix bugs in it before (a) committing to silicon and (b) passing the ball to us :-)
  23. I had an 810 before I had an Atari computer. I bought an 810 for use on a home-brew Z-80 system; wrote a bit-banging SIO driver for CP/M and did a bunch of work for my college courses using a C compiler and wrote papers using Mince and Scribble. The 810 was slow, but it fit my student budget. (At one point I had a question about how the SIO protocol worked, so I called up Atari and managed to weasel my way into talking to an engineer, who I believe was Rob Zdybel, and who I worked with pretty closely a few years later. Small world). Then I bought an Atari 400 so I could write games. A couple years later I got an Atari 800 and an Axlon 128K RAM-disk, and that setup (along with the Synapse Assembler) was just unbelievably goddamn screaming fast and was the bees knees, though I never wrote anything serious for the Atari after that, given the state of the industry. In the 80s it always seemed that we got the great tools just in time for the world to move on to something else.
  24. Comments are a tool to help code understanding. The comments in DK were written (by me) when I first wrote the code. I like to think of it this way: If you can't explain what's going on to someone else, you don't understand it yourself. Also, reading uncommented code, especially assembly language, is horrible. (The original 68K Macintosh ROMs were highly commented 68000 assembly language. Most good bodies of assembly language are very well commented. If you write crappy code, most places will fire you eventually). There's no magic in writing code like this. You figure out the data structures and algorithm, translate that to instructions (maybe via pseudo code, that's usually how I did it), and debug what you wrote. No "moving barrels around" and having the computer figure something out; *you* have to write the code that moves the barrels, not the other way around. The dates are real. Those comments are really just snarky ways of saying "I was working pretty hard" (and in retrospect this project went together really fast). IDS stands for "Internal Design Spec", by the way. Assembly language isn't all that esoteric. There are a lot of fiddly details and it's a really primitive, but the low-level details of CPUs are generally well documented. It's not esoteric, it's just that it hurts to write it and normally nobody does it if they can avoid it. :-)
×
×
  • Create New...