Jump to content

Chilly Willy

Members
  • Posts

    973
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by Chilly Willy

  1. I mainly said "probably" because I haven't checked on the 6502 market in years, and had no idea if anyone was making like an MCU using the 6502, but at stupid fast speeds. You can find 68xx MCUs like that still. Or 80xx MCUs. I've finally coughed up the dough for a MiSTer, and one thing I'd like to do is play around with my own FPGA cores for things.

     

    Thanks for the link on efabless - that looks pretty cool.

    • Like 1
  2. On 9/28/2022 at 12:41 AM, JamesD said:

    Pretty sure it was a Rockwell microcontroller.

    Yeah, it was a Rockwell chip. Don't remember the number offhand, but it was definitely Rockwell.

     

    EDIT: The R65C00/21 and R65C29. You can find the datasheet as part of the larger book, 1985 Rockwell Data Book: https://archive.org/details/bitsavers_rockwelldaDataBook_80778847

    Specifically, page 3-3 (which is pg 530 in the PDF).

     

    On 9/28/2022 at 5:48 PM, Ricky Spanish said:

    More MHz is what's needed.

    You can probably get more MHz with an FPGA 6502 than any of the 6502 (or equivalents) on sale. At the same time, you can make it multi-core and add some extras to it.

  3. It's not so much additional layers as the Genesis VDP has what it has, and you'd need something like the 32X to add more to the graphics. What it IS is having the horsepower to brute force changing the graphics on the fly so that the layers the Genesis has appear to be more than they are. Like, you take two backgrounds that use the same palette and merge them together with some logic using the ARM processor on the cart, then upload that as a single layer to the Genesis. Or maybe you're merging three layers to one... or whatever you need and have the power to do. The game probably spends almost every available cycle DMAing to VRAM, but the game is running on the ARM processor, so who cares? That's how some Atari 2600 carts work, and the results speak for themselves. 😎

    • Like 1
  4. Both the Saturn and the 32X used SDRAM. On the Saturn, HIMEM was 1MB of 32-bit wide SDRAM (two 256Kx16 chips); this was where the main game code and data was to reside. LOMEM was 1MB of 16-bit wide DRAM, and was for extra data storage. While you could put code in it, you wouldn't want to for speed reasons.

     

    The 32X used 256KB of 16-bit wide SDRAM (one 128Kx16 chip). You want most of your code in the rom, but critical code (like interrupt handlers) should be in SDRAM. The cache in the SH2 keeps the rom code from being too slow, so try not to trash the cache contents too much. Flooding the cache will force it to reload the code from rom, which is slow.

    • Like 4
  5. I'd guess that KRIKzz is looking at the 32X support in MiSTer, which uses an FPGA to emulate a bunch of difference consoles/arcades. If the FPGA in the MED Pro is big enough, it should be able to handle this. However, I don't see how you can get around the need for cables like the 32X... unless... perhaps instead of new code for the existing MED Pro, he is instead making an FPGA based Neptune. Now THAT would be hella-cool! 😎

    • Like 1
  6. 7 hours ago, agradeneu said:

    Home computer or console development community, e.g. Atari ST was very proficient with Assembly.  I think any proficient game/demo development was common with 68K. It was nothing exotic. 

     

     

    I was working on the Amiga at the time, and I agree. Everything but my last project on the Amiga was 100% assembly. The last Amiga product I worked on was done mostly with SAS/C. Between the ST, the Amiga, and the Mac, there were a LOT of people used to programming the 680x0 in assembly. When the PPC accelerators started becoming more dominant, I did a few things on it with StormC.

    • Like 3
  7. Nearly all the libraries for the Sega Saturn are assembly. It was that generation of console that started programming games in C, but a considerable number were still all assembly. It was also about this time that compilers were finally generating stable and (reasonably) fast code. The next generation of consoles moved almost entirely to C/C++, and assembly fell by the wayside. Not having a good JRISC compiler didn't help the Jaguar any, but it also wasn't the reason it failed. Too much is put on that belief for the time period the Jaguar occupied. The Jaguar 2 would have been the console to NEED a good compiler. People today say the Jaguar needs a good C compiler because it would make our job doing homebrew easier. That affects many of us, myself included. I work more on the 32X because it has great compiler support. I use the latest gcc, which can be almost as fast as the very best hand-written assembly.

  8. 1 hour ago, Kirk_Johnston said:

    So, to be clear in a way the average person can understand easily, the 1344–1808 number in the original post above, as used in the context of the segaretro article regarding Genesis' specs and unique background tiles, is basically talking about a roughly maximum amount of possible unique tiles you can have available in VRAM for any/all background layers (plus sprites), once you take all the tilemap data and stuff like that into account too?

     

    Yes. You start with 64KB, then start subtracting things. For example, a single name table in 64x64 mode takes 8KB of ram (64*64*2). If both layers are used in the same mode, that's 16KB from the 64KB total, so 48KB left. Let's say you use line mode for horizontal scrolling - that takes another 1KB of ram (256*2*2). Let's also say you use all 80 sprites, that's another 640 bytes of ram (80*8). Assuming nothing else, that leaves (64-16-1-.625)*1024/32 = 1484 tiles of ram left. If you don't use the line scrolling, you get more tiles. If you only use one layer, you get more tiles. If you use the Window layer for a HUD, you get fewer tiles. If you use more than 80 sprites (to make it easier to switch between which 80 display at any time), you get fewer tiles. Maybe you make the scroll layers 64x32 to save ram - then you get more tiles. You the programmer control the ram directly on hardware like this, and you need to plan just what you need vs what you have available.

     

     

  9. Each layer (and sprites) can refer to any or all tiles. There are no tiles restricted to a specific layer or sprite... well unless you have the VDP hooked to 128KB of VRAM, but that is not a concern for the MegaDrive, only the TeraDrive. Tile 0 starts at 0x0000 in the vram, and tile 2047 starts at 0xFFE0 in the vram. The tile number is indeed just the address of the tile data in vram >> 5 (divided by 32, the number of bytes in a tile). Any vram not being used for something else (like a name table) can hold tile data and be used by anything that refers to tile numbers.

    • Thanks 1
  10. VRAM space is used up by the layer A and B name tables (which can be different sizes like 64x32 or 64x64), the Window layer name table, the VSCROLL table (if used), and the sprite attribute table. All of those are mostly variable in size, or may not be used at all. This takes away from the total number of tiles you can have. Tiles can fill all the rest of vram, and are shared by all layers and the sprites. Games like Sonic keep most of the tiles in vram, then load a few on the fly from rom as needed by where you are in the map.

  11. The reason the stats drop from 40 tile to 32 tile mode is because the console physically lowers the clock the VDP is using to fetch data between the two modes. Slower clock equals less fetch slots which equals lower stats. Actually, it's really the other way around... in 40 tile mode, they raise the clock to fetch the extra data needed to display more pixels per line, and as a result, the stats go up.

    • Thanks 1
  12. I'd guess they're talking about the fact that the priority is set on a tile basis. The name tables for each plane use a word for each entry, and those entries have the h/v flip, the palette number. the tile number, and the priority. So on a single line, each tile can be set in front or behind the other layer/sprites, depending on the priorities of the tiles in the other plane, and the sprite priorities.

     

  13. 19 hours ago, JFD62780 said:

    ...I'm already seeing less detail in the map of E1M1...

    Could the less detailed map(s) be a part of why the 2.1 patch is a tad smaller than 2.0?

     

    No. There was no changes made to the level layout or details. If you can find any differences between 2.0 and 2.1 and screenshot them, that would be helpful.

  14. On 5/20/2022 at 10:55 AM, 42bs said:

    Yes, but not very well documented. Back in time I thought that code documentation is for lamers. ?

     

    From Real Programmers Don't Eat Quiche - "Real Programmers don't comment. If it was hard to write, it should be harder to read, and even harder to modify."

    ?

    • Haha 3
  15. I bought my link cable from Tototek many years ago. This was my first time using it in a real setting. It was a real experience. :D

     

    I'd worked on light gun support for SGDK, so I had experience using the external interrupt (both light guns and the link cable rely on the controller's ability to interrupt the 68000 using the TH line of the controller). Now there's a thought - light gun support for Doom. ?

     

    You can always learn new things. I learned quite a bit myself while working on D32XR. I really need to make an update on my tools and some demos to get the latest core 32X support code out. Right now, I point people to the D32XR repo for the latest 32X code. I also really need to update Wolf32X some time.

  16. The improved color was mainly because most people described the graphics as "muddy" and "dark". And it could be, especially on TV sets. So a few improvements were made to make that less of an issue. I will admit I like the new color scheme better. I was one of those who thought it dark and muddy.

     

    Barone and I did a lot of testing on multi-player via link cable. I did most of the hardware coding on that, with Barone and Vic keeping me working on bugs, like when my code would crash the game when the other system wasn't running... I hadn't even thought about the fact that most people were playing on a standalone system. I had two connected together, why wouldn't everyone? ?  And ask Barone about the noise when the two systems' soundtrack got out of sync. In multi-player link mode, one of the console would often have its music slow down for then unknown reasons. Took me a while to find that one. ?

     

    Meanwhile, Vic just kept adding improvements to everything else in the background. Much of the v2.0 changelog made it into the feature list for the game. So many improvements just cried out for a major version bump. This wasn't 1.6, it was 2.0!

    • Like 4
  17. On 5/2/2022 at 11:45 AM, agradeneu said:

    Atari really fucked up their consoles, they did not understand videogames at all I think. They were all about hardware and price.

    They didn't understand HOME video games. They were still king in the arcade, and remained so for some time to come. At one point, that was almost their only presence. To be fair, most companies have screwed the pooch a time or two (or three or four) in the home market. Look at Sega.

    • Like 2
  18. Nintendo tends to do their thing with little thought to how the other companies are doing their own. Most people think of the N64 as late to the party for the PS1/Saturn generation. You can think of it that way. As a programmer, it just doesn't feel like the same generation to me. The main processor supported floating point, an MMU, and kernel/user mode, all things found on the next generation, and none of the previous. The GPU was capable of perspective correct rendering, another thing only found on the next generation rather than the previous. It had much more memory (4 or 8 MB). The clock rates were far higher than the previous generation. Nintendo designed their SDK to isolate the developer from the hardware, again like the next generation and completely opposed to the previous.

     

    If you think of it as first of the next gen, a lot of these features all feel more in common. However, being first means everyone after you can see what you've done and try to do better, much as Nintendo did to Sega in the Genesis/SNES generation. The way the SDK for the N64 works, it seemed Nintendo had intended for their next console to be an updated version of the N64. That clearly didn't happen, and Nintendo moved to a new architecture, one it WOULD stick with for at least a couple more generations.

     

    Most people think of the start of the next generation as being the Dreamcast in 1998. But if you look at the DC, it's hardly much more powerful or advanced than the N64. Clearly Sega was just seeing what Nintendo did, and then doing that a bit better. Then Sony did that a little better with the PS2. MS was the wild card in this generation as it wasn't really clear if or when they'd throw their hat into the ring. At one point, it was just Nintendo, Sega, and Sony one upping each other in turn. But the hardware and SDK for the N64 looks more like the sixth gen than the fifth, which is why I consider it first in that gen rather than the DC. It came out about half-way between the two generations, so you could easily include it in either from a release time point of view. However, being so early meant the N64 ran out its lifespan in the middle of what most people call the sixth generation, so there wasn't much pressure on Nintendo to make as big an improvement to their next system as there had been for the N64. The GameCube seems more like the rest of the sixth generation systems, so they consider it part of that generation. I think if it more as Nintendo further divorcing themselves from what the rest of the game community was doing. They were taking themselves out of the race to make the "best" console. They starting making improvements to the console as met the needs for the next generation of GAMES rather than trying to match/exceed the next generation of hardware. They realized that it was the games that were important, not the hardware. As long as the hardware was capable of what the software needed, they didn't need to compete in the hardware wars like everyone else was. Every generation has taken Nintendo even further out of that race, leaving it to Sony and MS to keep at driving hardware further. It is kinda funny that in the end, Sony and MS have "standardized" to basically a PC with near identical specs. They also seem to be getting the idea that it's the games that are important, not the hardware.

    • Like 2
    • Thanks 2
  19. 1 minute ago, Jag64 said:

    Ok. Cool. Gonna try this again:

    Can the Atari Jaguar produce a game on par with the N64? Please answer yes or no.

    Yes... and no. It depends on the game. For a pure 3D game like Mario64, the Jaguar isn't up to the task. At least, not compared to the N64. For a 2D game, the Jaguar could compete with any of the other consoles - the example of that being Rayman. The Jaguar version is easily as good as any other port of that game.

     

    Let's face it, the Jaguar's 3D is rudimentary at best, needing a lot of babying to get good results. The N64 has a full on GPU that was on par or ahead of what was available for the PC at the time. If you ran your Jaguar game code on the GPU in local ram, the N64 main processor was almost four times as fast on the same code. If you ran the code on the 68000, the N64 did loops around it running backwards on its hands. It's really not that fair to compare the N64 with the Jaguar. I don't think of the N64 as the last of its generation of consoles, I think of it as the FIRST of the next generation that included the PS2 and Xbox.

    • Like 3
  20. Who said the N64 didn't have bottlenecks? I certainly didn't. In fact, I specifically said that the only reason the N64 had any speed at all is that its ram was stupid fast. That's the primary bottleneck, same as the Jaguar - all the ram is shared. While graphics are being fetched for display, everyone else has to wait. While graphics are being drawn, everyone else has to wait. While the processor is fetching/storing code/data, everyone else has to wait. Unified ram is a big bottleneck on systems that use it. The main cpu in the N64 at least has decent caches to allow it to stay off the bus most of the time. A cache miss can be a big slow-down if you're not careful as much of the bus time will be going to the RDP to draw the 3D, and the display interface to output the video.

    • Like 1
  21. Haven't seen Battlecorps on anyone's list yet. It's on mine... not at the top, of course, but still a favorite. If you like Battlezone type games, Battlecorps is a good example of that style game.

    • Thanks 1
  22. Nintendo made the N64 easier to program for. You had one processor that ran "normal" code. It had a robust toolchain, having compilers and assemblers, so you could write the code any way you were comfortable with. The power of the coprocessor for graphics and sound was easy to use. Regardless of how much I loathe their "microcode" nonsense Nintendo yammered about incessantly, it was easy to do all the things you needed to for 3D, and the audio library was up to playing sound effects while handling a MIDI-ish soundtrack.

     

    About the only real problem was the development system itself - if you thought buying an ST just to program for the Jaguar was bad, imagine if you were told you needed to buy an SGI instead! ? ?

    • Like 1
    • Haha 1
  23. On 4/27/2022 at 9:18 AM, laoo said:

    I feel that the discussion diverges from the OPs intent. I think that the OP asks to point design decisions that turned out to be wrong. So bugs in RISCs and lack of good toolset aren't design decisions. Using 68000 instead of e.g. 68030 is such decision but I don't think it can be called an obvious design flaw. The system certainly would benefit from better processor (especially from caches, pipelining and 32-bit bus). We know that it was considered, and it ended being against the design philosophy where the CPU is only a supervisor to RISC processors and the increase of final cost of the box had been considered unacceptable. We don't know if the machine would be competitive with better processor.

     

    I think using the 68000 actually was a design flaw. Jerry was designed to adjust its external bus to the width of the main processor, so because the 68000 was 16 bit, Jerry had to be as well. Using the M68EC020 would have made a world of difference in a few ways: first, having a 32-bit data bus would have kept Jerry's bus at 32 bits as well; second, it had a 256 byte instruction cache that would have helped tremendously in keeping it off the bus compared to the 68000; third, the instruction timing was much improved over the 68000, so it would have been much faster even at the same clock rate... but in this case, it could have been clocked at a higher rate, perhaps even the same as the JRISC processors. Yes, it was more expensive, but the price was dropping rapidly at the time as the EC020 was a favorite chip for a lot of folks at the time, be it appliances or computers.

     

    Another possible contender for main processor would have been the SH1. It also had a 32-bit data bus. It also had a cache (more than the 68020). It was also a favorite in appliances of the time, and was very reasonable in price. When (if) Atari moved to the Jag2, it could have moved to the SH2/3/4, depending on when it would have released.

     

    • Like 3
×
×
  • Create New...