Jump to content

Chilly Willy

  • Content Count

  • Joined

  • Last visited

  • Days Won


Chilly Willy last won the day on February 28 2013

Chilly Willy had the most liked content!

Community Reputation

492 Excellent

About Chilly Willy

  • Rank
  • Birthday 08/24/1965

Profile Information

  • Gender
  • Location
    The Heart of NASCAR
  • Interests

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I checked the link from the cart and it says it's still the latest firmware.
  2. I bought my link cable from Tototek many years ago. This was my first time using it in a real setting. It was a real experience. I'd worked on light gun support for SGDK, so I had experience using the external interrupt (both light guns and the link cable rely on the controller's ability to interrupt the 68000 using the TH line of the controller). Now there's a thought - light gun support for Doom. 🤣 You can always learn new things. I learned quite a bit myself while working on D32XR. I really need to make an update on my tools and some demos to get the latest core 32X support code out. Right now, I point people to the D32XR repo for the latest 32X code. I also really need to update Wolf32X some time.
  3. The improved color was mainly because most people described the graphics as "muddy" and "dark". And it could be, especially on TV sets. So a few improvements were made to make that less of an issue. I will admit I like the new color scheme better. I was one of those who thought it dark and muddy. Barone and I did a lot of testing on multi-player via link cable. I did most of the hardware coding on that, with Barone and Vic keeping me working on bugs, like when my code would crash the game when the other system wasn't running... I hadn't even thought about the fact that most people were playing on a standalone system. I had two connected together, why wouldn't everyone? 😄 And ask Barone about the noise when the two systems' soundtrack got out of sync. In multi-player link mode, one of the console would often have its music slow down for then unknown reasons. Took me a while to find that one. 😬 Meanwhile, Vic just kept adding improvements to everything else in the background. Much of the v2.0 changelog made it into the feature list for the game. So many improvements just cried out for a major version bump. This wasn't 1.6, it was 2.0!
  4. They didn't understand HOME video games. They were still king in the arcade, and remained so for some time to come. At one point, that was almost their only presence. To be fair, most companies have screwed the pooch a time or two (or three or four) in the home market. Look at Sega.
  5. Nintendo tends to do their thing with little thought to how the other companies are doing their own. Most people think of the N64 as late to the party for the PS1/Saturn generation. You can think of it that way. As a programmer, it just doesn't feel like the same generation to me. The main processor supported floating point, an MMU, and kernel/user mode, all things found on the next generation, and none of the previous. The GPU was capable of perspective correct rendering, another thing only found on the next generation rather than the previous. It had much more memory (4 or 8 MB). The clock rates were far higher than the previous generation. Nintendo designed their SDK to isolate the developer from the hardware, again like the next generation and completely opposed to the previous. If you think of it as first of the next gen, a lot of these features all feel more in common. However, being first means everyone after you can see what you've done and try to do better, much as Nintendo did to Sega in the Genesis/SNES generation. The way the SDK for the N64 works, it seemed Nintendo had intended for their next console to be an updated version of the N64. That clearly didn't happen, and Nintendo moved to a new architecture, one it WOULD stick with for at least a couple more generations. Most people think of the start of the next generation as being the Dreamcast in 1998. But if you look at the DC, it's hardly much more powerful or advanced than the N64. Clearly Sega was just seeing what Nintendo did, and then doing that a bit better. Then Sony did that a little better with the PS2. MS was the wild card in this generation as it wasn't really clear if or when they'd throw their hat into the ring. At one point, it was just Nintendo, Sega, and Sony one upping each other in turn. But the hardware and SDK for the N64 looks more like the sixth gen than the fifth, which is why I consider it first in that gen rather than the DC. It came out about half-way between the two generations, so you could easily include it in either from a release time point of view. However, being so early meant the N64 ran out its lifespan in the middle of what most people call the sixth generation, so there wasn't much pressure on Nintendo to make as big an improvement to their next system as there had been for the N64. The GameCube seems more like the rest of the sixth generation systems, so they consider it part of that generation. I think if it more as Nintendo further divorcing themselves from what the rest of the game community was doing. They were taking themselves out of the race to make the "best" console. They starting making improvements to the console as met the needs for the next generation of GAMES rather than trying to match/exceed the next generation of hardware. They realized that it was the games that were important, not the hardware. As long as the hardware was capable of what the software needed, they didn't need to compete in the hardware wars like everyone else was. Every generation has taken Nintendo even further out of that race, leaving it to Sony and MS to keep at driving hardware further. It is kinda funny that in the end, Sony and MS have "standardized" to basically a PC with near identical specs. They also seem to be getting the idea that it's the games that are important, not the hardware.
  6. Yes... and no. It depends on the game. For a pure 3D game like Mario64, the Jaguar isn't up to the task. At least, not compared to the N64. For a 2D game, the Jaguar could compete with any of the other consoles - the example of that being Rayman. The Jaguar version is easily as good as any other port of that game. Let's face it, the Jaguar's 3D is rudimentary at best, needing a lot of babying to get good results. The N64 has a full on GPU that was on par or ahead of what was available for the PC at the time. If you ran your Jaguar game code on the GPU in local ram, the N64 main processor was almost four times as fast on the same code. If you ran the code on the 68000, the N64 did loops around it running backwards on its hands. It's really not that fair to compare the N64 with the Jaguar. I don't think of the N64 as the last of its generation of consoles, I think of it as the FIRST of the next generation that included the PS2 and Xbox.
  7. Who said the N64 didn't have bottlenecks? I certainly didn't. In fact, I specifically said that the only reason the N64 had any speed at all is that its ram was stupid fast. That's the primary bottleneck, same as the Jaguar - all the ram is shared. While graphics are being fetched for display, everyone else has to wait. While graphics are being drawn, everyone else has to wait. While the processor is fetching/storing code/data, everyone else has to wait. Unified ram is a big bottleneck on systems that use it. The main cpu in the N64 at least has decent caches to allow it to stay off the bus most of the time. A cache miss can be a big slow-down if you're not careful as much of the bus time will be going to the RDP to draw the 3D, and the display interface to output the video.
  8. Haven't seen Battlecorps on anyone's list yet. It's on mine... not at the top, of course, but still a favorite. If you like Battlezone type games, Battlecorps is a good example of that style game.
  9. Nintendo made the N64 easier to program for. You had one processor that ran "normal" code. It had a robust toolchain, having compilers and assemblers, so you could write the code any way you were comfortable with. The power of the coprocessor for graphics and sound was easy to use. Regardless of how much I loathe their "microcode" nonsense Nintendo yammered about incessantly, it was easy to do all the things you needed to for 3D, and the audio library was up to playing sound effects while handling a MIDI-ish soundtrack. About the only real problem was the development system itself - if you thought buying an ST just to program for the Jaguar was bad, imagine if you were told you needed to buy an SGI instead! 😵 🤣
  10. I think using the 68000 actually was a design flaw. Jerry was designed to adjust its external bus to the width of the main processor, so because the 68000 was 16 bit, Jerry had to be as well. Using the M68EC020 would have made a world of difference in a few ways: first, having a 32-bit data bus would have kept Jerry's bus at 32 bits as well; second, it had a 256 byte instruction cache that would have helped tremendously in keeping it off the bus compared to the 68000; third, the instruction timing was much improved over the 68000, so it would have been much faster even at the same clock rate... but in this case, it could have been clocked at a higher rate, perhaps even the same as the JRISC processors. Yes, it was more expensive, but the price was dropping rapidly at the time as the EC020 was a favorite chip for a lot of folks at the time, be it appliances or computers. Another possible contender for main processor would have been the SH1. It also had a 32-bit data bus. It also had a cache (more than the 68020). It was also a favorite in appliances of the time, and was very reasonable in price. When (if) Atari moved to the Jag2, it could have moved to the SH2/3/4, depending on when it would have released.
  11. Slight correction here - the N64 is a 64 bit processor on a 16-bit bus. The bus between the main processor and the coprocessor chips was 16 bits, and passed packets of data. Even the cart is only 16 bits wide. The 16-bit packet nature of the bus is what contributed to the large latency on cache misses for the cpu. Nintendo also didn't want developers running 64-bit code as that unnecessarily expanded the amount of data needed to be passed across the bus. The N64 actually runs faster with 32-bit code, and as the N64 was never going to have more than 4GB of anything to access (max ram of 8MB, max rom of 128MB), 64 bit pointers were a complete waste. Saving and restoring 64 bit registers would double the time on entering and exiting functions. Note that the bus between the RDP and RDRAM was 18 bits. That's the main 16 bits plus the extra two error detection bits that RDRAM had. It could use those extra two bits to increase the levels of alpha in 16 bit rendering. It's only because RDRAM was ridiculously fast (for the time) that the N64 had any kind of real performance. Imagine trying to draw across a 16 bit bus while the processor is trying to run code/data across the same 16 bit bus to the same 16 bit ram. So in one respect, I agree - the N64 was NOT 64-bit where it truly mattered.
  12. The latest commits in the OpenLara repo were for the GBA and 32X. I follow it fairly closely as I make my own builds for the 32X to better advise on issues on that platform. In making the 32X version, he seemed to refine the fixed point branch and migrate some of that back to other fixed point builds like the GBA. I imagine that making a new port teaches him new things that might possibly help other existing ports. So even if he hasn't gotten back to the Jaguar port yet, he's learning lessons that will help it eventually.
  13. I've got a TON of Genesis game music on SD card in VGM format that I play using the VGM player in the NeoMyth menu. Lots of good Sonic music, music from Y's, Earthworm Jim, etc. I really like the music from Echo - Tides of Time. VGM players on the PC have gotten better, but there's just something about playing it on real hardware. 😎
  14. Did you read the user manual? https://krikzz.com/pub/support/mega-everdrive/x3x5x7/Mega Everdrive-v2 user manual EN.pdf If you don't know how to unzip a file on the Mac, I'm sure there's plenty of videos on just that on youtube. It doesn't need to be specific to the MED, just how to unzip a file using a Mac.
  15. Of course it was all compiled to begin with... just like the Jaguar port or the 32X port. The fact that he'll be redoing sections in JRISC instead of 68000 just adds to the complexity, but doesn't make it impossible. If he feels it necessary to look at what certain code would look like compiled to JRISC, he can use the old gcc compiler set to output assembly. Just cut and paste the functions he needs into a test file, make certain it's C instead of C++, and run the compiler over it. Being an old version of gcc, the compiled code will be easier to read than modern gcc. Ever look at the assembly generated by the latest gcc for the 68000 or SH2? That be some crazy shit! 😄
  • Create New...