Jump to content
VladR

Road Rash pre-alpha on Jaguar at 30 fps

Recommended Posts

...rather than technical issues that way....

Also, I'll openly admit, that the multithreading on jaguar is very complex to debug via printing numbers on screen, compared to doing multithreading on a PC with Visual Studio and C++, where the multithreaded library does all the heavy lifting behind the scenes.

 

Even as late as last weekend, I did some substantial changes to my semaphore system and synchronization between 68000 and GPU. I sure hope it was the last change, as that stuff trickles down through whole engine and debugging it is literally a whole-day thing. Utterly impossible just on coffee - RedBull is simply a requirement.

 

The technological satisfaction I get from this though - totally incomparable to the simplistic bullshit one does in corporate environment :lol:

  • Like 3

Share this post


Link to post
Share on other sites

I blame Wipeout for beginning the trend of racers with less and less interaction with other racers.

 

Plus I think Episode 1 Racer is much better.

Share this post


Link to post
Share on other sites

I blame Wipeout for beginning the trend of racers with less and less interaction with other racers.

 

I blame Checkered Flag. :-D

 

 

But seriously, I consider myself a racing game buff and i was totally blown away by Wipe Out. Eternal classic, loved the Vita Version.

Edited by Punisher5.0
  • Like 5

Share this post


Link to post
Share on other sites

I blame Checkered Flag. :-D

 

 

But seriously, I consider myself a racing game buff and i was totally blown away by Wipe Out. Eternal classic, loved the Vita Version.

The Vita version blew me away back in the day. Those were some insane graphics for something that fits in your pocket. Still looks fantastic today to.

  • Like 4

Share this post


Link to post
Share on other sites

Thanks, looks like there are quite drastic differences in richness of the physics implementation even on PS1, where clearly the HW performance is not a problem (as it is on jag). I'm definitely not going to do camera roll and 3-axis physics for this game. I could justify such development for next game, though.

 

Yeah, I actually bought remaster of that one recently on PS4 (or I think it was that one, it was a star wars racer, so highly likely this one). It's definitely an overkill game, in terms of features.

 

My greatest disappointment with Saturn's library is that I played all its great games already on PC, and in higher resolution and overall visual quality, so for me - no point in revisiting them on Saturn.

 

I really found only about 2 games in that vid covering whole saturn's library that were interesting and I haven't already played them on PC.

I knew once you saw proper Playstation software, the tone of "the Jag can keep up" would disappear:) Still looking forward to what you come up with.

Share this post


Link to post
Share on other sites

 

 

I blame Checkered Flag. :-D

 

 

But seriously, I consider myself a racing game buff and i was totally blown away by Wipe Out. Eternal classic, loved the Vita Version.

I can't say enough, how amazing in every way the new PS4VR version is.

Share this post


Link to post
Share on other sites

 

I knew once you saw proper Playstation software, the tone of "the Jag can keep up" would disappear:) Still looking forward to what you come up with.

 

Where do you come up with this stuff?

Share this post


Link to post
Share on other sites

I blame Wipeout for beginning the trend of racers with less and less interaction with other racers.

 

Where do you come up with this stuff?

 

 

I can't say enough, how amazing in every way the new PS4VR version is.

 

It's effing glorious & worth the $199 hardware price alone. The jaw-dropping moment when you're first placed down onto the track rivalled anything from those first few weeks trying out the new hardware and every piece of software possible. Incredible. Only wish I had time to play it (or any game really) and that it was a lot cooler (putting a HMD on in this heat is the last thing I'd fancy doing right now).

Edited by sh3-rg
  • Like 3

Share this post


Link to post
Share on other sites

 

Where do you come up with this stuff?

 

 

 

It's effing glorious & worth the $199 hardware price alone. The jaw-dropping moment when you're first placed down onto the track rivalled anything from those first few weeks trying out the new hardware and every piece of software possible. Incredible. Only wish I had time to play it (or any game really) and that it was a lot cooler (putting a HMD on in this heat is the last thing I'd fancy doing right now).

Cool, will check it out.

Share this post


Link to post
Share on other sites

 

Where do you come up with this stuff?

 

Dito his statement regarding Wipe Out and racing games makes NO sense,but I dont mind. ;-)

Edited by agradeneu
  • Like 1

Share this post


Link to post
Share on other sites

2. On 64-bit Atari, this is what you need to do:

- compute (or look up) the size of new code chunk

- Turn GPU Off

- Set up ~10 registers for Blitter

- Initiate Blit

- Initiate endless loop waiting for Blitter's mighty 64-bit snail blitting to finally finish, thus killing another processor for substantial period of time

- Set the PC for GPU

- Turn GPU On

- Of course, this presumes you are aware of the SMAC assembler bugs and issues, and what happens if you foolishly attempt 32-bit aligning of the GPU code - that was one fun discovery : )

That GPU-bugs file got so big (recently grew, because of the DSP-specific bugs), I actually have to scroll it. I think I'll have to buy a bigger TV because of the GPU bugs, mine's only 50"...

 

Ok, JagChris pointed me to this discussion, so here my 2cnts:

In JagTris I run only GPU code and do swap in new code. No need to stop the GPU. It does it itself (code is in BJL...):

 

overlay::  
    load (blitter+$38),r3
    shrq #1,r3
    jr cc,overlay
    nop

    store r0,(blitter)
    store r1,(blitter+$24)
    movei #BLIT_PITCH1|BLIT_PIXEL8|BLIT_WID320|BLIT_XADDPHR,r0
    xor r1,r1
    store r0,(blitter+4)
    store r0,(blitter+$28)
    store r1,(blitter+$c)
    store r1,(blitter+$18)
    store r1,(blitter+$30)

    movei #BLIT_SRCEN|BLIT_LFU_REPLACE|BLIT_BUSHI*0,r1
    store r2,(blitter+$3c)
    store r1,(blitter+$38)
    WAITBLITTER
    jump    (LR)
    nop

The first waiting for the blitter to finish can be omitted if it is sure that there is no outstanding blitter operation. (I always wait _before_ using the blitter, not after).

The one at end can be omitted it overlay loading is done some time before actuall usage of the code.

 

overlay is called from a macro:

MACRO MyINITLSUB
    movei #LSUBrun_\0+$8000,r0    ; dest-adr
    movei #LSUBstart_\0,r1
    movei #1<<16|(LSUBlen_\0),r2
    movei #overlay,r3
    BL (r3)
ENDM

Of course, if there are a lot overlays, it makes sense to store destination, start and length in a table and pass only the index.

 

 

 

  • Like 2

Share this post


Link to post
Share on other sites

- Of course, this presumes you are aware of the SMAC assembler bugs and issues, and what happens if you foolishly attempt 32-bit aligning of the GPU code - that was one fun discovery : )

You may be the only one doing this kind of work still using SMAC

Share this post


Link to post
Share on other sites

Couple notes:

 

- The GPU can't do any meaningful work in the middle of the code blit

- I wouldn't risk relying on a quirk of the architecture that blitting to the GPU RAM halts GPU execution. That one STORE instruction is totally not worth the potential trouble, because it's not the GPU cycles of turning it on/off, it's the wasted GPU cycles while Blitter blits, that add up

- I can have two 4 KB code blits, do a lot of texturing (check back the Road Rash section of this thread), and still maintain 60 fps

- If I lowered the resolution (768x200), or dropped the texturing precision, then sure - I could have more than 2 code blits at 60 fps

- If your game/rendering is simple, then sure, you could do even 8 or more 4 KB overlays - it all depends on exactly how much frame time have you consumed by GPU

- but no matter how you look at it, it's still a waste of performance (unless performance is irrelevant).

 

But, for a simpler game, it is of course much, much easier (from coding perspective) to just do a serial gpu code blit, than to debug this:

- figure out the stable way how to synchronize GPU and 6800 and Blitter at both start and end of the frame

- figure out, how to draw a menu bitmap on the blitter, in the middle of the frame, while GPU is busy processing polygon scanlines (but without actually writing additional GPU condition, per scanline). And no, just waiting for the blitter on 68000 won't avoid the glitch (for reasons that would take half this page to explain)

 

 

So, the single greatest advantage of the "whole game on GPU"-style is that it's criminally primitive - it's all serial, nothing can go wrong, because it's all single-threaded, nobody else is using Blitter, and it's 100% guaranteed, that when you get to the new overlay section (and wait for blitter), there's nobody else spinning blitter up (or, and this is what's problematic, that even though blitter is available, it's not going to be true in next milisecond, e.g. it just happened to be available, because GPU was in the outer loop, getting ready for next scanline).

 

 

I presume you're not using too much system bandwidth, if you use BUSHI and don't get OP artifacts? When I use BUSHI, almost all scanlines of the OP's framebuffer are broken, but then again - this is the bandwidth I am using just for the framebuffers (to say nothing of the OP List bandwidth, and other HUD bitmaps, etc. and data processing):

- 768x200 (8-bit): 150 KB : OP

- 768x200 (16-bit): 300 KB: OP

- 768x200 (8-bit): 150 KB : Blitter clear

- 768x200 (8-bit): 150 KB : Blitter rasterize

--------------------------------------------------------

750 KB / frame, at 60 fps -> ~44 MB / second just for framebuffers

 

What resolution is your game in and what kind of rasterizing are you doing ?

  • Like 1

Share this post


Link to post
Share on other sites

You may be the only one doing this kind of work still using SMAC

 

 

post-19882-0-48257500-1531643091.gif

 

 

Of course, I wouldn't want to be accused of using reboot-modified compiler :)

Share this post


Link to post
Share on other sites

 

What resolution is your game in and what kind of rasterizing are you doing ?

 

It is JagTris. The main purpose 25 years ago was to show that it is possible to write a GPU only game.

As for the bandwidth, I guess the OP is idling most of the time. Just 6 objects, each 128x200x8 large, partly overlaying each other. That's all.

The game logic fits into 3k.

 

BTW: You mention a bunch of GPU bugs you did discover. Could you share this list, so others won't get bitten by 'em?

  • Like 2

Share this post


Link to post
Share on other sites
So, the single greatest advantage of the "whole game on GPU"-style is that it's criminally primitive - it's all serial, nothing can go wrong, because it's all single-threaded,

 

 

To be knitpicking, what you talk about is multi-cpu (not multi-core!) programming. Multi-threaded means that one the _same_ CPU multiple threads are executed.
Anyway, I understand what you are off. Yes, handling 3 CPUs (better 4 with the OP) is not a simple task. Which might be why most of the game sources I saw so far use the GPU for short run-to-completion tasks and don't let it run.

Share this post


Link to post
Share on other sites

OP doesn't directly interfere with execution of your code, like Blitter or GPU or 68000 does. OP only processes a linked list of bitmaps and that's it. So, I wouldn't even consider it a separate parallel process (even though it technically runs in parallel), as it's simply set&forget.

 

The only time OP interferes is when you run out of bandwidth (or use BUSHI when already consuming major bandwidth) - but you don't have to sync it at the start/end of the frame (or you would have glitched anyway, you can't have a non-60 fps OP execution without glitch), you can't have a situation when either one of the other finishes earlier or later. For a very long time, I got away with a very simple 3-state sync check at the end of the frame and that took care of 68000's syncing too. Debugging screen output from 68000 required doubling the count of possible combinations, as did the potential of 68000 ending later than GPU. Interrupting GPU in the middle of the rasterizing (e.g. drawing menus) added additional layer of complexity to the syncing.

 

In C++, that sync is easy, but doing nested, assymetric conditions in 2 different assemblers (68000 vs GPU), has a potential for an easily overlooked typo that compiles just fine and it takes 3 days till your brain caches out what you think you wrote, and then come back and see the bloody typo :roll:

 

As for terminology, I know the difference in the definition, but nobody really used it like that in that era. All videos from the ~94 era that I've seen (including Saturn few yrs later) always talked about parallel programming or used the same term (multithreaded).

 

 

 

 

Also, I love how certain colorful cartoon characters here jump on the term and spend 7 pages "teaching" me what it means. It amuses me to no end and for free (as if I didn't do enough REAL commercial multithreaded programming in C++ in past) . Every.Single.Time :lol:

 

 

 

You say you're using 6 bitmaps, it's all just OP objects, right ? I'm wondering where does Blitter come into play in your set-up, but since I haven't seen it in motion or screenshot, I have really no idea how does your screen behave. Why would you then wait for Blitter at all? You don't erase/fill framebuffer, correct ? Or is there some small bitmap that you perhaps redraw on GPU each frame and thus have to clear it via Blitter?

Share this post


Link to post
Share on other sites

Yep. In the "original" JagTris I did use the blitter only do clear the bit map, draw stones and load overlays. Now I use it also do draw the digits.

Share this post


Link to post
Share on other sites

I have finally handled the last remaining high-risk item: Audio. Till last week, I wasn't sure (hence the high risk), if I'm going to use Atari's audio code (the fulsyn) or just roll my own from scratch. I ended up with a compromise - I use only Atari's DSP code now, but had to completely ditch all their tune/patch/initialization code and wrote my own API on 68000.

 

While I have previously statically linked Atari's sample and it played a tune alongside my 3D engine, there was a lot of work remaining as Atari's fulsyn is not an API, it's just a very raw basic code, with all kinds of insane hardcoded stuff. Not being able to directly run all the old DOS tools (initially) didn't help either, but they run now, so all is good.

 

Getting everything to compile and link was a challenge due to the way Atari "structured" the library, but it works now.

 

 

 

I still need to write some code to handle sound effects on available channels, but that shouldn't be a big deal, compared to completing the whole puzzle (that the fulsyn absolutely is).

 

I now have 3 tunes there, with 3 different sets of patches (something the Atari's code can't handle at all), all table-driven at compile-time, and obviously, tune selectable at run-time.

 

I think I'm ready to start ordering cart equipment so that it's ready while I finish remaining gameplay items.

 

 

 

In retrospect, I certainly would have written the basic audio lib in the time it took me to troubleshoot everything around Atari's audio code and tools, so that was clearly a mistake. Oh, well, relying on someoneelse's code always is...

 

 

Looking at my to-do list, the two biggest (the rest is a lot of small things) remaining coding items are:

- power-ups

- coloring of track inside 3dsmax (just one more export node to parse, really)

 

Hardly a moderate (let alone high) risk, by any stretch of imagination...

  • Like 2

Share this post


Link to post
Share on other sites

So it's running with music now? Would like to see...

Well, technically it ran with music about 2 months ago (or, whenever I made the post about that - would have to go back few pages to see when exactly it was), but it was inflexible, as it was using the hardcoded music from the Atari sample code at that time, while now I can just directly import and play any MIDI file.

 

I can't just grab a video, as I broke my old capture device, but the new Roxio (that is supposed to record at 60 fps), has arrived few days ago. That's assuming that the HDMI -> S-video (or is it composite ? I always mix those two) conversion will actually work.

 

There's plenty other features that warrant a new vid, for sure :

- Menus,

- Timers,

- Smooth input,

- Basic Physics (inertia/weight/acceleration),

- full Z-sorting,

- camera zoom (based on speed)

- more generic AI (higher variability between enemies behavior)

- resolution choice

- ultra-sharp 768x240 centered window (from within 1409x240)

- 16-bit background

- intro shader effect (5-shader composite)

 

And probably a whole lot more I forgot. Though, I probably would want to implement the power-ups before a new vid. I'll think about it...

  • Like 2

Share this post


Link to post
Share on other sites

You and I both have the Val Disere sound engine. I need to separate it and post it.

 

I don't know if you'd find it useful.

Edited by JagChris

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...