Jump to content
IGNORED

Altirra 1.9 released


phaeron

Recommended Posts

 

Faicuai, if I understand correctly, the -57 setting corresponds to output of your 800XL, while -51 matches your 800, right?

 

Your 800 is an interesting case - it is so far the only known unit which produces "hue 1" that is not equal to the colorburst. I'd like to investigate this discrepancy further, so I'm asking - could you send a screenshot of your 800 displaying the palette? (Or have you posted it already?) There's a slight chance that I discover the reason of the discrepancy just by analysing the screenshot.

 

 

Correct, Krotki.

 

Please, keep in mind that I am trying to have Altirra reproduce, as closely as possible, what my 800XL and JayMiner-800 look once Analog-to-Digital captured, and then SHOWN in my Sony Bravia KDL52W3000, NEXT to Altirra.

 

In other words, the final adjustment in Altirra is reflecting the "visualization path" the actual HW samples are going through. Therefore, with -51 I have also factored in what my Bravia is doing (with its baseline D75 calibration).

 

I will post ALL I've got from my JayMiner-800, once I return from my overseas trip.

 

F.

Link to comment
Share on other sites

Faicuai, if I understand correctly, the -57 setting corresponds to output of your 800XL, while -51 matches your 800, right?

 

Your 800 is an interesting case - it is so far the only known unit which produces "hue 1" that is not equal to the colorburst. I'd like to investigate this discrepancy further, so I'm asking - could you send a screenshot of your 800 displaying the palette? (Or have you posted it already?) There's a slight chance that I discover the reason of the discrepancy just by analysing the screenshot.

 

 

Correct, Krotki.

 

Please, keep in mind that I am trying to have Altirra reproduce, as closely as possible, what my 800XL and JayMiner-800 look once Analog-to-Digital captured, and then SHOWN in my Sony Bravia KDL52W3000, NEXT to Altirra.

 

In other words, the final adjustment in Altirra is reflecting the "visualization path" the actual HW samples are going through. Therefore, with -51 I have also factored in what my Bravia is doing (with its baseline D75 calibration).

 

I will post ALL I've got from my JayMiner-800, once I return from my overseas trip.

 

F.

 

 

 

Attached are the grayscale, 256 colors samples and "Ready" captures from my JayMiner-800. I am also including my 800XL 256-color sample, so everyone SEES the difference between these two machines, not only in colors (800XL goes towards green at the 256-colors chart bottom), but how UNDERDRIVEN Luma is in JayMiner-800.

 

Pay CLOSE attention to A) the FIRST color-bar at the top, b) the LAST color-bar at the bottom, and C) where the bunched-up BLUE patches occur vertically, and how far they propagate from the left to the right. These are KEY areas required to be tuned to perfection, if you want HW-like color reproduction in Altirra/Windows.

 

The two machines require TWO different adjustment ranges in Altirra, in order to emulate color response as close as possible (Luma aside).

 

Cheers,

 

F.

A800jm-Composite-Stock-READY.bmp

A800jm-Composite-Stock-GRAY.bmp

A800jm-Composite-Stock-256colors.bmp

A800XL-sVideo-FRAME-256Colors.bmp

Edited by Faicuai
  • Like 1
Link to comment
Share on other sites

Thanks, Faicuai. The screenshots definitely clear my confusion.

 

To be honest, judging from your screenshots I can confirm that hue 1 on your 800 is exactly equal to the coloburst, just as in all other known Atari units. That's good. However, to reproduce that fact in Altirra, its "Hue start" setting _must_ be set to -57. Since you say to reproduce the 800's Bravia output "Hue start" must be set to -51, the conclusion is that the Bravia somehow distorts the ouptut from your 800. Now it's interesting why your TV doesn't do the same thing with the output from your 800XL (since you say that "Hue start"=-57 correctly reproduces your 800XL). I'm suspecing it's due to different connection method (composite vs S-Video) and that the 800 would display colours more correctly if it was connected through separated chroma/luma. But that's just a hypothesis.

 

(I'm also suspecting that the composite connection is also the reason for the 800's output being dimmed, but I don't have any evidence to support this claim.)

 

As for your 800XL leaning to the green - judging from the screenshot, it seems the reason is improper setting of the colour pot. The Field Service Manual says hues 1 and 15 (the top and bottom colour bars) should have the same colour; on your 800XL they aren't (it is visible on both the 256-colour and the CPS SuperSalt screenshots of yours). Adjusting this would also reduce amount of green.

 

The setings in Altirra that reproduce your 800 screenshot (when both the screenshot and the emulator are displayed on the same monitor, so no calibration issues are significant) are:

Hue start=-57 (-51 would add noticeable yellowish hue to the topmost colour bar)

Hue step=26.1 (the ideal as proposed by the Field Service Manual would be 25.7)

Brightness=1

Contrast=52

Saturation=19

 

For your 800XL the settings are:

Hue start=-57

Hue step=24.9 (note the difference wrt. to the 800 setting - this is the reason for the 800XL showing more green at the bottom)

Brightness=1

Contrast=72

Saturation=19

 

Note that the settings are devised by mathematical analysis of the screenshots, without any "personal preference" issues involved.

Edited by Kr0tki
  • Like 1
Link to comment
Share on other sites

Thanks, Faicuai. The screenshots definitely clear my confusion.

 

However, to reproduce that fact in Altirra, its "Hue start" setting _must_ be set to -57. Since you say to reproduce the 800's Bravia output "Hue start" must be set to -51, the conclusion is that the Bravia somehow distorts the ouptut from your 800. Now it's interesting why your TV doesn't do the same thing with the output from your 800XL (since you say that "Hue start"=-57 correctly reproduces your 800XL). I'm suspecing it's due to different connection method (composite vs S-Video) and that the 800 would display colours more correctly if it was connected through separated chroma/luma. But that's just a hypothesis.

 

(I'm also suspecting that the composite connection is also the reason for the 800's output being dimmed, but I don't have any evidence to support this claim.)

 

As for your 800XL leaning to the green - judging from the screenshot, it seems the reason is improper setting of the colour pot. The Field Service Manual says hues 1 and 15 (the top and bottom colour bars) (...)

 

 

(..) Note that the settings are devised by mathematical analysis of the screenshots, without any "personal preference" issues involved.

 

First, thanks for your set of parameters. I appreciate the effort.

 

 

Now, since I am a bit short of time, here are the bullet-points:

 

1. SuperSalt color-bar pattern requires color-match between LAST bottom-color bar above gray-line, and color-bar below gray line (also in the bottom). Nothing else. My 800XL is calibrated that way, and I have already posted the SuperSalt-CPS analog-to-digital capture showing that... yet 256-color-pattern is showing DIFFERENT than that of my JayMiner-800.

 

2. I recollect we definitely agreed that HW color-pot/trimmer equates to HUE-STEP (not START). Well, it turns out that in order to fully calibrate the Ataris, we also need a HUE-START trimmer but it is not there. In fact, I don't believe ANYTHING else can be adjustes.

 

3. If you search for more A800XL 256-color samples, you will notice MANY going to the green-side on the bottom-bar (many of them). Seems like mine is just the norm. This, again, suggests that there maybe inherent/factory differences of Hue-Start between machines/models.

 

4. If I try my JayMiner-800's directly on s-video, I get a TOTALLY over-driven Luma response (which somehow aligns to your idea of varying luma-levels accross different video output formats).

 

5. Tried your math-derived settings (which have minor brightness/contrast differences, as well as correct reproduction of bunched-up blues in the middle-left side of the 256-color checker), and then re-run my visual calibration, side-by-side between Altirra screen and my captures .BMPs. Here's what I got (which shows up the overall best balance of ALL factors, including color-distribution, bright-to-dark-ramps, top and bottom color patches, etc., as seen in my Bravia KDL52W3000):

 

=> Palette: 800XL(s-video, bright-adj) / 800(composite, stock)

=> Hue Start: (800XL: -57 ; A800: -52 )

=> HUe Step: (800XL: 25.1 ; A800: 25.4 )

=> Brightness: (800XL: -1% ; A800: 0% )

=> Contrast: (800XL: 83%-85% ; A800: 51% )

=> Saturation: (800XL: 19%-21% ; A800: 18%-20% )

=> Art. Phase: (800XL: 180 ; A800: 180 )

=> Art. Sat.: (800XL: 100% ; A800: 100% )

=> Art. Bright: (800XL: -15% ; A800: -15% )

 

 

In short: it seems that display/screen color response needs to be factored in as well so you can see your HW's actual output as IT SHOULD right through that same display/screen.

 

Cheers,

 

F.

  • Like 1
Link to comment
Share on other sites

1. SuperSalt color-bar pattern requires color-match between LAST bottom-color bar above gray-line, and color-bar below gray line (also in the bottom). Nothing else.

When I said "hue 1 and hue 15 must be the same colour" I was referring to the same thing. The bars above and below the gray line in SuperSALT are hue 15 and 1, respectively. Hue 1 does not change while manipulating the colour pot. To make hue 1 and 15 identical in Altirra, one must set "Hue step" to 25.7 - any other setting causes the hues to be different.

 

My 800XL is calibrated that way, and I have already posted the SuperSalt-CPS analog-to-digital capture showing that...

My whole point is that your 800XL is not calibrated 100% perfectly. Picking the colours from your screenshot in an image editor shows that hue 15 (above gray line) is slightly more green-tinted than hue 1 (below gray line). It isn't noticeable with a naked eye because of the gray line inbetween, but removing it from the screenshot allows to notice the difference. (I don't know what they were thinking when they put that gray line there, as it's impedes the process of adjusting - it would be easier if the hues 1 and 15 were simply placed side by side.)

 

To represent this ouptut in Altirra, "Hue step" must be set to a value lower than the ideal 25.7 (in this case 24.9).

 

yet 256-color-pattern is showing DIFFERENT than that of my JayMiner-800.

Your 800 isn't calibrated 100% correctly either. On your screenshot hue 15 is slightly more yellow-tinted than hue 1 - in such case the corresponding "Hue step" setting should be higher that 25.7 (in this case 26.1).

 

Both your machines are misadjusted a little bit (in opposite directions, so the difference is more pronounced) and that appears to be the reason of the 800XL showing more green tint.

 

2. I recollect we definitely agreed that HW color-pot/trimmer equates to HUE-STEP (not START). Well, it turns out that in order to fully calibrate the Ataris, we also need a HUE-START trimmer but it is not there. In fact, I don't believe ANYTHING else can be adjustes.

I wouldn't be so sure. I don't know what your Bravia displays, but when comparing screenshots of both of your machines, hue 1 is identical on both of them (disregarding brightness/contrast, obviously), and exactly equals hue of the colorburst signal. For Altirra it means setting "Hue start" to -57 in both cases. Since on screenshots hue 1 is identical on both computers but on your TV it isn't so, the conslusion is that your Bravia treats signal from the 800 and the 800XL somewhat differently (or alternatively, there's a failure in your screen-capturing device). There are several potential causes - one of them is the fact that Atari's TV signal doesn't exactly conform to the M/NTSC standard, and there are reports of modern TVs not tolerating it as much as the old ones did.

 

3. If you search for more A800XL 256-color samples, you will notice MANY going to the green-side on the bottom-bar (many of them).

So? I've probably seen them all. Not one of those machines was colour-adjusted 100% correctly (it can be verified by comparing hues 1 and 15 on the screenshots). In one case the poster later admitted to not knowing about the colour pot at all.

 

Seems like mine is just the norm. This, again, suggests that there maybe inherent/factory differences of Hue-Start between machines/models.

Um, you meant Hue-Step, right? 'Cause between all screenshots I've seen, hue 1 was identical (which means that the corresponding setting of Hue-start would also be identical).

 

But anyway, all of those screenshots, including yours, conform to the colour-generation algorithm in Altirra (means they can be reproduced flawlessly), so there's no reason to think that there are some differences between the XLs and other models, apart from different adjustment of the colour pots in the particular machines.

 

5. Tried your math-derived settings (which have minor brightness/contrast differences,

I had adjusted the settings so that only the darkest and brightest luminances match between the screenshots and Altirra. Two problems: 1) Altirra currently doesn't emulate nonlinearity in the real Atari's luminance output (in reality the luminances aren't spread evenly); 2) Can't adjust gamma in Altirra. For these two reasons we can't set Altirra to exactly match all 16 luminances at once. Anyway, I'm focusing only on colours (hues) here, and they are not affected by brightness/contrast settings.

 

 

as well as correct reproduction of bunched-up blues in the middle-left side of the 256-color checker), and then re-run my visual calibration, side-by-side between Altirra screen and my captures .BMPs. Here's what I got (...)

When setting "Hue start" to -52 as you suggest, hue 1 in Altirra doesn't match hue 1 on your 800 screenshot (and I mean the pixels' RGB values differ; Altirra's hue 1 gets more yellow tint - it's noticeable with naked eye when parts of the Altirra screen and the screenshot are pieced together). The screenshot's hue 1 matches Altirra's when "Hue start" equals -57.

 

What is your calibration method that leads you to think that -52 is the correct setting?

 

In short: it seems that display/screen color response needs to be factored in as well so you can see your HW's actual output as IT SHOULD right through that same display/screen.

Sure, but when we're comparing two images (such as the emulator's output and a screenshot) on the same physical display side-to-side, any display-related issues affect both images equally, therefore they can be ruled out.

  • Like 2
Link to comment
Share on other sites

In short: it seems that display/screen color response needs to be factored in as well so you can see your HW's actual output as IT SHOULD right through that same display/screen.

 

Sure, but when we're comparing two images (such as the emulator's output and a screenshot) on the same physical display side-to-side, any display-related issues affect both images equally, therefore they can be ruled out.

 

 

I believe this is at the core of the issue.

 

You can, of course, evalaute the machine's color-rendition from an absolute point-of-view (e.g. measure what comes with the screen-captures on an image editor). However, that is NOT what I am really targeting.

 

You certainly try to start from "firm ground":

 

a. Atari HW color-adjusted as far as color-pot/trimmer allows to do it,

b. The best possible analog-to-digital capture, so you get a good sample to compare with Altirra's output).

c. Your actual viewing/evaluation screen is calibrated itself to at least some fixed criteria (e.g. black & white levels, white-point/temp.)

 

Once you get to that point, there is no other step required except aligning Altirra and your samples, side-by-side and proceeding to adjust Altirra from a fixed starting point (e.g. -57start/25.0step, Brightness 0, contrast as high as tonal-separation is visible, or final brightness matches your desired levels, whether it is the Atari's signal brightness, or a higher level, if you choose it).

 

Some specific notes about my color calibration procedure:

 

1. Take into account your screen's own rendering bias. In my particular case, the Bravia has a DEEP response on the blues. And the 256-color chart region that holds these blues (which rapidly transition into violet/magenta), is not rendered as closely as real HW , when using YOUR math-distilled/absolute settings. This area, for example, is crucial for Star Raiders, Zaxxon, Frogger II, Blue Max and RiverRaid. These blues are located in a three row-cluster on the middle-left vertical area of the screen.

 

2. Therefore, when compensating for how my LCD-display responds on this region, I invariably end-up performing small adjustments on Hue-Start and Hue-step, so the blues show up bunched-up almost exactly as they show-up on the HW screen capture, while preserving top and bottom color-bars as close to orginal HW as well (there is "collateral" change when fixing the blues this way, and the idea is to balance out the rendering of all key areas).

 

3. Then comes the actual ramp from left to right on ALL rows of 256-color-chart, which needs to match in brightness and color saturation the HW sample, as much as your eyes can tell on my screen.

 

4. And finally comes the brightness-concentration on the last 3 columns on the right of the pattern, which have to show COMPLETE tonal/chroma separation (each color sector has to be distinguishable from the other).

 

Once these steps are peformed, I then go on a multiple list of games, which very specific areas, and compare side-by-side to Altirra, in order to check actual similarities or any strange divergence. The idea is that Altirra's ouput LOOKS as close as possible as MY capture look on the SAME screen, vis-a-vis (nothing else).

 

As for the Color Pot/Trimmer, we agreed that it controls HUE-STEP, and that HUE-START does not seem to be a variable you can dynamically adjust (at least not in my 800XL, RevC). This is probably a possible explanation for BOTH the differences between my 800/800XL and the similarities I see among 800XLs (including mine). Please, note that this trimmer DOES NOT allow for exact adjustments. You CANNOT exactly match the last two-color bars on SuperSalt. They come close, but when rotating the trimmer, you will not find a perfect match. Thus you should find "absolute" differences between last-two color bars.

 

 

Cheers,

 

F.

Edited by Faicuai
Link to comment
Share on other sites

You can, of course, evalaute the machine's color-rendition from an absolute point-of-view (e.g. measure what comes with the screen-captures on an image editor). However, that is NOT what I am really targeting. (...)

Now I'm confuced, apparently I'm missing a crucial detail. I hope you could clear a few thing up.

 

Does your process involve matching Altirra to the screen-capture, or to the real machine's output on your TV? And, when doing the calibration in Altirra, is the emulator displayed on your monitor or on the TV?

 

1. Take into account your screen's own rendering bias. In my particular case, the Bravia has a DEEP response on the blues.

Does your screenshot reproduce this effect, or is the TV display more bluish than that?

 

And the 256-color chart region that holds these blues (which rapidly transition into violet/magenta), is not rendered as closely as real HW , when using YOUR math-distilled/absolute settings.

You mean, when compared to the screenshot or to the Bravia's display?

 

2. Therefore, when compensating for how my LCD-display responds on this region, I invariably end-up performing small adjustments on Hue-Start and Hue-step, so the blues show up bunched-up almost exactly as they show-up on the HW screen capture, while preserving top and bottom color-bars as close to orginal HW as well (there is "collateral" change when fixing the blues this way, and the idea is to balance out the rendering of all key areas).

So, if I understand correctly, you intentionally misadjust Altirra a little bit, so it more accurately reproduces the blue hues from your TV, right?

 

If so, then I'm wondering, if the same effect can be achieved by a) setting Altirra according to my "ideal" values, b) adjusting the blues by manipulating the graphics card's colour settings (usually hidden somewhere in Control Panel's Display settings - it's driver-specific, you know what I mean?).

 

Yesterday I've made the screenshot-Altirra comparison on a random computer and noticed that my "ideal" settings were quite off. As it turned out, the particular ATI graphics driver allowed for independent colour adjustments for the desktop and for Direct3D applications, and that was the cause :)

 

As for the Color Pot/Trimmer, we agreed that it controls HUE-STEP, and that HUE-START does not seem to be a variable you can dynamically adjust (at least not in my 800XL, RevC). This is probably a possible explanation for BOTH the differences between my 800/800XL and the similarities I see among 800XLs (including mine).

According to the current knowledge, adjusting hue 1 (Hue start, as you say) is not possible on any Atari model, and this hue is constant between all units examined so far. That was actually the thing I wanted to verify the most, and your input was quite valuable. Thanks again.

 

Please, note that this trimmer DOES NOT allow for exact adjustments. You CANNOT exactly match the last two-color bars on SuperSalt.

Certainly nothing to argue about. I've once read here on AA about a 5200 console that displayed correct colours only after a few minutes from being turned on - apparently the GTIA chip was gradually shifting its hue-step timing when it was becoming hotter.

  • Like 1
Link to comment
Share on other sites

Can I just jump into this discussion a little..

 

Are there any parts of the colours, hue, saturation etc etc that follow hard and fast factory rules either from Atari or as known settings?

 

I'm ignoring the 2nd part of the experience as in, what people remember and class as correct, what I'm trying to see is if there set criteria to form a basis to build on from, so to get a range that covers valid colour tones, hue and saturation ie the 'correct' blue can only be in a set range.

 

At least if there's a basic 'scale' its a good start.

 

I'm only asking as it seems no one really agrees on what is the colour for X machine from X country and I understand this matters due component tolerances on pots, incorrect settings on hardware then hitting display differences but isn't there a 'This is what Atari set it to at the factory' (roughly) to make the track more easy or is it just going to be a never ending disagreement as to what is correct?

 

Maybe a sampling of video output from a group of machines where owners may not have played with the pots on them.

 

Surely Atari had a video scale of sorts?

Link to comment
Share on other sites

Avery,

could you please update Altirra to use paddles with this new adaptor?

http://home.comcast.net/~tjhafner/2600-daptor.htm

http://www.atariage.com/forums/topic/181321-announcing-new-2600-controller-usb-adaptor/

New version of Stella emulator supports it (joysticks and paddles).

 

It looks like the adapter simply presents a standard USB HID interface, so Altirra's existing game controller support should already work with it.

Link to comment
Share on other sites

@Phaeron (and anyone else who knows the answer :) ):

Only info about profiler that I found is this: "The profiler is accessible via the Debug > profiler submenu and has three modes: instruction level sampling, function sampling, and BASIC line sampling. Double-click on a line, and it brings up annotated disassembly (insn/function mode only)"

 

What does it do ?

 

I have generated label and lst files from Mads compiler, and now I can see labels and other names in Altirra debugger.

 

What do columns in profiler mean ?

Cycles - sum of all cycles for instruction - but for what time ? Is it maybe for whole profiling time ?

Insns ? - how many instructions for that same time ?

CPI - cycles per instruction ?

 

How could I calculate how much time certain routine takes per frame - or per second ?

 

And how does the "Function sampling" works ?

Link to comment
Share on other sites

Avery,

could you please update Altirra to use paddles with this new adaptor?

http://home.comcast....2600-daptor.htm

http://www.atariage....er-usb-adaptor/

New version of Stella emulator supports it (joysticks and paddles).

 

It looks like the adapter simply presents a standard USB HID interface, so Altirra's existing game controller support should already work with it.

You're right, obviously...

Paddles should be configured in the Input mappings menu.

Link to comment
Share on other sites

Another paddles question!

I'm trying to map the 2nd paddle to the mouse's vertical axis for the game Tilter as it uses both paddles simultaneously - I've edited the "Mouse ->Paddle A" setting and added 'Mouse Move Vert' on 'Axis 2' but I can't seem to get it to work :?:

 

Thanks

Link to comment
Share on other sites

@Phaeron (and anyone else who knows the answer :) ):

Only info about profiler that I found is this: "The profiler is accessible via the Debug > profiler submenu and has three modes: instruction level sampling, function sampling, and BASIC line sampling. Double-click on a line, and it brings up annotated disassembly (insn/function mode only)"

 

What does it do ?

 

The profiler tells you which portions of code are taking the most CPU time.

 

I have generated label and lst files from Mads compiler, and now I can see labels and other names in Altirra debugger.

 

This will help decode the results from the profiler, but the profiling results are not affected by the presence of labels.

 

What do columns in profiler mean ?

Cycles - sum of all cycles for instruction - but for what time ? Is it maybe for whole profiling time ?

Insns ? - how many instructions for that same time ?

CPI - cycles per instruction ?

 

Cycles is the number of machine cycles over the profiling period. The longer the profiling period, the more cycles. The summary view also shows percentages. This includes halted cycles, so the cycle count will reflect code that is running slowly due to ANTIC DMA contention.

 

Insns is the number of times that the instruction has executed.

 

CPI is cycles per instruction. A high CPI compared to the ideal value either means high DMA contention or a lot of extra cycles due to page crossings.

 

I want to get unhalted cycles in at some point, but it's a bit tricky to get the way that the profiler currently works.

 

How could I calculate how much time certain routine takes per frame - or per second ?

 

If your function is self-contained, then you can use function profiling and look at the entry point. Currently you cannot profile a call tree as that requires call graph profiling. The tricky part of call graph profiling is actually the UI; collecting call contexts is pretty easy, but presenting it is a load of work.

 

And how does the "Function sampling" works ?

 

Function sampling works by trapping interrupts and JSR/RTS instruction and using this to identify function entry and exit events. The profiler then collects all samples within a particular function scope. This only includes time spent within the function itself (exclusive time); if a nested function is called, the time spent in that function is not reflected in the parent's total.

Link to comment
Share on other sites

Another paddles question!

I'm trying to map the 2nd paddle to the mouse's vertical axis for the game Tilter as it uses both paddles simultaneously - I've edited the "Mouse ->Paddle A" setting and added 'Mouse Move Vert' on 'Axis 2' but I can't seem to get it to work :?:

 

Paddles are single axis -- you need a second paddle. Add a Paddle B controller and map Mouse Move Vert to its Axis 0.

Link to comment
Share on other sites

...

Function sampling works by trapping interrupts and JSR/RTS instruction and using this to identify function entry and exit events. The profiler then collects all samples within a particular function scope. This only includes time spent within the function itself (exclusive time); if a nested function is called, the time spent in that function is not reflected in the parent's total.

Thanks Phaeron! Now it makes little more sense :)

 

So, if I would modify code to execute one of the functions only once (Nmi routine for example) - I should see total time it took - no matter for how long I ran sampling ?

Or just rely on percentages as a measure of code efficiency...

 

Thanks again for making such a feature rich emulator!

Link to comment
Share on other sites

Thanks for the explanation Phaeron. I believe I was using function sampling correctly: I divided the overall cycle usage over the sampling period by the number of calls. However, I'm still slightly confused by the fact the function sampler displayed stats for addresses immediately following the interrupt routine's entry point.

Link to comment
Share on other sites

Thanks for the explanation Phaeron. I believe I was using function sampling correctly: I divided the overall cycle usage over the sampling period by the number of calls. However, I'm still slightly confused by the fact the function sampler displayed stats for addresses immediately following the interrupt routine's entry point.

 

Strange... I haven't seen an example of this yet. I had forgotten to mention that JMP (abs) instructions are also interpreted as starting a new function, though.

Link to comment
Share on other sites

Thanks for the explanation Phaeron. I believe I was using function sampling correctly: I divided the overall cycle usage over the sampling period by the number of calls. However, I'm still slightly confused by the fact the function sampler displayed stats for addresses immediately following the interrupt routine's entry point.

 

Strange... I haven't seen an example of this yet. I had forgotten to mention that JMP (abs) instructions are also interpreted as starting a new function, though.

Im also puzzled :(

 

Can you explain on this example - what each of these numbers mean (and is it working OK) ?

 

post-14652-0-81607400-1307825757_thumb.png

 

$20E6 is entry point for NMI_HANDLER1.

 

BIT NMIST = 4 cycles

BPL NMI_NOT_DLI = most cases 2 cycles (there are more dlis than vbi)

JMP DLI_HANDLER1 = 3 cycles (is this also considered as end of function ?)

(I even tried changing JMP to BMI - same result)

 

Why are DLI_HANDLER1 and DLI_HANDLER2 not calculated ?

There is a jump to them and RTI at the end ?

(RTI regarded same as RTS or not ?).

Link to comment
Share on other sites

Thanks for the explanation Phaeron. I believe I was using function sampling correctly: I divided the overall cycle usage over the sampling period by the number of calls. However, I'm still slightly confused by the fact the function sampler displayed stats for addresses immediately following the interrupt routine's entry point.

 

Strange... I haven't seen an example of this yet. I had forgotten to mention that JMP (abs) instructions are also interpreted as starting a new function, though.

Im also puzzled :(

 

Can you explain on this example - what each of these numbers mean (and is it working OK) ?

 

post-14652-0-81607400-1307825757_thumb.png

 

$20E6 is entry point for NMI_HANDLER1.

 

BIT NMIST = 4 cycles

BPL NMI_NOT_DLI = most cases 2 cycles (there are more dlis than vbi)

JMP DLI_HANDLER1 = 3 cycles (is this also considered as end of function ?)

(I even tried changing JMP to BMI - same result)

 

Why are DLI_HANDLER1 and DLI_HANDLER2 not calculated ?

There is a jump to them and RTI at the end ?

(RTI regarded same as RTS or not ?).

 

RTI also ends a function. JMP abs and Bcc do not. Therefore, in your case, all of the instructions from all of the DLIs are being aggregated at the entry point. The cycle and instruction counts are for all instructions that executed within each function; CPI is just cycles divided by instructions.

 

Remember that the cycle counts include halted cycles, so even if you were using instruction-level sampling you still probably would have seen higher than 4 CPI, 2 CPI, and 3 CPI for the cases you gave above.

 

It wouldn't have affected your example above, but I did find a bug in the profiler that sometimes caused intermediate call frames to get lost. The most glaring result was that the idle loop of some games wasn't always shown correctly. This test version fixes it, and also splits apart DLI and VBI as separate threads:

 

http://www.virtualdub.org/beta/Altirra-2.00-test6.zip

http://www.virtualdub.org/beta/Altirra-2.00-test6-src.zip

 

As for how to read the profiler results, I wouldn't normally go to the detailed view for a DLI routine. DLI routines don't generally have loops or other code with variable timing, and so the profiler doesn't usually tell you anything you didn't already know. Usually when starting a profiling session, I start with the summary view, such as this one from The Last Starfighter's title screen:

 

post-16457-0-36780700-1307862656_thumb.png

 

Sorting by thread and looking at the Clocks% column, we can see that one third of the CPU is being spent in interrupts. That's inconsequential here, because the mainline code just runs an idle loop. Of that one third, about a third of that is in DLIs and the other two thirds is VBIs. For a program with a mainline code that does actual work, the interrupt time is important as it says how much the mainline code is being slowed down. Another thing that the summary view tells you is which functions are not actually getting much useful work done -- very high cycle count compared to instruction count either means high DMA contention or wasted time in STA WSYNCs.

 

Applications tend to show some more interesting results in function sampling. With an Atari BASIC program, for instance, you can see how much time is being spent in the floating point library versus the interpreter itself.

 

There are two other tools that I use for DLIs specifically. The first is the History window, just to check when instructions are executing and how many scanlines it takes, and especially if an STA WSYNC is executing too late. The other is the Recursive NMI Handler Execution mode of the Verifier, which will throw an assertion if it sees a DLI routine taking too long and colliding with another DLI.

  • Like 1
Link to comment
Share on other sites

RTI also ends a function. JMP abs and Bcc do not. Therefore, in your case, all of the instructions from all of the DLIs are being aggregated at the entry point. The cycle and instruction counts are for all instructions that executed within each function; CPI is just cycles divided by instructions.

 

Remember that the cycle counts include halted cycles, so even if you were using instruction-level sampling you still probably would have seen higher than 4 CPI, 2 CPI, and 3 CPI for the cases you gave above.

...

As for how to read the profiler results, I wouldn't normally go to the detailed view for a DLI routine

...

There are two other tools that I use for DLIs specifically. The first is the History window, just to check when instructions are executing and how many scanlines it takes, and especially if an STA WSYNC is executing too late

...

These explanations are very valuable. I was always looking at a detailed view...

And history window is excellent - breakpoint in a right place, and I can see just how much more time I have inside Dli...

 

Universe makes sense once again :)

Thanks, things are much clearer once again, and most importantly my code is working just as I planned it ;)

 

ps. You should put all this info in Altirra help.

Link to comment
Share on other sites

If so, then I'm wondering, if the same effect can be achieved by a) setting Altirra according to my "ideal" values, b) adjusting the blues by manipulating the graphics card's colour settings (usually hidden somewhere in Control Panel's Display settings - it's driver-specific, you know what I mean?).

 

(...)

 

Yesterday I've made the screenshot-Altirra comparison on a random computer and noticed that my "ideal" settings were quite off. As it turned out, the particular ATI graphics driver allowed for independent colour adjustments for the desktop and for Direct3D applications, and that was the cause :)

 

 

(...)

 

According to the current knowledge, adjusting hue 1 (Hue start, as you say) is not possible on any Atari model, and this hue is constant between all units examined so far. That was actually the thing I wanted to verify the most, and your input was quite valuable. Thanks again.

 

 

 

 

My apologies for leaving you in the dark here, but I have been very, very busy lately (traveling, work, etc.) I do have some time-to-breath, now.

 

I have extracted above your most important questions, which once answered, should provide the complete picture. In essence:

 

1. I first capture the real HW analog image in with a MiniDv cam/vtr (which has reliable analog-to-digital capabilities), and then dump it back to PC via IEEE1394 in AVI format.

2. Then I open the AVI file with TMPGenc Pro. Encoder, and extract the specific file in .BMP format (no Chroma/Hue/Luma adjustments whatsoever).

3. In my DUAL screen HomeTheater laptop (Dell Studio 1440z Nvidia), my Sony Bravia KDL52 is screen #1, and the Studio's screen is #2. Video mode is set to max.perf/dual screen.

4. The Bravia IS CALIBRATED with colorimeter (e.g. black-level is set around 0.10-0.15 cd/m2, white level is set to 125 cd/m2, and white-point is set to D75 illuminant, all measured by computer+colorimeter Monaco DTP-94)

5. In addition to the above, qualitative adjustments (minor) are performed in order to ensure that screen shows a SMPTE NTSC 75% pattern correctly, when seen with blue-filter (naked eye). This is because I run a LOT of emulation on this machine, and color-space wise, the Bravia can go way beyond NTSC (this is one of the reasons for explaining such deep blues, even when properly calibrated). Not to mention HiDef/HDMI color-spaces vs. NTSC, for instance.

6. At this point (and knowing exactly what the screen is doing), I fire up Altirra NEXT to the .BMP/s captured above, and ALL displayed in Bravia (Display 1). I DO NOT use the 1440z display, which remains active all the time, and it is used for system monitoring / load / rescue from lock-ups, especially when running the emulators on full-screen in the Bravia.

7. Once next to each other, you DO NOT touch Nvidia's control panel or Bravias's settings. You ONLY touch Altirra's Color-Panel, and adjust:

 

a. Brightness & Contrast, by generating gray-scale from Atari (there is a 256 GTIA color-checker that I used with a small pause right after generating gray-scale). it is here where black-levels and white-levels are matched or CORRECTED with respect HW (e.g. in the case they are under/over-driven by the HW itself).

 

b. Once 256 colors show up, you proceed to re-adjust CONTRAST, so ALL colors at the far-right column become distinguishable, while remaining as bright as your target is.

c. Then adjust saturation to match the perceived levels of "density" that you see on the real HW capture, next to Altirra.

d. Then adjust (in multiple small iterations) Hue-Sart and Hue-steps to hue-match as closely as possible the top-first and bottom-last COLOR bars between Altirra and real HW captures, as well as the distribution of the clustered-blues that appear on the left-center area of the 256 color-checker.

e. At the end, EACH color patch should be rendered in altirra as closely as possible as each color patch on real HW, in terms of hue, saturation and brightness.

 

Hope this helps a bit. In my current setup, balance of color-fidelity and contrast is as good as it can get, by looking at it with the naked-eye (Altirra vis-a-vis HW captures).

 

Regards,

 

F.

Edited by Faicuai
Link to comment
Share on other sites

Thanks for the details, that's clearer now. So in fact you calibrate Altirra do match the BMP screen captures, when both are displayed on the same TV.

 

In that case, I don't really understand why adjusting Altirra according to the "ideal" mathematically-derived settings would not be the goal. Since they are "ideal", they are guarantee to match the BMP almost perfectly. (I'm only talking about colours here - I've mentioned earlier that Altirra currently cannot reproduce brightness/contrast of all 16 luminances perfectly.)

 

But there's one thing that might give some light on the issue. Do you happen to use the Windows Vista's/7's "Windows Photo Viewer" to display the screen captures? I've just discovered that this application can change the displayed image's parameters such as contrast or saturation. The Photo Viewer does it according to Windows' "Color Profile" settings.

 

Now, Altirra ignores the system's Color Profiles and displays its screen without any recalibration. For this reason, when displaying the screen capture side-by-side for comparison, you should also use an image viewer that ignores the Color Profiles. Windows' Paint is good for this task (compare the same BMP between Photo Viewer and Paint - you should notice the difference).

 

If that's the issue, it would nicely explain why the "ideal" settings look off for you.

Edited by Kr0tki
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...