Jump to content
IGNORED

Internal ANTIC and GTIA schematics


JAC!

Recommended Posts

Well, theoretically you could disconnect Antic completely from GTIA and drive it so long as you sent the right commands at the right time and had something fast enough to change what it was displaying.

 

Some observations though:

- with the "scanline 240 bug", the DMA setting will affect the "curtain", ie turn it off completely or alter it's width. You can change colour of PF1 at will and it will be relected there too.

 

- Maybe Antic has a bug similar to the earlier post where some GTIAs can be fooled into shifting GR. 9 pixels. If you display hires on the last line, it "forgets" to do some of it's housekeeping.

 

Another thing - you don't even need hires on scanline 240 to activate the bug. All you need to do is disable DMA further up the screen during a hires mode line.

 

There's no register to send all the commands to GTIA (in software); there's the "bus-load" method for P/M graphics but not for AN2..AN0. In modes like Graphics 9, AN0..AN1 are mostly just the data DMAd but with ANTIC off, what register do you use to set AN2..AN0?

 

I guess you need to generate a scanline or part of it so that the delay of GTIA/ANTIC is less than a CPU cycle. I read that the CTIA was HALF color clock delayed from GTIA. So if you can "throw off" the timing of the GTIA, it should cause a shift like Graphics 10 already does.

Link to comment
Share on other sites

What I meant was we can't drive AN0-AN2 directly but it would be possible to just install it in a fast computer and provided the timing was availble it could be functionally the same as in the Atari... but all that's kinda beyond the scope of what we're on about here.

 

You just gave me another idea though... might test it out now... probably won't work but you never know.

Link to comment
Share on other sites

About resetting the flip-flop at the "right time", it would have to involve some trick with Gr.10 switching since IRQs/CPU cycles are always at 1.79Mhz max accuracy and that only corresponds to 2 color clocks.

 

It depends on where exactly is the "right time". The CPU can write to GTIA every other color clock. If the "right time" is one of those that it can't, then you are out of luck. Otherwise, you are ok. Anyway, this is probably just academic as it doesn't seem to be reliable (or non-deterministic, or non consistent, or whatever you want to call it). It might be interesting as far as understanding GTIA internals better, but probably not much more.

 

I guess you agree that for what it was designed for, it's completely deterministic. I would say all chips are completely deterministic-- it's just that some things are just too complicated to analyze (for most of us). It's more of a consistency factor that's significant as far as using the GTIA beyond it's spec.-- whether it can be made to work in general for all GTIA chips.

 

I'm afraid I'm not sure I understand your point. Are you making a semantic debate about the exact meaning of "non-deterministic"? Or your point is something like nothing is really completely random in the universe? Or what?

 

I read that the CTIA was HALF color clock delayed from GTIA

 

Interesting. Do you happen to remember where you read it?

Link to comment
Share on other sites

Another thing - you don't even need hires on scanline 240 to activate the bug. All you need to do is disable DMA further up the screen during a hires mode line.

 

Could you share any code that shows that video sync is being altered, and not just wierd Antic DMA behavior? It doesn't matter if the results are useful or not. e.g, you said something above about seeing color burst on screen.

Link to comment
Share on other sites

I crashed the machine yesterday and it coldstarted so I lost the code that had the luma change which (seemingly) might have simulated part of the VBlank pulses.

It was a fairly short bit of code and a bit crude, so I intend to do another version that works a bit better anyway.

 

Getting the colour burst to show on display is easy although I think it's probably dependant on individual TVs and how they handle whatever it is that's going on.

 

Just change the DMA mode from narrow/std/wide and you should get different effects.

 

e.g. (first setup the distorted screen):

 

5 GRAPHICS 0

10 DL=PEEK(560)+PEEK(561)*256:POKE DL-3,112:POKE DL-2,112:POKE DL-1,112

20 POKE 560,PEEK(560)-3

 

then just try Poking values 21, 22 or 23 in SDMCTL (559).

Link to comment
Share on other sites

I read that the CTIA was HALF color clock delayed from GTIA

 

Interesting. Do you happen to remember where you read it?

Probably in Compute's First Book on Atari Graphics:

Compatibility Between CTIA And GTIA

 

(...) We have, however, come across one discrepancy between the CTIA and GTIA. The video signal generated by the GTIA is shifted one half color clock, so colors produced by artifacting, such as in POOL 1.5 or Jawbreakers, will be different.(...)

Because of the half color clock shift, it is now possible for players and playfields to overlap perfectly, whereas with the CTIA they didn't.

Link to comment
Share on other sites

 

Thanks a lot Kr0tki!

 

...The video signal generated by the GTIA is shifted one half color clock, so colors produced by artifacting, such as in POOL 1.5 or Jawbreakers, will be different.(...)

 

Yeah, the artifact discrepancy is known, and makes sense.

 

Because of the half color clock shift, it is now possible for players and playfields to overlap perfectly, whereas with the CTIA they didn't.

 

That doesn't make any sense whatsoever to me! Either I don't understand what the author meant in this sentence. Or either it is (IMHO) just plain nonsense. Anybody thinks otherwise, or has a better (or different) interpretation to this sentence?

Link to comment
Share on other sites

My interpretation is, sprites in CTIA were shifted by half a colour-clock one way or another with relation to playfield graphics. Needs verification on a real Atari. How many CTIAs have survived to this day?

Edited by Kr0tki
Link to comment
Share on other sites

My interpretation is, sprites in CTIA were shifted by half a colour-clock one way or another with relation to playfield graphics.

 

Yeah, that's my interpretation as well. And that's exactly what doesn't make any sense (to me).

 

Besides the fact that I don't recall anywhere else this being mentioned (contrary to the artifact incompatibility, which is well known), there are some huge technical issues.

 

For starters, this would mean that CTIA would be able to change the chroma signal with half clock granularity (not hirez color, but yes colors shifted with hirez precision), in the very same scan line!

 

Needs verification on a real Atari. How many CTIAs have survived to this day?

 

Good question. They are probably pretty rare, and among the few that might have it, some they might not even know.

Edited by ijor
Link to comment
Share on other sites

For starters, this would mean that CTIA would be able to change the chroma signal with half clock granularity (not hirez color, but yes colors shifted with hirez precision), in the very same scan line!

Or someone miscalculated a propagation delay in the CTIA, and it was fixed in the GTIA.

Link to comment
Share on other sites

Or someone miscalculated a propagation delay in the CTIA, and it was fixed in the GTIA.

 

Can you explain how a wrong propagation delay could create that effect?

Can you explain how CTIA could generate chroma on a half clock basis?

Can you explain how a wrong propagation delay before the priority process (otherwise it couldn't be different for players than for playfields), could create a half color clock shift at all?

 

If PMGs had a halfCC shift, then that has implications in collision detection too. There'd be a whole heap of games would be unplayable.

 

Not sure about that. That shouldn't necessary be a problem because collision works at color clock boundaries.

Link to comment
Share on other sites

Maybe it wasn't; maybe chroma was bleeding between PMG and PF.

 

Hi Kr0tki,

 

I'm not talking about the video distortion produced by fast chroma changes (such as half color clock pixel red, half color clock pixel blue). I'm talking about something completely different. Not about chroma in hirez, but about the supposed capability of CTIA to produce (disregarding how it would be displayed by the TV/Monitor) chroma (or even luma for that matter) with a half color clock granularity.

 

Imagine you have a simple two color clocks player with HUE/LUMA X near the left edge, and a simple two color clock playfield near the right edge with HUE/LUMA Y. In between everything is black. No color bleeding at all between the playfield and the player, they are too far apart.

 

Yet, the player would be in horizontal position (say) 52, and the playfield in horizontal postion (say) 180.5! CTIA would need quite some logic to achieve this (again disregarding how the TV/Monitor would display it).

 

If you check once again Perry's picture of GTIA schematics, you would see a fair amount of tricky logic dedicated just for the purpose of producing the high rez half color clock luma. And this is for a very simple signal, that would affect LUMA only, and without any palette lookup at all!

 

Anyway, the discussion won't make any sense until it's verified...

 

May be, but I think we all agree that finding a working CTIA won't be easy...

Edited by ijor
Link to comment
Share on other sites

About resetting the flip-flop at the "right time", it would have to involve some trick with Gr.10 switching since IRQs/CPU cycles are always at 1.79Mhz max accuracy and that only corresponds to 2 color clocks.

 

It depends on where exactly is the "right time". The CPU can write to GTIA every other color clock. If the "right time" is one of those that it can't, then you are out of luck. Otherwise, you are ok. Anyway, this is probably just academic as it doesn't seem to be reliable (or non-deterministic, or non consistent, or whatever you want to call it). It might be interesting as far as understanding GTIA internals better, but probably not much more.

 

I guess you agree that for what it was designed for, it's completely deterministic. I would say all chips are completely deterministic-- it's just that some things are just too complicated to analyze (for most of us). It's more of a consistency factor that's significant as far as using the GTIA beyond it's spec.-- whether it can be made to work in general for all GTIA chips.

 

I'm afraid I'm not sure I understand your point. Are you making a semantic debate about the exact meaning of "non-deterministic"? Or your point is something like nothing is really completely random in the universe? Or what?

...

 

The only option of getting to the right time with 1.79Mhz is to somehow get to the internal delays of the chip which people have been trying already mostly with failure or inconsistently. If the timer allowed for interrupting on color clock boundaries, you have more choices to find the right time (if it actually does not work right off the bat).

 

I'm not arguing semantics; you can have inconsistent results and be deterministic simultaneously. People have gotten the color clock shift consistently on a particular machine and yet inconsistently across different machines, but chip is deterministic in both cases.

 

Yeah, I also claim nothing in the universe is random but I was trying to discuss just the chips that most people here know something about.

 

>>I read that the CTIA was HALF color clock delayed from GTIA

 

>Interesting. Do you happen to remember where you read it?

 

Looks like people already answered this for you. All you need to do is drop a CTIA chip into the GTIA socket and see if the results verify what some people have claimed.

Link to comment
Share on other sites

I'm not arguing semantics; you can have inconsistent results and be deterministic simultaneously. People have gotten the color clock shift consistently on a particular machine and yet inconsistently across different machines, but chip is deterministic in both cases.

 

Oh, I see now what you mean. Sorry for the misunderstanding.

 

It is possible that Bryan's trick is deterministic on specific chips, hard to be sure. Anyway, I still believe that some GTIA timings are not fully deterministic. In the sense that the same chip might behave differently in different computers, and also in the very same system depending on such things as small temperature and voltage variations.

 

All you need to do is drop a CTIA chip into the GTIA socket and see if the results verify what some people have claimed.

 

Indeed, that's all that I need. If you would lend me a working CTIA, I promise I would do it in a second.

Edited by ijor
Link to comment
Share on other sites

Something about CTIA that I remember:

 

The CTIA artifacting is not just one half colour-clock different from GTIA. CTIA can produce an orange colour with artifacting black and white hi-res that GTIA can't produce.

 

When I implemented Blargg's NTSC emulation, I added a phase-shift parameter that can adjust the artifacting colours any way you like. Games designed for CTIA just don't look right unless this parameter is adjusted to produce the "CTIA orange" colour.

 

So CTIA games can't simply be modified for GTIA by shifting the display 1/2 colour clock over to match the original shift.

 

There is an explanation here:

http://www.xmission.com/~trevin/atari/video_notes.html

 

I wondered if the issue with CTIA players overlapping only concerned the high-res luma output. If that output was off somewhat it could mean that an even-odd pair of high-res pixels might not exactly overlap one standard-width player pixel. Of course this would also be a problem for colour changes on the fly, not being specific to players.

Link to comment
Share on other sites

I'm not arguing semantics; you can have inconsistent results and be deterministic simultaneously. People have gotten the color clock shift consistently on a particular machine and yet inconsistently across different machines, but chip is deterministic in both cases.

 

Oh, I see now what you mean. Sorry for the misunderstanding.

 

It is possible that Bryan's trick is deterministic on specific chips, hard to be sure. Anyway, I still believe that some GTIA timings are not fully deterministic. In the sense that the same chip might behave differently in different computers, and also in the very same system depending on such things as small temperature and voltage variations.

...

 

Well, even if you take a more complex example like a Pentium IV where the heating kicks in the power management and dynamic frequency adjustments. If you knew all the factors involved including how heat affects transistors, the air temperature, the caching equation, memory speed, and all other forces involved in the system, you can calculate when the power management will kick in, the memory speed, and then predict the cycles taken to execute some code.

 

>>All you need to do is drop a CTIA chip into the GTIA socket and see if the results verify what some people have claimed.

 

>Indeed, that's all that I need. If you would lend me a working CTIA, I promise I would do it in a second.

 

I have an old Atari 400 but it's chip was upgraded before I got it from Ebay. Currently, it artifacts a blue line in the middle of GTIA mode 9 (graphics 9) when drawing shaded bars.

Link to comment
Share on other sites

I wondered if the issue with CTIA players overlapping only concerned the high-res luma output. If that output was off somewhat it could mean that an even-odd pair of high-res pixels might not exactly overlap one standard-width player pixel. Of course this would also be a problem for colour changes on the fly, not being specific to players.

 

Yeah, that would make much more sense. It would affect not only PF color changes on the fly, but also the vertical alignment between say, a scan of mode $E and one of mode $F. But that might be harder to see depending on the TV.

 

Actually I think that this has to be off. After all, isn't this the very same reason for the difference in artifacts? I mean, the half color clock shift that produces a change in artifacts, is a shift in respect to what? It can't be a shift of the whole display (both Chroma and Luma) in relation to sync, blank, or Antic, because that wouldn't change artifacting. It must be a change in the relation between Chroma and Luma.

Edited by ijor
Link to comment
Share on other sites

That type of artifact is probably from the "0111" to "1000" luma line transition which was talked about elsewhere.

 

Supposedly the luma circuitry takes longer to go from a 0 to 1 state than it does from a 1 to 0 state.

 

It's specific to old Atari 400 machine that I have since it does not show it on Atari 800/XL/XE/XEGS. Are you including circuitry external to GTIA when you stated "luma circuitry" being affected by propagation delay?

Link to comment
Share on other sites

Someone else recently explained it in another thread.

 

I'm pretty sure it's also the case on all machine's I've ever owned (400, 600XL and the 800XL, 130XE & XEGS I now have).

 

I'm no electronics expert but I'd put it down to the externals, and not inside GTIA.

Maybe a little time lag in the voltage ramping up?

 

With 0111 you have the biggest voltage off, and full output on the smaller ones.

1000 has the biggest voltage coming on and no output on the smaller ones.

Edited by Rybags
Link to comment
Share on other sites

Someone else recently explained it in another thread.

 

I'm pretty sure it's also the case on all machine's I've ever owned (400, 600XL and the 800XL, 130XE & XEGS I now have).

 

I'm no electronics expert but I'd put it down to the externals, and not inside GTIA.

Maybe a little time lag in the voltage ramping up?

 

With 0111 you have the biggest voltage off, and full output on the smaller ones.

1000 has the biggest voltage coming on and no output on the smaller ones.

 

There's some mention of color delay line output and voltage level on Pin 17 of GTIA (Vdel) in the link given in this thread:

 

ftp://ftp.pigwa.net/stuff/collections/nir...20Info/GTIA.PDF

 

Starting from Sheet 21, it's talking about the 16 colors and their min/max time in nanoseconds (although the min. time is left blank).

 

Looks like both the GTIA and external circuitry have their delays...

Link to comment
Share on other sites

That's different - the colour delay just relates to the phase shift for the colour carrier.

 

The luma lines run through resistors to break the 5 Volts into fractional signals that are combined later.

Obviously, the higher voltage lines take longer to charge up although it's in the order of nanoseconds - but still recognisable on the display when certain luma transitions take place.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...