Jump to content

Bryan

Members
  • Content Count

    10,914
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by Bryan

  1. Was SID really the target? It can do some neat things, but it isn't a hard chip to beat if you're starting from scratch. We had 8-channel Yamaha FM chips by then that Atari Games was using in coin-ops. Maybe AMY would have been better, but it sounds like it was either too ambitious or no one was really willing to spend the money to fix its problems.
  2. Here you go: http://atariage.com/forums/blog/695/entry-14205-install-uav-ac-in-the-400/ I'm working on a new revision of the AC board that I can have automatically assembled for me, so I'll be out of stock for a little bit.
  3. Well, it also could have to do with trying to make sure it can be crammed into a 16K cartridge and will run on a 16K system. Sometimes you don't get to write the engine you'd like to.
  4. Adding Maria is difficult because it needs to be the DMA controller and it cannot use DRAM. However, some of Maria's ideas could have been used in a successor to Antic.
  5. Tower Toppler uses artifact colors and it doesn't appear to always produce consistent colors across different 7800's. It appears to combine colored lines with dot patterns to produce a very messy combination. On some captures Level 1 looks green or gray (how does an artifact pattern make gray??!?), and in your picture it looks purple. Does anyone know what the "official" colors are supposed to be? I'm working on some ideas to make the next version more tweakable in this regard.
  6. Yeah, the LUM bits don't seem to reach the output with identical timing (meaning they probably are clocked prior to some additional output logic). That's the main reason for UAV's delayed latching (UAV detects that the LUM has changed, then latches the new value after a small delay).
  7. Yep. That's from something I posted.
  8. Here's what standard NTSC composite looks like. For NTSC, 1 IRE = 7.14mV so a "perfect" burst is +/- 143 mV (286 mV p-p). (remember to always measure with a 75 ohm load in place)
  9. That's okay, I've had several people send me scope caps of video issues they're having and it's almost always from a better scope than mine. Yeah, when I started trying to define what a "standard Atari signal" should be, I kinda gave up and just aimed for what would be closest to the industry standards.
  10. I'm working on a method to adjust artifacting on the fly so it can be changed on a per-program basis.
  11. I've started warming my fish with a hairdryer at work.

    1. Flojomojo

      Flojomojo

      hey, anyone know what you call a fish with no eyes?

       

    2. Flojomojo
    3. GoldLeader

      GoldLeader

      OK, Pull your pants up and step away from that dryer!

    4. Show next comments  6 more
  12. Changing the burst level only works if you change it independently of the rest of the color. Atari used this trick in the 6-switch 2600 and then in the 1200XL. TIA/GTIA only puts out one level of color, but on those models the burst was made to be a lower level than at all other times so the colors would be more saturated. I don't know if any TVs adjust brightness, but some of them might set the black level after sync. That's something I need to investigate further.
  13. It's very true that the amplitude of the burst will determine how much artifacting you see on high-rez text. Early sets only calibrated color phase off the burst, but modern sets (starting in the late '70s probably) calibrate color strength as well. A weak burst means the TV will assume all color is weak and it will bump up the "COLOR" control leading to stronger artifacts. This is one of the tough things to set when designing something like the UAV. If the color is strong, then the composite picture gets clearer but faint jailbars can start to appear on some TVs that don't filter Chroma out of the picture very well.
  14. To make things 100% clear, here's a demonstration of how it occurs. I ran the following program: 10 GRAPHICS 8+16:COLOR 1 (high rez) 20 POKE 712,148 (blue border) 30 POKE 710,4 (gray screen = no color) 40 FOR X=1 TO 319 STEP 2 (skip every other X) 50 PLOT X,0:DRAWTO X,191 (vertical lines) 60 NEXT X 70 GOTO 70 (halt with screen intact) You can see the output in S-Video and in Composite. Instead of lines we now see a solid red screen. Now look at the scope images of a single line from the middle of the screen. First is the Luminance channel. Once we get to the playfield area, we see the alternating lighter and darker pixels that make up the lines. I also took a zoomed in image to show the pattern more clearly. Next is the Chrominance channel. We see that there is the off-screen colorburst, followed by the widely spaced color information for the blue borders. The playfield is gray, and thus has no color signal. Now look at the 2 Composite (mixed) channel pictures. The Luminance and Chrominance have been combined and now there's a waveform across the entire width of the screen. Because the Atari generates pixels with the same clock as it generates color, the frequency is the same and it is now impossible to separate what came from Luminance and what came from Chrominance. This signal looks just like a colored playfield signal. The C64 uses about a 12% faster pixel clock, so vertical line patterns won't trick the TV like this (also, the faster clock makes its playfield a little narrower).
  15. Because anything that decodes NTSC video from a Composite source will do it. The signal contains frequencies that must be decoded as color. They only failed to display color if they were using separate Chroma/Luma mode. Hopefully my post above helps.
  16. Neat HAM radio interface on ebay: https://www.ebay.com/itm/173175565736

    1. save2600

      save2600

      That seller has a lot of neat stuff for sale!

    2. ValkerieSilk

      ValkerieSilk

      You had me @ Ham...

  17. Okay, I'll go back and read this after I post the lowdown on artifacting. 1. It has nothing to do with the display output device (the CRT or the LCD panel). The artifact colors are in the signal before it reaches the tube. The alignment of the image to the phosphors is completely arbitrary (and will actually vary quite a bit as the TV warms up). The size of the phosphors relative to the resolution of the video source is different from set to set. Phosphor triads are NOT related to pixels. The shadow mask blocks the guns from hitting anything except their designated phosphors, but this is done after the continuous analog image has left the guns giving us sort of a Venetian blind view of each of the the 3 individual signals. If R,G and B were run to 3 discrete B&W tubes, you'd still see the artifact colors happening.** 2. Artifacting is generated in the video decoding circuits. The NTSC color carrier is centered around 3.58MHz. The A8's pixel clock is based on this same frequency, so drawing on-off-on pixels in 320 mode actually puts a 3.58MHz waveform in the picture. The TV sees this and screams,"COLOR!!!1!omg". Actually, the TV only screams 2 colors. One if the pattern is 1010 and another if it's 0101. The colors produced depend on what the phase relationship is between the on-off waveform and the colorburst wave extrapolated out to the same point in time (which the TV does via a PLL). Artifacting is putting a color generating waveform into the picture using luma signals instead of the automatically generated Chroma. 3. S-Video (separated video) won't produce the colors because the monitor knows that color is only contained in the Chroma signal. Those 3.58MHz waves in the Luma don't fool it. With a Composite signal, there's no way for the monitor to know the difference. A 3.58MHz wave in the image = color, no matter what. 4. Different A8's have different video buffer circuits and these introduce differing amounts of skew: the delay relationship between Chroma and Luma. This skew changes the colors the Luma appears to be generating because it skews the timing relationship to the colorburst which is our reference for color. 5. I suspect that the 800's strange colors are the result of a sub-optimal clock circuit causing an odd duty cycle in high resolution. High resolution is created by showing 2 pixels per 3.58MHz clock cycle. One LUM value is shown when the clock is high, and another when it's low. If the clock isn't a true 50/50 waveform, then even and odd pixels will have different widths, and will create artifact patterns that aren't ideal. This probably explains the odd color shifting and why some monitors may interpret them differently. EDIT: I also noticed a reduced adjustment range with the UAV pot on the 800 which points to a clock issue. These are things on my laundry list for Rev E. ** not as color, of course, but in the relative strengths of the images.
  18. Yeah, I was a Jag developer for a short while. It was an ambitious design slaughtered by the need to keep it all really cheap (and by insufficient hardware testing). Even though you could do some fast things with it, nobody in the game industry has got the time to figure out how to squeeze every last ounce of performance out of a complicated architecture. Especially if the docs are sparse. Well, when you saddle a 6502 with a screen that big you've got two problems: 1. Most of your RAM will be consumed with graphics. Double buffering will consume all of it. 2. It's slow to update large bitmaps with a slow processor. This is one of the things that hurt the Apple IIgs. It had modes like an ST, but couldn't draw anywhere near as fast.
  19. Only a little bit on topic... I would love to have a crowdfunding campaign to design an 8-bit system from the ground up. Design the NMOS chips and everything using ONLY technology from around 1980 (if there's not enough money for chips, then workalike FPGA DIP's could be used in the board). It would be fun to deliver boxed computers from an alternate history, but based on the architecture ideas we'd most like to see in a single machine. Then we'd write a fictional backstory for the machine and its parent company.
  20. That question applies to most of the 5200's design choices.
  21. The 1650XLD was rumored to have an Intel chip for DOS compatibility. Sure, the Amiga chipset could be modified for 8-bit systems but what good is Amiga hardware in a system that can only see 64K without bank switching? A 320x200x16 color screen is 32K just by itself. Once your graphics get big, you need a big address space. Who knows what Atari would have done if the crash hadn't happened. They sure wasted a lot of money exploring all the possibilities.
  22. I'm pretty sure the Amiga chipset has 16-bit registers and you'd need some complex scheme to write to them. I also don't know if the chips can be accessed at 6502 speeds. In any case, I don't think that was ever anyone's intention.
  23. I'm pretty sure the Amiga chipset requires a 16-bit environment and would not be at all compatible with a 6502 system.
  24. I tend to disagree. When you've got the world's best selling micro, why would you continue to act like a company trying to break into the market with an incompatible product? It's an incredible gamble. The Plus/4 was considered a bad idea by most of the press at the time as well.
  25. The Ted line should never have been released as it had drifted too far from its goal as a ultra-budget machine and was a misguided attempt to move away from the thriving C64 ecosystem. Whereas the 128 probably shot too high, the Plus/4 shot too low. Personally, I think a revised C64 at the same price but with a few added features would have convinced many owners to upgrade and generated more sales than anything else.
×
×
  • Create New...