-
Content Count
3,963 -
Joined
-
Last visited
-
Days Won
3
Posts posted by ZylonBane
-
-
Is there a way to know...?
Yes, you just...!
-
If I haven't officially joined, I would also like to enter the fray.
Sorry, your signature block isn't stupidly huge and eye-searingly multicolored enough to be a true member of the 5200 Brotherhood.
-
1
-
-
Huh, it's a presentation folder.
-
-
I like it. It's a Jaguar game, so if you like that genre of games, you'll like Skyhammer.
"Jaguar" is a genre of games? What does that even mean?
-
2
-
-
So is it the Framemeister or your video capture setup that can't maintain a steady frame lock, messing up the interlace effect on Draconian's title screen?
-
So this converts MARIA's output to a 60hz HDMI signal?
-
My only question is as follows: If the Ataribox is released, will it get it's own tab on Atari Age next to the Jaguar?
My response is as follows: Of course not. It's just an emulator.
-
please stop speaking authoritatively about subjects you're guessing about.
Oh, irony.
-
2
-
-
The problem is not in the relative Hz of the signal, but the internal mechanics of the Display itself. I've never once seen one of these HD displays pull this off, because they all take the 30Hz flicker and interpolate and convert it to a progressive signal. I'd love to see it done properly.
Yes, you've never seen it done properly because most scalers weren't designed to deal with a nonstandard 240p picture that uses lots of flicker. That's why I keep saying that an HDMI scaler designed explicitly for the 2600's display characteristics would be the way to go. So stop confusing the capabilities of scalers with the capabilities ot modern TVs.
2D graphics are better upscaled by just replicating pixels, like OSSC and Framemeister (Picture mode) actually do. At least most people prefer it that way, blocky, not blurry. Emulators actually have specific filters for 2D graphics, "eagle" and "HQ" stuff.
If you apply the same approach for 3D graphics, they don't look good. 3D graphics are better upscaled by interpolating pixels, like TVs actually do. That's why filters for 3D graphics are different.
What you've stated here is pure personal opinion, not fact. Emulators can keep things sharp, but they also have options to make the picture more "analog", with simulated scanlines, blur, chroma bleed, phosphor persistence, etc. People like both. Ditto for 3D graphics-- I hate soft-looking polygon edges. Give me sharp any day.
-
The biggest differences are interlacing and 30Hz flicker. The 2600 used these analog display characteristics for purposeful effect in certain games, not just as a hardware limitation. This cannot be accurately duplicated using digital graphic displays. You're getting an approximation at best, and I've never seen an accurate representation of this, either through a dedicated scaler, or using a display's integrated scaler. They all botch it to varying degrees, in some cases to the point of rendering certain games unplayable.
Yeah it's almost as if the best HDMI scaler for the 2600 would be one created specifically for the 2600, instead of relying on an external generic box.
And the notion that such effects are impossible to replicate on modern TVs is ludicrous. HDMI supports 60Hz video, so 30Hz flicker can be represented perfectly well.
-
If you want to eliminate the analog hole altogether, a Hi-Def 2600 Mod could designed based on the 2600 RGB mod principle.
For the love of god, read the 2600 RGB mod description page:
"The various luma outputs from the TIA, in addition to the colour data stored is then used to create a RGB version of the original video signal. Essentially, the board creates the RGB video signal by bypassing the colour generation logic in the TIA."
There is no analog hole in the 2600 RGB mod. It only uses raw digital data from the 2600.
Again, not true. Most games up to 4th gen use 2D graphics. They won't look good with the same techniques use for modern 3D graphics. Because content matters.
Content matters! Content matters! Content matters! Bu-bawk!
Clearly you have no idea what you're actually talking about. Otherwise you'd explain what you mean instead of parroting your little catchphrase over and over. Like, maybe you could explain to us the difference between a 2D pixel and a 3D pixel.
-
You use different settings for different consoles to get them converted in the first place. Once they're converted, THAT'S DONE.
How are you not grasping this?
-
Christ, I leave this thread alone for a day and it goes full Keahaha.
Modern gaming consoles may not need a dedicated upscaler since their graphics are completely different and closer to TV content anyway. Again, content matters.No, the graphics of modern consoles are not completely different. That's another nonsense assertion. They're colored pixels, in a grid. The only difference is density. Once you normalize a 2600's video output to a standard 1080p 60Hz signal, a 2600's picture is no different from a PS3's. That's the entire point.
-
But take Atari 2600 and output crisp and sharp digital 480p through HDMI. Now take it to a 4K TV and the graphics will be blurried. You just can't use trivial upscaling methods for retro-videogames.
First, I question your premise that blurry output is a bad thing. All commercial 2600 games were designed with the knowledge that they would be viewed via a fuzzy RF connection. They were never sharp. They're not supposed to be sharp.
Second, fine, then have this theoretical 2600 upscaler generate a 1080p signal. There, the 2600 is now in the same boat as every modern gaming console.
-
and thus, eliminate the ability to connect the 2600 to an analog connection on a CRT, thus rendering any game that intentionally employs 30hz flicker as an intentional graphic element, or works with a light gun - useless.
Point out where I said the analog connections should be removed.
-
Oh god, you again.
The 2600 RGB mod works by intercepting hue codes directly from the CPU. So no, it works entirely within the digital domain.
-
Then, when 4K or whatever becomes the standard, we will still have the TV incorrectly upscaling the 720p/1080p image. Then what? Either we will have to upgrade the mod, or we will need another upscaler to chain with the internal one.
This right here is where you make the leap into nonsense. Upscaling the signal from an old video game system is tricky, but upscaling a standard HDMI video stream is not. At that point it's just a progressive pixel grid, trivial to scale.
And once again I point out that if this is a problem (it isn't), then by your logic it's a problem with literally every HDMI device ever manufactured.
-
You completely missed the point. It is because it should look blocky that it needs a dedicated upscaler which handles 240p content without attempting to deinterlace. Movies, modern videogames and everything else is upscaled by modern TVs just fine.
You're arguing against yourself. First you said a 2600 shouldn't have HDMI output because that would require a scaler. Now you're saying it would need a dedicated 240p scaler.
Obviously the best scaler for the 2600's unique video output would be one designed specifically for the 2600, that would generate a standard 720p (or 1080p, whatever) signal. And the best place to position a scaler in the data stream is as close to the raw video output as possible. An internal mod like this one, that directly taps the TIA output pins, would by definition deliver more accurate video data to any scaler than one that was working from an analog signal.
-
A good reason to NOT put a HDMI output on Atari 2600: HDMI requires upscaling, upscalers evolve.
Yeah, except this is an Atari 2600 picture we're talking about. It's non-interlaced and supposed to look blocky. It falls entirely outside the problem domain for scalers, which is generally to make interlaced, photographic imagery look good.
This is also a dumb argument because by that logic nothing should should have HDMI output.
-
At that point you're building an rgb to hdmi scaler device and stuffing it into an Atari case, which is stupid.
You stamping your feet and declaring it stupid does not in fact make it stupid. Being able to plug a 2600 directly into an HDMI port would be very convenient, therefore not stupid at all.
And if you'd even bother reading up on this mod, you'd be aware that it's not directly converting the 2600's video signal, it's synthesizing its own video signal. So the circuit generating the output video could theoretically be modified to scale up its output to any arbitrary resolution.
-
its not like a mod for HDMI is going to increase the native resolution of the console. Native res TIA RGB is the best output a 1977 era console with 160x192 resolution could hope for.
Either way, you're still going to be upscaling the picture on an HD set, whtether you use a dedicated RGB to hdmi converter, or you let the TV upconvert the analog signal. You would have to build an entirely new motherboard for the Atari 2600 to have it output true HD resolutions. Everything else is an analog to digital compromise. Knowing this, native res RGB is the best you can get without designing an entirely different console.
Okay first, you're responding to a post that's two years old.
Second, you blather on and on about resolution, but I said nothing about resolution. What I said was "For a product intended to make it easier to hook a 2600 to modern A/V equipment, the design seems about 10 years out of date." This statement is true. RGB, S-video, and composite inputs are rapidly vanishing from modern TVs. It's an HDMI world now.
-
What, no 7800 port?
Oh well, the NES should be a good home for it too.
-
I wonder if Chris Foss's stuff would look great because it's so colorful, or terrible because it has so many regions of solid color.


Skyhammer Thoughts?
in Atari Jaguar
Posted
If I knew, I wouldn't have asked. Taken literally, it's nonsense. The Jaguar is a system, not a genre.