Jump to content
IGNORED

Atari 2600 RGB mod


Yurkie

Recommended Posts

Here it goes. Draconian (2600RGB + Framemeister):

 

Looks great, thanks!

 

I loved it around 4:15 when you realized you could just hold down fire, and a bit later when you realized you could target the enemy from the back of your ship :) Nice that you eliminated an entire formation at the end for the bonus points :thumbsup:

 

 

From the title screen, it looks like Youtube is recording only every 2nd frame. Maybe enable phosphor.

 

That's from a 2600 with the RGB mod, it does look like it could be from Stella.

Link to comment
Share on other sites

its not like a mod for HDMI is going to increase the native resolution of the console. Native res TIA RGB is the best output a 1977 era console with 160x192 resolution could hope for.

 

Either way, you're still going to be upscaling the picture on an HD set, whtether you use a dedicated RGB to hdmi converter, or you let the TV upconvert the analog signal. You would have to build an entirely new motherboard for the Atari 2600 to have it output true HD resolutions. Everything else is an analog to digital compromise. Knowing this, native res RGB is the best you can get without designing an entirely different console.

 

Okay first, you're responding to a post that's two years old.

 

Second, you blather on and on about resolution, but I said nothing about resolution. What I said was "For a product intended to make it easier to hook a 2600 to modern A/V equipment, the design seems about 10 years out of date." This statement is true. RGB, S-video, and composite inputs are rapidly vanishing from modern TVs. It's an HDMI world now.

Link to comment
Share on other sites

 

Okay first, you're responding to a post that's two years old.

 

Second, you blather on and on about resolution, but I said nothing about resolution. What I said was "For a product intended to make it easier to hook a 2600 to modern A/V equipment, the design seems about 10 years out of date." This statement is true. RGB, S-video, and composite inputs are rapidly vanishing from modern TVs. It's an HDMI world now.

the fact that your post is 2 years old doesn't make it any less wrong.

 

Yes, I'm aware that svideo and rgb have fallen into obscurity as input formats on current generation TVs. Composite, not so much. It is now the industry standard legacy input format. You would be hard pressed to find a current gen tv that doesnt have one. I'm sure there's some crappy low end hdmi only TVs out there, but if you're hooking an Atari up to one of them.... yeah, why bother?

 

The point you apparently missed with my "blathering" about resolutions is that the atari's native 160x192 revolution is NOT supported by HDMI, and is not included in the HDMI spec. The minimum resolution supported by hdmi is 480p. You would need to include a scaler in an Atari HDMI video mode to scale the atari's native res up to at least minimum HDMI spec. At that point you're building an rgb to hdmi scaler device and stuffing it into an Atari case, which is stupid.

Edited by DrSidneyZweibel
  • Like 1
Link to comment
Share on other sites

At that point you're building an rgb to hdmi scaler device and stuffing it into an Atari case, which is stupid.

 

You stamping your feet and declaring it stupid does not in fact make it stupid. Being able to plug a 2600 directly into an HDMI port would be very convenient, therefore not stupid at all.

 

And if you'd even bother reading up on this mod, you'd be aware that it's not directly converting the 2600's video signal, it's synthesizing its own video signal. So the circuit generating the output video could theoretically be modified to scale up its output to any arbitrary resolution.

Link to comment
Share on other sites

 

You stamping your feet and declaring it stupid does not in fact make it stupid. Being able to plug a 2600 directly into an HDMI port would be very convenient, therefore not stupid at all.

 

And if you'd even bother reading up on this mod, you'd be aware that it's not directly converting the 2600's video signal, it's synthesizing its own video signal. So the circuit generating the output video could theoretically be modified to scale up its output to any arbitrary resolution.

why not just put a hinged LCD screen attached directly to the back of the 2600 and bypass the TV altogether? No muss, no fuss. No need for HDMI cables.

 

The point is, you're not gaining anything by putting a scaler into a 2600 over plugging a composite, s-video, or rgb cable into any HD tv and letting its built in scaler do the work. The resultant picture will be indistinguishable. But with the composite/s-video/rgb mod, you also retain the ability to use a pre-hdmi CRT tv, or RGB compatible monitor.

Edited by DrSidneyZweibel
  • Like 1
Link to comment
Share on other sites

Yes, here we go. No more RGB in TVs. And the fun begins! Once those LCD TVs get rid of analogue inputs, I am sure someone will come over here with a thread designing HDMI daughterboard. Bet it is going to happen. And it is all wrong. Atari 2600 was never ever intended to be RGB. It is precisely RF + CRT. If you want something different, use different hardware. But why do you need to butcher existing 2600 risking shortening its potential life with improper design changes is beyond me.

Edited by maiki
  • Like 1
Link to comment
Share on other sites

@DrSidneyZweibel: Don't worry about posting in a two year old thread. The guy likes to call out people that do, for whatever good it does the world.

 

---

 

why not just put a hinged LCD screen attached directly to the back of the 2600 and bypass the TV altogether? No muss, no fuss. No need for HDMI cables.

 

Well.. yes..

 

Thing is though, how far do you (we) want to mod vintage hardware? I personally prefer to leave the VCS console intact and completely stock. And game entirely in digital. But that's me. I bet in the future that unmodded consoles will sell for higher than modded consoles.

 

Point being, is that mods are interim solution. Patchwork. And, today, you're modding an old console to work with other old outdated standards. If you're going to play through HDMI on that new 4K TV, why not synthesize the signal in HDMI from the get-go? This means playing via emulation, or an emulator in a box, emulation on the PC, a flashback-type SoC, or a console built from the ground up to output digitally like something from Analogue

 

It really is the way forward.

 

Most people I've observed that've gotten back into the VCS usually follow similar paths, but all end up like how I just described. They start out with getting a console, and cartridges from fleabay, try their existing TV, then do mods, try some sort of framemeister framebuffer thing, try more mods, start looking for a real CRT, get frustrated (with setup or even finding one) then start looking at more digital options and converters. Then they finally ask about new hardware that outputs directly to the TV via HDMI.

 

I won't poo-poo the CRT and its original back-in-the-day look or anything. In fact it is one of emulation's holy grails - to synthesize a simulated CRT-style look. Some emulators have CRT & NTSC effect built in. But they're all pretty primitive and need significant amounts of work.

 

They've got some of the blurring and smearing covered, maybe even scanlines - if you don't make them too intense. But scintillation and blooming? Not even attempted yet.

Link to comment
Share on other sites

Yes, here we go. No more RGB in TVs. And the fun begins! Once those LCD TVs get rid of analogue inputs, I am sure someone will come over here with a thread designing HDMI daughterboard. Bet it is going to happen. And it is all wrong. Atari 2600 was never ever intended to be RGB. It is precisely RF + CRT. If you want something different, use different hardware. But why do you need to butcher existing 2600 risking shortening its potential life with improper design changes is beyond me.

A case can be made that you're not altering the output of the Atari at all by converting it to composite or s-video, since those are native output formats the console creates, before it gets to the RF convertor. Remember, it needs to CONVERT the output to RF.

 

Also the point is to remove RF signal noise, which is achieved very easily by bypassing the RF converter and amplifiying the raw TIA video output.

 

But you're right - RGB is not what the Atari intended.

Link to comment
Share on other sites

I consider the RF modulator as an integral part of the VCS. All VCS units came that way out of the box. On the schematic it may look like a different block, inside the unit it may even be a 3rd party part - a parasite on the PCB. But it is part of the essence of the VCS.

Link to comment
Share on other sites

I consider the RF modulator as an integral part of the VCS. All VCS units came that way out of the box. On the schematic it may look like a different block, inside the unit it may even be a 3rd party part - a parasite on the PCB. But it is part of the essence of the VCS.

it was a necessary evil given that the vast majority of commercially available TVs in 1977 only had RF inputs. in fact - most didn't even have 75ohm coax inputs, and only have 300ohm screw prongs for the antenna.

 

the bottom line is, nostalgia aside - RF input looks like dogshit and it is 2017, where nearly every other input format does not look like dogshit. If you want to continue to look at dogshit nostalgically, have fun. But coming into a thread about a video mod to argue against using video mods is not a good look.

Edited by DrSidneyZweibel
Link to comment
Share on other sites

it was a necessary evil given that the vast majority of commercially available TVs in 1977 only had RF inputs. in fact - most didn't even have 75ohm coax inputs, and only have 300ohm screw prongs for the antenna.

 

the bottom line is, nostalgia aside - RF input looks like dogshit and it is 2017, where nearly every other input format does not look like dogshit. If you want to continue to look at dogshit nostalgically, have fun. But coming into a thread about a video mod to argue against using video mods is not a good look.

 

It simply was the technology available back then. RF was the standard of choice, and the only standard in mass production. And the VCS was designed with that in mind. A simple hookup was all that was needed, no modding, the out-of-box experience was a good one.

 

And it is 2017, where nearly every other input format is either gone or going away from modern television sets. The proper way to do digital displays is to cut out the analog conversion entirely. No need to compute a game in digital, convert it to analog to send it 3 meters away, and then convert it back into digital for display on a matrix of pixels. Just send it digital from the get-go.

 

Video mods are an interim solution. Most people eventually discover that operating entirely digital, from joystick and game program and microprocessor (collectively known as the source) - direct to HDMI TV - is the best solution in 2017. Digital sourcing to a digital display is superior all around. There's still too much slop and variation in specification to get a nice clean consistent picture across many different and varied setups.

 

It's why the Retron 77 thread continues to grow with hope it will come to fruition. It's why the Analogue consoles are hot items. It's an all-digital solution. It's a supporting reason why those small "flashback" consoles are important, they're all digital. They're convenient. They work trouble-free. You don't need to struggle with a half-assed implementation. Just take the right, 2017, way.

 

Don't fret it, son. You'll catch up in good time. It's a trial and error thing for nearly everyone.

Link to comment
Share on other sites

 

It simply was the technology available back then. RF was the standard of choice, and the only standard in mass production. And the VCS was designed with that in mind. A simple hookup was all that was needed, no modding, the out-of-box experience was a good one.

 

And it is 2017, where nearly every other input format is either gone or going away from modern television sets. The proper way to do digital displays is to cut out the analog conversion entirely. No need to compute a game in digital, convert it to analog to send it 3 meters away, and then convert it back into digital for display on a matrix of pixels. Just send it digital from the get-go.

 

Video mods are an interim solution. Most people eventually discover that operating entirely digital, from joystick and game program and microprocessor (collectively known as the source) - direct to HDMI TV - is the best solution in 2017. Digital sourcing to a digital display is superior all around. There's still too much slop and variation in specification to get a nice clean consistent picture across many different and varied setups.

 

It's why the Retron 77 thread continues to grow with hope it will come to fruition. It's why the Analogue consoles are hot items. It's an all-digital solution. It's a supporting reason why those small "flashback" consoles are important, they're all digital. They're convenient. They work trouble-free. You don't need to struggle with a half-assed implementation. Just take the right, 2017, way.

 

Don't fret it, son. You'll catch up in good time. It's a trial and error thing for nearly everyone.

I'm not fretting anything. Every TV I own has a composite input. Installing a composite mod in an Atari takes under an hour, and the soldering skills of a chimp on cocaine. I have pristine, RF interference free, clear as a bell Atari 7800 output on my 4 months old 4K TV, and it looks glorious.

 

If I wanted to play emulators on my TV (that's what the Retron 77 is - an emulator, NOT a real Atari) I'd hookup a laptop to my TV, and run every rom ever made.

Link to comment
Share on other sites

A good reason to NOT put a HDMI output on Atari 2600: HDMI requires upscaling, upscalers evolve. Framemeister and OSSC are good, but not perfect. In some years we will have better upscalers. We don't want a mod that has to be upgraded in a few years. Resolutions increase. TVs suck at upscaling and they probably always will. If we output up to 480p, we are letting the TV do an awful job at upscaling it to 1080p. So, let's output in 1080p? Ok, but 4K is already available. You got the picture...

 

A good reason to put RGB output on Atari 2600: most of your other consoles have RGB output (Mega Drive, Super Nintendo, Playstation, etc.). It's better (and cheaper) to have a single upscaler to them all. In a few years, I can change only the upscaler and all my consoles will output 4K. No need to ever change the Atari 2600 mod again.

Link to comment
Share on other sites

A good reason to NOT put a HDMI output on Atari 2600: HDMI requires upscaling, upscalers evolve. Framemeister and OSSC are good, but not perfect. In some years we will have better upscalers. We don't want a mod that has to be upgraded in a few years. Resolutions increase. TVs suck at upscaling and they probably always will. If we output up to 480p, we are letting the TV do an awful job at upscaling it to 1080p. So, let's output in 1080p? Ok, but 4K is already available. You got the picture...

 

A good reason to put RGB output on Atari 2600: most of your other consoles have RGB output (Mega Drive, Super Nintendo, Playstation, etc.). It's better (and cheaper) to have a single upscaler to them all. In a few years, I can change only the upscaler and all my consoles will output 4K. No need to ever change the Atari 2600 mod again.

this^^^
Link to comment
Share on other sites

A good reason to NOT put a HDMI output on Atari 2600: HDMI requires upscaling, upscalers evolve.

Yeah, except this is an Atari 2600 picture we're talking about. It's non-interlaced and supposed to look blocky. It falls entirely outside the problem domain for scalers, which is generally to make interlaced, photographic imagery look good.

 

This is also a dumb argument because by that logic nothing should should have HDMI output.

Link to comment
Share on other sites

Yeah, except this is an Atari 2600 picture we're talking about. It's non-interlaced and supposed to look blocky. It falls entirely outside the problem domain for scalers, which is generally to make interlaced, photographic imagery look good.

 

This is also a dumb argument because by that logic nothing should should have HDMI output.

except that interlaced games with 30hz flicker by design completely fail using an inferior scaler.
Link to comment
Share on other sites

You completely missed the point. It is because it should look blocky that it needs a dedicated upscaler which handles 240p content without attempting to deinterlace. Movies, modern videogames and everything else is upscaled by modern TVs just fine.

 

You're arguing against yourself. First you said a 2600 shouldn't have HDMI output because that would require a scaler. Now you're saying it would need a dedicated 240p scaler.

 

Obviously the best scaler for the 2600's unique video output would be one designed specifically for the 2600, that would generate a standard 720p (or 1080p, whatever) signal. And the best place to position a scaler in the data stream is as close to the raw video output as possible. An internal mod like this one, that directly taps the TIA output pins, would by definition deliver more accurate video data to any scaler than one that was working from an analog signal.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...