Jump to content
IGNORED

The specialization of TVS


Recommended Posts

I currently got 2 TVs in my basement, and may have need of a third. Why?

 

Because no TV is complete by itself. First you got a CRT TV, which is preferred for retro games because of their ping that’s so low, light gun games play perfectly. That’s the only way.

 

Then you buy a modern 3D Monitor of 3D movies and modern video games from 360 and beyond.

 

There’s only one problem, on some games, I question what is the best way to play. I understand most games are designed to tolerate a little ping, but some games you want every advantage you can get. My 3D monirtor was a great-at-thetime 33 ms ping. But now it doesn’t cut it, hence Low-ping monitors.

 

So If I wanted an "all media monitor" I need to split it up into 3, a sub-microsecond ping TV for light gun games, a 3D TV for 3D movies, and a modern <4ms monitor for modern games.

 

So how do consollidate montiors?

 

Well there are inventions of light gun adapters. The problem is they use WiiMote technology, directly, and I’ve play Ghost Squad enough in "cursorless mode" to know that the line of the gun is never true. You’re just using your arms like an analog joystick. Muscle memory is more important than lining up sights.

 

The only solution to that would be a "camera Gun" which can sense with visible light where you’re pointing towards. There is another ligt gun which deos that. The proble, is it’s PC only, none working with original consoles. If that technolgy were aplplied to consoles as was, then I’d like to try it. Maybe if the low ping TV is low enough for everythign except light gun games, and the light gun games work with true line-of-sight, then’ that consolidates 3 TVs down to 2.

 

Another issue is 3D. If there was a way you can turn any TV into a 3D TV, then I can purchase a 1 ms ping monitor and turn it into a 3D 1ms monitor.

 

Oh wait there already is.. Sega Master System Sega Scope 3D. Those games I still need a CRT TV unless my gaming monitor can to 240p30Hz2eye in native resolution, instead of converting 240p to 480i30Hz1eye (left and right fields combine, losing the second eye) then 480p30Hz1eye.

 

Sorry about that tangent. The Sega Scope is now an unprotected technology. Why can’t there be a device which can convert a 3D Blu Rays, Games, an TV shows into alternate frames and have the external processor alternate frames?

 

Well 3DNow.com tired it and they failed. First in 2012, it was $500 for the converter, when most TVs were "stepping up to 3D" for $100-200. Second I doubt they successfully made one because I see none on Ebay.

 

3DNow probably didn’t make a model because I can see a problem they may have encountered, various ping times on various TVs. When the 3D is built into the TV, the 3D processor only has to compensate for itself. When being able to be attached to ANY TV, it HAS to accommodate the ping for EVERY TV form a 1 ms monitor to a highly processed 480 Hz monitor (which causes massive ping times) when not in game mode.

 

I got a solution, use a NES Zapper-style single-0pixel sub microsecond CRT Light Gun "camera" and flash on the screen when 3D content first appears to activate "Sync Mode" where for less than .25 seconds, the content is showing left eye as a 100% black frame and right frame as a 100% a white frame. When pointed at the screen, it can tell the processor in more accurate units than microseconds, when to alternate the shutter frames.

 

What sounds easier, adding shutter 3D to any existing TV using similar-to-Sega Scope technology, or polarizing a non-polarized screen. And wouldn’t 3D be more enticing if you just buy a 3D processor and add it to the TV you already have, kind of like the Surround Sound System market for Movies and Games today, as opposed to being in a package deal?

 

And I believe Shutter 3D is superior to Polar 3D. You get full resolution, and can even have 4K3D with no problem. Even though it darkens whites, it darkens blacks at a higher rate, causing a CRT-like image with inky blacks and higher contrast. And if you tilt your head more than a couple of degrees sideways with Polar 3D, the polarization will partially fail, thus exposing both eye to one eye (both eyes possibly), causing double images, which cause confusion of the image, which causes headaches. I have NEVER gotten a 3D Headache with a Shuttervision 3D. The Master System and My PS3DTV give me no contrary information unless the IR glasses are blocked off. Now they got radio syncglasses which removes the IR problem.

 

So until they make a modern TV light gun for retro games that actually feels like a light gun game, instead of a joystick yank, or they make a 3D adapter which turns 2D TVs into 3D ones, I guess, I’ll eventually be stuck with 3 TVs and use them all on occasion. My basement is crowded already as it is,

Edited by tripletopper
Link to comment
Share on other sites

I realize it's not a popular opinion here, but I don't like CRT TVs and am glad to be rid of them. Put your time and effort into finding retrogaming solutions that work well on modern TVs. I realize that's not a simple task, but once you let go of needing to stay in the analog world, you gain some focus.

 

Second, I don't know how much you care about 3D, but in my mind, 3Dtv is a failed concept and the tiny amount of 3D content released each year in no way justifies the additional hardware needed to play it.

 

IMO, get yourself a good quality 4K set with a solid low-latency mode. Call it good.

  • Like 4
Link to comment
Share on other sites

Yes. You gain focus and future freedom. Concentrate on adapting the games themselves to the new display technologies of today. And once you've done that, you can move forward in the future much easier - like getting into even newer display tech and connectivity. For example, all my emulators output to either VGA, DVI, DP, or HDMI equally well. I don't have to worry there. HDMI and DP are here and any other new digital interface will come to the PC sooner or later so I'm good to go.

 

And yes again, 3DTV is dead. As least no one in my circle of friends gets hard up for it.

  • Like 1
Link to comment
Share on other sites

I've let light gun games go, and never had any interest in 3D. In the 1980s, I can't find a coin-op arcade today, either. Times change. I didn't play online games or mobile games.

 

If you want to hold on to the old things you like, what's wrong with having dedicated hardware for it? You can't go for a motorcycle ride in a truck and falafel doesn't taste like ice cream.

  • Like 1
Link to comment
Share on other sites

If you want to hold on to the old things you like, what's wrong with having dedicated hardware for it? You can't go for a motorcycle ride in a truck and falafel doesn't taste like ice cream.

There's nothing wrong with holding on to the old tech, if you have the time, space, money, patience, etc. for it. But if you don't, I think CRTs are one of those things you should give up sooner rather than later. YMMV, of course.

Link to comment
Share on other sites

Old tech. New tech. Old tech was once new. And new tech will become old tech soon enough.

 

The key point is to match all the stuff to the proper era. A 70's console wants a 70's box TV with analog CRT. A 2019 console wants an HDMI enabled flatscreen, preferably in 4K HDR or whatever it is they're doing these days.

 

My new computer wants a USB keyboard. My vintage 486 wants an AT-style connector on its keyboard. I'm not gonna fight those standards. I'm just gonna match them to what they were designed for. And that's what you should do with older analog consoles like the VCS or Intellivision. Anything else is asking for project time and converter circuits and adapters. Adapters and converters that may not work in certain rare situations, corner cases. And once a corner case rears it's ugly head, the internet is a-ready and waiting to blow it out of proportion by saying all is no good.

 

Well.. Keyboards may be a poor example because there are adapters that work well. But enter into the world of video - and now you have to deal with colorspace, lag, response times, image quality, formats, scaling, signal levels, voltages, frequencies, and currents, and a whole lot more. A far cry from simple ASCII codes going in one direction. But you get the idea.

Link to comment
Share on other sites

Right now, there is not a magic solution to all of this. There are some options

 

The 33ms ping or lag, is the same thing across most modern TVs. They all have some, I have seen some over 90ms and my personal 4K tv does at 55ms (in 1080p) So RIGHT NOW your only real solution is to find a MODERN PC monitor that was designed around gaming and is large enough to fit your needs but, even higher end monitors come with a little lag (under 10ms for the better ones). A really good, large, no/very low lag PC monitor is very expensive.

 

LG announced they are working on the problem. On the NEW 2019 OLED TVs (very pricey right now), they claim to be offering a "Low lag mode" on their TVs so this gets dropped to a minimum. Real world tests for EXACT amount of lag on this models are unknown at this time.

 

If your looking for the retro consoles to play, A line doubler device is your best option to view them on a modern TV. Something like the framemeister or OSSC.

 

All this does not solve light gun games... A lot of 3rd party updated devices designed for modern TVs is always hit or miss, I've never had any luck with this one..

Link to comment
Share on other sites

In computers, ping is a signal between computers requesting a response. Time to receive the response is measured. A TV responds to the signal from the video game with a picture we receive with our eyes.

 

When I was a kid, my neighbour had two TVs in their living room, one on top of the other. This was before remote controls. I thought they were crazy. But there was probably a practical reason for it e.g. sports. You could probably find a modern TV with acceptable ping/latency for video games that's also acceptable for television watching. Maybe compromises have to be made; otherwise there's nothing wrong with having two TVs. And as cooldave said TV manufacturers are improving ping/latency.

Link to comment
Share on other sites

Is Picture in Picture still a common feature?

 

I guess it is. Our new monitors come with it. And its intended purpose is to provide a "hardware window" drawn by the monitor's circuitry with the ability to take content from the videocard driver on the same input as the main signal. Or it can take content from a 2nd input/source.

Link to comment
Share on other sites

I'm always surprised a TV hasn't been made yet that caters to the gaming/legacy input community. Certainly if one company made a 55" TV with a bunch of inputs (Composite, S Video, VGA, Component, and HMDI) and marketed it it would sell pretty well; enough to warrant production.

 

In the meantime; I keep a CRT because most things VHS/Laserdisc/Game systems look best on them. And I DO like light gun games and do not want to let them go.

 

And on the other side of my gaming room i keep a flatscreen. Many good ones were made from 2006-2011 that have every input under the sun (as mentioned above.)

 

As for the 4k tv......don't have one upstairs in my living-room and probably won't for a long time. 1080p is good enough for me! I do have blu-rays, but still watch DVD's, VHS and laserdisc as well.

 

If a good all around retro gaming TV were ever produced, i'd be all over it. In the meantime there are plenty of good options dirt cheap on Craigslist.

 

The problem for me with the OSSC, is that you can only hook up one console at a time, and there is no s-video input, and most consoles are easy to mod to S-video. I don't want to mess with scart, buying expensive cables for a euro format, and having to individually plug in and fine tune consoles each time i want to use it. It's not a great solution for a person with 20-30 consoles hooked up to multiple TV's.

Edited by travistouchdown
Link to comment
Share on other sites

I'm always surprised a TV hasn't been made yet that caters to the gaming/legacy input community. Certainly if one company made a 55" TV with a bunch of inputs (Composite, S Video, VGA, Component, and HMDI) and marketed it it would sell pretty well; enough to warrant production.

 

Don't be surprised. Retrogaming and legacy input community is like 0.001% of all TV buyers worldwide. Or smaller.

 

HDMI or DisplayPort or USB-C are the standards for the near future. They are the only standards capable of addressing the screen's pixels individually dot-by-dot, and the only standards capable of delivering the sharpness, clarity, color gamut, framerate and overall bandwidth.

 

Consumers want that kind of performance (and more!) with 1 easy cable. And building these standards into a television's ASIC controller costs pennies on the dollar.

 

Building legacy standards into a television is also pennies on the dollar excluding the more expensive analog connectors. That tidbit aside, these pennies add up real fast and may cost a TV maker hundreds of thousands of dollars over the lifetime of just one model. And they can't really advertise that connectivity because it's not what the buying public wants.

 

I didn't mention DVI because that appears to be a falling-out-of-favor computer standard from the 90's and early 2000's. The connector is costly to make and there are like 10 sub-versions of the parent standards. So fail.

 

---

 

Likely the best move for vintage hobbyists is to go the mod & converter route or generate the classic game material directly in digital space like Software Emulation or FPGA or re-writes on modern digital-out hardware. Hardware that communicates to the display in its native language - that's the best. When the processor knows it's writing a red pixel exactly there on the screen. No analog buffoonery or voodoo to muddy the waters.

 

These past 4 years have seen the adoption of G-Sync and FreeSync with adaptive framerates, in the PC world. Technology that goes a long way cutting down latency to one frame or less. There is no time wasted awaiting the monitor to finish drawing. And to top it off different parts of the image (as it progresses through the scan) can be sped up or slowed down to maintain lock-step with the GPU.

 

So if the GPU sees itself and the CPU getting ready to output another frame, it'll tell the monitor to haul ass and be ready for the next frame. Minimal, if any, time is spent dwaddling with the framebuffer.

 

Variable framerates, gotta love it! No tearing. No stuttering.

  • Like 2
Link to comment
Share on other sites

Yeah, all that. Except to add: Thunderbolt 3 > USB C. They use the same connector but TB3 is 4X faster and is the video & data standard to beat.

 

Other than that, I was going to say something similar -- if retroland was so profitable, someone would be tapping that market. But it isn't and our nerdy circles are much smaller than we might think.

  • Like 1
Link to comment
Share on other sites

What about replacement bulbs for old projectors? Still prohibitively expensive? Or just buy a new old projector when needed, like a disposable inkjet printer?

I haven't had to replace a lamp yet, but new lamps for these older projectors are around $30 on eBay shipped from China. Quality may be worse than OEM though.

 

It may be better to use these as "disposables," although they would get snapped up pretty quick if donated to a thrift even with a burned-out bulb.

Edited by boxpressed
  • Like 1
Link to comment
Share on other sites

It's amazing, as kids (at least my gang) we all oohed and ahhed over projection televisions and the upcoming flatscreens. We wanted absolutely nothing to do with the clumsy hot boxes of the day. We wanted to be in the future and there was no arguing.

 

I did, too, until I saw one in person. The only rear projection TVs I saw as a kid had very dark images compared to CRTs, and that's when the dream died. The 19" CRT I got for my 14th birthday was all the sudden perfect. :lol:

Link to comment
Share on other sites

Yeah, all that. Except to add: Thunderbolt 3 > USB C. They use the same connector but TB3 is 4X faster and is the video & data standard to beat.

 

One small correction on all this is that it doesn't matter if you have a Thunderbolt 3 port or a USB Type C port, they both use DisplayPort signaling. The only time TB3's bandwidth comes into play is if you have a PC peripheral that can use it (such as an external GPU enclosure).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...