Jump to content
boxpressed

I Made the Switch from CRT to LCD

Recommended Posts

Arcade games as early as the late 70's had higher than standard 480i resolution monitors and so did the computers that programmers used,

 

Nearly all of the old raster arcade games used standard resolution (15 kHz) CRT monitors. These use the same picture tubes and scanning frequencies as standard resolution CRT TVs; the only difference being that arcade monitors accepted an RGB signal, while most standard resolution TVs did not (not in the U.S. anyway). Most arcade games were roughly 240p @ 60 Hz.

 

There were a few old raster arcade games that used medium resolution (25 kHz) monitors (none from the '70s that I know of), but these were still low resolution by today's standards (e.g. 384p @ 60 Hz), and more importantly, they still had a relatively coarse dot pitch and shadow mask, so the graphics didn't look anything like something drawn in MS Paint and viewed on a modern PC monitor.

 

so to try and claim VCS game programmers were limited to 320 240 monitors is a straight up lie.

 

Once again you have put your reading difficulties on public display, as well as your tendency to call people liars due to your own ignorance. I said:

 

"The programmers couldn't have possibly intended users to see it the way we can now see it on high-resolution monitors, because users didn't have high-resolution monitors, nor did the hardware that ran the video game software output a signal that would natively fill a high-resolution screen. Programmers knew exactly what type of displays users would be seeing their game on."

 

Does the bolding help?

 

The type of full color raster monitors with relatively fine dot pitch and relatively fine shadow masks, that makes old graphics look like MS Paint, didn't come along until VGA in the late '80s. Even in the late '80s (and throughout the '90s, and even somewhat into the '00s), arcade games and console games were still by and large intended to be displayed on standard resolution (15 kHz) CRTs.

Edited by MaximRecoil
  • Like 1

Share this post


Link to post
Share on other sites

I agree that fine details and subtleties of the conversation are being missed due to poor reading and comprehension skills, thus leading into arguments about semantics. I see this in threads all over. It is a sign of lack of critical thinking and dumbing down. Bad nutrition, chem-trails, CRT radiation.. All that. But yet the discussion continues. I stopped correcting grammar and spelling years ago except for when necessary like in programming. I don't even bother correcting myself either. Fuck it..

 

The point is if we argue the finer aspects of a conversation too strongly; we lose sight of the big picture and piss off the participants. And the fun slowly drains away.

 

Now...

 

CRT's = crap because they cannot reproduce exactly what they are told. They are lazy and smear the place up too.

LCD's = kick ass because, while not being exact either, come a hella-lot closer to doing what they're told, like a good woman. And they are consistent, time and time again. Doing exactly what you want, without adjustment or continual coaxing.

 

It is common knowledge that programmers plotted pixels on graph paper when designing characters prior to coding. It is to be understood that sometimes these shapes looked differently when displayed on crap CRT's with RF interfaces and tuners. Generally we like to think a programmer would rely on the inherent NTSC artifacting and smearing of the all the display hardware; this includes the modulator, cable, switchbox, tuner, deflection coils. Sometimes they did, sometimes they didn't. If they wanted smooth color gradations like in the activision sunsets then probably yes. NTSC hardware was a plus. But if they were going for fine detail, then obviously not.

 

Now, all that aside. What sort of development systems and hardware was used for things like Intellivision, O^2, and VCS? And what about the displays for those systems? What were they like?

Share this post


Link to post
Share on other sites

I'm also inclined to wonder if classic gamers are so used to the brash and brazen CRT that they can't see the light from an LCD display? Not that that's good or bad, it is what it is!

Share this post


Link to post
Share on other sites

CRT's = crap because they cannot reproduce exactly what they are told.

 

Absurd. In the case of simple graphics, not producing "exactly what they are told" is a good thing, because "what they are told" looks like MS Paint, and what they produce looks more like a real painting on a canvas. Real painting on a canvas trumps MS Paint. So we have a case of "It's a good thing CRTs improve the look of these crappy Lego block graphics that we are limited to," rather than, "I wish this round-appearing sprite on this CRT looked more like it was made with Lego blocks."

 

They are lazy and smear the place up too.

 

"Lazy" has no meaning in this context, and they don't "smear" at all. Go ahead and point out the alleged "smearing" in this picture:

 

crtlcdikariwarriorscr.png

 

LCD's = kick ass because, while not being exact either, come a hella-lot closer to doing what they're told, like a good woman. And they are consistent, time and time again. Doing exactly what you want, without adjustment or continual coaxing.

 

Which is a bad thing for primitive graphics (see above). Also, the idea that you have to constantly adjust CRTs is pure nonsense. I've never made a single adjustment on the ordinary 32" RCA CRT TV that I bought new in '05, and it still looks exactly the same to me as it always has. The same goes for the 3 Happ Vision Pros in my arcade machines.

 

Also, high resolution CRTs can "do what they're told" just as well as LCDs (and look better doing it), so this isn't even an issue that is inherent to CRT or LCD. Unfortunately, that means that high-resolution CRTs also make old video games look crappy (just not quite as crappy as LCDs).

 

It is common knowledge that programmers plotted pixels on graph paper when designing characters prior to coding.

 

Which is irrelevant, because it has nothing to do with how the graphics were intended to be seen by the user. All of the classic raster consoles and nearly all of the classic raster arcade games were intended to be displayed on 15 kHz CRTs.

 

I'm also inclined to wonder if classic gamers are so used to the brash and brazen CRT that they can't see the light from an LCD display? Not that that's good or bad, it is what it is!

 

Absurd.

Edited by MaximRecoil
  • Like 1

Share this post


Link to post
Share on other sites

Bullshit! (fist hitting the table so fine china bounces in the air) Not reproducing exactly what they are told is a bad thing. If I want a pixel 2720,1918 with a color value of 218,222,173, I better damned well see that. I do not want a some ink splotch and adjacent pixels lighting up.

 

The smearing is happening with the pipe-wrench thing the blue guy is holding. The square white pixels are stretched from to bottom, into an oval shape, and stray radiation is lighting up adjacent pixels, but dimly. Furthermore the blue ink stuff is bleeding out of the shadow in the CRT rendition. But in the pixel-plot "bit-map" version the blue stuff is in perfect line with the black shadow. I'm referring to the bottom most line in the sprite.

 

CRT's are indeed lazy, like a fat momma at the buffet, they focus on anything. They cannot precisely control any one pixel without affecting an adjacent pixel. It is physically impossible. Furthermore, you've got the angle of the beam to deal with in a CRT. The scatter angle changes as the beam hits phosphor in the corner as opposed to in the center.

 

Many 8 bit games seem to need the CRT display in order to look halfway decent, but with the right settings and effects LCD is much sharper and yet maintains the old school look. It takes some setting up adjustments, but so does an arcade monitor.

 

A game designer *will* most definitely have considered how the final sprite looks on the intended display device. And it shows as lack of subtle color changes in the data in the sprite (not that early hardware could do it well in the first place), they relied on the CRT to do the anti-alias and color bleeding.

 

Because, if that isn't the case, then why everybody get so upset and complain that emulators and LCD's reveal too much pixelation?

 

Ye, I, too, like the canvas effect. It does indeed make for a nice easy-on-the-eyes image, for games with "blocky graphics"..

Share this post


Link to post
Share on other sites

Bullshit! (fist hitting the table so fine china bounces in the air) Not reproducing exactly what they are told is a bad thing.

 

No, it is a good thing when it is predictable and improves the appearance of the simplistic graphics you are limited to.

 

The smearing is happening with the pipe-wrench thing the blue guy is holding.

 

There is absolutely no smearing there nor anywhere else in the image. There is a clean transition between the color of the gun and the colors that surround the gun. Do you even know what "smearing" means?

 

The square white pixels are stretched from to bottom, into an oval shape,

 

They are not "stretched". A single pixel won't take on a perfect square shape on a standard resolution CRT, because of the relatively large dot pitch of the phosphor dots and the relatively large rounded rectangular openings in the shadow mask. This is a large part of the reason why 15 kHz CRTs look better for displaying simplistic graphics than high resolution monitors.

 

and stray radiation is lighting up adjacent pixels, but dimly.

 

The transition between colors is well-defined.

 

CRT's are indeed lazy, like a fat momma at the buffet, they focus on anything. They cannot precisely control any one pixel without affecting an adjacent pixel. It is physically impossible. Furthermore, you've got the angle of the beam to deal with in a CRT. The scatter angle changes as the beam hits phosphor in the corner as opposed to in the center.

 

That's absurd. I'm reading fine text at 1600 x 1200 on a CRT right now, with sharpness and clarity, so those "problems on paper" are irrelevant to the real world. Standard resolution CRTs are not necessarily any less precise with beam control than the high resolution CRT I'm using right now, they just paint with a bigger brush so to speak.

 

Many 8 bit games seem to need the CRT display in order to look halfway decent, but with the right settings and effects LCD is much sharper and yet maintains the old school look. It takes some setting up adjustments, but so does an arcade monitor.

 

LCDs are not "much sharper", nor do they even look remotely "old school" regardless of the settings (those settings/effects fool no one). New CRT arcade monitors look beautiful out of the box (I've bought 3 of them new in the past few years), and require no picture quality adjustments at all. The only adjustment that may be needed is a one-time vertical and horizontal size adjustment so the raster fills the screen properly (which only takes a few seconds). This is because different arcade game boards often output different size rasters, so there is no single raster size adjustment on a monitor that will be correct for all possible arcade game boards.

 

A game designer *will* most definitely have considered how the final sprite looks on the intended display device. And it shows as lack of subtle color changes in the data in the sprite (not that early hardware could do it well in the first place), they relied on the CRT to do the anti-alias and color bleeding.

 

Because, if that isn't the case, then why everybody get so upset and complain that emulators and LCD's reveal too much pixelation?

 

Yes, I have already said as much. I think it is odd that you try to mimic the look of a CRT on your LCD with settings and effects (which fool no one), rather than simply using the real thing in the first place.

Edited by MaximRecoil
  • Like 1

Share this post


Link to post
Share on other sites

I generally like the glow and persistence of the CRT. Especially with vector graphics or high contrast stuff. I dislike the geometrical distortion and constant adjustments and fiddling. And in light of the constant fiddling (no pun intended), I decided it extends to real hardware too. Too much maintenance and fussing with connectors and controllers and carts and just the bulk in general.

 

So I've come to settle on a happy medium and it leans toward the LCD & emulation as my rig of choice. These pics were taken before I got into really adjusting monitor masks and shaders and NTSC effects. It is also on my shit-old 204b. I'll need to make a new set to show the massive improvement on a 2012 model year display. But this looks

 

 

https://picasaweb.google.com/114688480094930960522/August202011

 

The reasons you choose LCD and emulation makes sense to me. If you had a huge collection of original hardware but chose to use LCD then it wouldn't make sense to me because there is more fiddling and so forth with the original hardware than a CRT.

  • Like 2

Share this post


Link to post
Share on other sites

The CRT effects in emulators ate mainly for nostalgia purposes. They look nice if you want your brand new LCD tv to look like old and dated technology, but not because they look "better".

 

When you take the raw data of a VCS game and run it through M emulator, you don't get rounded softer images. You get EXACTLY what the programmers came up with. That is what they intended users to see, and any rounding or smoothing that happened as a consequence of running it through a cRT was completely unintentional.

 

It isn't just nostalgia. How retro video games look on CRTs is how they were intended to look and in my opinion they look better. When the programmers of emulators try to mimic CRT effects they are trying to display the image that the original programmers intended. The raw data of a VCS game in an emulator isn't what the programmers intended you to see. They were programming video games for customers who owned CRTs. To say that they intended them to look like how they do on LCDs is like saying that the intent of Alfred Hitchcock to use Hershey's chocolate syrup in the shower scene in Psycho wasn't to make it look like blood in black and white but to make it look like women bleed chocolate for a future color version.

  • Like 3

Share this post


Link to post
Share on other sites

Programmers of the VCS, for example, would consider the smearing and blending and bleeding of CRT's and their associated circuitry. They would build their onscreen sprites and objects with it in mind. Programmer's built their object to be high contrast when going for detail, and low contrast when wanting smooth transitions. Well, that's typical of any graphics. But moreso back then. The typical home NTSC/RF TV set that was being used with a VCS was pretty bad. I even remember 40 column text from my Apple II to be barely passable.

 

It wasn't so much the actual tube causing problems, but the stability of the analog drive circuits (and tuner). And then there's the bandwidth availability. Shit consumer electronics didn't have amplifiers and drivers with as sharp a rise time in their pulses as a pro model would. Temperature, humidity, minor power supply variations, aging, outside interference, phosphor mask, aperture grille expansion and deformation, degaussing.. it all affects the CRT technology. And unfortunately a CRT is bound by these things and very dependent on the self-compensating closed-loop behavior of the analog circuitry. If the engineers did their job right, all these variations are kept to a minimum.

 

Also understand that CRT's are considered a bare-metal mechanical display device today. This is about as close as you can get to seeing the actual electricity spill out of the gaming console! You've got the TIA in the VCS yanking and pulling the beam in all sorts of directions and turning on and off phosphors like nobody's business! Zoom!

 

A CRT is an analog representation of an x,y memory matrix. And in translating that matrix there's always going to be subtle mistakes. Sometimes they're aesthetically pleasing, sometimes not.

 

I remember taking my VCS over to Uncle Dick's and wiring it up. I always had to ask permission to adjust the H.size and V.size and position, and sometimes the color. A pain in the ass because this TV had some of it in the back. And if Uncle Dick was being a dick he'd say no. Sometimes I'd get pissed and turn up the Picture adjustment before leaving. This had something to do with pulling the electrons toward the screen faster, and I did this one time with my face in the screen and some sort of static charge blew me across the room.

 

LCD's are a bundle of buffers and scalers and converters remapping processors. It is basically a large clear integrated circuit, with a 1600x1200 display having 5,760,000 elements. But these big elements are big enough to see! I sometimes think of an LCD as a HUGE monster-sized clear chip that is a blowed up representation of a tiny bit of your system's ram, or wherever they store graphics bits. All this is based on 0's and 1's. And this can be made rock solid, there are no analog voltages until the pixel element or twisty thing. And even now, this final analog step has been eliminated by using PWM to excite the crystals in the very newest of displays.

 

An LCD will put a dot where you want it 100% of the time, and it will always make another dot 75mm away from the first dot, if that's what you ask it to do. That's another thing I like about LCD's, once they are adjusted you won't need to fiddle with them anymore. Maybe 5 years later you'll need to tweak up the backlight a point or two to compensate for aging, but you do the same thing to any electronic display.

Edited by Keatah

Share this post


Link to post
Share on other sites

As much as I'm enjoying everybody embarrass themselves here, it's getting pretty painful by page 7...

 

ha ha. You're right. We're just looking for something to argue about, by now.

Share this post


Link to post
Share on other sites

I remember taking my VCS over to Uncle Dick's and wiring it up. I always had to ask permission to adjust the H.size and V.size and position, and sometimes the color. A pain in the ass because this TV had some of it in the back. And if Uncle Dick was being a dick he'd say no. Sometimes I'd get pissed and turn up the Picture adjustment before leaving. This had something to do with pulling the electrons toward the screen faster, and I did this one time with my face in the screen and some sort of static charge blew me across the room.

 

I've never had to adjust "H.size and V.size and position" on a TV for anything; not for video game consoles, not for TV transmissions, not for VCRs / DVD players; nothing. In fact, I've never even seen a standard resolution CRT TV that even has those adjustments on the outside (those pots are on the chassis; you have to take the TV apart to get to them). Everything that was designed to display on a standard resolution TV in the U.S. market generated an NTSC signal (or NTSC-like signal), and as such, they overscanned beyond the viewable area of the tube, which meant the picture always filled the viewable screen 100%.

 

Your uncle either had a very weird TV or you're misremembering things. Also, your claim of being "blown across the room" from a static charge off the screen is dubious at best.

Edited by MaximRecoil

Share this post


Link to post
Share on other sites

This was one of those chroma-color zenith sets that looked like a crescent shape. They had some adjustments up front, and others were accessible by going 'round back. A lot of TV sets (and monitors) have little holes where you could stick a screwdriver in and tweak stuff. These were clearly labeled too.

 

I had to adjust the vsize most often, so uncle's stock quote ticker would be visible, and yet still have the score counters in my Atari games be fully on-screen and not scrolling off the top.

 

Dubious or not, I clearly remember bashing backwards into the table and knocking stuff down. But it sure was fun turning that up though. The set started hissing and I thought the tube was filling up with gas and ready to exploder itself!

Edited by Keatah

Share this post


Link to post
Share on other sites

One time the spark gap tripped and a snapping noise came out with a flash of blackness. And that was scary enough. We even played a game where one of us would sit in back. And the other had to run past the screen and slap a sheet of paper on it (static made it stick). Then we'd crank it more and do it again, all the while hoping we wouldn't get blasted. The game ended when no one would attempt another "posting" due to the whited out screen and screaming flyback coil. The game also ended when touching the screen would upset the high-stress superfine balance and trip the sparkgap. We all freaked!

Edited by Keatah

Share this post


Link to post
Share on other sites

Today I understand that is really is a dangerous control to mess with, because, mis-adjustment will accelerate the electrical particles right out of the screen, for real. I'm sure there's a spec for it someplace and a specific adjustment procedure. But I never bothered to research into it much.

Share this post


Link to post
Share on other sites

One time the spark gap tripped and a snapping noise came out with a flash of blackness. And that was scary enough. We even played a game where one of us would sit in back. And the other had to run past the screen and slap a sheet of paper on it (static made it stick). Then we'd crank it more and do it again, all the while hoping we wouldn't get blasted. The game ended when no one would attempt another "posting" due to the whited out screen and screaming flyback coil. The game also ended when touching the screen would upset the high-stress superfine balance and trip the sparkgap. We all freaked!

 

This thread has now officially jumped the shark folks....... :P

  • Like 1

Share this post


Link to post
Share on other sites

This was one of those chroma-color zenith sets that looked like a crescent shape. They had some adjustments up front, and others were accessible by going 'round back. A lot of TV sets (and monitors) have little holes where you could stick a screwdriver in and tweak stuff. These were clearly labeled too.

 

CRT monitors always have readily accessible V/H size and V/H position adjustment controls, because there is no standard signal that the manufacturer can count on the monitor always receiving (unless it's an old composite monitor, in which case it expects an NTSC signal like a TV). A standard resolution arcade monitor for example may get a 395 x 254p @ 53 Hz signal from a Midway Mortal Kombat board, or a 512 x 448i @ 60 Hz signal from a Nintendo Popeye board, or a 292 x 240p @ 60 Hz signal from a William's Defender board, and so on. The same goes for a PC, and to a greater extent, so PC CRT monitors are usually multisync, because the user could select any resolution between say 640 x 480p @ 60 Hz and 1920 x 1440p @ 75 Hz.

 

Standard resolution TVs however, expect only one type of signal: NTSC; and as such, the V/H size and V/H position can be set at the factory and never have to be touched by the end user. I have 6 standard resolution CRT TVs here and none of them have V/H size and V/H position adjustment controls that are accessible from the outside, nor have I ever needed those adjustments on a TV.

Edited by MaximRecoil

Share this post


Link to post
Share on other sites

Yes yes all that. It doesn't change the fact I've got a few monitors that don't have V/H size and position on the front. And these are really computer monitors. Some of these are Apple specific, like for the II series. But 2 or 3 of them are generic ones like from magnavox or CTX (chuntex), or from NEC, believe it or not.

 

If you've never tweaked the vsize and hsize, then your picture (TV) must never have been fully optimized. As far as arcade monitors go, I've never really worked on them on a component level a great deal so I don't know all the specifics, but they're basically the same as a tv set without the tuner. And that's good enough for me.

 

I was looking on google for some images of rear adjustment and didn't find any straight away. So I'll post some of my sets that have said controls.

Share this post


Link to post
Share on other sites

good grief, with the posts I've seen, you guys might as well be arguing about religion or politics. geez.

 

you like what you like. don't get butthurt that people don't agree with you. there is no right answer.

  • Like 2

Share this post


Link to post
Share on other sites

Yes yes all that. It doesn't change the fact I've got a few monitors that don't have V/H size and position on the front. And these are really computer monitors. Some of these are Apple specific, like for the II series. But 2 or 3 of them are generic ones like from magnavox or CTX (chuntex), or from NEC, believe it or not.

 

I never said they all had them on the front, I said they were readily accessible, meaning you can access them without taking anything apart. However, some of the older PC monitors (CTX comes to mind) didn't have a pot at all for horizontal size; the only way to adjust it was with a plastic Allen wrench to turn the horizontal width coil on the chassis (you have to remove the monitor's housing to do this). This also applies to most older TVs and most older arcade monitors.

 

If you've never tweaked the vsize and hsize, then your picture (TV) must never have been fully optimized.

 

Standard resolution CRT TVs were fully optimized for an NTSC signal at the factory. Since they are only ever used with NTSC or NTSC-like signals at home, there is no need for the end user to ever have to adjust raster size and position, unless something goes terribly wrong with the TV. That's why most of them don't even have readily accessible raster size/position adjustment controls.

 

As far as arcade monitors go, I've never really worked on them on a component level a great deal so I don't know all the specifics, but they're basically the same as a tv set without the tuner. And that's good enough for me.

 

The two differences between a 15 kKz CRT TV and a 15 kKz CRT arcade monitor are:

 

1. The arcade monitor has no TV tuner

2. The arcade monitor has RGB input

 

All of the adjustment controls on an arcade monitor are readily accessible because they have an open frame; i.e., they are not encased in a plastic or wood housing like TVs are. Also, they have better picture quality than TVs because of the RGB input (RGB is the purest analog video signal possible), and because they are usually displaying progressive scan video (i.e., not interlaced). There are a few exceptions, like Nintendo's Popeye, and some of the Bally Midway games like Tapper, which are interlaced. But, any CRT monitor or TV can display interlaced or progressive video, as long as it falls within its sync range (which is ~15 kHz for standard resolution arcade monitors and TVs), so those oddball interlaced arcade games used ordinary arcade monitors.

Edited by MaximRecoil

Share this post


Link to post
Share on other sites

For serious purpose to achieve excellent visual authenticity, connecting any device that was designed for 15 kHz CRT screen to LCD screen does not make any real sence to me. Instead of using the real Atari 800 and then distorting the picture through an LCD screen, you could easily just emulate the computer using your laptop and have exactly the same incorrect picture. In other words, if you invest in real hardware you need to invest not only on the input but on the output as well.

Share this post


Link to post
Share on other sites

Classic console hardware look like crap on most LCD panels.

The way to jam VCS games and other classics on LCD is by keeping the Game Program entirely in the digital domain. That means some sort of emulation via software or fpga simulation. While you may not get 100% CRT style, you will gain a number of benefits and have less frustration overall.

 

Benefits like consistency of color and geometry, one-time adjust, no burn-in, energy savings, and more, are part of the package. They may take some time to become evident. They creep up on you. However, once you experience those you'll find it hard to regress to older analog technology.

 

Sure, the final display result will be different. And it will be necessary to make use of the NTSC/Composite artifacting to avoid razor sharp edges.

Share this post


Link to post
Share on other sites

depends on the LCD, In my experiance the cheaper and the crappier the LCD the better it is for retro gaming

 

your 4k samsung is going to suck, your 16 inch crap brand that has a max display of 1024x768 that claims "720p" even though it physically doesnt have enough pixels to do 720, is going to rock for retro displays

 

my 40 inch 1080p LG cant do 80 col apple II text worth a snot, my 13 inch craig is better than my monitor // and it does color

 

go figure

Share this post


Link to post
Share on other sites

In my little experience, the only thing that you get better with cheap LCD is that most of the time they are low resolution, so less pixels to process, so you reduce input lag.

But I would tend to disagree that cheaper is better.

My consoles (in RGB mostly, but even composite) looks better on my Sony Bravia 720p TV that on the cheaper Samsung display I had.

Colors are sharper, pixels are better defined and less blurred.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...