Jump to content
IGNORED

Why CRY?


kool kitty89

Recommended Posts

But as for RGB in general, aside from it being nice to have for computers/consoles and such, I really don't see why it was even pushed like it was in the 80s (or even early 90s) since there was very little it could be used for unless you wanted to connect certain computers and (later) game consoles to it. (sort of a niche application)

VHS, SVHS, and Laserdisc all used composite or S-video (and SVHS and LD were very niche while VHS had no use for S-video -LD often didn't either).

So for home use, RGB was actually rather pointless to have for a TV in a home entertainment set-up. (like S-video was in the US until DVD got big -at which point YPbPr became common as well -though I'd argue they should have gone RGB instead since it was a more common analog standard)

 

It's because of SECAM. Unlike PAL or NTSC, SECAM mixing isn't possible (without major color artifacts) so VCRs, satellite receivers and standalone teletext decoders used RGB to overlay graphics on top of the existing SECAM image.

 

As for LD there was never any SECAM LDs, all french LDs were in PAL format. It's possible the player would have a built in PAL-to-RGB decoder but most likely a person owning high end gear such as a LD player would also own a multi-standard tv.

 

Finally, games. France was pretty much the only market for SECAM hardware, other countries using SECAM were in the communist bloc. So rather than using an expensive SECAM encoder video games and computers were sold with RGB cables instead, sometimes (the NES and 7800) going so far as adding an internal PAL-to-RGB converter in the system.

  • Like 1
Link to comment
Share on other sites

Being that I don't program, I've read so much Jag tech stuff, I understood mostly everything despite not knowing all of the details. All technical jargons aside... Whenever I see any kind of image that's only 16x16 in size, it always reminds me of how SNK saved their background images and sprites... Of coures their image sizes could go up to 512 pixels. A 16x16 image is less then 1kb (16bit) in memory size. I've always pictured texturemap images streaming from a cartridge 16x16 or smaller on the Jag; from what I understand, streaming texturemaps from a cartridge would cause slower frame rates considering the size of the image. Wouldn't it be feasible for texturmapping to stream in small chunks that's 16x16 in size similar to how SNK would break their background images up into chunks and saved them as sprites...? Not that I'm suggesting that it be saved as sprites, but rather a series of small 1k (or less) images streaming from a cartridge in order to make up a whole texturemap, or would it be too slow for the GPU / OP to alter and display and still have a high frame rate 60 or 30fps.

Except for a few problems with that comparison:

Kskunk is talking about using a single 16x16 (or 32x32 or 16x32) texture stretched over a large polygon (or possibly tiled/repeated many times over that area). Using many different texture tile block would require updates to the buffer holding the texture (be it CLUT RAM or the GPU scratchpad) . The Neo Geo doesn't do that.

 

Also, Neo Geo games usually don't build things with 16x16 sprites at all since that would quickly exhaust the 384 sprite limit (it would take 266 16x16 sprites to fill a 304x224 area). So to make the most of the Neo Geo, large BG planes use very large sprites, but even then you'd be limited by the 16-pixel width and 96 sprite/line limit. (so enough to overlap 5 solid BG layers, but no band width left for sprites) Due to using larger sprites to build the BG vs 8x8 tiles, there's also less flexibility with use of palettes (each sprite selects 1 of 256 15-color palettes indexed from 15-bit RGB rather than each 8x8 cell selecting a palette for arcade systems or consoles with 8x8 character map layers, but the Neo Geo also has a huge selection of subpalettes to work with . . . but some contemporary arcade boards come close to that too -I know several of sega's had 128 subpalettes to the Neo's 256)

 

I know what KSkunk was referring to concerning texture mapping... The thing about the Jaguar OP is that it only renders so many colors per scan-lines. If you look at system like the 2600 (missle and playfield) or the 7800 that basically read a list per scanline, that opens a door of having scan-line manipulation thus I think you could possibly use that to your advantage using a "chasing the beam" tactic to handle a limited amount of sprites/graphics/display list. Consider this example; Cinepack is a video compression scheme that handles frames by streaming images that are compressed 4x4 (I think up to 8x8) using vectors that have 255 entries, which are preset entries. According to a note I ran into a while back in the Jaguar development kit docs, Cinepack can run at a considerable frame rate while can be manipulated with zooming effects very similar to how the Neo Geo handles its zooming effect (when I found those notes I'll post them here); can't remember if any skewing effects were used. On that note, my point is if the Jag can so easily manipulate scan-line based engines like Doom and Phase Zero with texture mapping, I believe that the Jag could possibly have enough leverage to handle a smarter texture mapping system... The Jag is a 2D monster thus that's where it's 3D capabilities will thrive through the use of a smarter 2.5D method... But that's all theoretical based on factual findings of other examples; I feel it's a note worthy mention for this topic concerning cry color scheme.

Edited by philipj
Link to comment
Share on other sites

The Jag is a 2D monster thus that's where it's 3D capabilities will thrive through the use of a smarter 2.5D method...

I agree with this. The Jaguar's design is way more focused on scaled sprites than polygons, and it's just plain better at it. The few Jaguar games that do generate 2.5D via scaled sprites run at high res and 60FPS: Think Super Burnout or Val d'Isere Skiing.

 

I was disappointed that Blue Lightning has such awful sprites and gameplay, but there are a number of later levels that demonstrate dense textured 2.5D cityscapes, forests, and canyons rendered entirely using scaled sprites. Scaled sprites were also a mainstay in arcade racers of the early 90s.

 

I guess Atari saw what the Saturn and Playstation could do, and tried to make the Jaguar do the same thing. I wish they had tried harder to make the Jaguar do what it was actually good at.

 

- KS

  • Like 4
Link to comment
Share on other sites

The Jag is a 2D monster thus that's where it's 3D capabilities will thrive through the use of a smarter 2.5D method...

I agree with this. The Jaguar's design is way more focused on scaled sprites than polygons, and it's just plain better at it. The few Jaguar games that do generate 2.5D via scaled sprites run at high res and 60FPS: Think Super Burnout or Val d'Isere Skiing.

 

I was disappointed that Blue Lightning has such awful sprites and gameplay, but there are a number of later levels that demonstrate dense textured 2.5D cityscapes, forests, and canyons rendered entirely using scaled sprites. Scaled sprites were also a mainstay in arcade racers of the early 90s.

 

I guess Atari saw what the Saturn and Playstation could do, and tried to make the Jaguar do the same thing. I wish they had tried harder to make the Jaguar do what it was actually good at.

It wasn't just after the PSX either . . . 3D games on the Jag were pushing polygons from day 1 too. (various games in 1993 and 1994 -and many in 1995 that started development in '94, before the PSX had a big showing) Granted, the arcades were pushign polygonal 3D big time (and textured polys heavily from 1993 onward), but the home market showed a different picture. (especially PC games -Atari probably should have aimed for more licenses of PC games . . . and tried to attract PC publishers more in general)

 

Then you had psuedo 3D games with poor trade-offs. (like AvP running at full res -imagine if Doom handn't opted for low detail . . . Dom at 2x the res and 1/2 the framerate, blek)

 

 

It's because of SECAM. Unlike PAL or NTSC, SECAM mixing isn't possible (without major color artifacts) so VCRs, satellite receivers and standalone teletext decoders used RGB to overlay graphics on top of the existing SECAM image.

I was talking about SCART for normal home consumer use . . . not professional video applications. (for NTSC, you had similar high-end RGB transmission set-ups in video production/editing -albeit some things were done in composite or s-video directly, not to mention various other professional applications using RGB monitors -computers, various display terminals, etc) There were (or are) also a variety of professional video connectors for composite/s-video not normally used for home/consumer devices. (like BNC connectors for video, or dual RCA plugs for S-video -as some computers did too, or some proprietary/semi-proprietary connectors -albeit most connector standards start out that way and become standards due to mass market popularity)

 

PVM monitors were (and sometimes still are) prominently used for such applications (video production, medical, etc).

 

So the issue is: why would an average home/consumer product need to have such features included. (with the vast majority of devices limited to RF or composite video transmission)

 

As for LD there was never any SECAM LDs, all french LDs were in PAL format. It's possible the player would have a built in PAL-to-RGB decoder but most likely a person owning high end gear such as a LD player would also own a multi-standard tv.

Why not have a PAL to SECAM video transcoder?

 

What did VHS use in SECAM?

 

SECAM video itself isn't directly tied to RGB, but modulated chroma/luma in the vein of NTSC and PAL (closer to PAL), which is transmitted through an RF signal and then decoded to the Y/C/audio signals (and then to the CRT display and speaker of the TV). Just the specific format used is different, so video would (obviously) need to be transmitted in that format. (or transcoded to it -from YUV/YPbPr, RGB, NTSC, or PAL composite or s-video)

 

Finally, games. France was pretty much the only market for SECAM hardware, other countries using SECAM were in the communist bloc. So rather than using an expensive SECAM encoder video games and computers were sold with RGB cables instead, sometimes (the NES and 7800) going so far as adding an internal PAL-to-RGB converter in the system.

Yes, after the fact that makes sense, obviously . . . France mandated RGB and thus companies producing devices with native RGB video could realistically support RGB only for the output.

 

The problem comes with non-RGB devices, like several home computers (Atari 8-bit, C64, VIC-20) and consoles (VCS, CV, 5200, 7800, NES, etc) that don't use RGB at all. Using an external RGB transcoder would seen pretty cost-ineffective . . . converting to SECAM composite video should be a bit cheaper (ie PAL composite or Y/C converted to SECAM composite or Y/C -probably the former since SCART didn't have fully standardized S-video support), or design a special revision of the video hardware to support SECAM video directly . . . or RGB if that was more practical. (or for the Colecovision -or any console/computer with a TMS9918 based VDP, you could use a model with YPbPr output and transcode that to RGB -relatively simple circuitry compared to RGB to NTSC/PAL encoders in common use, or the YPbPr to composite encoders)

 

The NES did have an RGB version of the PPU, but I'd thought that was only used in the Playchoice 10 arcade systems. (those were definitely in production by the time the NES launched in Europe though)

 

 

 

 

 

 

 

 

 

 

 

I know what KSkunk was referring to concerning texture mapping... The thing about the Jaguar OP is that it only renders so many colors per scan-lines.

That's only if you use 8bpp or lower graphics (using the CLUT to expand that to CRY or 15/16-bit RGB) . . . you could use 16bpp graphics directly to bypass the single CLUT. (at the cost of using more RAM -graphics could still be indexed/compressed further in ROM).

And that's not considering possibilities of software look-up using the GPU to expand indexed textures using other tables. (limited and slower than OPL look-up . . . though avoiding the issues of having to use large uncompressed textures in RAM -you could also use the GPU to do some realtime decompression or such as well, from ROM to RAM or RAM to RAM . . . and obviously far more resource to pull that off than the Genesis -and far more RAM to buffer into- though the Genesis itself had a fair number of games doing realtime decompression -as well as more intensive compression for intermittent loads -between levels, before boss battles, etc)

 

Then there's simple palette swapping between scanlines. (modifying CRAM in hblank) Though that requires more careful art design. (and blitter rendered BGs/objects to complement the OPL -though the blitter is fairly limited)

 

Even so, 255 colors per line alone is still pretty substantial compared to what the "16-bit" consoles were limited too, though more trade-offs compared to other "32-bit" gen systems.

Plus, you have shading and translucency/blending effects on top of that. (so better than 256 colors on PC -where blending/shading is quite limited by the palette)

 

If you look at system like the 2600 (missle and playfield) or the 7800 that basically read a list per scanline, that opens a door of having scan-line manipulation thus I think you could possibly use that to your advantage using a "chasing the beam" tactic to handle a limited amount of sprites/graphics/display list.

What? Why would you want to go back to super intensive pain in the ass to program tightly software timed techniques like that? That was hard on the VCS, but with more and more complex systems (and far, far larger and more complex games) it becomes highly impractical.

 

Both the 7800 and A8 (among others) had already solved the problem of doing raster-synced effects without having to have synchronized CPU code: interrupts. (be it interrupts signaling CPU work or interrupts signaling the video chips to do something) In the A8 and 7800, these are referred to as display list interrupts (DLIs), and many other platforms had similar mechanisms for timing mid-screen scanline raster effects. (C64 VIC-II's raster interrupt, various interrupts on the Amiga video chipset, scanline interrupts on the PC Engine, hblank interrupts on the MD and SNES, etc)

One of the simplest and most common raster effects

 

Consider this example; Cinepack is a video compression scheme that handles frames by streaming images that are compressed 4x4 (I think up to 8x8) using vectors that have 255 entries, which are preset entries.

If it's a direct derivative of conventional Cinepak (PC, Mac, 3DO, Saturn, etc) and not a proprietary format (like Sega's very different "Cinepak" format on the Sega CD), it uses frames devided into 4x4 macroblock cells (so a 320x240 frame would be a 64x60 grid) with interframes (key frames) built using 2 codebooks of 256 entries each (1 using 4x4 blocks, the other using 2x2 blocks, or "vectors") with each 4x4 square of the frame defined as 1 or 4 vectors (1 or 4 bytes indexing 1 vector from the 4x4 codebook or 4 vectors from the 2x2 set), and then there's also "motion compensation" using interframe compression. (interframes being defined by a 64x60 grid with each block left unchanged or defined as changed with a new vector block replacing the old one)

See:

http://multimedia.cx...ror/cinepak.txt

 

According to a note I ran into a while back in the Jaguar development kit docs, Cinepack can run at a considerable frame rate while can be manipulated with zooming effects very similar to how the Neo Geo handles its zooming effect (when I found those notes I'll post them here)

; can't remember if any skewing effects were used.

OK, that's not really unusual either . . . the OPL can normally scale ANY framebuffer (including full-screen scaling of any 3D game) in addition to other OPL objects. That's not really surprising at all.

 

Skewing/rotation should also be possible, but since the blitter would have to do that, it would be much more bandwidth intensive and limited. (blitter texture mapping is slow) Plus it would requite more buffers. (blitter would need to take the initial frame and render it into a 2nd buffer)

 

Also, it's FAR better than what the Neo Geo can do. The Neo can only shrink sprites, not scale/zoom larger, so pretty limited. (very much like the limited scaling support on several of Sega boards -like the System 16, Systemm 18, and such -not like the scaler boards though)

The Saturn and PSX should be capable of scaling streaming video windows too (except not if the PSX's hardware video decoders are used, since those render to 24-bit RGB, which the GPU can't render in -if using cinepak or another software format -like Truvideo- it would be quite practical on the PSX). Other platforms could do scaling effects on video too, with various trade-offs. (Sega CD, PC, 32x, 3DO, all depending on the context)

 

 

On that note, my point is if the Jag can so easily manipulate scan-line based engines like Doom and Phase Zero with texture mapping,

Huh? Those are COLUMN rendered engines, NOT scanlines . . . which makes it generally harder too work with for conventional hardware acceleration (generally optimized for line filling/drawing -including the Jaguar and Sega CD's texture mapping . . . and Saturn, 3DO, PSX, or pretty much every other hardware texture mapping hardware out there -the Jag's line filling and g-shading also work on scanlines). Only the floor and ceiling of Doom use line rendering. (all the walls are columns)

 

For column renders you have a couple options: do everything in software with the GPU (or a CPU on other systems -hence why such engines were popular on unaccelerated PCs), or use a 2-pass renderer that draws the columns swideways (as lines) and then rotates the frame 90 degrees. So it's a matter of determining which method would be faster. (on the Sega CD, it might be faster to use the ASIC to render that way than the 68k doing it alone, but the Jag had WAY more resource to draw on with the GPU -the 3DO is probably closer to the Sega CD, weaker CPU relative to hardware texture mapping . . . the Saturn is a bit in the middle with the dual CPUs and fairly fast blitter . . . PSX had a huge amount og bandwidth with the GPU, so a 2-pass system makes more sense there -or just do the whole game in full 3D like PSX Doom did :P . . . though the latter wouldn't work for a voxel engine)

 

Height maps are attractive due to much less intensive calculations needed than polygons, plus lack of perspective warping issues on textures (due to constant Z rendering), and (for voxels) allowing very smooth/organic terrain compared to polygons of the time. They're especially attractive on platforms with a lot of general purpose resource and little to no hardware polygon drawing support. (like unaccelerated PCs, Jaguar, 32x, etc)

It's these reasons that make pseudo 3D so well suited for the jaguar . . . too bad it wasn't pushed more. (many polygonal games could have been done better in heightmap engines, or hybrid engines with height maps, sprites, and limited amounts of polygons -imagine Cybermorph with Commanche/Phase Zero style terrain ;) . . . or a port of Commanche -which was on PC in 1992)

There were a lot of PC games that catered to that school of thought too (due to the lack of hardware acceleration on PCs), though texture-heavy polygonal 3D did take over that market ~1996/97 too. (with a few notable psuedo 3D games later on -Outcast is a huge late example)

 

Obviously scaled sprite based games also fit the system well . . . railshooters, racing games, space/air combat games either. (imagine a Wing Commander I or II port to the Jag, among others)

Edited by kool kitty89
  • Like 2
Link to comment
Share on other sites

I could write a really long reply about SECAM, but in short, SECAM and SCART was invented because France wanted to protect their domestic television manufacturers. RGB was needed for clean graphics overlays (for example frequency data for a satellite receiver). There were no consumer SECAM encoders (not counting the primitive color generator used in the first french 2600) so video games used RGB instead, even when systems didn't natively support RGB. VHS did use SECAM, but the signal was already SECAM encoded on the tape and the VCR merely shifted it up to the correct frequency.

 

In case you want to know more about SECAM (or other video systems) I recommend this site

  • Like 3
Link to comment
Share on other sites

I could write a really long reply about SECAM, but in short, SECAM and SCART was invented because France wanted to protect their domestic television manufacturers.
This is somewhat true for SCART (while the goal was to create a standard for home A/V connections, making it mandatory was a way to block imports of cheaper Japanese-made gear, which did not feature SCART connectors at the time) ; it's not really the case for SECAM, as it was developed at the same time as PAL (NTSC predates both, but has hue shift problems that both PAL and SECAM solve, in different ways).
  • Like 1
Link to comment
Share on other sites

I could write a really long reply about SECAM, but in short, SECAM and SCART was invented because France wanted to protect their domestic television manufacturers.
This is somewhat true for SCART (while the goal was to create a standard for home A/V connections, making it mandatory was a way to block imports of cheaper Japanese-made gear, which did not feature SCART connectors at the time) ; it's not really the case for SECAM, as it was developed at the same time as PAL (NTSC predates both, but has hue shift problems that both PAL and SECAM solve, in different ways).

That makes more sense . . . it would be odd to design a TV transmission format around that, but a video connector standard would make more sense. Still, the lack of commercial SECAM encoders seems a bit odd . . . even with a fairly limited market, I'd have thought volumes would still have been high enough to favor SECAM-specific video encoding rather than transcoding PAL/NTSC composite/s-video devices to RGB (the few things using YPbPr would be much simpler to transcode to RGB though).

 

The fact that SECAM wasn't compatible with PAL makes sense too, since the 2 were developed (more or less) in parallel. (though as for the hue shift issues, PAL/SECAM solved it, but at the expense of vertical chroma resolution, though that also made some neat graphics exploits possible for some very solid dithered pseudo color artifacts in PAL video -including the pseudo 80x96 256 color "mode" on the Atari 8-bit)

 

But as far as SCART goes (as a general AV standard -composite, RGB, mono, stereo,etc), being tied to government/nationalistic/domestic industry interests makes sense . . . and as such, it also made sense to standardize for professional and home applications. (a totally different story for open-market defined standards/de-facto standards as in the US and Japan)

 

Though it is a bit ironic that SCART ended up being fairly popular in Japan, but never did much of anything in North America. (or RGB for that matter -pretty much limited to multimedia and computer monitors . . . and then the standardization of YPbPr for DVD rather than conforming to the more established RGB standard -and since analog TVs and monitors -analog and digital- still work on RGB internally, that signal still had to be decoded . . . so it was a matter of having that hardware in the TV vs the DVD player/etc -assuming said device even used YPbPr natively anyway and didn't use RGB transcoded to YPbPr anyway ;))

Edited by kool kitty89
Link to comment
Share on other sites

Still, the lack of commercial SECAM encoders seems a bit odd . . . even with a fairly limited market, I'd have thought volumes would still have been high enough to favor SECAM-specific video encoding rather than transcoding PAL/NTSC composite/s-video devices to RGB.
I can see three reasons for that :

- limited market (few countries used SECAM)

- complexity (SECAM is more difficult to encode than PAL)

- RGB through SCART was an easier and cheaper alternative (a lot of stuff uses RGB internally ; for the few machines that didn't, PAL->RGB decoder chips were widely available since they're used in TV sets)

Link to comment
Share on other sites

YPbCr has one nice advantage. Well, it has a few, but one nice advantage is the basic controls like "tint", "brightness", "color", and such are simple things to do on that signal, where straight RGB offers brightness and contrast easily enough, but "tint" and "saturation" are not so straight forward.

 

The other thing about YCbCr is being able to run color at a different resolution from monochrome, or just running a straight monochrome signal. A baseline grey scale signal takes only one conductor. I do this regularly with Propeller video output. The same is true for S-video luma, but the YCbCr has pixel perfect color resolution, if desired.

 

The color difference signals can be degraded considerably, without impacting overall detail. And I'm struggling to remember which video compression scheme maps over to YCbCr, but there is one, and the synergy boils down to the difference signal making sense compression wise.

 

Of all the input standards, I like YCbCr the best for those properties. It can take a standard NTSC / PAL composite signal displayed as monochrome on the "Y" channel. All three can be driven for a 640x400 'ish pixel perfect display on NTSC scan rates, higher on PAL. And, it goes up from there, with a full on analog 1080p signal being accepted by a fairly large number of display devices.

Link to comment
Share on other sites

I think the idea is that with a pure RGB signal you don't have to worry about tint and saturation. On NTSC phase errors will alter the tint, while on PAL it will reduce saturation (PAL TVs don't have tint controls unless they also support NTSC)

 

Pretty much all major video compression schemes use chroma subsampling. It's a simple way to reduce video bandwidth by 33-50%.

Link to comment
Share on other sites

But, the source material isn't always corrected. In TV / Movie land, source correction varies significantly. Many people likely don't care, but for those that do, RGB is actually limiting.

 

Thanks for the data on video compression. So, it's most of them then.

 

Very interesting on PAL sets not including tint control...It sure is a more precise standard. Signal tolerances are a fraction of what NTSC will tolerate. What about things like DVD media? Or component aging, display color temp reference variations? Do PAL users just deal with those? Always curious about the other standards...

Link to comment
Share on other sites

Or component aging, display color temp reference variations? Do PAL users just deal with those?

 

Back in the old days of simple analog filters, the benefit of PAL's phase shift didn't end at the antenna. In a properly designed set, it can also cancel out tint variations caused by components that are out of calibration, temperature sensitive, or just faltering with old age.

 

Once things turned digital, there were better ways to deal with component variations. But it was another benefit of PAL in the stone age.

 

- KS

Edited by kskunk
  • Like 1
Link to comment
Share on other sites

- RGB through SCART was an easier and cheaper alternative (a lot of stuff uses RGB internally ; for the few machines that didn't, PAL->RGB decoder chips were widely available since they're used in TV sets)

As far as home video games and computers in the 70s and early 80s, the vast majority of products didn't use RGB internally, but usually YCbCr or some custom Y/C colorspace that was output as conventional NTSC or PAL Y/C and/or composite. (and RF, obviously). Albeit there were also many monochome devices, which were neither and didn't really benefit from RGB either. (luminance signals with no color carrier are decoded with the comb/notch filter of a TV -or composite monitor- disabled, thus resulting in component/RGB-like clarity -or S-video luma for that matter . . . granted, B/W through RF will still have added noise -varying by modulator and TV- but still lack lack artifacts that a color signal would have)

 

Of the early consumer-level computers and game consoles on the mass market from '75-82, the only RGB-specific ones I can think of would be the BBC Micro, ZX Spectrum (internal RGBI, but RF output as standard), and maybe the Intellivision or Odyssey2.

The IBM PC used RGB, but certainly not standard RGB . . . it used a proprietary 4-bit RGBI format with "digital" monitor output (4 separate RGBI lines with 2 levels each) with special monitors needed to decode the color correctly (EGA did the same thing, but with 6 "digital" RGB lines -VGA used normal analog RGB), and of course, old CGA cards had composite out as standard as well.

 

The majority (and certainly many of the most popular) machines of the time used Y/C colorspace internally: VCS, Apple II, A8/5200, VIC-20, various TMS9918 derived consoles/computers, CoCo, C64, NES, 7800, etc. Albeit both major successors to the TMS99 VDP switched to RGB (6-bit for Sega, 9-bit for Yamaha V9938) with approximated YCbCr values in compatibility mode.

 

 

 

YPbCr has one nice advantage. Well, it has a few, but one nice advantage is the basic controls like "tint", "brightness", "color", and such are simple things to do on that signal, where straight RGB offers brightness and contrast easily enough, but "tint" and "saturation" are not so straight forward.

Just have R-G-B tuning instead . . . actually better than hue adjustment as some things can't be corrected within the limits of "hue" control of a TV, (actually, some old TVs have RGB pots accessible externally -though usually on the back service panel, not with the normal user controls) VGA monitors tended to have RGB pots (or digital RGB intensity control) as a standard feature.

 

The other thing about YCbCr is being able to run color at a different resolution from monochrome, or just running a straight monochrome signal. A baseline grey scale signal takes only one conductor. I do this regularly with Propeller video output. The same is true for S-video luma, but the YCbCr has pixel perfect color resolution, if desired.

All of that can be accomplished on the digital end anyway (you could use any colorspace you want, so long as the analog output is converted to RGB . . . and that's what most/all PC video cards -and modern game consoles, and many DVD/BD players- did/do). Actually, was/is it even common for VCD, DVD, SVCD (or BD, etc) players to used VDCs that natively work in digital YCbCr and natively output YPbPr? (vs using conventional 24/32-bit RGB and converting analog RGB to YCbCr with a simple video encoder -you need added hardware for composite and S-video support anyway . . .RGB to YPbPr conversion is a lot simpler than NTSC/PAL Y/C encoding)

 

The color difference signals can be degraded considerably, without impacting overall detail.

That's all done on the digital end regardless . . . "compression" on the analog end really doesn't matter unless you have REALLY long cables. (and even then you'd run into problems with YPbPr -and for the better part of a decade, DVI/HDMI have been better options anyway -be it digital YCbCr or RGB, you generally avoid the problems of transmission over long cables)

 

And I'm struggling to remember which video compression scheme maps over to YCbCr, but there is one, and the synergy boils down to the difference signal making sense compression wise

JPEG, H.261, MPEG, and all related formats use YUV/YCbCr internally, though the vast majority of devices they're displayed through use RGB natively. ;) (usually 24-bit, though 15/16-bit wasn't too uncommon either . . . or decoded to 24-bit and then posterized -or dithered- to conform to a 15/16-bit display)

Cinepak supported a YCbCr mode too (with 4:2:0 subsampling) as did Indeo (4:1:0 subsampling), and several other formats. (I think the CInepak and JPEG derivatives used on the Jaguar supported CRY though)

 

Plus, there's the issue that all color CRT monitors work on RGB internally anyway, so you MUST decode other formats to that at some point. (and doing that inside the TV can lead to more errors/calibration issues)

Many digital display devices work on RGB as well (Plasma and LCD -I think OLED as well- use RGB elements somewhat analogous to the RGB phosphor of analog monitors -LED jumbotron-type displays as well for that matter)

 

Of all the input standards, I like YCbCr the best for those properties. It can take a standard NTSC / PAL composite signal displayed as monochrome on the "Y" channel. All three can be driven for a 640x400 'ish pixel perfect display on NTSC scan rates, higher on PAL. And, it goes up from there, with a full on analog 1080p signal being accepted by a fairly large number of display devices.

Any analog video format (aside from composite/s-video chroma) has a technically indefinite maximum horizontal resolution . . . it could be infinitely high, but only practically limited by beam precision/response time, dot pitch, etc. (same for RGB, monochrome, YPbPr, etc) A 15.7 kHz monitor with a suitably fine pitch and high precision beam could certainly handle 1280x240(or 480i) . . . and some Amiga models DID support such resolutions. Vertical resolution is the issue though, and that's limited by the physical number of scanlines which, of course, are defined by the H and V sync rates.

 

...Well, there's also the issue of overscan . . . normal 60 Hz 15.7 kHz NTSC rate video may be well capable of over 480i/240p (though certainly limited to less than 525/262 lines due to reserved vblank/sync/flyback time) but the average TV tends to show ~224/448 lines (sometimes more sometimes less), though some show the full 240/480 lines (or a bit more) and custom calibration can certainly expose full overscan.

Edited by kool kitty89
Link to comment
Share on other sites

Re: RGB pots.

 

Yep. On those TV's, you would also find internal video drive pots to deal with levels and linearity. I used to do TV adjustments for date money. The Atari was a nice machine for this actually. My own Atari was shifted a bit red, so it needed it's color pot tweaked slightly. Once that was done, I would connect it to TV's and start with grey and dot patterns to deal with screen convergence correction, then purity for the primary colors, then linearity, levels, color temp (I ran 'em warm, which ended up being trendy as that's the dominant mode today, where the original NTSC standard was fairly "cool" with a distinct tint to the display), etc...

 

Once those corrections were done, the standard tint or hue control was sufficient to perform correction on variances in source material and received signal.

 

I've seen the results when ordinary people get hold of RGB tuning. There is a solid reason for having tint, bright, contrast, or picture controls. :)

 

Re: Encoders.. Interesting question! I'll have to pull a few of mine apart to see! None have RGB, all have YCbCr, some with progressive output, some interlaced, and the HD ones capable of the higher scan rates.

 

Re: DVI / HDMI being better options.

 

Well, that's where I will personally differ. Output on YCbCr is easy, and not encrypted, nor fickle in many ways the digital signals are. If one is building their own devices, which I do, analog component is just cake easy compared to any digital one. Both YCbCr and RGB are easy to output analog, but the YCbCr has the added benefit of carrying the monochrome data by itself, which has software benefits where one might want to do tricks with the color detail. Doing that with RGB isn't so easy. Given that's not a requirement, both can be done with a few resistors and software PWM type level generation, no worries. The digital domain is of far higher complexity.

 

Edit: Finally, people "see" something close to YCbCr, or at the least the Intensity, hue, saturation color space easily. We have separate detectors for intensity data, with color being managed for overall detail / benefit to us. Blue receptors are loosely scattered about, leaving us unable to see sharp detail as consistently as we can green and red. (assuming person has normal color vision) I've always preferred composite, s-video and the YCbCr for that reason too. Maps nicely to how we are wired, where RGB makes a lot of sense given how the hardware so often works.

 

Another edit: On my personal devices, the analog inputs are dead simple, and they work always. No worries. Higher end signal display quality on moderate and short cable runs isn't really significant, with the slight edge going to the analog. A nice pixel roll-off tends to blend a few things together that make sense, where every pixel can be seen as a pixel otherwise. I vary in this, but for movies, prefer to not see pixels, unless the director puts them out there as such. I never used DVI much, worked when I did, and when I had the RIGHT DVI adapter. HDMI works, until it doesn't, which isn't often, but it is damn frustrating to see device chain resets on basic things like just inserting media... Oh well. Maybe we will come full circle on that, making things a bit easier / faster. The encryption layer didn't do anybody any favors so far, utility wise.

Edited by potatohead
Link to comment
Share on other sites

Once those corrections were done, the standard tint or hue control was sufficient to perform correction on variances in source material and received signal.

There's a problem with that though . . . some devices have incorrect intensity on just one of the R, G, or B channels, and that can't be corrected with YUV(or YIQ) hue adjustment. (some PC video cards have that problem for outputting composite, s-video, or YPbPr, and analog many SCART RGB to YCbCr boxes seem to have that problem too -albeit the latter case can be corrected by manually adjusting the RGB trim pots in the converter)

 

I've seen the results when ordinary people get hold of RGB tuning. There is a solid reason for having tint, bright, contrast, or picture controls.

Well, you'd probably limit RGB fine tuning to just a narrow fine-tuning range if supported on consumer TVs, probably along with additional brightness/contrast/BL/etc controls. (albeit PC monitors also tend to do that: brightness, contrast, and RGB controls -via pot knobs or menu)

 

Of course, the original reason for hue adjustment was the correct for NTSC transmission errors in chroma more than anything else. (these days it's mainly used for color corrected from differing video sources/devices)

 

Re: Encoders.. Interesting question! I'll have to pull a few of mine apart to see! None have RGB, all have YCbCr, some with progressive output, some interlaced, and the HD ones capable of the higher scan rates.

Are you talking about the DVD player comment I made?

 

Re: DVI / HDMI being better options.

 

Well, that's where I will personally differ. Output on YCbCr is easy, and not encrypted, nor fickle in many ways the digital signals are. If one is building their own devices, which I do, analog component is just cake easy compared to any digital one. Both YCbCr and RGB are easy to output analog, but the YCbCr has the added benefit of carrying the monochrome data by itself, which has software benefits where one might want to do tricks with the color detail. Doing that with RGB isn't so easy. Given that's not a requirement, both can be done with a few resistors and software PWM type level generation, no worries. The digital domain is of far higher complexity.

I was talking purely about signal degradation through cables, not interfacing complexity or other issues. (or the stupid lack of proper resolution detection on many HDTVs -especially "720p" sets not identifying the true native resolution -often 1365x768 or 1360x768)

 

Also, as to optionally just using Y/luma for a composite/s-video/component connection, you could so the same for RGB by running a single video line and tying the RGB connector(s) to that common signal. (like passthrough adapters used for the ST in monochrome mode for VGA monitors, etc)

 

And my building devices, what specifically are you talking about? (homebrew hardware projects?)

Edited by kool kitty89
Link to comment
Share on other sites

Yeah, I was talking about the DVD comment. Just curious, and I've a couple that I can pull the cover on, no worries.

 

I really like a robust analog signal path. What is very interesting to me is the higher clocks in today's devices really bring out a lot in composite. Amazing really. I've output some totally over driven signals, like CGA 80 column text on composite and actually have the HDTV render most of that correctly. Damn cool. Way better than the all analog, maybe comb filter displays.

 

Many home theater people enjoy YCbCr too. It's right up there with the very best digital outputs, and with the right gear, it's very, very difficult to see the quality differences between that and a digital input.

 

My own experience is favorable, and I've got devices that output that all the way up to the 1080p, and it's great. For some reason, I've not always seen the same quality over RGB, and I don't know why. I think it has to do with small phase differences in the three channels. Maybe one of these days, I'll try and put the scope on it to see. In any case, having the luma on one channel basically keeps detail registration sharp, and color can smudge a bit with nobody the wiser, where the same impact on RGB creates fringing. And like I said, it's possible to clock color difference signals differently, and very easily.

 

Re: Yes, home brew. I've had a lot of fun with the Propeller and signals. The thing can output most any signal, though it's not quite accurate enough to do a solid PAL. There are some reasonable emulations, which I've run on VGA and TV at various resolutions and signal types. Cool. Several of us are kind of into the display part of things, building signals, sprite engines and such. We've also gone and duplicated some classic computer signals, like the old Apple with artifacting, Atari like, C64 like, etc...

 

Mostly I use S-video / VGA / NTSC composite for the home-brew projects. System RAM and overall clock speed are right in the sweet spot for a standard TV sweep rate, and the sweep for 640x200 interlaced is the lowest common denominator I've found newer monitors can scan. Interlaced, but... Nothing will display a 15Khz sweep anymore that I can find, unless it's actually a TV.

 

This generation Prop chip requires a lot of resources to do a good YCbCr, but the next gen one will have it built right in. Looking forward to that. 1080p output... :) And that one will do RGB easily, with only color transforms needing to be changed. If somebody wants to, both can be output at the same time, just like is possible on the current chip, which can do a composite, VGA and S-video, etc... all at the same time, different scan rates, different display data, or the same, depending on what somebody wants to do, and how the RAM gets used.

 

The mention of the various signals just got me riffing on it a bit. I've been tinkering with display devices since I was a middle schooler. :) Most televisions can output a nice display when they are well tweaked. It's been a fun ride checking out all the stuff device manufacturers do. Also a big fan of artifacting. I can do lots of NTSC tricks, but don't know much about PAL. I have a display now, and a capture card, but I don't have a good signal source yet. Needs to be VERY accurate. That will have to wait until I can get a better clock reference, maybe doubling a real PAL xtal or something... Anyway, I've artifacted on anything I could get my hands on, and love how the analog signal can be exploited in this way. Sometimes the effects are really good. None of that works on Jaguar and modern devices though. They all output alternating phase, basically broadcast quality signals.

 

Suppose it should go back to the Jaguar CRY discussion now, which I found very interesting. It's just fun to talk about signals and such sometimes, that's all. Sorry to distract.

 

FWIW, the Jaguar composite output is not bad at all! I've used it with S-video too, looks great. "Rayman" looks fantastic. "AvP" so-so, though I personally liked the composite output. It looked kind of spooky back then, a little over driven, which just adds to the atmosphere of the game. S-video looked much better, but then there are the pixels.... Funny how that all worked out.

 

Not sure what else it will do output wise. One of these days, I'll have to make some cables and find out. Was there RGB output or YCbCr component?

 

(goes off to look as that might be a reason to pull the Jag out for a quick play)

 

Cheers, and it's always fun to talk a little hardware with you. Always interesting!

 

Edit: Yep, it's got nice signal outputs. Will have to setup and try the RGB.

Edited by potatohead
Link to comment
Share on other sites

FWIW, the Jaguar composite output is not bad at all! I've used it with S-video too, looks great. "Rayman" looks fantastic. "AvP" so-so, though I personally liked the composite output. It looked kind of spooky back then, a little over driven, which just adds to the atmosphere of the game. S-video looked much better, but then there are the pixels.... Funny how that all worked out.

The composite output isn't bad from what I've seen, but it does have that characteristic chroma moire (rainbowing) issue on high-luminance-contrast dithering (and fine detail). It seems to happen at relatively low resolutions too (ie the common ~320 wide res used in most games of the time -stuff with ~6-7 MHz dot clocks).

It's really noticeable on Sony's CXA1145 and CXA1645 (and the Fujitsu MB3514) in NTSC as seen on many Genesis models, Neo Geo, and some Saturn and Playstion models. (with those encoders, it seems to be OK at ~5.37 MHz -ie MD low res- but even at 6 MHz -Neo Geo- there's substantial rainbow banding moire artifacts, about as bad as MD stuff at 6.7 MHz -the common 320 wide mode H40- . . . and it's not solid artifacting either, but oscillates with scrolling as the dot clock isn't a direct multiple of the NTSC color clock -systems like the Atari 8-bit and Amiga, with 7.16 MHz modes have artifacting too, but very solid/consistent -enough to practically rely on artifact colors)

 

S-video helps a lot for those encoders, though there's still faint rainbowing visible. (you obviously lose all the luminance degradation, and that's the big problem with composite video in general)

 

Also, unlike dot crawl (and general blur to some extent), such chroma artifacts vary little from TV to TV . . . though it's certainly possible that Atari used more than 1 video encoder for the Jag. (I've only seen/used one in person, so I'm going by what I noticed in that case -which was certainly decent with no nasty/excessive dot crawl, but definitely very noticeable moire artifacts similar to what the Genesis often shows -in fact, the video quality in general wasn't that much different from the Genesis's composite on models with the CXA1145 -possibly closer to the sharper CXA1645 on some late models, or at least similar to those encoders when used on good TVs -which is quite good and similar to the SNES at similar resolutions -the MD usually runs at higher resolutions though, and there's definitely more blur/artifacting there in composite)

 

Not sure what else it will do output wise. One of these days, I'll have to make some cables and find out. Was there RGB output or YCbCr component?

 

(goes off to look as that might be a reason to pull the Jag out for a quick play)

 

Cheers, and it's always fun to talk a little hardware with you. Always interesting!

 

Edit: Yep, it's got nice signal outputs. Will have to setup and try the RGB.

Yes, the Jaguar (like most consoles and computers of the time) used RGB natively (the system works internally on 24-bit RGB . . . CRY is converted to 24-bit RGB via look-up and some additional logic handling intensity -256 colors via look-up and logical linear intensity control for 256 levels of each color -with some redundancy due to the limitations of 24-bit RGB, and the object processor system also supports 15/16-bit RGB direct color and 24-bit direct color -using 32-bits per pixel, as well as 1/2/4/8-bit indexed color using a 256 CRAM entries)

 

Actually, that's the issue that made me bring up the possibility of using non-RGB color DACs internally on the Jaguar: to allow a more flexible customized colorspace rather than one tied to using 24-bit RGB. CRY uses 256 indexed colors from 24-bit RGB and has an additional 8-bits on intensity which fades all of them towards black, but cant fade/desaturate towards white (unlike YCbCr)

 

However, another possibility might be doing things on the analog end differently in general, ie using a different colorspace than 24-bit RGB for the video DACs themselves (more like some older consoles and computers), and handling conversion to RGB (if needed -ie for the European market or for developer systems) externally (on the board or via an add-on adapter).

Using conventional YCbCr natively wouldn't address the needs for a better CRY alternative either, since YCbCr (and YUV) don't desaturate towards white and blank completely (ie at max luminance, not all colors are white, and at minimum, not all are black), so something like YCbCr that desaturated totally towards white and black at max/min intensity (like RGB shading can do, and like CRY does towards black), perhaps handling chroma similarly to YCbCr. (ie R-G and B-G difference for the 2 color elements . . . or a more custom colorspace could be used more like CRY did -though CbCr would fit better with existing standards, with the luminance/intensity system differing) Not only would it allow better color/lighting effects, but also provide many more pale/desaturated colors. (as it is, you only get the 256 CRY colors and darker shades of those -not paler/lighter ones)

 

You'd then have that custom YCbCr-like analog colorspace (of RGB, or Y/C PAL, or NTSC) done as the final step to conversion (RGB probably only for SECAM models and via adapters for NTSC/PAL models -with the native YCbCr-like signals included on an external port).

 

I argued this possibility earlier, but it was mentioned that you'd run into a lot of added cost there too on the analog conversion end, including possible calibration issues. (though I also pointed out that many older console/computer designs did do custom colorspaces like that -VCS, A8, VIC, C64, Apple II, NES, etc . . . and those did have analog calibration to deal with -albeit usually via a simple color pot)

 

 

 

 

 

OTOH, from the RGB conversion method Flare did opt for with RGB, they could have alternatively chosen the 256 color selections all from the middle of the RGB range (rather than near max R, G, or B -as many of the CRY colors are), you'd end up with relatively dark colors in general, though you could potentially then smoothly step-up RGB levels on each color to fade towards bight (and eventually desaturating towards white rather than to full-intensity color) and limit dark shades to 128 levels towards black rather than 256 (not truly 256 since there's a fair bit of redundant colors/shades due to limitations of the conversion to 24-bit RGB). However, that would limit the base color selection more in general, or (if using colors fairly close to the existing CRY colors) you'd get rather poor/choppy shading towards white due to the colors already fairly close to RGB white. (but still much smoother than 15-bit RGB shading)

Flare already spent a lot of time optimizing a fixed 256 color selection for flexibility and general utility (offering a good range/selection of colors, and organized in such a way to allow pretty decent logical blending via simple averaging of 2 4-bit chroma components -though a 256x256x8-bit table would allow more optimal blending, and some games opt to do that in software iirc)

And offering both normal CRY and a separate, custom mid-range color CRY as separate modes would mean 2 different CRY look-up tables (so another 256x24-bit ROM table eating up precious chip space).

 

Hmm, however, it might have been practical to use only the current normal CRY table, but allow both the 256 shades towards black mode, and a mode allowing 128 shades towards black and 127 shades towards white (with the mid-point being the "normal" intensity) at the expense of very coarse shading towards white (and a large chunk of redundant colors mapped as white or near-white) and somewhat less smooth shading towards black (but still nearly 128 true shades towards black).

To do that properly (with linear shades), you'd need to have intensity used steps of 2 rather than 1 (so R-G-B would get added/subtracted 2,2,2 rather than 1,1,1, effectively making it 7-7-7 RGB -which is what the 15-bit CRY/RGB combo mode already does).

 

That wouldn't be nearly as nice as a true custom colorspace handled on the analog end, but it would fit well with the existing 24-bit RGB design of the Jaguar (and mesh with the existing RGB modes too) . . . it would offer limited pale/white-ish shades of colors and allow more saturation based lighting/shading/blending effects (lens flares, explosions, refletive lighting, etc). Obviously that would be nowhere near as smooth as shades towards black (the primary aim of the colorspace -and obviously needed for common effects like shadows and non-reflective light-sourced shading -and some games only do that sort of shading/lighting, like Doom), and likwise nowhere near as smooth as 24-bit shading (CRY effectively allows 24-bit quality shading towards black), but would allow things that normal CRY can't do well or at all (at least without additional software effect), and probably wouldn't be too bad compared to shading towards white in 15-bit RGB. (or actually better for some colors -the best you'd get for 15-bit RGB would be 32 shades total -towards white and black- for any given color, and the brighter the color, the fewer shades towards white, or the darker, the fewer shades towards black . . . so a 15-bit RGB game using colors generally quite close to CRY's 256 colors would obviously have even poorer shading towards white and still muchinferior shading towards black)

Albeit you still wouldn't be able to shade towards all 6 primary colors (R/Y/B/G/C/M) like RGB can for colored lighting effects. (or even towards R/G/B like CbCr color allows)

Edited by kool kitty89
Link to comment
Share on other sites

The composite output isn't bad from what I've seen, but it does have that characteristic chroma moire (rainbowing) issue on high-luminance-contrast dithering (and fine detail). It seems to happen at relatively low resolutions too (ie the common ~320 wide res used in most games of the time -stuff with ~6-7 MHz dot clocks).

 

Yeah, that kicks in at about 160-256 pixels. 320 is just a bit much for composite, unless a lot of care goes into the art direction.

 

Atari 8 bit computers do not output a NTSC phase shifted signal. That's the artifact consistency right there, and the primary reason why the machine is a 160 pixel color machine.

 

How that color phase is done, and the clock multiples off the color burst does impact patterns and rainbow effects a lot. That's one of the things I've found fun to explore.

 

Newer displays, in particular most HD capable ones, sample the composite signal to a much higher precision, significantly reducing those effects in a lot of cases. DSP type technology can do wonderful things to a composite signal. Of course, it can botch it too, sometimes "rounding off" or eliminating entire pixels... In general though, I've been pleased with HD capable displays and composite signals. I'll have to pull out the Jag and check it out on the HD units.

 

I argued this possibility earlier, but it was mentioned that you'd run into a lot of added cost there too on the analog conversion end, including possible calibration issues.

 

Gotta digest this one some. Dense, but interesting comment. I'll be back later :)

Link to comment
Share on other sites

Missed this:

 

There's a problem with that though . . . some devices have incorrect intensity on just one of the R, G, or B channels, and that can't be corrected with YUV(or YIQ) hue adjustment.

 

Yep. Spot on. Most TV's I worked with had internal linearity controls. The art of it was matching the curves back up to align well with the "soft" output channel. Used the grey scale on the Atari for that regularly. Newer devices seem much more solid. Often those controls are still in a service menu or other, there if a person knows to look for them. Just how it should be. My Sony WEGA has that, and nice calibration to display nearly all the overscan without serious issues. :)

 

On older analog devices, that's usually among the first pass calibrations, right after basic convergence and purity (if applicable). It's pretty amazing the difference seen when there is a "soft" channel. It will always look "off".

Link to comment
Share on other sites

The composite output isn't bad from what I've seen, but it does have that characteristic chroma moire (rainbowing) issue on high-luminance-contrast dithering (and fine detail). It seems to happen at relatively low resolutions too (ie the common ~320 wide res used in most games of the time -stuff with ~6-7 MHz dot clocks).

 

Yeah, that kicks in at about 160-256 pixels. 320 is just a bit much for composite, unless a lot of care goes into the art direction.

Well, it also depends on the video encoder and the specific dot clock for "320" . . . though in the context of normally calibrated NTSC sets (with ~224 lines visible and square pixels achieved at roughly 6.25 MHz pixel/dot rate), 320 pixels fitting just to overscan would be exactly with the Genesis uses (6.7 MHz) while the A8 and Amiga's "320" width leaves a significant boarder and actually shows ~344 pixels (which is exactly what NEC/Hudson defined the PCE/TG-16's 7.16 MHz dot mode as). The C64's 320 mode is even higher, at 8 MHz like the ST (and about 380 pixels visible on such TVs). That's disregarding very tightly calibrated sets with no overscan (full 240/480 lines visible), and old sets with even more overscan.

The 256/280 pixel modes of the CoCo and Apple II were both 7.16 MHz with large boarders (likewise the 128 pixel mode of the CoCo was the same rate as the 160 pixel mode of the A8, leaving a much larger boarder than the A8)

 

But as to the actual composite video quality at a given dot clock, there's more to it than that too . . . on some older encoders for NTSC, anything above ~6 MHz dot (~288 pixels on at typical overscan) tends to have significant moire problems . . . and some older systems have it at even lower rates (TMS9918 based systems tend to have it significantly even though it's 5.37 MHz -like NES/SMS/low-res Genesis/TG16), and then there's the issue of dot crawl. Dot crawl (or luminance artifacts in general) are the really big problems with composite in general, but good encoders and good TVs (with good comb filters) tend to handle that adequately even at high resolutions (like 640~720 pixels on newer systems) and also handle lower resolutions quite well (the N64's composite was particularly good in 320x240 stuff, not as good as S-video obviously, but very limited dot crawl and chroma artifacts -blur was more noticeable than either of those, but that wasn't terrible either).

 

There's also the feature of some encoders to stagger/dither luminance artifacts (to creat checkerboard dot crawl rather than the natural jailbar artifacts). That's generally advantageous for systems with well managed luma artifacts (ie fainter in general), but IMO actually looks worse on more primitive encoders (it makes for horrible swarming dot crawl on the NES vs the unfiltered jailbars of the CXA1145 on the Master System and model 1 Genesis -especially since we're talking scrolling screens, so jailbar artifacts don't swarm with animated artifacts like the checkerboarding does)

All modern encoders seem to use the checkerboarding mechanism, so there's not any similar quality encoders without that filtering to compare. (which makes me wonder if I'd still prefer the jailbars in such cases . . . especially since interlaced video makes the dot crawl animated/swarming even on static images)

 

Newer displays, in particular most HD capable ones, sample the composite signal to a much higher precision, significantly reducing those effects in a lot of cases. DSP type technology can do wonderful things to a composite signal. Of course, it can botch it too, sometimes "rounding off" or eliminating entire pixels... In general though, I've been pleased with HD capable displays and composite signals. I'll have to pull out the Jag and check it out on the HD units.

I haven't seen any that offset the issues seen in the encoders used on the Genesis (even the CXA1645 -which, honestly looks little different than the 1145 other than moderately less blur -luminance artifacts and chroma moire are nearly identical on all TV's I've tried).

 

OTOH, I've seen some HDTVs with really, really horrible composite video support . . . aside from the common deinterlacing errors for misinterpreting a 240p signal (which not all HD sets do, but all the SD-poor supporting ones do), and ignoring harmful "enhancement" filters (especially edge enhancement), there's some cases of really poor comb filters out there. (at least much worse than any decent newer SDTV -or many higher quality older sets)

 

I argued this possibility earlier, but it was mentioned that you'd run into a lot of added cost there too on the analog conversion end, including possible calibration issues.

 

Gotta digest this one some. Dense, but interesting comment. I'll be back later :)

Here's the original response pointing out the problems with my suggestion/question of analog conversion:

In any case, the main idea would be using a custom colorspace through the video DACs and converting the analog output to standard formats (NTSC/PAL Y/C or composite, and RGB) either on-chip or with additional external video encoding circuitry.
That would have been complicated and expensive (analog stuff has calibration and precision problems, whereas digital stuff works in exactly the same way every time). All for supporting something that wasn't even part of the design goals. Remember Atari were penny-pinchers, too :)

 

 

Edit:

Also, here's the CRY color model depicted visually:

http://www.atariage....6-bit-cry-mode/

For the actual 256 colors used by the Jaguar:

http://www.atariage....ttach_id=153231

 

 

And CRY if depicted with 16-bits rather than 8 for chroma:

http://www.atariage....ttach_id=153232

 

 

 

 

 

 

 

 

 

Missed this:

 

There's a problem with that though . . . some devices have incorrect intensity on just one of the R, G, or B channels, and that can't be corrected with YUV(or YIQ) hue adjustment.

Yep. Spot on. Most TV's I worked with had internal linearity controls. The art of it was matching the curves back up to align well with the "soft" output channel. Used the grey scale on the Atari for that regularly. Newer devices seem much more solid. Often those controls are still in a service menu or other, there if a person knows to look for them. Just how it should be. My Sony WEGA has that, and nice calibration to display nearly all the overscan without serious issues. :)

 

On older analog devices, that's usually among the first pass calibrations, right after basic convergence and purity (if applicable). It's pretty amazing the difference seen when there is a "soft" channel. It will always look "off".

Yes, though there's the problem when the TV is calibrated properly, but the device outputting to the TV is incorrect . . . and an average user could only correct that via the normal TV adjustment (especially convenient if a TV supports multiple custom programmable slots rather than the usual 1 -or obviously manual pot knobs on older sets . . . but most sets would require re-calibration for switching back to normal operation . . . or perhaps using a factory preset option -though I usually don't like those).

 

For non-average (tech-savvy users) there may be the option to correct the color issue of the external device itself (which is certainly the case for the RGB=>YPbPr converter boxes -which have RGB pots inside them), but that's not always practical either as there may be no adjustment on the external device.

 

 

Hmm, actually, you wouldn't necssarily need direct RGB adjustment control to address that issue either, but 2-channel U/V (or Cb/Cr) control rather than the limited hue control. (or true I/Q control for that matter)

Edited by kool kitty89
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...