+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 On 8/31/2021 at 9:02 PM, Andrew Davie said: I know the eye sensitivity for colours is the thinking behind this, but I am not understanding why the above is necessary here. This is just going to skew the colours, no? Have you tried just a simple average? If the scene were alternating between full green, and full blue, squares, you'd still want the average to be more green than blue, I'm thinking. Green has more of an impact visually. Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 On 9/1/2021 at 12:01 PM, Mr SQL said: The video quality is fantastic! Would it be possible to create an AtariVCR Cart with the encoder and port for a usb cam to film movies right on the Atari or view video in real time? That would be very interesting for games and utilities. It could enable video conferencing capabilities for PlusCart for example, where text conferencing has already been implemented. Hm maybe. Would be a big project, using a beefier microcontroller than the one I have now. (16 MIPs currently). Encoding is very slow right now, but it could always use a black and white dithered solution to keep things speedy I suppose. For color, I guess snapshots would be doable. Still a big hardware undertaking. 1 Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 (edited) On 8/31/2021 at 10:49 PM, macdlsa said: Ahhhh ... I missed the pre-processing part . Can I see, of course if allowed, a pre-processed videoclip just before being fed to the encoder ? It's just a curiosity... I'll actually be doing a live presentation of some of the projects I've worked on, over the years, for my work, mid October. Probably focus more on MovieCart, showing all the pre-processing elements. DM me for details. Alternatively, I can send you info on how to launch it on your own if you're interested. Another forum member already has successfully. Edited September 3, 2021 by rbairos 1 Quote Link to comment Share on other sites More sharing options...
Shawn Posted September 3, 2021 Share Posted September 3, 2021 5 minutes ago, rbairos said: I'll actually be doing a live presentation of some of the projects I've worked on, over the years, for my work, mid October. Probably focus more on MovieCart, showing all the pre-processing elements. DM me for details. Alternatively, I can send you info on how to launch it on your own if you're interested. Another forum member already has successfully. Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 On 8/31/2021 at 8:04 AM, Andrew Davie said: In the above, the purple platform is a poor match. Yah you were right. My 'enhancement' stage was being too aggressive. I think I tone it down quite a bit now that these potential improvements are here: Original, with 100% custom enhancement filter, then encoding. Bottom row, no enhancement filter, then encoding.. 3 Quote Link to comment Share on other sites More sharing options...
+Andrew Davie Posted September 3, 2021 Share Posted September 3, 2021 49 minutes ago, rbairos said: Yah you were right. My 'enhancement' stage was being too aggressive. I think I tone it down quite a bit now that these potential improvements are here: Original, with 100% custom enhancement filter, then encoding. Bottom row, no enhancement filter, then encoding.. This looks much better to me. The yellow/white are now well separated. Will be interesting to see the results of seeing it in action. Quote Link to comment Share on other sites More sharing options...
+Andrew Davie Posted September 3, 2021 Share Posted September 3, 2021 1 hour ago, rbairos said: If the scene were alternating between full green, and full blue, squares, you'd still want the average to be more green than blue, I'm thinking. Green has more of an impact visually. Well, I disagree here that it's your job or correct to do this. If it were alternating, I'd want the effect to be exactly the same to the eye as it would be without adjustment. That is, we are not looking to "fix" things for the eye; we want to present the eye with the same thing as it would have seen anyway. If it's alternating then let the eye do the "averaging". If the eye "loses" stuff, so be it... that's what would happen with the original, too. Quote Link to comment Share on other sites More sharing options...
+Andrew Davie Posted September 3, 2021 Share Posted September 3, 2021 2 minutes ago, Andrew Davie said: Well, I disagree here that it's your job or correct to do this. If it were alternating, I'd want the effect to be exactly the same to the eye as it would be without adjustment. That is, we are not looking to "fix" things for the eye; we want to present the eye with the same thing as it would have seen anyway. If it's alternating then let the eye do the "averaging". If the eye "loses" stuff, so be it... that's what would happen with the original, too. I'm not sure what I'm suggesting is going to be better. But I'd really like to see a comparison between the two! Quote Link to comment Share on other sites More sharing options...
Thomas Jentzsch Posted September 3, 2021 Share Posted September 3, 2021 (edited) 3 hours ago, Andrew Davie said: Well, I disagree here that it's your job or correct to do this. If it were alternating, I'd want the effect to be exactly the same to the eye as it would be without adjustment. That is, we are not looking to "fix" things for the eye But that's what you should do. The signal bandwidth is massively reduced on a 2600 display. So you have to make use of it as good as possible. E.g. you should do Chroma subsampling to take "advantage of the human visual system's lower acuity for color differences than for luminance". And that's already done here, the colors have a much lower resolution. Also formulas used for converting RGB into grayscale (luma) prioritize green a lot. E.g. Edited September 3, 2021 by Thomas Jentzsch Quote Link to comment Share on other sites More sharing options...
+Andrew Davie Posted September 3, 2021 Share Posted September 3, 2021 1 minute ago, Thomas Jentzsch said: But that's what you should do. The signal bandwidth is massively reduced on a 2600 display. So you have to make use of it as good as possible. E.g. you should do Chroma subsampling to take "advantage of the human visual system's lower acuity for color differences than for luminance". Also formulas used for converting RGB into grayscale (luma) prioritize green a lot. E.g. Yes, I'm sure you're right. But let's have a look-see at some test examples anyway? Quote Link to comment Share on other sites More sharing options...
Thomas Jentzsch Posted September 3, 2021 Share Posted September 3, 2021 Sure. This is new territory for the 2600, so we have to experiment. Quote Link to comment Share on other sites More sharing options...
Thomas Jentzsch Posted September 3, 2021 Share Posted September 3, 2021 (edited) After my rief exchange with Andrew above and after watching the latest Dragon's Lair example again, I started to wonder if a different sprite alignment would work even better. One would have to test. The current horizontal color resolution is very low, because it is only updated every 8 horizontal pixel (which is about 13 vertical pixel). Maybe it would work better if the sprites are only shifted by 4 pixel back and forth. Like this: Even frame: |## ## ## ## ## | | ## ## ## ## ## | Odd frame: | ## ## ## ## ##| |# ## ## ## ## #| If the approach works, this should double the horizontal color resolution (for the human eye) at the cost of the already relatively high vertical resolution. Overall this should further reduce the color blockyness of the image and make the thin vertical stripes less noticeable (I hope). The problem is that flicker will probably increase a bit. And you would have to feed six sprites in the 2nd odd frame line. The latter could be avoided if the horizontal size is reduced by 4 pixel (removing the left single # in my example). I am sure there are more problems to overcome and maybe in the end it doesn't work well. But I wanted to share my idea here. BTW: HMOVE would become easier, you can use a constant value of $C0 for +/-4 pixel movement. That saves 11 cycles every 2 lines. Edited September 3, 2021 by Thomas Jentzsch 1 Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 (edited) 10 hours ago, Andrew Davie said: This looks much better to me. The yellow/white are now well separated. Will be interesting to see the results of seeing it in action. So here's the same clip with no enhancements. You'll notice its darker, as the checkerboard flicker removes 50% per frame. That might look better with some CRT glow though. Moving forward, might still add a little brightening and contrast, a little darkening of the edges, but hold off on the individual R/G/B palette stretching which brings out detail, but looks unnatural. Edited September 3, 2021 by rbairos 3 Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 (edited) 6 hours ago, Andrew Davie said: Yes, I'm sure you're right. But let's have a look-see at some test examples anyway? I should get you to download and compile the encoder from the github. ColorizeTOP.cpp compiles into a dll. Which is the plugin I wrote for the free version of TouchDesigner. In particular, you could add more weighting methods. Currently the last one 'MonoBack' that minimally weights the background on the monochrome luma value is what I'm using. Though that last multiply by the distance to background color should be a separate option applicable to the first 3. Send me a message if you're interested. Be happy to walk you through the setup. Once setup, you could examine differences in realtime. Here's the relevant folder. https://github.com/lodefmode/moviecart/tree/main/colorize switch(weightMethod) { case Weight_Mono: weight = r*0.3f + g*0.6f + b*0.1f; break; case Weight_Luminance: weight = RGBtoLum(r, g, b); break; case Weight_Hue: weight = RGBtoHue(r, g, b); break; case Weight_MonoBack: default: weight = r*0.3f + g*0.6f + b*0.1f; // weigh background minimally { float dist = colorDist(npixel, backColor); weight *= dist; } break; } Edited September 3, 2021 by rbairos Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 3 hours ago, Thomas Jentzsch said: After my rief exchange with Andrew above and after watching the latest Dragon's Lair example again, I started to wonder if a different sprite alignment would work even better. One would have to test. The current horizontal color resolution is very low, because it is only updated every 8 horizontal pixel (which is about 13 vertical pixel). Maybe it would work better if the sprites are only shifted by 4 pixel back and forth. Like this: Even frame: |## ## ## ## ## | | ## ## ## ## ## | Odd frame: | ## ## ## ## ##| |# ## ## ## ## #| If the approach works, this should double the horizontal color resolution (for the human eye) at the cost of the already relatively high vertical resolution. Overall this should further reduce the color blockyness of the image and make the thin vertical stripes less noticeable (I hope). The problem is that flicker will probably increase a bit. And you would have to feed six sprites in the 2nd odd frame line. The latter could be avoided if the horizontal size is reduced by 4 pixel (removing the left single # in my example). I am sure there are more problems to overcome and maybe in the end it doesn't work well. But I wanted to share my idea here. BTW: HMOVE would become easier, you can use a constant value of $C0 for +/-4 pixel movement. That saves 11 cycles every 2 lines. Yah it's an interesting variation. Before I knew about the full 8 pixel backshift glitch, I considered similar options to get around it. Many resulted in 'swimming' chevron artifacts in my simulations. Most extreme example I setup was 1-pixel single columns moving horizontally (every 8 pixels). Flicker was extreme, but you got full color across! Your basically saying instead of checkerboard patterns of the left (and its inverse), use 2 on right (and their inverse), correct? I might even have created a test of that in the past. Maybe it would look better now with the background color improvements. One thing I was thinking this week, was, let *each line* offset be randomly selected. Very curious what that would result in. Quote Link to comment Share on other sites More sharing options...
Thomas Jentzsch Posted September 3, 2021 Share Posted September 3, 2021 Yes, that's the idea. Effectively it will look more like this: Quote Link to comment Share on other sites More sharing options...
sn8k Posted September 3, 2021 Share Posted September 3, 2021 Holy shit so it's even better now. Glad I waited. Quote Link to comment Share on other sites More sharing options...
Thomas Jentzsch Posted September 3, 2021 Share Posted September 3, 2021 31 minutes ago, rbairos said: weight = r*0.3f + g*0.6f + b*0.1f; That's about the same values as I posted above. Just rounded. 1 Quote Link to comment Share on other sites More sharing options...
+Andrew Davie Posted September 3, 2021 Share Posted September 3, 2021 43 minutes ago, rbairos said: So here's the same clip with no enhancements. TY. It's quite good, IMHO. Obviously darker/less contrasty. I had a bit of a "what-if?" idea and the first thing I'd suggest instead of putting a contract/brightness filter step in, I'd simply modify the values of the '2600 palette so they were ~1/2 as bright. That way it would choose brighter colours, effectively brightening things at a 0-cost for extra processing time. But instead of exactly 1/2 (i.e., one frame with colour and the next blank), I'd take phosphor fade into account and maybe set it to something like 75%. Worth a try? Quote Link to comment Share on other sites More sharing options...
+Andrew Davie Posted September 3, 2021 Share Posted September 3, 2021 40 minutes ago, rbairos said: I should get you to download and compile the encoder from the github. I'd be delighted to do that, thank you, but... 40 minutes ago, rbairos said: ColorizeTOP.cpp compiles into a dll. Which is the plugin I wrote for the free version of TouchDesigner. I don't think MacOS handles "dll"s; would I need to do anything different for this OS? Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 (edited) 51 minutes ago, Andrew Davie said: I'd be delighted to do that, thank you, but... I don't think MacOS handles "dll"s; would I need to do anything different for this OS? TouchDesigner works for mac as well. Plus the github folder has the mac setup files, and an earlier compilation output (before the background changes). I don't work on Mac myself, but some of my colleagues do, who answer all my questions. Reminder to myself though, this just simulates + outputs images/movie files, the bin files it optionally creates still aren't supported by the emulator (have to work on that soon, change the kernel etc). I still need to convince myself I haven't done something silly in my simulation, and this is just a dream. Will get that out soon. Edited September 3, 2021 by rbairos Quote Link to comment Share on other sites More sharing options...
+rbairos Posted September 3, 2021 Author Share Posted September 3, 2021 (edited) 56 minutes ago, Andrew Davie said: TY. It's quite good, IMHO. Obviously darker/less contrasty. I had a bit of a "what-if?" idea and the first thing I'd suggest instead of putting a contract/brightness filter step in, I'd simply modify the values of the '2600 palette so they were ~1/2 as bright. That way it would choose brighter colours, effectively brightening things at a 0-cost for extra processing time. But instead of exactly 1/2 (i.e., one frame with colour and the next blank), I'd take phosphor fade into account and maybe set it to something like 75%. Worth a try? Maybe, but that extra preprocessing is done on the GPU anyways, and has super low impact. The real slow down is the dithering algorithm itself, which isn't parallelizable. I played around with a 16 meg lookup table to avoid 'find closest' but overall improvement was still minimal. That being said, it runs about 20% - 30% realtime without background selection, which is pretty interactive for testing. Selecting background per line slows it down to about a frame every 5 seconds. (2% as shown in the output) I haven't spent much time looking for optimizations yet. Edited September 3, 2021 by rbairos Quote Link to comment Share on other sites More sharing options...
Thomas Jentzsch Posted September 3, 2021 Share Posted September 3, 2021 5 hours ago, rbairos said: One thing I was thinking this week, was, let *each line* offset be randomly selected. Very curious what that would result in. Well, you would have to encode in sync. The colors of the nearby pixel affect the color perception of the current one. Quote Link to comment Share on other sites More sharing options...
+stephena Posted September 3, 2021 Share Posted September 3, 2021 4 hours ago, rbairos said: the bin files it optionally creates still aren't supported by the emulator (have to work on that soon, change the kernel etc). When you commit the changes to Stella, you can now create a PR against master, since I've merged the lodefmode-moviecart branch. 1 Quote Link to comment Share on other sites More sharing options...
macdlsa Posted September 4, 2021 Share Posted September 4, 2021 On 9/3/2021 at 3:21 AM, rbairos said: I'll actually be doing a live presentation of some of the projects I've worked on, over the years, for my work, mid October. Probably focus more on MovieCart, showing all the pre-processing elements. DM me for details. Alternatively, I can send you info on how to launch it on your own if you're interested. Another forum member already has successfully. Oh, thank you, rbairos. But I prefer to stay tuned to your (great !) work, especially since I don't have the right skills on the subject. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.