Jump to content

turboxray

Members
  • Content Count

    129
  • Joined

  • Last visited

Community Reputation

66 Excellent

About turboxray

  • Rank
    Chopper Commander

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Are you also one of those people that doesn't care that he took my early Megaman PC-Engine CD build and sold it as a "repo" without my permission? Or that he tried to pass off Sapphire repos as real/official 2nd run prints?
  2. So you understand about wanting to play on the original hardware, with a physical game, but fail to understand that some people are willing to pay over a hundred dollars for a game? Especially in the world of PCE, that's strife with contradiction haha. You've got basically three tiers; original, bootleg, and finally roms. Using your logic - I can easily say why even pay for a bootleg when you can use roms/images. I can totally understand how someone would want a physical game to insert/remove, just as much as I can understand how someone would want the original. It's all about that wholistic retro gaming experience, and for some.. imitation is not enough. It's kinda silly to judge be amazed how other gamers spend their money haha.
  3. To be honest, for most games yes. It's 41% more vertical resolution. And given the GBs had that pixel gap around the whole pixel (and not the sub pixel), it make the res look even higher in real life. Sort of how like scanlines (gap) have that similar effect. Ninja Gaiden 3 looks a horrible pixelated mess on the Lynx. You've got 16 colors on screen.. there's not much 4k color is going to do for you haha (let alone the actual screen even cable of display half that). With 16 colors, you got room for maybe 2 shades of a color. At least with the Game Gear you have 31 colors on screen.. and the same GB res. Too bad it used that same horrible sound chip as the original Mark I.
  4. Yeah, from what I've heard from the album, he doesn't really do anything with the PCE sound. It sounds a little tinny/tinkery for the non sample usage parts (for this type of music, the PCE accels in doing filter type effects too). And for the samples, he didn't even maximize quality of the samples. Given his Genesis stuff, I expected a LOT more than this. I haven't heard/known about his SNES stuff though.
  5. I did the nes2pce games on the PCE. I made a special version of Megaman CD with CD audio. It also has an 'easy' mode setting. Let me know if you're interested and I'll uploaded it... somewhere haha.
  6. You took a pass on the CD version because you played the SMS version? Oh man, someone missed out hahaha
  7. I'm still fiddling with the settings, so I'm not gonna post any numbers yet. But I did some inspecting of the videopath. So the 240p pass-thru is working on the retrotink, and the hdmi is creating a 240p component signal. I was able to look at it on my vectorscope and verify that it's 240p. I just don't know why OSSC doesn't like it. It likes the RGB2COMP 240p component signal just fine. Any ideas? I also found two other things with the retrotink 2x and hdmi converter: There's a 7.5 IRE pedestal on Y (which honestly, I was expecting). I think the retrotink is putting it there, because when I hook the retrotink up to my TV directly from the hdmi output (yay! my TV takes a 240p signal over hdmi), I can see it in the border that's supposed to be black in my test roms - and it's not. The other, which is the real problem, is the Luma level is squashed! It should be close to 100 IRE, but it's like 70 IRE. And OSSC doesn't really have nice controls/settings to fix this. The "gain" stuff only get's me so far. In other words, unless I want to totally crush the upper white range - I'm stuck with poorer dynamic contrast range. I compared to this to RGB in on the OSSC, and it's not just my TV (the dynamic contrast range issue). If all I see is the TV, then it's not soo bad. But when I run my CRT next to it and see the difference in dynamic contrast range... it makes me sad haha. I also know it's not so much the TV as it is the OSSC (not being able to compensate), because I have an incredible contrast range setup for my MisTer with PCE emulation. I might end up building a log-amp circuit that sits between Y input on OSSC and Y output from the HDMI to component adapter.
  8. I think it probably should be pointed out that while 10 sprites per line vs 8 sprites per line seems nice, it's actually better than that. I mean even though both are 8pixels wide, from a screen realstate perspective - GB 10 sprites are able to cover 50% of the screen horizontally while the NES can only cover 25% of the screen due to its highe res. So lower res on the GB, but more screen coverage for sprites in your pixel mileage. Another down side is that vblank is shorter on the GB than on the NES (like almost half I think). And you can't "force vblank" to get more vram bandwidth. So it has less room for updates during vblank, but then it again it has a general DMA so that might balance it out..
  9. I finally got my video chain setup. It took quite a bit of work to get it just right. Part of that was not being familiar what some of the settings meant. The retrotink 2x classic "passthrough" mode does not work on my video chain, but not a big deal because 2x mode worked fine. I found the picture too bright (as in, the "pedestal" was removed and bumped up). It's possible that HMDI to component converter is doing this (I found one on amazon that looked exactly like the recommended one). But that's not a probably, because "G/Y" offset is actually "brightness" and "G/Y" gain is actually contrast. Likewise, R/R-Y and B/B-Y is color saturation. But now you have brightness and contrast control for saturation, which is REALLY nice. Just to note, it does not work like that if you have an RGB source. This is for component only. And the pre-gain ADC is just that. So with that in mind, I used the 240p test suite, and a few known games, and was able to calibrate the whole video chain to that of my calibrated CRT. My scanline options are a little different than what you have, but look pretty close. I'm just amazed! Like I cannot express how impressed I am with this videopath for composite video. I measured the video lag against my CRT, using 240p test suite and using my phone at 240p capture. I'm about 1.4 frame lag! And I'm pretty sure that 1 frame part comes from the TV itself, so 0.4 frame lag of the chain. I'll retest with straight RGB to OSSC to the TV and see what the lag is. This also a great option for people that want to capture from the real hardware too. I'm gonna pick up a $200 HMDI PCIe capture card now.
  10. The ram thing is irrelevant since ram was added to NES carts (and can be added to GB carts) as was typically 10k for NES total. So are you comparing GB or GBC? The 10 sprites a scanline is nice, but you're also limited to a total of 40 sprites instead of 64. And like the NES, you're limited to 8x8 or 8x16 sprites, but not both at the same time. The GB sound has like one nice advantages and quite a bit of disadvantages. I guess two advantages if you count stereo volume on the GB. Obviously the lower res is a disadvantage, but the slower refresh rate of the GB LCD is both a disadvantage and an advantage (for tricks that exploit the slow refresh rate). The 4mhz CPU is effectively 1mhz, and it's not a real z80. So I'd say it's less powerful in this department. The GB does have some nice advantages such as DMA (NES only has sprite table DMA), but then again the NES has all vram on cart - so you can do things with it that even the SNES/Genesis can't do. So nice on the GB, but NES still has the edge. GB has an actual real interrupt cable hsync 'raster' effects system, but later NES with onboard interrupt timer on cart gives you that as well. Plus, NES was able to do split screen vertical scrolling with a mapper. You can do this on the GB but kills cpu performance so outside of demos, it's not done AFAIK. The same goes for sprite multiplexing - GB is capable of it, but it's a potential cpu performance killer. The NES is definitely more powerful, but the GB is a 'clean' design and easier to get more out of the hardware.. without additional hardware. I think a lot of the GB draw backs are fixed with the GBC. Faster clock, bigger palette, etc. The GBC also has an interesting mode where you can actually use 4 unique colors per tiles instead of the typical 3 + 1 common for all tiles. And while the NES has 3+1 colors for every 16x16 pixels, the GBC is 3+1 or 4 for every 8x8 pixels. And while the NES has emphasis bits that could be changed mid screen, the GBC has real palette update support per scanline (although most games just use this in "high color mode" stills.. but it can be used in game like how MSX games us this to get more colorful sprites). The GBC has twice the clock speed, so Hsync raster effects are less of a cpu resource hog - making some of those nice tricks more application in game. And while the NES has unlimited banked vram, and can be changed mid screen, the GBC does has more banks of vram than the original GB - which IS nice. Not quite as nice as the NES, but in application it brings it closer. I think for GB, the NES holds pretty much all the advantages if you're willing to include on cart hardware (which is the norm for NES). But I think when it comes to GBC, it's more of selections of trade offs between the two.
  11. I picked up a vectorscope (videotek 675-tvm) that can look at component as well as composite video. Specifically, I wanted to look at the color part of the signals. That tells you pretty much all of the color characteristics of the colors in the signal. So anyway, I had purchased an RGB scart cable for the SSDS3, and a retrotink RGB2COMP transcoder. I had a 20" CRT that took component input, so I was curious about how it looked. Plus, besides graphic development on the PCE - I wanted to further inspect games that had more noticeable difference between the PCE's YUV palettes Vs the RGB output mod that a lot of people use (or emulators). But I ran into this issue that color was just waaaayyy too hot on my SDTV. I had to turn the color balance down to 30 (50 was "neutral"), an even then. It didn't really make sense why it was so 'hot'. After looking at it on the scope, I got the idea of just terminating Pb and Pr to 75ohms (with a T connector and a terminator), but leave Y alone. Sure enough, it completely fixed the issue. I need to investigate further, but I think the issue is with the TV and not the RGB output or the RGB2COMP.. but wow! What a difference. So much so, that colors look soo much closer to original composite output of the PCE.. but with the high end of the colors having that extra saturation. It was like the best of both composite and RGB; removal of the over saturated warmth of the palette in the low and mid (flesh tones look correct like on the original PCE composite), but colors in the highend are very "rich and vibrant". Like I said, unclear if this just an edge case with my TV set as I don't have another set of component options to compare it to (yet. I'll be getting my Genesis next week. And a component cable for my MisTer), but it might be option for others out there having similar issues (for RGB to Component). I will be receiving a 20M2U PVM this week, so I'll be able to verify on that as well.
  12. Nice write up! You've convince me to order both
  13. And old audio demo I did back in 2011 for the PCE: Single channel, which is nice in that you don't have to pair a channel just to get a higher bit sample depth.
  14. Yeah, I know some of the history but it looked like everything got resolved (and this was supposedly new hardware revision - fixing whatever). Yeah, I definitely encountered that attitude. And I'm not one to take that kinda crap, so I definitely called them out on it on the discord channel. But yeah, TO is trash but I needed a fast way to do development for PCECD stuffs, and to double as a nice quick play device. Also another side note to the SSDS3: don't enable "in-game trigger". Apparently it causes random issues with CD games. There's a warning message that it's not compatible with some games, but that's about it (and not what it means by that). It looks like they have a hook into the vblank routine where they spy the controller code for soft+reset. Something along those lines (it's definitely a hook), but was causing CD RAM to get corrupt values. Leaving it disabled cleared up all issues I had with CD games. The issue isn't the approach, but apparently their buggy code that interfaces with said hook. smh. Does anyone remember the PCE Tototek card? It had game-genie type cheats that you could turn on/off. Outside a few games, it worked really well. Totally a missed opportunity on the SSDS3 side to implement that - and they could have done it even easier by design.
  15. I picked up a retrotink rgb2comp converter. I had to buy a scart cable though for the SSDS3 to use it. My CRT takes component, so I wanted that option - not necessarily for PCE but I figured might has well haha. Works pretty good with the SSDS3, but wow the saturation is pretty hot. I had to turn the "color" down to 34 on the TV set (normally set to 60), and even then some games were overblown or too hot for some colors. Part of it could be my TV set (I suspect is it), but I don't have another component cable set to try it on. I do plan on buying a PVM soon though, so I can see how it compares then. Oh, in regards to the SSDS3 - I need to mod it with a volume adjustment for the chip generated audio. I was playing Sapphire and the chip generated SFX are just too loud (and annoying at that level). In Gate of Thunder the CDDA music just isn't loud enough to enjoy. I could manually boost the CDDA tracks, but that's annoying haha. I asked and apparently the way they designed the SSDS3, they never anticipated this issue? So it doesn't support it. They never added a attenuation controlled device on the incoming chip sound for their mixer. Which is fine, but apparently you can't also boost CDDA and ADPCM over 100% level either (which would also solve the problem). Dshadoff said they previously had clipping issues for their CDDA mixer, so I guess that's why there's no room. So yeah, I find this really odd. The SSDS3 is what.. 3 years old now? I bought their newest hardware revision (released this year). I understand that some people have no issues with this balancing of chip generated stuff against CDDA, but some games really do have glaring issues with this. I mean, over the pass two decades, people have been manually boosting CDDA tracks for CDRs for this very reason. And emulators have mixing options. The response I got was, "well, can you adjust it on the real hardware?". Rather condescending rhetorical question. They've also said they've had no complaints. I find that really hard to believe. A little annoying that I have to mod in a feature, on a $200 product, that should have been there in the beginning. Even more annoying that I was told to go use a Pi+Emulator or buy an Analogue Duo if I wanted that feature haha.
×
×
  • Create New...