Jump to content
IGNORED

PS5 capped at 32gbps?


glazball

Recommended Posts

I'm curious if any PS5 owners can confirm exactly what A/V output you are getting?  I'm reading that even though HDMI 2.1 can handle 48gpbs, the PS5 is capped at 32gbps (and Xbox Series X capped at 40gpbs).  Can anyone confirm this on their setup?  I'm also curious if Variable Refresh Rate (VRR) is enabled on PS5 yet?

 

I'm looking to buy a new TV (and a PS5) this year and hope to maximize A/V quality from the PS5.  Ideally, for 4K you would want the full 48gpbs which allows for 4K @ 120Hz @ 4:4:4 (12-bit).  So the big question is can the PS5 output at 48gpbs and if not, will Sony update the firmware later to allow it?

 

Here's an article about the cap found via a quick google search:

https://www.gamesradar.com/ps5-hdmi-21-capped-at-32gbps-compared-to-xbox-series-x-40gbps/

Link to comment
Share on other sites

I guess I shouldn't expect a $500 console to have all the bells and whistles, but it sure would be good to know what Sony has planned so that I can plan my purchase.  Some of the 2021 TVs I'm looking at are Sony's own A90J and A80J OLEDs, which are supposed to have VRR.  Does Sony plan to implement VRR on the PS5 with the launch of their newest TVs?  I've read that Sony promised VRR with an update to their 2020 TVs, but from what I can gather it hasn't happened yet (and may never happen).  I don't want to base any major decisions on promises or speculation, so I guess we'll all just have to wait and see.

Link to comment
Share on other sites

I don't think bandwith will matter much since 4K@120Hz is not something that I see happening on consoles (nor PCs for any time soon), outside some simpler titles.

 

Lack of VRR is much more worrying. I do my AAA gaming on a PC these days and can't imagine not being able to use adaptive sync.

Link to comment
Share on other sites

2 hours ago, youxia said:

I don't think bandwith will matter much since 4K@120Hz is not something that I see happening on consoles (nor PCs for any time soon), outside some simpler titles.

 

Lack of VRR is much more worrying. I do my AAA gaming on a PC these days and can't imagine not being able to use adaptive sync.

I don't know...I've been stuck at 60hz only since forever and it doesn't bother me any. Then again, I don't play competitive online games either and only play the single player campaigns on FPS type games. As long as I'm getting my 60fps I'm good.

 

But then again, I come from the generation where getting anything over 20Fps on Quake II was considered excellent when it was first released. 

 

  • Like 1
Link to comment
Share on other sites

My point was about VRR and tearing/stutter which happens if console's framerate does not match the TVs. In PS2-3 console era it wasn't a big problem because games were locked at 30 fps, few could do 60. But now consoles are fairly powerful and can easily go over 30fps, even 60 fps (depending on game), and TVs can also provide high refresh rates. Without VRR this causes ugly tearing (thsi has always been present on PC, even in old Quake days, but we used V-sync to tame it then, and now monitors with adaptive sync)

Link to comment
Share on other sites

4 hours ago, youxia said:

My point was about VRR and tearing/stutter which happens if console's framerate does not match the TVs. In PS2-3 console era it wasn't a big problem because games were locked at 30 fps, few could do 60. But now consoles are fairly powerful and can easily go over 30fps, even 60 fps (depending on game), and TVs can also provide high refresh rates. Without VRR this causes ugly tearing (thsi has always been present on PC, even in old Quake days, but we used V-sync to tame it then, and now monitors with adaptive sync)

Yes I know this. My understanding is that most of todays console games are coded for 60fps or really close to operation. At least it seems that is the case with most of the PS5 games. So if that was the only target they are going for, it wouldn't make much sense to get a display that is over that is all I meant. I know all about tearing and I live with it constantly on my PC as I have a first generation 4K monitor that can only do 60hz. However my video card can go way above that on frames of course. But if I enable v-sync then the entire experience becomes laggy with movements etc. So I leave v-sync off and live with the tears. Again, I'm used to it as I've played that way for as long as I can remember in regards to PC games. 

Link to comment
Share on other sites

On 2/5/2021 at 9:01 AM, youxia said:

I don't think bandwith will matter much since 4K@120Hz is not something that I see happening on consoles (nor PCs for any time soon), outside some simpler titles.

 

Lack of VRR is much more worrying. I do my AAA gaming on a PC these days and can't imagine not being able to use adaptive sync.

4K@120Hz is exactly what these consoles are already outputting. Of course the games aren't being rendered at native 4K at that refresh, but the consoles are upscaling to 4K. So 4K is what's being sent out over the HDMI cable.

 

I think the linked article may be causing more confusion. The maximum of 4:2:2 Chroma mentioned is not going to be noticeable for most people.

  • Like 1
Link to comment
Share on other sites

@DJ ClaeFair enough. I can concede this point, I guess the topic of the myth of a real 4K@120Hz can be left for some other occasion.

 

That raises another one though: if the stuff is upscaled, which means loss of quality already, is this 10-12 bit difference actually relevant (as in visible?) Would it be relevant even if we apply it to these few games actually able to output real 4K@120Hz?

Link to comment
Share on other sites

Yeah, good point. I think the difference in resolution and color depth would be completely unnoticed by most, especially with the fidelity the consoles are actually able to render for most games.

 

My TV only handles 120Hz at 1080p, so I've only set my Series X to output 1080p to try 120Hz out. I know 120Hz is getting all the hype this generation, but I'm not really able to tell the difference between 60 and 120, so I don't bother with it.

  • Like 1
Link to comment
Share on other sites

Most likely it depends on a person, but I suppose the colour thing would be the hardest to spot. That's at least what I struggle with. Dynamic resolutions probably would bother me, though that also depends on how far from the TV one is sitting.

 

I can defintelty tell - and feel - the difference between 60/120 fps. That's on a PC though, perhaps on a console/TV it'd be a bit different. Consoles have always used lots of tricks to make inferior framerates bearable. I do my AAA gaming on a PC atm, but wouldn't mind trying a modern console setup, even if only out of technical curiosity. But prices are too crazy now, and there aren't enough exclusives yet to justify a purchase.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...