Jump to content
IGNORED

Console graphics: are we leveling off?


Recommended Posts

It seems to me that, in the 8th gen, we are leveling out in terms of graphical power. When new systems come out, killer graphics are all the rage. I'd even go as far as to say that generations are defined (at least in part) by these visual leaps. My point is, the PS4 and One (and ok, Wii U, you too) seem like incremental leaps forward in terms of graphical power. What's odd is that the last gen was so long; I thought by now that the leap would be insane!! But it's not nearly as defined as, say, SNES to N64, or PS1 to PS2, or nearly any other variation. So what's going on? Has the tech simply leveled off? Comment and discuss please.

Edited by toptenmaterial
Link to comment
Share on other sites

I think that when pixel shaders became popular with Xbox 360 and PS3, the look of the games made a huge jump and became more realistic, now the limiting factor is with world size and character geometry/complexity and realistic lip sync and movement.

 

The consoles this generation are weird in that they have very, very weak CPUs, which is starting to bottleneck games that don't take time to spread the CPU load among 6 (available) cores. The graphic cards are somewhat reasonable and are a jump from the previous generation, as games like Infamous and Ryse show. It's not bleeding tech anymore because a high end graphics card this generation consumes as much power as these consoles do. While they're not as green as Wii U (and thankfully so), there has to be a sensible limit on the power draw and heat.

 

The jump is getting smaller, with game resolution, texture resolution, polygon count, cloth physics, fluid physics, hair physics, anti-aliasing and so on getting bumped up, as well as effects like depth of field, chromatic aberration or motion blur being used to emulate a filmic/video look.

 

What is surprising (or not once you look at console CPUs) is that a card weaker than a PS4 can run games as well or better than it. Stuff like a Radeon R7 260x (same as 7790) and Nvidia GTX 750 Ti achieving better framerates in games is completely opposite of what happened last generation. To match or surpass a Xbox 360 you needed a higher end card from the year after to have the game look better (8800GT, 3870). Hell, even my crappy (but overclocked) GT 640 can run Alien Isolation in 1080p with 30+fps when I turn shadow mapping down.

Link to comment
Share on other sites

For games that are supposed to have a realistic look, everything should look like they pointed a video camera at the real world instead of the fake-looking "photo-realistic" crap that we keep getting. I require a huge jump in graphics. I'd like my jaw to drop to the floor from awe overload. I want to say "I can't tell the difference between real video footage and what I'm seeing in this game. Everything looks real: grass, trees, buildings, people . . . everything! And the NPCs are acting and reacting like real people with individual personalities and voices."

  • Like 4
Link to comment
Share on other sites

For games that are supposed to have a realistic look, everything should look like they pointed a video camera at the real world instead of the fake-looking "photo-realistic" crap that we keep getting. I require a huge jump in graphics. I'd like my jaw to drop to the floor from awe overload. I want to say "I can't tell the difference between real video footage and what I'm seeing in this game. Everything looks real: grass, trees, buildings, people . . . everything! And the NPCs are acting and reacting like real people with individual personalities and voices."

 

And the media would condemn/blame these realistic looking 'murder simulators' for every shooting spree... :D

Link to comment
Share on other sites

For games that are supposed to have a realistic look, everything should look like they pointed a video camera at the real world instead of the fake-looking "photo-realistic" crap that we keep getting. I require a huge jump in graphics. I'd like my jaw to drop to the floor from awe overload. I want to say "I can't tell the difference between real video footage and what I'm seeing in this game. Everything looks real: grass, trees, buildings, people . . . everything! And the NPCs are acting and reacting like real people with individual personalities and voices."

 

And we'll get it some day. And not all that long after, such graphics capability will be built into every computer's motherboard. And then it'll be game over for the graphics chipmaking industry, because they'll just be selling a cheap commodity like any other.

 

I wonder how much of a hurry they're in to get to that day.

  • Like 1
Link to comment
Share on other sites

It seems to me that, in the 8th gen, we are leveling out in terms of graphical power. When new systems come out, killer graphics are all the rage. I'd even go as far as to say that generations are defined (at least in part) by these visual leaps. My point is, the PS4 and One (and ok, Wii U, you too) seem like incremental leaps forward in terms of graphical power. What's odd is that the last gen was so long; I thought by now that the leap would be insane!! But it's not nearly as defined as, say, SNES to N64, or PS1 to PS2, or nearly any other variation. So what's going on? Has the tech simply leveled off? Comment and discuss please.

 

Well damn, how much PS4 programming have you done? What experience do you have with these platforms' power?

Link to comment
Share on other sites

As history shows...they don't level off until much later in the consoles life cycle. They only get better as programmers get better at making games for them. The Uncharted series on PS3 for example looks a lot better with each sequel (a LOT). It's unbelievable what these guys can do with what seems to be so little when in fact its more than enough. It takes skilled programmers with a desire to achieve these results. Will we see the same improvements over time on current consoles? That's up to the developers but I'm sure we will........so no there's no leveling off right now.

 

One more thing, if your really want to see the difference between a current console game and one on PC you would need a graphics card with 6GB of memory (or tons of system ram equivalent) and a game which is designed that way. Check out the HD texture pack for Shadows of Mordor on Ultra settings. You really need a killer card to use that pack and only then will you really notice major improvement.

Link to comment
Share on other sites

For myself, Dreamcast was one of the last consoles to really impress me in terms of just how much of a step up it was from my existing hardware (PS1).Sharp, vibrant graphics, rich textures etc.PS2 'impressed' on technical level with GTA3 in terms of living city etc, but it was'nt the power house leap Sony had promised and not that big a leap over DC ( i owned DC, PS2, GC and Xbox that generation).

 

360+PS3 brought me into 'HD' gaming and going from playing game on Scart to HDMI, hell you noticed....

 

 

Now it's just what i'd expect, more lighting effects, detail etc, higher resolutions, but nothing wow!.3D gaming did'nt attract me, VR currently lacks wow factor as well.I guess i need something like Star Trek holo Deck or red Dwarf Better Than Life type tech to make me sit up and take notice.

  • Like 2
Link to comment
Share on other sites

The last big graphical change was roughly the PS1 era when most games switched to polygons over pixels. The only major shifts since have been pushing more polygons and reaching HD resolutions more consistently. Unless the core graphics technology changes, we'll continue to see incremental improvements. I don't know of any new graphics technology on the horizon, so the only thing that will change are delivery methods, e.g., augmented reality that will eventually lead to holographic technology, a la the mythical holodeck. Certainly graphical fidelity will continue to improve and eventually make the idea of the uncanny valley a thing of the past, but I suspect it will all be generated the same like it is now and has been since that last major shift in the mid-90s.

Link to comment
Share on other sites

I'm still waiting to see the advances in game A.I to reach a point i was expecting by now.I know marlketing cannot sell a game based on A.I routines but to see things like AVP on PS3/360/PC take a step back in terms of Alien behaviour A.I from the PC original i played on my Windows'95 PC years ago, really showcased how messed up the industry has become over past few years.

Link to comment
Share on other sites

Slightly off topic, aren't the CPU's of today kind of maxed out? What happened to the evolution of them? Where are the 4, 5, 6ghz Cpu's? Have they hit a brick wall in terms of manufacturing or what? All I've seen the last few or 10 years are slower processors with more cores until even they eventually evolve back up to the 3+ghz range. And no, I don't require a lecture on the merits of multiple cores vs. fewer cores at higher speeds. I own and use both daily. Will always prefer the latter for day to day use.

Today's processors are hot enough though. See and deal with a LOT of failure rates between PC's and consoles due primarily to cooling issues. Guess it's a matter of not being practical or cost effective to keep this stuff cool - which is why I think consoles have sort of stunted out graphically. Other obvious reasons and the intrinsics of business aside of course. Cross platform least common denominators, style of art and development tools being the same, etc. Most modern games have looked the same to me for years, despite a given systems specs.

Link to comment
Share on other sites

The PS4 and Xbox One are also low end machines today when compared to a PC from a hardware standpoint.

 

When you compare a PS3 or 360 to the PCs of their time this wasn't the case, they were much higher on the hardware "ladder" when compared to where the PS4/Xbox One are today.

 

Not saying the PS3/360 beat high end PCs at the time, but they certainly weren't low end machines at the time.

 

The consoles this generation are simply designed to be cheaper than the previous generation consoles were at launch. I mean the £420 PS3 launch day price speaks for itself, the PS4 is a lot cheaper than that was.

  • Like 1
Link to comment
Share on other sites

Slightly off topic, aren't the CPU's of today kind of maxed out? What happened to the evolution of them? Where are the 4, 5, 6ghz Cpu's? Have they hit a brick wall in terms of manufacturing or what? All I've seen the last few or 10 years are slower processors with more cores until even they eventually evolve back up to the 3+ghz range. And no, I don't require a lecture on the merits of multiple cores vs. fewer cores at higher speeds. I own and use both daily. Will always prefer the latter for day to day use.

 

Today's processors are hot enough though. See and deal with a LOT of failure rates between PC's and consoles due primarily to cooling issues. Guess it's a matter of not being practical or cost effective to keep this stuff cool - which is why I think consoles have sort of stunted out graphically. Other obvious reasons and the intrinsics of business aside of course. Cross platform least common denominators, style of art and development tools being the same, etc. Most modern games have looked the same to me for years, despite a given systems specs.

Yes, we have reached a brick wall in terms of CPU speed. The theoretical max for Silicone CPUs is 8Ghz. The 8Ghz jump was broke with an AMD FX 8150 Bulldozer on liquid heulium. I have that exact same processor from 2011 in my desktop running 24/7 @4.2Ghz rendering deep zoom fractals with a big-assed aftermarket heat sink. My computer is in a sleek black EATX workstation chassis made of staiinless steel that wighs about 50 pounds (My mobo is a Gigabyte ATX but the extra room inside the case really helps to improve airflow and make it not so cramped). Anyway my processor is rated 125w. The computer pulls 100w idle, 240-300 watts 100% CPU load, well exceeding the thermal design power, but my oversized heat sink keeps CPU temps cool. It drops from 55C down to 35C almost instantly when I unload the CPU from 100% to idle so the heat sink does it's job very well. There's probably slight more room for additional overclock on my system but I don't want to risk instability. I do not have a high end video card since I don't use the machine for gaming.

 

Intels are pushing past 3Ghz with shorter latency and AMD pushing past 4Ghz (the latest AMD model hits a top turbo speed of 5Ghz but that's not enough improvement over my current 4.2Ghz to upgrade) both getting about the same performance per price, but Intel has higher end desktop processors which putperform AMD (I still miss the days whena 2Ghz Athlon XP could wiped the floor with a 3Ghz P4; now the reverse is true).

 

Fact of the matter higher speeds can be achieved using smaller manufacture process which also allows more cache and more cores on a die. Problem is that at tiny sub-micron levels, the silicone wafers start to leak current generating additional waste heat, prohibiting the tech from going smaller. This is why over the years dies got smaller but heatsinks got bigger. We hit that speed plateau a decade ago and what little speed performance boosts we've achieved have been the result of more efficient pipelines and adding lots more cores.

 

I think carbon (diamond) is the future. A diamond CPU could theoretically operate at 80Ghz and withstand temps up to 1300 degrees C before it breaks down. Silicon breaks down at 300 C (solder melts at 200 C and water boils at 100 C for comparison). Running a diamond CPU at ambient temps would be the equivalent of running silicone under liquid nitrogen. Great for overclockers, but there is a problem. Silicon diodes have a voltage drop between 0.6-0.8V. Diamond has a Voltage drop of 5V and an LED version was fabricated that emits germicidal UV light. Maybe someday all lighting will be flourescent lamps backlit by Diamond LEDs instead of ionized mercury vapour. Anyway with a Vdrop of 5V, a Diamond CPU would probably need nearly 8-10V to operate at high speeds. Given the density of modern silicone, you would need 8x the voltage which would result in sucking massively more wattage than silicone. Remember ohm's law: 2x higher voltage = 2x higher current = 4x wattage (squared). Diodes do not behave like resistiveloads but you get where I'm going with this. So you might be able to hit 50Ghz on the diamond CPU but performance per watt would be just as bad if not worse than current silicone.

Edited by stardust4ever
  • Like 3
Link to comment
Share on other sites

I'm still waiting to see the advances in game A.I to reach a point i was expecting by now.I know marlketing cannot sell a game based on A.I routines but to see things like AVP on PS3/360/PC take a step back in terms of Alien behaviour A.I from the PC original i played on my Windows'95 PC years ago, really showcased how messed up the industry has become over past few years.

 

I actually read an article on Yahoo about a couple upcoming games that are supposed to improve AI and give them more of a mind of their own. One of these two is an upcoming Aliens game where you're supposedly being stalked by the alien as you try to escape.

Link to comment
Share on other sites

Real-time rendering systems use a lot of 'tricks' to display what they do.

 

For heightened realism, we really need to be able to push wayyy more polygons, instead of relying on bump-mapping and other image-based techniques.

 

Full-on real-time ray-tracing would be nice, but there just isn't enough CPU power available to the consumer for that sort of thing. It's funny that I'm even writing this, since ray-tracing is as old as the hills and it still takes its toll on even the most modern computers.

 

Of course, the other half of it is to make sure the people who model the geometry do so in an efficient manner.

 

We need a CPU and VPU revolution and a move away from copper. There were plenty of articles on the limits of copper back in the '90s. Looks like we've pretty much hit the limit.

Link to comment
Share on other sites

We need a CPU and VPU revolution and a move away from copper. There were plenty of articles on the limits of copper back in the '90s. Looks like we've pretty much hit the limit.

And what conductor do you suggest would be more suitable? Copper has the 2nd lowest resistance of any metallic element on the periodic table, sans silver, which would only perform marginally better. The added cost would offset any real benefit.

 

Granted there are various mostly ceramic compounds that deliver higher conductivity or even superconduct at frigid temperatures, but last I checked they are extremely brittle so thermal stress might destroy them in short order, and you'll probably need to submerge your computer in liquid nitrogen, because many of these superconductive exotics are insulators at room temperature.

 

I firmly believe that silicone semiconductors are the real bottleneck. Going down the periodic table, Germanium has a lower voltage potential but is far more leaky than silicon. Carbon (diamond and/or nanotubes) has potential to exceed silicon but currently there is no efficient way to fabricate chips, and even when such fabrication technology exists, it will require vastly different voltage levels as I iterated in a previous post. Diamond would have far less current leakage at sub-micron levels so it may be possible someday to fabricate an efficient cool-running CPU despite the voltage increase. And if you were an overclocking enthusiast, you could probably overvolt / overclock it to kingdom come sucking in 100s of watts without destroying the wafer because it could potentially tolerate far hotter temps than silicone. Diamond is also the fastest known conductor of heat, which also works to it's advantage.

Edited by stardust4ever
Link to comment
Share on other sites

And what conductor do you suggest would be more suitable?

 

I keep (naively?) hoping that someone will eventually figure out a way to scale-down a suitable infrastructure for the transmission of light.

 

 

Here's a neat article with a number of approaches to optimizing processing (and includes technologies that you mentioned):

 

http://www.kavlifoundation.org/science-spotlights/next-life-silicon#.VEGeVmctCzk

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...