Jump to content
IGNORED

General Streaming and VR Discussion


Recommended Posts

2 hours ago, Lord Mushroom said:

I had a very weak internet connection 22 years ago (relative to others), and I have a very weak internet connection now. But my current connection is about 200 times faster than it was 22 years ago. That is an enormous difference.

It's an enormous difference, but ultimately the connectivity on your end isn't the only factor at play.

2 hours ago, Lord Mushroom said:

To go from streaming 1080p to 4k, you need to increase speed to about 5 times faster. That is a relatively small step in comparison.

Going from 1080p to 4K requires a CDN somewhere to push out a minimum of 400% more data per customer.  That affects everything from datacentre infrastructure (including power requirements, data storage, networking, and physical space) to backbone capacity to connectivity to service providers and, ultimately, individual homes.  The efforts and costs of engineering those changes and building them out are orders of magnitude in complexity and cost ahead of upgrading your home Internet access.

 

Please don't make it sound as though this is a problem where all someone has to do is turn the knobs up to eleven and we're all good to go with eyeball-searing picture clarity.  I spent a chunk of my working life in tech dealing with the problems of streamed 4K video transport and it is nowhere as simple to solve as you are making it sound - and just wait until 8K starts gaining a foothold in the consumer marketplace.

  • Like 3
Link to comment
Share on other sites

3 hours ago, Lord Mushroom said:

They aren´t setting the launch prizes that high to squeeze money out of consumers. They are doing it because that is what they cost to produce. Sony and Microsoft are competing vigorously in the gaming sector. The Xbox 360 was cheaper than the Playstation 3, and did much better than it otherwise would have done as a result.

At no time did I suggest that console prices were set at the price points that they are in order to gouge the customer.  What I did say is that people buying these devices have become accustomed to the price points that they're being launched at, which have a higher per-unit margin than a $50 streaming device.

3 hours ago, Lord Mushroom said:

Xbox set the price of its streaming service(s) much lower than Playstation´s, but Playstation responded.

What does that have to do with hardware R&D and manufacturing costs and the need to amortise those costs over the lifetime of the unit?

3 hours ago, Lord Mushroom said:

It is not like the App Store and Google Play, where they "agree" to keep royalties at 30%.

You're right.  It's not.  That's one of many reasons why I didn't mention either of those services, and quite honestly do not understand why you are.  They have nothing to do with the environments and devices we're talking about.

3 hours ago, Lord Mushroom said:

If Sony and/or Microsoft could make a $50 box, which can do the same or more than the competitor´s $500 box, they would do it in a heartbeat. 

I'll repeat myself: the margins on a $50 VDI box are too slim to make them a worthwhile pursuit as the market currently stands. Unless lifetime sales of those devices exceed current console lifetime sales by a factor of at least 3 (so, lowest-bar scenario, 350M $50 devices), they just aren't worth it.

Link to comment
Share on other sites

The unsolvable problem remains lag, from the time you hit the stick, gotta travel to the server, be operated on, frame rendered and prepped for transport, then sent back to you, and displayed. And that's simplifying it.

 

So how you get around that and meet or exceed what your gaming computer/console can do internally?

 

Might be ok for adventures and RPG/worldbuilding. But you ain't gonna be hi-scoring any 8-bit shit like Tempest. Or any other cheap-death style 80's games.

  • Like 3
Link to comment
Share on other sites

16 minutes ago, Keatah said:

The unsolvable problem remains lag, from the time you hit the stick, gotta travel to the server, be operated on, frame rendered and prepped for transport, then sent back to you, and displayed. And that's simplifying it.

 

So how you get around that and meet or exceed what your gaming computer/console can do internally?

 

Might be ok for adventures and RPG/worldbuilding. But you ain't gonna be hi-scoring any 8-bit shit like Tempest. Or any other cheap-death style 80's games.

Just take the relatively small step of ignoring the laws of physics and everything will work out peachy-keen A-OK ?

  • Haha 3
Link to comment
Share on other sites

7 hours ago, Keatah said:

The unsolvable problem remains lag, from the time you hit the stick, gotta travel to the server, be operated on, frame rendered and prepped for transport, then sent back to you, and displayed. And that's simplifying it.

 

So how you get around that and meet or exceed what your gaming computer/console can do internally?

 

Might be ok for adventures and RPG/worldbuilding. But you ain't gonna be hi-scoring any 8-bit shit like Tempest. Or any other cheap-death style 80's games.

Yeah with Antstream I can feel the lag on games.  Also another thing I don't get is, while I live alone and don't have to share my bandwidth with anyone, most people don't.  So how well does it work when you have one person streaming Netflix, another on a zoom call for work/school and you are trying to play something on Stadia?  Ha, it will end up being like when I used a modem enough that I bought a separate land line for it...  two internets is better than one, right?

  • Like 1
Link to comment
Share on other sites

On 5/29/2021 at 2:13 PM, Lord Mushroom said:

Expensive consoles are gradually fading in terms of gaming market share. Streaming has developed slower than expected, but is no longer an insignifant part of gaming (although still small), and they are now growing at a high percentage rate. Even the console makers themselves are embracing streaming.

Sony and Microsoft embraced streaming because they were spooked by Stadia.   Prior to this, Sony had a streaming platform in PS NOW for almost a decade that nobody cared about..   until they started to allow you to download games locally, then it started to gain some traction.   Stadia made more of a ripple than a splash.  And everyone and their mother is trying to buy a physical PS5 now even though there aren't that many PS5-specifc games.

 

Infrastructure is a problem.   Comcast is trying to put a data cap on my home internet.   Mobile has even more stringent data caps.  I'm actually considering buying/renting physical blu-rays again rather than rely on streaming, and I'd be a fool to sign up for games streaming knowing one day my household might hit the cap and I'll be locked out of my games for the rest of the month.

 

That plus we are seeing internet hacks that take out a portion of the infrastructure and ransomware attacks with increased frequency.

 

And gamers keep demanding higher and higher framerates in games-  60, 120, 144, 240..   to reduce control latency.  These gamers are not going to put up with the increased latency of streaming services.   And the latency is never going to completely go away..  when signals are travelling around the country and bouncing off satelites, there's going to be a delay that can be measured in ms at the very least.   Casual gamers may be ok with this, but the more dedicated gamer never will.

  • Like 2
Link to comment
Share on other sites

On 5/30/2021 at 5:00 PM, Shaggy the Atarian said:

This is similar to the spin that came out when VR reemerged on the scene about 10 years ago. Plenty of claims that the traditional TV/console concept was dead, it was all VR from here on out. Yet here we are in the next generation and Sony has barely mentioned the PSVR for the PS5, Microsoft has abandoned HoloLens for consumers, Nintendo has better things to do, Google dropped their VR support last year. While there are some new HMDs on the horizon, it's still very far and away from becoming mainstream or replacing consoles like I remember reading about.

Anyone who said VR was about to take over TV was crazy. 

 

The way I see it, the current VR is like the "2600 generation of VR",  it's novel, it's fun, it has to start somewhere  but it has ways to go still.   I remember how resistant my parents and others of their generation were to video games and how my dad complained about how the 2600 graphics looked like garbage.  Years later, they were playing games on facebook, which makes me laugh as I remember how anti-game they were.   This is the same kind of resistance I see in people to VR.   I've heard lots of stories how people's significant other's thing VR is dumb, until they are shown Beat Saber or something, then suddenly they have to fight over who gets to play it! 

 

Sony has been talking about their next Gen VR lately.   They sold millions of their first gen one.   After they got the PS5 out the door, they started talking about their next gen VR, teasing new info.   Oculus Quest I/II is often sold out.    Yes a lot of the me-too "mixed reality" devices have faded.   The concept of "Phone as VR unit" (Gear VR, Google) has faded, as it should!   And I don't think HoloLens was ever going to be a serious consumer product.  It was always going to be too expensive for that, and not as immersive as a VR unit.

On 5/30/2021 at 5:40 PM, Shaggy the Atarian said:

This circles us back to "this is the year of Linux!" It's said about Linux, it's said about VR, said about game streaming, will be said about the VCS I'm sure.

Linux has pretty much taken over the world,  it runs on mainframes, servers, micro-devices, phones, set-top boxes-- you name it!   It just can't steal the desktop from Windows,  and people say it's a failure because of that.   But really it's one of the most successful tech innovations ever. 

 

On 5/30/2021 at 5:40 PM, Shaggy the Atarian said:

There was a business that started in 2016 just a few stores down from my arcade - they were offering VR to the masses for a relatively cheap starting fee. They setup some locations at different malls around Utah, they were riding the hype. Before the pandemic started though, they ended up closing all of their stores because the money wasn't there. The owner told me how much he made in Christmas 2019, which was the best month I'd ever seen at my arcade, and I made $10k more than he did.

Even pre-pandemic, the concept of "VR arcade" was iffy..   I'm no germ phobe,  but I'm not crazy about the idea of wearing something on my face that many others wore before me, and it probably not cleaned in between usage.

 

 

Link to comment
Share on other sites

16 hours ago, x=usr(1536) said:

It's an enormous difference, but ultimately the connectivity on your end isn't the only factor at play.

 

Going from 1080p to 4K requires a CDN somewhere to push out a minimum of 400% more data per customer.  That affects everything from datacentre infrastructure (including power requirements, data storage, networking, and physical space) to backbone capacity to connectivity to service providers and, ultimately, individual homes.

All of which have been evolving, and are evolving, along with consumer internet speeds.

 

16 hours ago, x=usr(1536) said:

I spent a chunk of my working life in tech dealing with the problems of streamed 4K video transport and it is nowhere as simple to solve as you are making it sound

I think a result of that is that you can´t see the forest for all the trees. Just because it is difficult to do it now, it doesn´t mean it will be so for a long time. What would you have said 20 years ago if someone told you 20 years later people will have computers in their TVs or a computer in their cellphone, and can watch whatever they want when they want from a huge meny in a live transfer of 1080p video files on a wireless connection? "In your dreams"?

 

16 hours ago, x=usr(1536) said:

and just wait until 8K starts gaining a foothold in the consumer marketplace.

Increases in resolutions are becoming irrelevant. For each upgrade in resolution the difference in improved experience is smaller. There is virtually no difference between 4k and 8k. I think 4k is the last upgrade people will care a little bit about. Going from SD to HD was not a very big deal, and people care much less about 4k.

 

You already see in gaming that it is no longer the most visually stunning games which reign supreme. People want good gameplay most of all. Games like Fortnite and League of Legends are beating more visually impressive games. And the Switch is a success for the same reason. 

  • Like 1
Link to comment
Share on other sites

On 5/30/2021 at 5:40 PM, Shaggy the Atarian said:

They are two great games, but that's not enough to drive adoption to the point where it overcomes the issues in price, reliability, health (vertigo and/or skin issues - even the brand new Oculus Quest 2 is causing problems with some people), etc. A lot of people have gone bankrupt rolling the dice on VR so far, and while it's not going away, companies see that and decide they'll keep waiting or take the cautious approach. It's not a lock that VR will become like it was portrayed in Ready Player One (and we could say the same of game streaming - maybe it will, but not any time soon)

Video games had a bumpy ride as well trying to gain mainstream acceptance.   There is the well-known 83 Crash.   Some sources refer to another crash in the 70s too, between the saturation of Pong consoles and the adoption of cartridge-based systems.

  • Like 1
Link to comment
Share on other sites

16 hours ago, x=usr(1536) said:

At no time did I suggest that console prices were set at the price points that they are in order to gouge the customer.

No, but you suggested they would do so if they could sell the next generation at a much lower price:

 

"Except that the market has become accustomed to consoles with a $500 at-launch price point.  There is no way in hell that either Sony or Microsoft would do anything to deliberately let that number slip in the public's mind now that the expectation has been set."

 

16 hours ago, x=usr(1536) said:

What I did say is that people buying these devices have become accustomed to the price points that they're being launched at, which have a higher per-unit margin than a $50 streaming device.

First of all, consumers would buy a cheaper, but equally good, alternative if they could. As many Playstation fans did when Xbox 360 was a little cheaper.

 

Second, I am not sure $500 consoles have a higher per unit marging than a $50 streaming device, as consoles are often sold at a loss, especially at launch. This is not the case for Nintendo, but for Sony and Microsoft, who actually have streaming services.

 

16 hours ago, x=usr(1536) said:

What does that have to do with hardware R&D and manufacturing costs and the need to amortise those costs over the lifetime of the unit?

Nothing, it was a second example illustrating that Sony and Microsoft wouldn´t keep (console) prizes high if they could set it lower.

 

Also, the costs of hardware R&D would probably be reduced with a streaming device.

 

16 hours ago, x=usr(1536) said:

You're right.  It's not.  That's one of many reasons why I didn't mention either of those services, and quite honestly do not understand why you are.  They have nothing to do with the environments and devices we're talking about.

I was (over-)emphasising that Sony and Microsoft are competing on (console) prizes.

 

16 hours ago, x=usr(1536) said:

I'll repeat myself: the margins on a $50 VDI box are too slim to make them a worthwhile pursuit as the market currently stands. 

Right now a $50 box is probably not able to properly stream games at 1080p, which is what it needs to be able to do to be successful at the moment. But we are getting really close to that. Chromecast with Google TV ($49.99) can almost do this. In fact, Google promised it would by the end of the first half of 2021. They may not meet that deadline, but it is gonna happen real soon.

 

Of course, Stadia is not a very attractive streaming service at the moment (low value for money), so it won´t be a smash hit any time soon. At least not due to its gaming capabilities. But it shows that the technology to do it is soon here.

Link to comment
Share on other sites

48 minutes ago, Lord Mushroom said:

I think a result of that is that you can´t see the forest for all the trees. Just because it is difficult to do it now, it doesn´t mean it will be so for a long time. What would you have said 20 years ago if someone told you 20 years later people will have computers in their TVs or a computer in their cellphone, and can watch whatever they want when they want from a huge meny in a live transfer of 1080p video files on a wireless connection? "In your dreams"?

I remember reading articles in Compute! magazine in the 1980s that predicted that your computer, TV and other electronics would converge into a single device.    But it took decades longer than they predicted to become a reality.  They also made a faulty assumptions because they couldn't see beyond their paradigm--  that because tech would converge into a single device,  families would only need to own one such TV device that would do everything.   (maybe they'd have a second for the bedroom).    They reality they didn't predict is the average family would have a pile of such devices.   Some in the form or large traditional TVs, some in the form or traditional computers,  some in the form of a phone or portable games console.    All are capable of streaming TV shows, playing Music, games, computer apps.   It's easy to spot a trend, and assume it just keeps going in the same direction.   But in reality, counter-trends emerge that shape the future in unexpected ways.

 

57 minutes ago, Lord Mushroom said:

Going from SD to HD was not a very big deal, and people care much less about 4k.

I think HD was a fairly big deal.  For the first time in decades we could have higher-resoultion TV broadcasts where we'd see details we could never see before.   But 4K is a harder sell.   It all depends on the size of your TV and how far you sit from it whether you can even perceive the difference between 1080p and 4K.   8K - most people are going to need a ridiculously-sized TV to notice.   My main issue with TV manufacturers rushing us to higher and higher resolutions is they try to do it faster than CPU/GPU power can keep up,  faster than TV broadcast standards can keep up, and the growth of bandwith can keep up.   So what are we supposed to watch on these new 8K TVs?  4K took years to get over its content problem.

 

1 hour ago, Lord Mushroom said:

Increases in resolutions are becoming irrelevant. For each upgrade in resolution the difference in improved experience is smaller. There is virtually no difference between 4k and 8k. I think 4k is the last upgrade people will care a little bit about. Going from SD to HD was not a very big deal, and people care much less about 4k.

 

You already see in gaming that it is no longer the most visually stunning games which reign supreme. People want good gameplay most of all. Games like Fortnite and League of Legends are beating more visually impressive games. And the Switch is a success for the same reason. 

In a world were every game can achieve photo-quality graphics,  what is your game going to do to stand out?   I expect game developers to increasingly focus on unique stylized designs, but with lots of effects to show off ray-tracing capabilities.

  • Like 1
Link to comment
Share on other sites

12 hours ago, Keatah said:

The unsolvable problem remains lag, from the time you hit the stick, gotta travel to the server, be operated on, frame rendered and prepped for transport, then sent back to you, and displayed. And that's simplifying it.

 

So how you get around that and meet or exceed what your gaming computer/console can do internally?

You probably can´t beat the lag of a computer/console which plays internally. Although the remote computer would be faster than the local one. But it doesn´t need to. It just has to be so close that the vast majority of players don´t notice a difference.

 

All the factors you have mentioned can, and have been, improved upon. Even travel to and from server can be improved by increasing number of servers, which has been and is being done.

 

Most people who worry all day long about latency play with a wireless controller...

Link to comment
Share on other sites

3 hours ago, zzip said:

Sony and Microsoft embraced streaming because they were spooked by Stadia.   Prior to this, Sony had a streaming platform in PS NOW for almost a decade that nobody cared about..  

The technology wasn´t as good as it is now, and will be in the future. Also, maybe the game library was less interesting then. And maybe you could only play on a Playstation.

 

4 hours ago, zzip said:

And everyone and their mother is trying to buy a physical PS5 now even though there aren't that many PS5-specifc games.

Because the games currently on offer in streaming is limited, the technology is not quite ready and there is no obvious streaming box to buy.

 

But for the next generation of consoles I think the technology will be ready, so the console makers will offer a cheap streaming console and a streaming service with all the latest games they have the rights to.

 

4 hours ago, zzip said:

Comcast is trying to put a data cap on my home internet.

Low data caps would of course be a problem. But it hasn´t been a big problem for the average home internet user, who uses internet for TV streaming, so there is no reason why it should be a big problem in the future.

 

4 hours ago, zzip said:

That plus we are seeing internet hacks that take out a portion of the infrastructure and ransomware attacks with increased frequency.

It is very unlikely attacks on the internet connection is going to be a big problem. Also, even games played locally often require an internet connection. Most commonly with online multiplayer.

 

And I would say that ransomware attacks are a bigger problem if you have an expensive box with many expensive games. 

 

4 hours ago, zzip said:

And gamers keep demanding higher and higher framerates in games-  60, 120, 144, 240..   to reduce control latency.

Most players don´t even notice the difference between 60 and 120, it is just cool to pretend you do. Just like 8k sound awesome, but is bullshit.

 

4 hours ago, zzip said:

when signals are travelling around the country and bouncing off satelites, 

Signals don´t necessarily have to travel that far.

 

Finally, while streaming has its downsides, it also has upsides:

1) Limited storage is no longer a problem.

2) You don´t have to spend a long time downloading/updating games.

3) Most streaming services let you play any game on offer any time you want. In other words, you don´t have to pick out a few games and play them to death. Or spend a fortune to play all the games you want to play.

4) The box you are using is just a fraction of the cost of a conventional box.

5) Remote computers can handle more powerful games.

Link to comment
Share on other sites

2 hours ago, zzip said:

I remember reading articles in Compute! magazine in the 1980s that predicted that your computer, TV and other electronics would converge into a single device.

And power to them. I know a guy who has difficulties seeing us going from streaming 1080p to 4k. ?

 

2 hours ago, zzip said:

I think HD was a fairly big deal.

I can accept fairly big deal. But only just. :)

Link to comment
Share on other sites

2 hours ago, Lord Mushroom said:

Most people who worry all day long about latency play with a wireless controller...

A Bluetooth stack has lower latency than USB.

 

26 minutes ago, Lord Mushroom said:

Most players don´t even notice the difference between 60 and 120, it is just cool to pretend you do. Just like 8k sound awesome, but is bullshit.

Not really. It's simple to tell the higher framerates by the size of the buzz when scrolling sideways. There is noticeable difference between 200 and 240. And more between 60 and 120, or 60 and 144.

 

26 minutes ago, Lord Mushroom said:

Signals don´t necessarily have to travel that far.

Maybe not distance. But all the repeater equipment inbetween. All those packets have to move through buffers and switches.

  • Like 1
Link to comment
Share on other sites

19 minutes ago, Lord Mushroom said:

The technology wasn´t as good as it is now, and will be in the future. Also, maybe the game library was less interesting then. And maybe you could only play on a Playstation.

Yeah but this is the same argument that was used the first time streaming became a concept over 10 years ago.   The technology hasn't improved all that much for most users since.  Then about 2 years ago, Google announced Stadia and all the same arguments popped up again "Streaming is for real this time!   The tech is now there".   But the reality is the Stadia tech hasn't played out like the hype, and I don't see this happening anytime soon.

 

The thing is, there is a hard speed limit on how fast data can travel--  the speed of light..   and while that seems unimaginably fast to us,  when your data has to travel hundreds of miles to get to and from the server farm,  possibly bouncing off satellites hundreds of miles in the sky in the process,  it can add some milliseconds of delay, not counting the latency added by routers along the way and network congestion.   Latency is always going to be much greater than for a box sitting next to you.   when you are streaming video, this latency doesn't matter.   But when you are streaming two-way, in a multiplayer game when a couple of milliseconds might make the difference in who scores the kill, and who gets killed--  it makes a real difference!

 

Maybe you can help by putting more data centers closer to more people,  but the cost of such infrastructure is expensive, and in the end it is probably more cost effective to deliver games to beefy consoles/PCs that are already in users homes and they payed the costs for--   just because game streaming is possible doesn't mean it will be the best solution for every gamer.

 

48 minutes ago, Lord Mushroom said:

Low data caps would of course be a problem. But it hasn´t been a big problem for the average home internet user, who uses internet for TV streaming, so there is no reason why it should be a big problem in the future.

I have never had a data cap on home internet from the time I first got broadband around 1999, until now.   Right after I cut the cord in favor of streaming services, upgraded to 4K, and everyone in the house is working from home/schooling from home because of COVID,  now suddenly I have to meter my family's internet usage (which has been growing near the proposed cap).  There's no way I will be signing up for anything that will increase our internet use at this point.

 

55 minutes ago, Lord Mushroom said:

It is very unlikely attacks on the internet connection is going to be a big problem. Also, even games played locally often require an internet connection. Most commonly with online multiplayer.

It's annoying now when it happens to multiplayer games,  but there are always single player games you can play instead until the problem is resolved.   When all your games require streaming you are just SOL...      And the attack doesn't have to be on the service directly,  we've seen plenty of attacks on internet infrastructure that result in slow speeds for average users as traffic is routed around the problem.

 

1 hour ago, Lord Mushroom said:

Most players don´t even notice the difference between 60 and 120, it is just cool to pretend you do. Just like 8k sound awesome, but is bullshit.

I would agree for a lot of people it's bragging rights..  however whether it's a real thing or all in your head, it's still a big deal for people to brag about what kinds of frame rates they get, brag about what kind of hardware is in their rig, argue about who has the better console.   These kinds of people are really really into the hardware side of things, and will never be satisfied by streaming.

 

1 hour ago, Lord Mushroom said:

Most streaming services let you play any game on offer any time you want. In other words, you don´t have to pick out a few games and play them to death. Or spend a fortune to play all the games you want to play.

Here's something a lot of people haven't quite grasped though:   We value what we pay for more than what is given to us.    We're more likely to finish a game that we payed for than one offered for free..   With all these free games we are likely to just see people "channel surf" games.   At the height of cable TV, it was not uncommon to hear complaints like "I have 120 channels, and I can't find a thing to watch!".   The other downside to "subscription" models, is they get paid whether you like the new games they deliver or not, so the incentive to produce hit games that will sell well is diminished, likely leading to more mediocre content.   "Well if they don't produce good quality, users will jump ship to another service",   true, but they are likely to find ways to "lock you in", like maybe if you cancel your subscription, all your save data gets wiped.    Content companies love subscriptions exactly because it produces a predictable stream of revenue in the way that always having to find the next hit games doesn't.   But I think it's anti-consumer for this reason.  One of the best tools consumers have is the ability to vote with our wallets when they are producing what we like/don't like.

Link to comment
Share on other sites

46 minutes ago, Keatah said:

A Bluetooth stack has lower latency than USB.

Damn, my limited knowledge about modern gaming strikes again. I thought most wireless gamepads still used USB dongles.

 

48 minutes ago, Keatah said:

Not really. It's simple to tell the higher framerates by the size of the buzz when scrolling sideways. There is noticeable difference between 200 and 240. And more between 60 and 120, or 60 and 144.

If you are looking for it, I am sure you can find it. But if you are an average gamer, who doesn´t know what to look for, it looks and feel the same. 

 

Anyway, I meant in the context of latency. Although that was unclear with my following 8k example.

 

50 minutes ago, Keatah said:

Maybe not distance. But all the repeater equipment inbetween. All those packets have to move through buffers and switches.

And that process can be improved. My point is that too much lag is not unavoidable due to distance.

Link to comment
Share on other sites

7 minutes ago, Lord Mushroom said:

Damn, my limited knowledge about modern gaming strikes again. I thought most wireless gamepads still used USB dongles.

Some do. Some don't.

 

7 minutes ago, Lord Mushroom said:

If you are looking for it, I am sure you can find it. But if you are an average gamer, who doesn´t know what to look for, it looks and feel the same.

Simply when side panning, the image or object in the image, had edges that buzz or become fuzzy, like a double image. The higher the framerate the closer to a single image you get.

Link to comment
Share on other sites

4 hours ago, Lord Mushroom said:

All of which have been evolving, and are evolving, along with consumer internet speeds.

Two problems with this:

  1. No single network-connected activity drives infrastructure improvements, including bandwidth.  It is the aggregate of all traffic that makes this happen.  Gaming, Netflix, television, audio, mobile applications, cloud-based applications, and everything else from email to web browsing to ping - all of them and more are what drives how and when content transport and delivery is improved.  Yes, individual applications and / or types of traffic are taken into account in certain cases, but overall improvements rest on the back of the sum total requirements.
  2. All of this costs money.  Unless there is a return on investment in terms of improvement for every layer of the delivery and transport stack, it's not going to happen.  Your home internet connection (or mine, or anyone else's, for that matter) is the lowest-level consideration in the stack because most content can be scaled to work at what would have been considered a fast pipe 15 years ago.

And yes, of course all of it is evolving.  But that's just the nature of technology.  In 2006, I specced out a symmetric 1Gbit Metro-E circuit carried over fibre for my employer.  The cost: $120K per month.  We now have an equivalent circuit (in terms of bandwidth, at least, but my peering is going to suck horribly by comparison) in our home for a staggering $49.99 per month, and I say staggering because at a cost of $0.04999 per megabit, it's just ludicrously cheap when compared to every other form of domestic (and some enterprise) connectivity I've ever had.

4 hours ago, Lord Mushroom said:

I think a result of that is that you can´t see the forest for all the trees.

Uh, no.  It means that I have the experience and background to understand details of this that you clearly do not.

4 hours ago, Lord Mushroom said:

Just because it is difficult to do it now, it doesn´t mean it will be so for a long time.

Nor did I ever say that it would be.

4 hours ago, Lord Mushroom said:

What would you have said 20 years ago if someone told you 20 years later people will have computers in their TVs or a computer in their cellphone, and can watch whatever they want when they want from a huge meny in a live transfer of 1080p video files on a wireless connection? "In your dreams"?

Actually, no, I would not have said, "in your dreams" simply because I rode the dot-bomb era out to its bitter end working directly with the types of technologies you're listing.  Something I would have been more likely to say would have been, "oh, yeah, it's gonna be really awesome at some point in the future when we get all the infrastructure and transport issues figured out, but the software is already here to do it," because that was a more realistic view of the way things were going at the time.

 

However, since you mention computers integrated with TVs (or vice-versa), computers in a cellphone (in the sense of what we now call a smartphone), or a streaming video service with user-selectable content, I'd refer you to the following:

All we've seen in the last 20 years is the natural progression of what we saw from the mid- to late-1990s.  Media was available for streaming in various formats by that time, and sites such as newgrounds.com existed to provide content selection for the end user.

4 hours ago, Lord Mushroom said:

Increases in resolutions are becoming irrelevant.

I'm not gonna search out another Picard facepalm image to paste in here, so read the following carefully please:

 

Increases in resolutions are not 'becoming irrelevant', as you put it.  Films have been being shot in 8K for over a decade.  One of the end goals of 8K is to be able to go from camera to screen with zero loss.  8K isn't going anywhere because, frankly, pretty much everyone shoots cinema on Red cameras these days, and they've become the de facto standard for digital filming in 8K.  As a nice side effect, it'll also make a lot of the pieces of the how-do-we-provide-this-to-the-customer puzzle fall into place much more easily if and when across-the-board demand ramps up.

 

And yes, there are standards coming up behind 8K that are intended to surpass it in quality.  Again: that's just what technology does.

4 hours ago, Lord Mushroom said:

For each upgrade in resolution the difference in improved experience is smaller. There is virtually no difference between 4k and 8k. I think 4k is the last upgrade people will care a little bit about. Going from SD to HD was not a very big deal, and people care much less about 4k.

I disagree.  So do the SD-to-1080 sales numbers (which were admittedly forced higher due to the digital transition in the US being moved forward 3 years), followed by the 1080-to-4K numbers.  Demand isn't huge at this time for 8K, but hardware costs will drop over time and it'll change both price points and availability - which is pretty much what we saw with 1080-to-4K.  Having said that, 4K still isn't dominant over any other resolution, so its ultimate success remains to be seen.

4 hours ago, Lord Mushroom said:

You already see in gaming that it is no longer the most visually stunning games which reign supreme. People want good gameplay most of all. Games like Fortnite and League of Legends are beating more visually impressive games. And the Switch is a success for the same reason. 

Gaming does not necessarily drive the needs of other industries, and is an inaccurate yardstick to measure them by.

  • Like 2
Link to comment
Share on other sites

2 hours ago, Lord Mushroom said:

And power to them. I know a guy who has difficulties seeing us going from streaming 1080p to 4k. ?

 

I can accept fairly big deal. But only just. :)

Funny thing is, 4k streaming really isn't.   You still need compression and since it is over UDP, you will still lose bytes here and there, and since it is constantly auto adjusting based on latency... 4k is more a marketing term for streaming of the potential you could have vs what you are actually getting.

Also, actually trying to play 4k blurays turned out to be impossible for me.

1) PS4 Pro doesn't support them for some reason...

2) my standalone player is too old.

3) I couldn't figure out why as my hardware / software should have been able to play 4k blurays on my computer, but they refused.  (Maybe now I have a 3080 RTX it will let me?)

so yeah, 4k movies are a lie!

  • Like 1
Link to comment
Share on other sites

32 minutes ago, zzip said:

Yeah but this is the same argument that was used the first time streaming became a concept over 10 years ago.   The technology hasn't improved all that much for most users since.

Because the streaming providers didn´t spend much resources on improving the technology. They just used the technology which existed. But now that Microsoft, Google and Amazon is involved, the technology is improving much faster.

 

35 minutes ago, zzip said:

Then about 2 years ago, Google announced Stadia and all the same arguments popped up again "Streaming is for real this time!   The tech is now there".   But the reality is the Stadia tech hasn't played out like the hype,

They promised more than they could deliver. But that doesn´t mean that they, or more likely someone else, won´t deliver.

 

44 minutes ago, zzip said:

Latency is always going to be much greater than for a box sitting next to you.

Maybe in terms of percentage, but not in terms of number of milliseconds.

 

47 minutes ago, zzip said:

But when you are streaming two-way, in a multiplayer game when a couple of milliseconds might make the difference in who scores the kill, and who gets killed--  it makes a real difference!

A couple of milliseconds is almost nothing. A couple of milliseconds only matter if you are professional player. 

 

While considerably more latency than that has a slight impact, it is still so low that it doesn´t ruin the player´s experience, especially as he/she doesn´t even notice it. 

 

Many games also have a skill based matchmaking, so even if the player´s performance was reduced, he/she would still be playing against people with equal skills.

 

57 minutes ago, zzip said:

Maybe you can help by putting more data centers closer to more people,  but the cost of such infrastructure is expensive,

There doesn´t have to be a data center on every street corner. Light travels 186 miles per millisecond. Distance is not a big problem.

 

1 hour ago, zzip said:

There's no way I will be signing up for anything that will increase our internet use at this point.

I didn´t say data caps wouldn´t be a big problem for you. :)

 

1 hour ago, zzip said:

It's annoying now when it happens to multiplayer games,  but there are always single player games you can play instead until the problem is resolved.   When all your games require streaming you are just SOL...  

Internet problems are of course a bigger issue for streaming. Still, even if you don´t have a powerful console/computer, you most likely have some games on some device. So you won´t be totally without games in those rare occasions.

 

1 hour ago, zzip said:

I would agree for a lot of people it's bragging rights..  however whether it's a real thing or all in your head, it's still a big deal for people to brag about what kinds of frame rates they get, brag about what kind of hardware is in their rig, argue about who has the better console.   These kinds of people are really really into the hardware side of things, and will never be satisfied by streaming.

People want to have the best. When streaming is best, they will want that.

 

1 hour ago, zzip said:

Here's something a lot of people haven't quite grasped though:   We value what we pay for more than what is given to us.    We're more likely to finish a game that we payed for than one offered for free..   With all these free games we are likely to just see people "channel surf" games.   At the height of cable TV, it was not uncommon to hear complaints like "I have 120 channels, and I can't find a thing to watch!".

I agree that people would play more games, and less per game. But that is a good thing. It means people get to play all the games they want to play, not just a selected few.

 

If you offered to take 90 of those 120 channels away from those people, they would say "no, thanks". And if you offered 90 extra channels to someone who had only 30, he/she would happily accept.

 

1 hour ago, zzip said:

they are likely to find ways to "lock you in", like maybe if you cancel your subscription, all your save data gets wiped. 

They may try that, but that kind of stuff also keeps people from choosing that particular service in the first place. One of the major reservations people have with Stadia is that if you leave Stadia, or Stadia shuts down, you lose the games you bought there (unlike with Geforce Now).

 

Even if you are locked in a poor service, you will eventually leave. But if you buy a bad game, you are stuck with it.

Link to comment
Share on other sites

4 hours ago, Lord Mushroom said:

No, but you suggested they would do so if they could sell the next generation at a much lower price:

 

"Except that the market has become accustomed to consoles with a $500 at-launch price point.  There is no way in hell that either Sony or Microsoft would do anything to deliberately let that number slip in the public's mind now that the expectation has been set."

I suggested no such thing.  All that I pointed out was that if someone is used to paying $500 for the latest console, then the next generation machine can be priced at at least that number and consumer expectations of price will be met.  At no point did I suggest, insinuate, or imply otherwise.

4 hours ago, Lord Mushroom said:

First of all, consumers would buy a cheaper, but equally good, alternative if they could. As many Playstation fans did when Xbox 360 was a little cheaper.

Consumers may buy a cheaper but equally-good equivalent if one is available, but not always.

 

The PS3 / Xbox 360 scenario also omits two details: one, that a fair number of PS3s were sold in their first 18 months on the market because they were the cheapest Blu-Ray player you could buy; two, that many of the people who bought the Xbox 360 instead of the PS3 later bought a PS3 once it had reached a price point they were more comfortable with.  I'm not going to pretend that I have hard-and-fast figures for both, but they were significant.

4 hours ago, Lord Mushroom said:

Second, I am not sure $500 consoles have a higher per unit marging than a $50 streaming device, as consoles are often sold at a loss, especially at launch. This is not the case for Nintendo, but for Sony and Microsoft, who actually have streaming services.

Let's talk about the 'sold at a loss' phrase; it's tossed around a lot and really does not fully-represent what happens with console pricing, especially over the lifetime of the console.

 

Firstly, and for the sake of argument: let's say that <insert console manufacturer here> sells a million consoles ("Console Q") on the market in the course of a year.  Console Q has a retail price of $499, but costs $799 to manufacture.  That represents a $300 million loss in the first year alone.  No company wants a big red $300 million deficit line item on their books for a new product.  They also don't want the Federal Trade Commission investigating them for potential dumping of products on the market at an unrealistically-low price in order to gain unfair advantage in the market, which is something that could be a very real possibility in a scenario like this one.

 

Secondly: Console Q was a money pit even before the first one shipped.  Howevermany billions of dollars were spent on R&D, marketing, manufacturing, support, backend infrastructure, contracts with studios and developers, and everything else that goes into its creation are not insignifcant.  This is another good reason to avoid selling at a loss, because there's no reason to dig yourself even deeper into the hole.

 

What actually happens is this: serious development work on Console Q begins about one-third to halfway through its predecessor's life.  As part of that development work, a product lifecycle roadmap for it is established.  On that roadmap are certain key criteria that the console needs to meet in order to not be a financial disaster for its manufacturer.  One of those criteria is that somewhere in the 24- to 36-month range of it being on the market, it needs to have broken even on its pre-sale costs.  Months 36 to 72 (which we'll call EOL for Console Q) are typically where actual profit on the hardware would start to be realised.

 

And yes, there are other revenue streams related to Console Q that the manufacturer will have access to during those initial 24 to 36 months.  However, they're not going to pay all of the bills related to its development.  They'll absolutely help with them, sure, but they also have to be used for other expenditures during that timeframe.

 

There is greater detail to this, obviously, and I am very much giving the 50,000 foot view of things.  However,saying, "consoles are sold at a loss," is utterly inaccurate and paints a very poor picture of how these things actually work.

4 hours ago, Lord Mushroom said:

Right now a $50 box is probably not able to properly stream games at 1080p, which is what it needs to be able to do to be successful at the moment. But we are getting really close to that. Chromecast with Google TV ($49.99) can almost do this. In fact, Google promised it would by the end of the first half of 2021. They may not meet that deadline, but it is gonna happen real soon.

This comes back to my earlier point re: margins being lower on cheap hardware.  Note that all numbers below are notional but not outside the bounds of reality.

 

Back to Console Q for a moment: it sells for $500.  20% of that is net margin (not gross), so each unit is making $100 in profit for its manufacturer.  At that price point, 50 million are sold by month 24, which represents a total profit of $5,000,000,000 for the manufacturer.  At this time a lower-cost revised unit is introduced, retailing at $300.  The cost to manufacture Console Q2 also drops significantly (economies of scale, component count reduction, more cost-effective manufacturing location, etc.), so the manufacturer is able to realise a 35% net margin on sales of a further 75 million units.  At month 72, when Console QWERTY is being introduced as its replacement, Console Q2 has managed a profit of an additional $7,875,000,000, or a total of $12,875,000,000 in profit on sales of 125 million devices over a six-year product lifespan.

 

Even if the $50 streaming box has a 50% net margin and we assume that it a) never has a price drop over its lifetime and b) sells the same 125 million units as Console Q managed, that's still only $3,125,000,000 in profit, or 25% of what Console Q achieved.  Unless it's managing to sell the in the same numbers at a $200 price point (unlikely) or can sell four times as many units as Console Q did (also unlikely), there just isn't any money in it.

4 hours ago, Lord Mushroom said:

Of course, Stadia is not a very attractive streaming service at the moment (low value for money), so it won´t be a smash hit any time soon. At least not due to its gaming capabilities. But it shows that the technology to do it is soon here.

The technology to do this has been here for years.  It's gone by a few different names, but the most recent one is VDI, or Virtual Desktop Infrastructure.  Ignore the word 'desktop' in there; what we're really talking about is how the data is transported between the user and remote gaming instance.  Same idea, different application.

  • Like 1
Link to comment
Share on other sites

20 minutes ago, Lord Mushroom said:

There doesn´t have to be a data center on every street corner. Light travels 186 miles per millisecond. Distance is not a big problem.

But where's the nearest data center?   Let's say it's 500 miles away?   Well the line from your house doesn't make a straight shot to the data center.   It goes to your ISP, makes God knows how many hops before it gets there,  and cable isn't layed without slack.   A 500 mile distance could mean it easily travels 1000-1500 miles of cable to get there,  that's almost 10ms added just by light-speed before even counting the latency added by the switches and network congestion.   Let's say you are a rural customer and sign up for Starlink satellite internet.   Their satellites are 680 miles up,  that means your data has to travel 1360 miles just to get to your ISP, not counting the travel from your ISP to the data center.

 

It will very much depend on where you live how good the service is,  and there is no practical way to overcome the issues for everyone else.

30 minutes ago, Lord Mushroom said:

I didn´t say data caps wouldn´t be a big problem for you.

I'm just one of many Comcast customers in the northeast,  many of us will be affected.  There was such an outcry that they postponed their plans until next year so hopefully fewer people aren't home using internet for work/school because of covid, but it's still coming and there isn't a better alternative around.

 

33 minutes ago, Lord Mushroom said:

If you offered to take 90 of those 120 channels away from those people, they would say "no, thanks". And if you offered 90 extra channels to someone who had only 30, he/she would happily accept.

Of course they will say that, because people always think they want "MORE" that doesn't mean they'll use more when available.   More channels also lead to a reduction in quality.   Cable channels used to be dedicated to Music, or History or Home and Garden or Scifi..  Nowadays they run mostly reality programming, and when they find a semi-hit, whether it be Pawn Stars, Storage Wars, Trading Spaces, they just run endless marathons of it until you are sick to death of the concept.   They same thing will happen to gaming on subscription services-- they will all be competing for a limited pool of money, so they'll go for the gaming equivalent of the reality show-- something cheap to produce and popular in the moment, they'll all work from the same template until these services are the same wasteland that cable became

  • Like 3
Link to comment
Share on other sites

13 hours ago, zzip said:

I would agree for a lot of people it's bragging rights..  however whether it's a real thing or all in your head, it's still a big deal for people to brag about what kinds of frame rates they get, brag about what kind of hardware is in their rig, argue about who has the better console.   These kinds of people are really really into the hardware side of things, and will never be satisfied by streaming.

Flight simming is a great example. The higher the framerate the less shimmering and buzzing, or loss of sharpness at the horizon. Especially during a roll. Ideally you would want to advance the image 1 pixel at a time. and a framerate of 60 fps means you can only have something onscreen move a distance of 60 pixels in one second before the image starts ghosting. That's rather pitiful. 240 fps is an improvement and maybe even good considering this is consumer electronics we're talking about. Bonus points for FreeSync!

 

When it comes to FPS bragging rights, it's usually related to GPU/CPU power of a new build. It's never about image fidelity.

  • Like 2
Link to comment
Share on other sites

On 6/1/2021 at 11:16 AM, Lord Mushroom said:

You probably can´t beat the lag of a computer/console which plays internally. Although the remote computer would be faster than the local one. But it doesn´t need to. It just has to be so close that the vast majority of players don´t notice a difference.

 

All the factors you have mentioned can, and have been, improved upon. Even travel to and from server can be improved by increasing number of servers, which has been and is being done.

 

Most people who worry all day long about latency play with a wireless controller...

The 20% of people who make up 80% of the market will notice the lag.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...