Return to Level1Techs.com

Theory - Shared GPU output in the cloud

#1

A scary thought that occurred to me as I was trying to get a clearer picture on AMD’s stance on hardware ray tracing after E3. That is the huge advantage (potentially) that all these cloud based streaming game services will have in terms of traditional hardware in your home. The advantage being quality. Which is kinda ironic and somewhat mitigated by the loss of quality due to video encoding. Consider for a moment the tile based rendering employed by AMD’s crossfire. Add in cloud infrastructure, AI, and the potential redundant portions of frames in game instances across thousands if not millions of players. I would not doubt these services already plan to somehow reduce GPU load by sharing the output of what is common between game instances. If AI allows them to handle detecting similar tiles in frames being rendered in realtime then they potentially can offer things like ray tracing, antialiasing, shader effects, etc at much lower price/performance. In certain workloads they may even be able to prerender (only things that are especially on rails or turned based) scenes and have them cached to CDN’s to reduce network load as well. AI’s could certainly be trained to look at the statistics of game scene similarities on a case by case basis. And what games are brought to these platforms and which are marketed heavily might very well be due to how well they “scale” in this manner.

I would not be surprised if games running on these services have their graphics options neutered in an attempt to reduce costs. Time will tell. Just my wild prediction. I understand there are numerous technical hurdles in doing something like this, but if the google’s of the world can save even 10% in compute costs I think they will at least try to make it happen. The sacrifice is the users options and potentially a trend on developers focusing on making games that, again, “scale” well.

At the very least this all would apply to long cutscenes with little or no variation. Just food for thought. I’m sure there are holes in my prediction.

1 Like

#2

I think that would be an awesome way for the technology to progress! It seems like a definite possibility, although I’m sure it would take years to get to a point where something like that would be stable.

It’d be interesting to see how the public takes the release of Google’s Stadia and it’s online model. Because if people hate the latency/quality/experience then I imagine the whole “Gaming on the Cloud” paradigm may be postponed for a few years until there’s a more robust, low-latency infrastructure in place.

I kinda hope that the cloud gaming thing does take off? It will be a cool technology to experience, and would be cool to see our robot overlords AI being used to streamline computer processing. On the other hand, I live rural and don’t have a particularly fast OR reliable internet connection so I won’t be able to experience the cloud gaming experience for the time. So hopefully it won’t completely take over local gaming :crossed_fingers:

1 Like

#3

I guess I didn’t really make it clear in my OP that I’ve got mixed feelings about game streaming services myself. I guess in a way this is a bit of technical devils advocate on how such a service might try to differentiate itself into your wallet. I’m already disappointed with myself for letting digital platforms, like steam, become my norm. It’s DRM. But it’s DRM that makes things really convenient. When it launched I thought myself a diehard advocate of physical media. Time has a funny way of chipping away at you. Part of me kinda wants these streaming services to fail, because if the succeed I might be lamenting I fell in the trap of convenience in another 10 years.

Certainly. I agree that our infrastructure, especially in rural areas, is not ready for this. But if they find this venture profitable, it’s almost just a matter of time. One thing I’m even more skeptical about is MS’s idea of streaming from the xbox in your home to your mobile device. Hardly anyone has the upstream for that and if they do, they would probably need to do some traffic shaping to keep enough bandwidth available for TCP Acks and such. Well, except for the few that have symmetrical connections in the 100-1000 mbps range. And that’s still ignoring the fact that ISPs would most likely throttle or block such things if it became common on residential connections.

However, I think applying AI for the task above would be relatively painless for them compared to the infrastructure issue. They are already using AI to do comparison’s on satellite imagery for maps. Isn’t this similar? Of course it might have some unintended consequences if it’s not tuned right.

1 Like