A scary thought that occurred to me as I was trying to get a clearer picture on AMD’s stance on hardware ray tracing after E3. That is the huge advantage (potentially) that all these cloud based streaming game services will have in terms of traditional hardware in your home. The advantage being quality. Which is kinda ironic and somewhat mitigated by the loss of quality due to video encoding. Consider for a moment the tile based rendering employed by AMD’s crossfire. Add in cloud infrastructure, AI, and the potential redundant portions of frames in game instances across thousands if not millions of players. I would not doubt these services already plan to somehow reduce GPU load by sharing the output of what is common between game instances. If AI allows them to handle detecting similar tiles in frames being rendered in realtime then they potentially can offer things like ray tracing, antialiasing, shader effects, etc at much lower price/performance. In certain workloads they may even be able to prerender (only things that are especially on rails or turned based) scenes and have them cached to CDN’s to reduce network load as well. AI’s could certainly be trained to look at the statistics of game scene similarities on a case by case basis. And what games are brought to these platforms and which are marketed heavily might very well be due to how well they “scale” in this manner.
I would not be surprised if games running on these services have their graphics options neutered in an attempt to reduce costs. Time will tell. Just my wild prediction. I understand there are numerous technical hurdles in doing something like this, but if the google’s of the world can save even 10% in compute costs I think they will at least try to make it happen. The sacrifice is the users options and potentially a trend on developers focusing on making games that, again, “scale” well.
At the very least this all would apply to long cutscenes with little or no variation. Just food for thought. I’m sure there are holes in my prediction.