Hey, this has been making my head itch for a bit and I figure "why not post about it?". I know there's /r/futurology , but they mostly talk about Batteries, Clean Energy, Biotech, Transhumanism and maybe Quantum Computing. (of which has no know application for games) So, here we go.
-Monitors won't get any better than 8K, 480hz, Freesync, 10-bit, OLED, HDR. "Good enough" will stay good enough and they may not even get as good as I mentioned unless they want something to gimmick like "going to 11" and what I mentioned looks so good, it would make me not miss CRT technology and this is coming for a guy that loves CRTs and I can see a marketing program eventually happen where people would get a free monitor for replacing their FW-900 and they'll say their monitor is the "FW-900 Killer". Being 480hz would also make it perfect for legacy 24fps, 30fps and 60fps passive content because there would be no frame interpolation. (though PAL 25 and 50 would have minor frame interpolation and it may not even be noticeable)
-Being stuck on Silicon for years without die shrinks making us go to reconfigurable computing with FPGAs. We're already kinda there, with OpenCL, but not massively adopted and there's really nothing for us power users yet and I think we might see that when people that do Desktop Computing (Yes, Desktops will still exist, there will always be that 90W CPU Market, Pros need raw performance) and we might even see FPGAs embedded in CPU dies themselves or it might be co-processor and it might fit in something like a M.2 Slot.
-CPUs & GPUs not made of Silicon with the best one being Graphene allowing chips reaching close to 1Thz.
-Mill CPU Architecture that's ten times as fast as x86. It's so fast, if it ever took off the ground, your legacy x86 apps would run slightly faster in an emulator than on native x86. Couple that with Graphene and you got a CPU 2,500x as powerful with native ports and that's not even counting how it would interface with OpenCL FPGAs and thus maximizing the potential of non-quantum computing and that would be the perfect opportunity to kill off Windows. I don't care what the Linux enthusiasts say, in order to kill off Windows, you have to kill off x86 first and in order to replace a platform, the replacement can't be almost as fast like Linux is now for some things, (way slower with most) it can't be just as fast, but have more features like OS/2 and nobody making native applications because it's just as fast and have all of the OS/2 users run in compatibility mode, (like Linux users use Wine for DX9 and older Games) it needs to run emulated stuff at native speeds of the previous platform while having native speed of the replacement platform be way faster like what Apple did with DOS emulation. When you emulated DOS on an early PPC Mac, PPC had so much processing overhead, it ran as fast as a 386 and native stuff was as fast as a Pentium II. Anyway a Graphene Mill CPU with a couple Graphene FPGAs would be the machine of the Gods and yes, it would run Crysis and be over 250x as fast without a native port with Crysis 1 being the oldest game that would somewhat scale because it has a 64-bit binary and you could probably put in a realtime raytracing ENB. If you're modding Skyrim even now, (non-Special Edition) it will crash as soon as you go over 8GB of RAM. (4GB from Skyrim with the 4GB Patch and 4GB from the ENB) The one problem with Mill is they're short on funds, so most of their money goes into patenting their stuff to protect them from lawsuits.
-Patent expiring on Intel's Thunderbolt allowing third party CPU makers to use it. (15 years from now) It would be kinda cool to see a 200Ghz Raspberry Pi 11 with a Thunderbolt Port :D But who knows, by then Softbank may have acquired Intel
-Cheap custom CPUs. I see once we reach the limits of Graphene, custom Graphene Lithography will be commodity. Say today, (pulling numbers out of my ass) you could get a custom ASIC made, but you will need to spend at least $10M and you'll get a 45NM or even a 90NM Process (because everybody else gets hand-me-downs) and you'll get a million chips with no option to scale it down and get 10k chips for $100,000. Once Lithography plateaus, you'll have a plethora of Lithography vendors and lowering the price to make an ASIC and allowing you to scale down quantities and still be at the same manufacturing process to play with the big boys. This would allow open source CPUs and GPUs that don't suck to be financially feasible or an add-in card or external accelerator for whatever you want be it for compression, encoding for some future codec, AI, realistic voice synthesis, cryptocurrency mining, encryption and encryption cracking for "penetration testing". Imagine a Kickstarter to make homebrew chips actually being feasible. It could be a project so niche, their pitch would be "New Graphene PowerPC SBC that's Amiga OS Compatible", only get $100,000 and they could still deliver. Kinda like how Dragonboard is getting the Pyra made for about a couple million today. Once Graphene Lithography is commoditized, it maybe cheaper to just buy a Lithography machine if you need $10M worth of orders. (Maybe even less)
-The plateau of transistor count with Graphene and the inexpensive production of high end ASICs will also put an end to Consoles as we know it. (if mobile isn't already doing that) The Console fans of today that are collectors and buy Consoles for game collection of physical media and are scared that the 8th gen will be the last gen that has physical media might get together and make one last open spec device like the 3DO concept of single format gaming, but be successful that time around because there won't be a "next gen" after that.
But what would you guys do if your computer was 250x to 2,500x+ as powerful with ways to go even faster with software optimized to use multicore, FPGAs and ASICs made out of Graphene?