What is the future of PC? (PC Futurology)

Hey, this has been making my head itch for a bit and I figure "why not post about it?". I know there's /r/futurology , but they mostly talk about Batteries, Clean Energy, Biotech, Transhumanism and maybe Quantum Computing. (of which has no know application for games) So, here we go.

-Monitors won't get any better than 8K, 480hz, Freesync, 10-bit, OLED, HDR. "Good enough" will stay good enough and they may not even get as good as I mentioned unless they want something to gimmick like "going to 11" and what I mentioned looks so good, it would make me not miss CRT technology and this is coming for a guy that loves CRTs and I can see a marketing program eventually happen where people would get a free monitor for replacing their FW-900 and they'll say their monitor is the "FW-900 Killer". Being 480hz would also make it perfect for legacy 24fps, 30fps and 60fps passive content because there would be no frame interpolation. (though PAL 25 and 50 would have minor frame interpolation and it may not even be noticeable)

-Being stuck on Silicon for years without die shrinks making us go to reconfigurable computing with FPGAs. We're already kinda there, with OpenCL, but not massively adopted and there's really nothing for us power users yet and I think we might see that when people that do Desktop Computing (Yes, Desktops will still exist, there will always be that 90W CPU Market, Pros need raw performance) and we might even see FPGAs embedded in CPU dies themselves or it might be co-processor and it might fit in something like a M.2 Slot.

-CPUs & GPUs not made of Silicon with the best one being Graphene allowing chips reaching close to 1Thz.

-Mill CPU Architecture that's ten times as fast as x86. It's so fast, if it ever took off the ground, your legacy x86 apps would run slightly faster in an emulator than on native x86. Couple that with Graphene and you got a CPU 2,500x as powerful with native ports and that's not even counting how it would interface with OpenCL FPGAs and thus maximizing the potential of non-quantum computing and that would be the perfect opportunity to kill off Windows. I don't care what the Linux enthusiasts say, in order to kill off Windows, you have to kill off x86 first and in order to replace a platform, the replacement can't be almost as fast like Linux is now for some things, (way slower with most) it can't be just as fast, but have more features like OS/2 and nobody making native applications because it's just as fast and have all of the OS/2 users run in compatibility mode, (like Linux users use Wine for DX9 and older Games) it needs to run emulated stuff at native speeds of the previous platform while having native speed of the replacement platform be way faster like what Apple did with DOS emulation. When you emulated DOS on an early PPC Mac, PPC had so much processing overhead, it ran as fast as a 386 and native stuff was as fast as a Pentium II. Anyway a Graphene Mill CPU with a couple Graphene FPGAs would be the machine of the Gods and yes, it would run Crysis and be over 250x as fast without a native port with Crysis 1 being the oldest game that would somewhat scale because it has a 64-bit binary and you could probably put in a realtime raytracing ENB. If you're modding Skyrim even now, (non-Special Edition) it will crash as soon as you go over 8GB of RAM. (4GB from Skyrim with the 4GB Patch and 4GB from the ENB) The one problem with Mill is they're short on funds, so most of their money goes into patenting their stuff to protect them from lawsuits.

-Patent expiring on Intel's Thunderbolt allowing third party CPU makers to use it. (15 years from now) It would be kinda cool to see a 200Ghz Raspberry Pi 11 with a Thunderbolt Port :D But who knows, by then Softbank may have acquired Intel

-Cheap custom CPUs. I see once we reach the limits of Graphene, custom Graphene Lithography will be commodity. Say today, (pulling numbers out of my ass) you could get a custom ASIC made, but you will need to spend at least $10M and you'll get a 45NM or even a 90NM Process (because everybody else gets hand-me-downs) and you'll get a million chips with no option to scale it down and get 10k chips for $100,000. Once Lithography plateaus, you'll have a plethora of Lithography vendors and lowering the price to make an ASIC and allowing you to scale down quantities and still be at the same manufacturing process to play with the big boys. This would allow open source CPUs and GPUs that don't suck to be financially feasible or an add-in card or external accelerator for whatever you want be it for compression, encoding for some future codec, AI, realistic voice synthesis, cryptocurrency mining, encryption and encryption cracking for "penetration testing". Imagine a Kickstarter to make homebrew chips actually being feasible. It could be a project so niche, their pitch would be "New Graphene PowerPC SBC that's Amiga OS Compatible", only get $100,000 and they could still deliver. Kinda like how Dragonboard is getting the Pyra made for about a couple million today. Once Graphene Lithography is commoditized, it maybe cheaper to just buy a Lithography machine if you need $10M worth of orders. (Maybe even less)

-The plateau of transistor count with Graphene and the inexpensive production of high end ASICs will also put an end to Consoles as we know it. (if mobile isn't already doing that) The Console fans of today that are collectors and buy Consoles for game collection of physical media and are scared that the 8th gen will be the last gen that has physical media might get together and make one last open spec device like the 3DO concept of single format gaming, but be successful that time around because there won't be a "next gen" after that.

But what would you guys do if your computer was 250x to 2,500x+ as powerful with ways to go even faster with software optimized to use multicore, FPGAs and ASICs made out of Graphene?

You PC bro?

http://i.imgur.com/7rw1Qwb.gifv

3 Likes

I think the future is kind of bleak. We've seen from the success of the iPhone, the most popular computer of all time, that general purpose computing isn't something that appeals to the masses. Intellectual (imaginary) property laws, Balkanization, encroachment of locked out black boxes (formerly TPM, but now Intel Management Engine, UEFI/Secure Boot, etc.), and hegemony of China as production of most of the worlds electronics, all contribute to cornering free, open general purpose computing. While there have been some advancements in freedom, projects like Raspberry Pi, it's hard to tell how they will fair in the future. We live in a world where IP rights holders have enough governmental clout to start an international dragnet to track down people who dare violate laws that don't even apply to them in the country they live in. Where using a VPN could cost you half a million dollars and put you in jail. Where merely disagreeing with someone in a tweet has landed many people in jail - or leveraged their platform for censorship.

1 Like

Like I said, I think there will always be 90W+ CPUs and we could have an open source CPU now, it's just that nobody wants to use 180NM Lithography, so nobody makes them because it's using less transistors. The big boys always get the new hotness leaving the others with hand-me-down manufacturing, but once 7NM Graphene Lithography has been a standard for years, everybody will get the same shit and that could make open source CPUs and boards with free firmware.

Also, patents expire after 20 years. I'll give you, the near future doesn't look so hot like the next 10 years, but 15 years+ looks promising.

So how much in the future are we talking here?

Immediate future (say 10 years?): Not much will change as far as Desktop PC use goes. Desktop PC use might decline a bit but it will be replaced by gaming ultrabooks or something of the sort. (See Razer Blade Stealth).

Beyond 10 or 20 years it is impossible to say or even speculate much. Technology advances exponentially (there is math on this somewhere) which means it will be really really hard to predict anything with any sort of accuracy. 9 years ago, I'm sure someone could have predicted the invention of the iPhone, but I suspect no one could have predicted the impact of it. So what comes out 9 years from now may completely change the course of technology as the iPhone did. For instance, HUDs and GUIs implanted in our eyes. Quite possible considering the exponential growth factor of technology, or perhaps we just skip the eye and go directly to the brain for this function.

Again, Simply put: we can't know. Unless of course you subscribe to the theory [disclaimer: side tracked thought ahead] that life is on an existing timeline or plane and anything that will happen has already happened, we just happen to be on spot x of this timeline and the only way to view the future is to leave the timeline and observe it from a distance / dimension. There is also a theory on this somewhere that sprouts from a unique experiment where observation can change the outcome of an experiment.

You're thinking of gadgetry technology, I'm talking about stuff under the hood in the Desktop or Workstation of the future for the market that still buys 90W+ CPUs. Just because we have smaller devices now, that doesn't mean there won't be a market for people that want more raw power. It's like saying "Who would want to shoot a 32-bit HDR from a DSLR when an iPhone can take pictures?". I'll tell you who, people that want more than the basics, that's who.

Not anymoore, Moore's Law is dead. Hell, it wasn't even a real scientific "law" when it was new. Scientific Laws should be reserved for Physics and Maths, that's why there's no "law" for evolution and people sometimes put laws in computing because it makes their findings sound more profound. Anyway, Transistors aren't getting any smaller, you would be lucky if you can get Silicon at 7NM and after that, nothing will get smaller because there's a limit on how small you can make them. If you looked at a modern CPU on an electron microscope, you could literally count the individual atoms on each Transistor. We've already got 3D transistors, the next increase in performance is from parallel workloads and reconfigurable and specialized computing and after that, there will be no leap for years until we can make CPUs out of Graphene or at least Indium gallium arsenide and more efficient CPU architectures. (Though I'm not sure if Mill will ever take off)

I don't know what the specs or materials are gonna be nor in which direction innovation pushes things but I know what I'd want to see.
I'd want to get the tower away from my desk or immediate proximity to save space and noise, and I want multiple monitors but I don't want a desk full of monitors. Solution? Having the PC tuck away in the closet or wherever as a stream box and a VR headset that provides apart from gaming a virtual workplace, having all the windows and whatnot floating around me.

Closets don't have enough airflow. If you have accidental damage protection, put your PC in the Basement and use a really long Thunderbolt Cable.

Eyeball actuated interface. The targeting reticle will follow what your eye looks at and lock on.
Software that uses the massive horsepower we already have.(ie going wide)
Specialized input devices yet to be developed. How old is the qwerty keyboard anyhow?

Whoever wrote it is hyping up mill. Since Mill lacks a working compiler there are no conventional benchmarks making most of the speed claims are on how fast it perform sets of low level math functions and what it should be based on the models.

Like I said, I'm not sure Mill will ever get off the ground.

As someone who has spent quite a lot of time with graphene I can promise you that any sort of advances in graphene to the market will not be in our life time. Grantee it.
Software software sooftware

Cellular nano-physics!

Fuck Bio

If I can have dual 28 core PPC desktops in 40 years with linux and AMDGPU and 4 of whatever is equivelant to the Nano now, I will never need anything ever again.

most of the future advancement is going to be about pushing gpu's to the limit. for VR to look real you need 8k color megapixels and about 550 black and white megapixels. we should also see a push to 230fps at some point as well. that's the limit of our eyes and also a good pair of headphones/speakers can already produce the full range of our hearing.

all that's left will be converting brainwaves to input to enable people to act at the speed of thought rather then the speed of fingers.all of that should happen in the next 50+years. after that I don't see any task that will need the raw power of a desktop.

personally once I have a system that can emulate a PS3 at 8k with 16x AA I think a system with the specs to do that will easily last me 10+ years. games going forward will no doubt be on the cloud meaning not worth buying. since it basically means a game won't exist when the company drops support for it. not to mention a cpu with that much power should be able to make short work of encoding video down to a portable size so I can watch movies while traveling.

Monitor? lol no, Just wire up the optic nerve, simple

But before that, autoluminescent jellyfish derived film over the eyeballs mmk

Who's going to want something invasive?

They want what we tell them to want

Porn.