So it seems like CPU prossesing isnt getting so much better, but GPU is. does this mean we will have a cpu botteneck era in our near future?

so yeah what do you guys think

CPU's have only become less and less important for games, that's probably going to continue for the future, CPUs now are only really a bottleneck if you're trying to do 144hz gaming, any moden intel CPU running at about 3ghz that's a quad core should be fine for a while

2 Likes

well I guess DX12 to right?

Hopefully vulkan rather than DX12

3 Likes

whats the advantage of Vulcan over DX12?

I know about both but only the details of DX12

It's open source, so it'll work better on linux, also DX12 is windows 10 exclusive for now if I'm not mistaken

2 Likes

oh yeah. and I guess a lot of people don't like windows 10 for the "spying" I like for the productivity. and stuff. and you know. all in all. some of that "spying" helps me for search results and stuff and its almost like Windows is "learning" how I like to use my computer. and so far I like that. I have Linux too. but only for 64bit KSP lol

I'm sure someone could make a program that does all that from within the OS rather than going to the cloud, or having a personal server do that all.

Worst case scenario someone gets access to your PC and starts doing illicit things not bothering to hide anything while it sends all that info back to Microsoft.

probably. but I don't mind if my data is in the cloud. because I do a lot of windows refreshing because I like the feel of new windows. but since my data is on my Microsoft account its easy since I just have to connect to it, and all my data is back.

But a flash drive is probably more secure lol I keep everything of mine compressed and backed up as needed locally, no internet connection required.

yeah but think of the average user.?

Average user could just do so without compression and wait a good amount longer for file transfers, still not exactly good for anyone to have no self-encrypted stuff floating around the cloud.

It wouldn't be so bad if you could turn the stuff off within the OS itself. Last I heard, it was pretty well demonstrated that even all the ways people thought were turning all the various tracking and data gathering off, even ways that would spam the event logs with errors of unreachable servers (like changing the hosts file for the servers info was being sent to) was still not working to stop huge amounts of it (they hardcoded the ip addresses...). Yea, you can probably still block it at the router level (assuming they haven't figured out some way around that by piggybacking on traffic that can get out), but it really should never have to come to that.

But, if you want such optimized features from tracking, might as well just use Chrome OS or some other web/cloud based OS (which is the direction 10 is going, but is just at a hybrid state).

Back to the original topic, there was a computerphile video recently about cpu vs gpu and some of the new directions GPUs are being taken that used to be handled by CPUs: https://www.youtube.com/watch?v=_cyVDoyI6NE

it only appears that way on the surface GPU's are getting performace boosts mostly from memory interface changes but the shaders are improving at the same rate as cpu dies due to having the same process limitations.

odds are once we reach 8nm both will stop seeing big gains while we transition over to graphene or some other material.

why don't they make "bigger" CPU cores instead of those small tiny ones. once they reach 8nm, they can make the dies bigger for more transistors and higher clocks

Heat, IBM made a massive CPU recently though

well they can refine and refine the process till they make decently cool huge cores, I mean the transistor count seems to stay the same. also wouldn't a "bigger" CPU core spead the heat more than a tiny one

We still have GPGPU on the table. And as it develops, like in the consoles, the CPU will become less important. For now we will see GPGPU make use of the i-GPU built into Intel chips and to the same extent AMD APU's allowing for more calculations without hampering the dedicated GPU. This is basically what Physx was except Physx was almost exclusively cosmetic.

Fact is, Consoles are proof that we can make better use of what we already have and while the limited improvements on the CPU side is due to the lack of competition coming from AMD, It doesnt mean we will see a slow down in performance of games. When you ignore CPU bound games and processing outside of games, A lot can be done with a simple FX6300 CPU. While its not recommended that you build a modern gaming system around a 4 year old CPU. There is plenty of evidence that supports the notion that we can do a lot better with what we have.

1 Like

Would also need more power, but what do you mean huge cores exactly? Just lots of transistors per core? because that's kind of a waste, you probably hit bottlenecks at a certain points.