Intel working towards 10nm- CPU lithographies

Alright, we're all waiting for what comes after Skylake, right? Right? No? Just me? Aw.
Intel's 'train' of progress seems to be suspended right now. But that's to be expected. I imagine at this time Cannonlake is in the fires, ready to be molded into its selling form. Great. But we all know 10nm is quite a feat of engineering and frankly, at this point, craftsmanship. Intel has done most of the work so far. Moore's Law is telling Intel no and Intel is drawing their sword to strike back. I'm no engineer, but I can certainly say that Intel is going to be milking Cannonlake and its following 10nm processes for a long time as they develop the 7nm process. Even further, I've heard rumors of Intel running for 5 and 3 nanometer processes. WHAT. THE. FUCK. So I create this thread for community input. Is this possible? And even if it was, there is no way we will be able to do this with our current Silicon/FinFET kinda stuff? Is there? What crazy ideas do any of you have? Or think/know Intel will likely have to employ to get anywhere near their goals?


http://wccftech.com/intel-expects-launch-10nm-2017/
http://semiengineering.com/10nm-versus-7nm/

Anything is possible if you put your mind to it. In all seriousness I've heard they've been experimenting with mixing materials into the silicon that could allow for more precision on existing techniques

We're at the point where I almost wish Intel would just push the clock speeds up instead of just making it smaller. Like the IPC at identical clock speeds is just not moving in any meaningful way anymore, so I wish they would work on making the existing lithographies better in hopes of cranking clocks. They aren't going to really give us any more than 6 cores for the next 5 years realistically on the mainstream platform, so they need to make these cores faster. Going smaller only means they can fit more cores on the server parts and thats no fun for us plebs.

1 Like

Intel only cares about performance per watt not absolute performance. Increasing clock speeds drives up power consumption.

They won't do it until they absolutely have to.

1 Like

If they stuck on the same lithography they could refine it and reduce the leakage of power inside of the cpu itself which would allow them to up clocks without increasing power. A hard feat to do but it would be fun and get more cpu performance for us plebs :3

Don't get me wrong they won't but I think it could be neat.

1 Like

its called doping we've been doing it for years nothing new.

@thecaveman
problem with boosting the clock speed is that we dont have a good way of switching between on and off that fast. its why people are moving towards smaller to have more transistors rather than faster ones.

From a money making perspective, aren't they better off just making 14nm the best it can be at this point?

Well not necessarily. Intel released that new technology with mobile broadwell first that allowed the cores to switch clock speeds in hardware instead of hardware which allowed the cpu's to switch speed steps much faster than previously before. I can't find the name of whatever it was but I will try to search for it.

of course thats mobile. They turn off some of the cores to save on battery. and thats switching between a low power mode to a high power mode. I talking like actually how fast a transistor can go from open to closed not how fast they can up the speed at which work.

In order for an electron to jump a gap you need to apply a voltage. thus you need to raise and lower the volatge to that spot very fast to have electrons jump at Ghz speeds. Its hard to get that much higher speeds without increasing the volatge. Problem has now become well more voltage means more heat so if we can go smaller we can achive the same frequecy with lower volatges. then the problem is well we have smaller devices we cant pump more power into them bc they will break.

the whole problem is one or the other. either lots of power high freq low transistor count or low power lower freq high transistor count.

.... lol what? if that was so why would they have chips that at stock can draw over 140 watts? ya its good to care about power efficiency but their prosumer, and overclocking line of stuff draws a hell of alot more power for marginal gains.

1 Like

they can't.

the reason they backed out of high clock speeds was voltage leakage.

since you need smaller nodes to hit higher clocks (latency of the electrons) but smaller nodes, meant more voltage leakage, more voltage leakage means the more power they pumped in (for those higher clock speeds) the heat would rise exponentially.

its just not practical.

1 Like

So what happens when they reach 0nm?
-10nm?
Virtual CPUs?

They are already dealing with quantum effects, so arguably 0 nm covers quantum compute. University of Maryland has claimed a programmable quantum computer cable of 3 algorithms using lasers.... Long long way from general computing, most classical problems don't parallelize well, and GHz isn't going further anytime soon.

1 Like

After nano there is pico, but we'll probably see a cpu in the single digit nano's before pico is even a dream in the distance.

it'll be 0.9nm and so on. We can go up to 0.1nm

in meantime likely we'll see different materials and lithography might actually go up couple times in future.

you can already achieve 'quantum like' processing power through multi-dimensional programming. Current programming (most of it) is 1 dimensional. dx12 and vulkan have features that strive achieving 1.5d, 2d resource allocation.

They'll have to hurry up with making the switch from using classical formulas to quantum formulas to calculate the effects. Or maybe that's what AMD, Intel, or an unknown party are already up to. Which would explain some of the lack of real innovation going on with current tech CPUs.
I can't see the change to quantum happening happen soon though. Like you mentioned. But I secretly hope that I'm proven dead wrong.

0nm is smaller than the 0.1nm that you mentioned. They could go pico like @thecaveman mentioned. Or even femto. But I have my doubts about that happening. Transistors need to go away. Or at least the way they are used now.