AMD to beat Intel to 10nm (Extreme Speculation)

Before I start with this I just want to acknowledge that this is a highly unlikely scenario and that I am just going off into the weeds here, but I figured this could create some discussion. So Intel Coffee Lake will be releasing in 2018 and will be running on the 14nm process, however the rumor is (good ole wccf so major grain of salt here) that the newest snapdragon 830 will be manufactured on the 10nm process and will go into the newest set of flagship phones next year that being around the march/april time frame. So with that you might ask, how does that relate to AMD? Well the 830 is supposed to be manufactured on Samsung's 10nm process and AMD will use Samsung's process to make their new chips, and even if they don't or split where their chips are made, GloFo has a licensing agreement to use Samsung's fab technology. So while I realize that low power chips are generally made first on a given process, will it really take an extra year to get to 10nm? Maybe for once, Intel will be the ones who struggle to get a 10nm process ramped up while Samsung/GloFo takes the lead. Like I said this is a HIGHLY UNLIKELY scenario, but just some interesting food for thought.

Intel Map - http://www.pcper.com/news/Processors/Intel-Will-Release-14nm-Coffee-Lake-Succeed-Kaby-Lake-2018

AMD Using Samsung - http://www.forbes.com/sites/patrickmoorhead/2016/07/25/amd-diversifies-14nm-manufacturing-with-samsung/?yptr=yahoo#223b344e50fb

830 10nm Process - http://wccftech.com/qualcomm-snapdragon-820-kryo-processor-sources/

I doubt AMD will get 10nm chips within the next 3 years. 14nm is very early for everyone outside of intel and IBM, additionally the Snapdragon you are talking about is a tiny chip compared to a desktop or laptop processor. X86 chips are incredibly difficult to make in comparison.

2 Likes

Basically what @Hadesfriend said. Also even IF Intel is later to the party (which is very unlikely but it would still be very funny to see a giant like Intel get beaten) there process might just be better than the samsung one. We saw it on the iphone 6s where the samsung chip was 14nm and the tsmc chip was 16nm yet the tsmc made chip was still slightly more efficient

Let me get this correct:
AMD uses Samsungs 10nm process to run away from Intel who needs to figure out 10nm first?
Sounds like a lot of magic and pixie dust to me.
AMD aims at mid-range and low end, Intel for the moon. So it is to be asked WHY instead of COULD. I do not see this happen. Without having confirmed Zen benchmarks, this is crystal ball grade guessing game.
With Zen benchmarks, this would make for an interesting dicussion. At this point in time, I do not see any reason to guess and speculate.

Final notes:
Maybe AMD runs to Samsung, gets Zen fabbed there to test and see improvements by the 10nm process. Just like Nvidia tried Maxwell on 14nm and called it Pascal.

the answer is quite simple, its a big no. If it did intel would need to allow it. The Samsung 10nm fab is still bad. Yields are bad... would mean the cpu's would be filled with bugs like Pascal from NV, and it would be expensive.

There would be only a small gain from 14nm to 10nm, They should instead strive for 6-8nm instead.

Uhm. Nope.

To expand further with that, well Samsung and TSMC could at least start by catching up to the same level as Intel is with their "14nm" and "16nm".
Though yes, of course, TSMC's and GF/S 14/16nm is a foundry node while Intel makes their process suit for their needs but the point still stands.

https://www.semiwiki.com/forum/content/3884-who-will-lead-10nm.html
https://www.semiwiki.com/forum/content/5620-10nm-sram-projections-who-will-lead.html

So no, AMD won't beat Intel to 10nm in any way, shape or form as Intel's lead is quite good right now.
And you might as well take a good, hard look at the Polaris 10 on the 14nm LPP from GF/S. Is it good for it? Does it overclock well?
Hopefully the 10nm foundry processes from GF/S is better than 14nm LPP but I don't have high hopes.

AMD is going to be "stuck" with the 14nm process for a while.

Samsung could be ramping up 10nm by end of 2016 but that would still mean long ways away from actual products. Not even talking about the yields for huge chips (CPU's) vs simpler SoC's.

Obviously this tech is not going to be available to the public nor has it been designed for public use at this point in time.

And AMD also knows what they can and cant do, AMD's strengths are in their strategic relationships, they may have lost the interim battle against intel for raw processing production and capability but AMD has worked very hard to construct positive business relationships with Samsung and other such companies and this will in future benefit their ability to produce tech and stay in the race.

Even if AMD beats Intel to 10nm, it doesn't mean it'll be more efficient. Nvidia has proven they can make a 16nm video card that is more efficient than AMD's 14nm video card. Better engineering makes for better efficiency...

AMD can do async, Nvidia can not.
If AMD cut away the hardware needed for async, they would be on par with Nvidia.

I doubt that CERN has any of that in use. The detectors are managed by WindowsXP machines and processing on lvl0 is done using standard servers (because the budget of CERN is rather tight, most of them are a tad older).
CERN is not an argument. I once spoke with Prof. Dr. Rolf Heuer about brand new computer tech at CERN. They first of all can not do "maintenance breaks" at lvl0 and secondly lvl0 is where data from all detectors is packed and sent of to lvl1 and lvl2 data centers for processing. In other words: lvl0 is not allowed to fail or introduce errors into the data.

TL;DR I do not belive Intel has 5nm silicon.

(Post Withdrawn)

Once async is used large scale, you don't think Nvidia will start putting the hardware in their cards? They are a company that is about doing things now. Once things are widely adopted, I have no doubt that will have a version of async compute running in their graphics cards...