Yea well good luck with that.
You really think Intel wouldn’t try shenanigans? Seriously… Intel will try any sort of shenanigans they can possibly think of.
knowing Intel they’ll throw money at anything.
problem is the PCIE standard does not just evolve around the CPU’s.
this is a standard which touches basically anything from USB perifials running Mpci-e(yes mini pci express does contain a usb bus…),
(meaning modems, wifi and all that jazz), to your gaming GPU, as well the GPU mining your bitz coinz.
So they can come up with ALL the pcie future proofed express standards they want to, basically if it doesn’t concern the general good, it
is a no go.
which is also EXACTLY why it is set up this way.
Not something as stupid as what you are suggesting
Like every market intel has tried to enter in the past 10 years, they will burn a huge amount of cash and fail miserably.
Even if they catch up to AMD and Nvidia, their history with GPUs is so shocking, no people will buy unless it is superior to the alternatives by some margin.
“Koduri seems adamant about getting the gang back together, and I believe this is just the beginning.”
Koduri seems to have a passion for allowing the third world to do things like content creation and movies with first world special effects. Jim Keller and him worked for years on the A4 thru the A12. I think they both will really want to bring “affordable” solutions.
I am really curious to see if their plans are for dual gfx, something I like because of the promise of cheap initial build cost and being able to scale up later. AMD completely abandoned crossfire in their latest APU’s.
This subject has spawned a renewed discussion on intels first atempt to go up against nvidia and Ati, the i740.
It was a huge flop
But it did give us the AGP, sorry post is so long
dGPU? d as in Desktop? or
Neither, as in dedicated.
Intel released a teaser video showing a GPU silhouette, we learned nothing new from this and it was just them reminding people that they will have GPUs too, not to let Nvidia steal the entire Siggraph show.
Intel Begins Teasing Their Discrete Graphics Card
I cant wait to hear about all of the security flaws in 15 years
I tend to agree that is the bigger market.
Im a gamer but AMD all over consoles and intel is a 2020 unknown and Nvidia is the king of PC games.
Its an RX and RTX crazy world.
Oh yeah. Can’t wait. This will definitely be a massive success for Intel. Especially seeing how incredibly reasonable they are with their pricing.
i5, 4 cores. Single threaded. $100 or more over AMD equivalent with multithreading. And times have changed. AMD no longer run 90 degrees celcius and draw 500 watts. And the single core performance is negligible.
Only reason i’m on Skylake now is because Ryzen wasn’t available at the time and I said “phug it…”. This system will be passed down to my father, maybe my brother when i’m done with it. Then i’m building something else.
Absolutely hilariously, though. Intel making dedicated GPUs.
Fuck right off.
Intel is going to charge more than NVidia for those GPUs…
it’s this on 14nm (plus plus, plus plus, plus) at “5Ghz”
I’m actually keen to see how AMD cards fare on ray tracing. They’ll be a lot better than pre-turing Nvidia cards i suspect, due to the much higher TFLOPs in general.
This is just Intel repeating what they already said, but adaptive sync(Freesync) will be supported.
If this happens, intel becomes my new GPU of choice