Surprised tho at Peanut's charts, expected for the 1700 to do a lot better against the 8350. I guess there's still significant power savings involved in the upgrade. edit: Also obviously gpu bound. But still... wow.
I can vouch for that, has happened to me plenty of times, but the coffee is just too addictive, can't stahp consuming it. (ps. send help)
And to actually contribute something on topic: as far as I know it will be possible to use the other vendors graphics card for compute only, with exception perhaps of Vulkan/dx12. If we are assuming simultaneous use.
The point of the charts is that it is not really surprising that a 8350 @ 3.4 Ghz ($180) can keep up with a modern $120 video card (1050 Ti, 5,766 Passmark). Even modern dual-core like the g4560 ($70) could also shift the bottleneck to the GPU. The only CPU on that list that creates a CPU bottleneck is the 4850e from 2008, ($45). The only video card that can shift the bottleneck back to the CPU, assuming an 8350, is perhaps a 1070 or higher ($500, 11,023). It is unlikely a 1060 6GB can ($250, 8,717).
So when considering...say...GPU bound workloads like games, it does not make sense to upgrade one's CPU. It really is necessary to spend 3x+ more monies on the GPU than CPU for the CPU to ever become a limiting factor. A 8350 at 4.0 Ghz+ ($180) probably pairs well with a GTX 1070 (assuming $350). That said, there are some cavets related to OS version and specific games.