AMD has been worth it on linux for just about everything since 2012 though. nVidia has had very poor support on linux for security conscious people forever, and since 2012 also for those that forego security and basic privacy and protection by installing binary blobs, and Intel hasn't been making good drivers in the last 5 years or so, even though they're acceptable on linux in comparison, but the hardware is very limiting. So bit of a misrepresentation there...
On Fc25, with kernel 4.9.3, non-AMDGPU-Pro AMDGPU system, LLVM now comes in by default, even on GCN 1.0 cards. All I've done sofar is check out glxinfo, and WOW, just WOW... If you install the Vulkan libs and fs from the main repo, I don't think there will be much benefit any more in compiling AMDGPU-PRO on this lolz, and with GCN 1.0 supported, there might be little benefit in using radeon.
Edit: just ran 2 minutes of a game on the new AMDGPU KMS drivers on an Fc25 AMD+AMDGPU system with GCN 1.0 card (RHD 7950)... performance seems off the charts, never had fps like this even on Catalyst...
I don't have steam on that machine and have not tested on my gaming rig yet, so haven't tried any of the betas of Unity3D games that should actually see the biggest advantage. I've just tried it with RedEclipse, all max settings, deathmatch mode, random chart and loadout, maximum embellishments of avatar just because. Minimum fps was at like 190 or something. Fedora 25 KDE, standard AMDGPU KMS driver, Vulkan installed (not that it would matter for RedEclipse). (Edit: that was on an old FX6300 @ 4.3 GHz with 16 GB of 2133 MHz DDR3 and a RHD 7950 with 3 1440p monitors attached, which is a GCN 1.0 card, so first time AMDGPU support on kernel 4.9. Average fps was 275).
I haven't had time to try out any of those new optimized betas of Unity3D games yet, but I'm really curious to see what it does with Vulkan optimization. Are there any of those games already available on Linux?
yeah I'm a victim of Intel laptop chips also, I feel your pain, maybe AMD will actually get OEM's to use their new APU's in laptops, but I don't get my hopes up too high, the Wintel alliance is still very strong in controlling the laptop market.
Oh no no no. Intel is all fine imo. It's just that Nvidia's 10 series mobile chip was the main reason I bought a portable workstation. That's why I cry every night. But I try to drown my sorrows in alcohol.
Me too man, I remember when the early/lazy implementations of optimus would make getting linux up and running on your laptop a PITA, and intel's igpu nonsense often made it much more difficult
Never got beignet to run as it should though, and those intel power management bugs are really annoying. I just don't like discrete GPU's in laptops because I dislike the form factor and the cooling requirements. Intel has never really gotten their hardware running as it should. Curious about the new AMD APU's though. I'm hoping for an APU that can do passive cooling personally, that would be really great. With the AMDGPU drivers and Vulkan on linux, AMD solutions are really great for my kind of work loads in the field. If I would only have a working Beignet, I would even be satisfied with an Intel chip, but I just don't want to waste all that battery life on operations that can be done through OpenCL by the GPU much more efficiently.
Yeah, I got a gaming laptop one for college when I went, thinking It'd be a better option than a desktop + igpu laptop with space at a premium... I got a second job in the first year to rectify that mistake haha.
My feeling is that even if beignet worked well, (the LLVM etc hoops you need to jump through are a hassle either way) it wouldn't give you much of a boost over the pure CPU opencl libs. THere's just not enough meat in their graphics silicon yet.
I'm still a firm believer in desktop for work and compute, laptop for docs and email. no solution has been compelling enough for me to go a different route yet.
Yeah but that's the thing, you never could expect near desktop performance from a laptop ever. Intel never delivered anything capable of more than basic office usage. AMD was kinda better with their APUs but that was also limited to office use, sure with the newest kaveri's 2 years later than their desktop counterparts, some HSA made maybe Gimp, Libre Calc and maybe hashing faster but most software devs never got hold of it. I blame the market. Then you have nvidia with their blob but they still offer better performance when it comes to laptops, and that's sad. I'm a pragmatic person tho, I never even considered a laptop for a proper workstation until like 6 months ago, but we're just in a point in time that: 1: portable desktop replacements are a thing (note the word 'portable', they're not laptops lol) 2: nvidia is the only option if you want proper GPU performance And 3: it sucks
We're pretty much in agreement here. unless you're, like, a pilot that moonlights as a digital designer on his off trips, there's little reason to go the dGPU laptop route.
a laptop will never be able to give desktop performance though, everything is just more limited even with so-called desktop substitution laptops... with the amount of data any modern CPU+GP-GPU solution, including ARM SoC's, can produce, you can't connect enough stuff to a laptop at a high enough velocity to get your money's worth in comparison to the ever smaller desktop solutions. I really like modern laptop form factors though, because power efficient. Problem is that whereas for instance on ARM, decentralized performance centers on the silicon are a given and just work, on x86 it seems like this is still a big problem, and it spoils a lot of fun. I really shouldn't complain though, I'm getting like 13-15 hours of battery autonomy on my 13.3" Travelmate in Fedora with tlp these days, but once over 10 hours of autonomy, even an increase in efficiency of a mere 10% represents more than an hour of autonomy... the rules have changed, OpenCL has become a necessary technology. On a laptop with 3-4 hours autonomy, like the surfacebook with a discrete GPU, a 10% increase in efficiency would only represent maximum 24 minutes in extra battery life, and those GPU's are not that sexy for most workloads, not like desktop-GPU-sexy lol.