Return to

Returning my Radeon VII what do I buy?

So long story short.

Old build was a 8700k @ 5.0 1080Ti great for gaming sold it as a whole for a VFIO 2950X build

My new build has a WX3100 (Host) Vega 56 (Mac Guest), and at the moment a Radeon VII (Windows VM)

My thought process with getting the Radeon VII is that I’m really only holding onto Windows gaming for Battleye and a small amount of various other titles, and when the time (maybe) comes I can easily game with this on Linux.

Another selling point for me was how lateral of a upgrade was coming from a 1080Ti. On top of all this I wanted to seriously try and get into content creation once my schedule freed up a bit.

I had the 1080Ti for what felt like forever and I’m pretty disappointed buying another pre-owned 1080Ti seems like a smart choice.

So after prefacing with all this my return grace period is up Monday at Microcenter, and with the inconveniences I face with the reset bug i’m inclined to turn this in for something else. I game at [email protected] and the game I play most is PUBG (sadly).

So achieving ~144 in game is always my goal. Before the reviews came out for the 2080Super I was pretty set on it, but still seems somewhat disappointing. Also how bad I loose in VRAM from these transitions. 11GB>16GB>8GB. Gaming wise I know this doesn’t mean much, but when you’re inclined to spend this much on a GPU it would be nice to have.

So L1 do you have any suggestions for me ? I suppose if I really want to see a performance delta over my OG 1080Ti I can get a 2080Ti, but that will definitely require a cash return and a little bit of saving. If I do go that route I don’t think i’ll be able to help to feel a little burned from Nvidia.

From the few performance reviews I’ve seen, the 2080 Super is still going to be slightly faster with the stock clocks than the 1080Ti plus it may still have some head room on the memory speed for over clocking. It doesn’t have the 11GB that the 1080Ti does but it’s faster. Unless you are dead set on buying new and can find a 1080Ti for a rock bottom price, I would go with the 2080 Super if you have to get something now. If you still have the 1080Ti to use, then wait until the custom 2080 Supers come out.

I’m kind of in the same boat. I’ve been holding on to my 1080Ti until there is a significant performance increase in GPU’s in the $700 or less price range. Right now there’s nothing in that price range that’s going to give me much of a performance gain in regular rasterized gaming.


Upgrade hardware less often. 1080ti to Radeon VII to 2080 is kinda mostly sideways at best. You’re just pissing money (and time) away for small changes in performance IMHO.

The new shiny might be exciting for a bit but involves reconfiguration, new bugs, etc. and unless you’re waiting for significant (e.g., 50% plus at LEAST) jumps, IMHO it just aint worth it.

I’d just keep the Radeon VII or go back to 1080TI and wait longer.

If you have “PC hardware modification money” burning a hole in your pocket maybe look into other PC related hobby stuff like 3d printing, 3d scanning (content creation!), or whatever. I reckon you’re sitting pretty for GPU for a while as is.

1 Like

For content creation, you will NOT get more than 16GB of VRAM at the price point you got your Radeon VII at. Stay with the Radeon VII. When the drivers go stale (on Windows), a 3080 might be a worthwhile upgrade. Or stay on Linux with the VII where the drivers never go stale.


Second that. Nothing significantly better is out there atm. You don’t mention any RTX enabled games. So, staying with the VII or going 1080ti again and saving money certainly is the smart choice here.

1 Like

Alternatively, flick the Radeon VII for Navi (does it still have reset bug?), put the spare money in the bank (you’d get a bit back from that trade), and/or upgrade something else… or use the spare money to build a dedicated navi equipped pubg box :smiley:

Either way if you’re currently on anything similar to 1080ti, 2080, Radeon VII, you’re just talking small percentages of performance difference between them depending on the specific game/application. I personally think the Radeon VII will age better out of those 3 cards in particular (due to the memory size, memory bandwidth and raw compute power all being superior to the other two cards), but that’s my opinion.

Then again, i’ve mostly given up on PC gaming these days myself and just used some of the PC-upgrade money for a nintendo switch, and the PC is just running linux full time where the non-console type games mostly work anyway.

Totally agreed, it’s a tough situation. Wish I snagged that 1080Ti when I sold the build. Damn what a card. I’ll see what prices on /r/hardwareswap

I think you misunderstood my post. I sold a whole build including a 1080Ti for a VFIO build. I was making the best profit by selling the build as a whole. Which left me again with $700 To spend on a GPU. My problem is that spending that much on a GPU right now feels like a loss.

This would be the best case scenario, but constantly patching my kernel and dealing the unpredictable facets of the reset bug. I love the card, and would honestly get a 5600XT to hold me over until the 5900XT came out if we had more confirmations

Agreed, I’m going to look at some 1080Ti prices.

PREFACE: I am not a Linux user, but read a lot on here about the woes and successes. Please correct me if I am wrong.

Oh boy… Wait till you get nvidia.

The reset bug is a pain and should not exist but if you don’t like kernel messing about nVidia will not solve this. Every new kernel update will be a new driver as just rolling over the existing seems to make the nVidia stuff just freak out. No reset bug which is more frequent but new problems that are also annoying.

1 Like


With NV on the *nix side, you have to make sure that the NV driver has been updated before you go to a new kernel. Less regressions that AMD but it is a thing. You also have to patch wayland if you use it. The AMD qwerks tend to happen when they release a nee architecture. And you can always just run the AMD proprietary drive until the qwerks are fixed.

the game I play most is PUBG (sadly).

Wait, is PUBG uncool suddenly?

Whilst i agree, i think the OP is using the card in question for VFIO, and thus linux kernel hackery for Nvidia is irrelevant, as it is passed through to windows.


@anthr76, I will have to say, changing from a Radeon VII to a 1080ti or 2080 or even a 2080 super IMHO makes no real sense like @domsch1988 said. It would make sense if you were going up to a 2080ti. But would say just wait till the replacement for the Radeon VII is released (probably announced Dec 2019 or Jan 2020) nothing in the $700 price range is worth swapping in and out of. Nvidia will also probably be releasing new GPUs mid to late next year.

Say all that, since you are planning to do VFIO, I would still keep the Radeon VII, swap it for the Vega 56 you’re using for the Mac Guest, sell the Vega 56 and get a used 1080 or 1080ti for the Windows VM; especially since you are gaming at 1440p/144hz. Nvidia is generally better in Windows. Nvidia also doesn’t suffer from the the reset bug as AMD does.

Since you’re using a Mac VM and own a WX3100, then it is probably safe to assume you are doing some form of audio-visual media creation or CAD work. So the 16GB HBM and 1tb bandwidth on the Radeon VII is probably going to be far more useful to you in the long run. It is a cheap Titan V with a bit more. With the average PC gamer defaulting to Nvidia GPUs, when you are ready to stop using the Windows VM, an Nvidia GPU will probably have a higher resell value than an AMD gpu.

1 Like

Exactly this. I have 0 intention on using this card on my host. Rather just passing it through to a VM.

1 Like

This is actually a good idea. I like the sound of this. It’s crazy too what a steal you can get 1080Tis for. On top of all that I can move the RVII and game on Linux when windows isn’t required.

The only difficult thing about this process is it seems like the WX3100 isn’t supported on MacOS so i’ll have to sell the WX3100 for a WX4100.

I’ll pull the V56 tonight and post it up. I’ll get a feeler for how much they’re going for


Glad I could be of help.
I game on my Radeon VII at 4K60hz and it is amazing; I don’t experience much if any lagging or hitching unless I am recording the game play. I also recently undervolted and overclocked it to 1900Mhz/1020mV and 1100mhz for the memory. So getting slightly better frame pacing.

Also since you also get access to the Pro drivers and ROCm with the R VII, you could also do some of your pro work on the Linux host as well just for convenience with quick jobs that don’t involve your Mac workflow.

Yup a cheap 1080ti is better than an RTX 2080 or 2080 super because of the VRAM. Ray tracing will also probably not be a thing till 2021 to 2023 once next gen consoles are mature. Even then, it may not be done the RTX way, but more general purpose compute and async compute like CryTech’s implementation

Was watching something on youtube last night where apparently someone at Nvidia (presumably off the record) was quoted as saying they don’t expect ray tracing to “take off” until 2027.

So yeah, either way, the hacks we can do with rasterisation can get “Close enough” at “much faster speed” right now.

Even if RTX was to double in performance right now, we have plenty of other requirements for GPU power that will make demands on frame rate (e.g., VR at decent res with increased FOV). I remember one VR proponent discussing what we need for “good” VR and it was something like the equivalent of 2x 16k resolution (at about 90 fps+) for a decent (i.e., natural feeling) FOV. We are simply WAY off that point at the moment, even if we aren’t trying to throw ray tracing into the mix.

TLDR: RTX is half baked and a waste of time. Buy the 1080TI :smiley:

or a 2080TI if you are simply made of money. But buy it for the brute rasterisation performance, not RTX.

1 Like

I bought my RTX 2080ti because I knew I was getting a slightly cut down Titan Volta for half the price :smile:

My plan was to use 2 of them for AI workloads. Though now thinking of using two Radeon VIIs instead. All depends on libraries; most of the research is done on Nvidia cards sadly. Though from that compute standpoint, I wouldn’t even go anywhere near a 1080ti. Vega 64 or Vega frontier edition if you can find one is a better GPU. If it’s just about gaming definitely GTX 1080ti

But yeah, ray tracing is great but most gamers will never see its value especially when you have dynamic rasterized lighting with probes. You will only see RTX in all its glory if you are gaming on a large screen TV like 55inches or more and pair it with HDR. Even then the effect is more HDR than ray tracing. So on regular monitors, you really don’t notice that much. It is good the companies are thinking about it, but you are right there are other things competing for that hardware. Plus from all the papers I have read and demos seen on ray tracing since last year, the problem is that the current algorithms are extremely inefficient. What may be a better approach would be to use cloud computing in the development process to map all the light and sound bounces and pass that to the game as game data so the GPU is not doing as much brute force calculations but just working out where an object is relative to other objects. Probably means games using up much more storage space, but hey storage is getting cheap. Still we have light probes and other ways to cheat the effects. It will also be interesting to see how Crytech have improved their Voxel ray tracing which is probably a better application for gaming than what is currently being done.

For sure there are better uses for the hardware than ray tracing. Personally though, I’ve not really been huge into VR, the prices have been too insane and far too many cables. For me VR will be interesting once the images can be transmitted to the glasses via wireless or Bluetooth at the high 90+fps at at least 4K resolution. 16K per eye seems excessive, but yes we definitely don’t see at 1440p or lower so I guess 16K may make sense. And with the screens being so close to the eyes, lower resolutions may actually damage the eyes longer term because the eyes will have to readjust to their regular vision when the VR goggles are of. Kinda like why people get motion sickness from watching 3D movies in theatres. Will say though VR seems less of a fad than 3D and people seem to be really into VR. So definitely VR and increased FOV may be a better use of the hardware.

That said who knows, the new APIs DX12 and Vulkan when coded for properly like in Sniper Elite 4, Ashes of the Singularity, and Strange Brigade are huge leaps in efficiency and GPU headroom. They also allow you to scale fully your GPUs. Using split frame rendering (SFR), an open source AMD technology that allows you do async/parallel compute across all available GPUs, you could not only scale performance but independently use up all the VRAM on the GPUs so 2x1080ti = 2x11GB VRAM. So mGPU will not be too far off, people just need to get off Windows 7 already! :smiley: