R9700 AI GPUs

To be honest, I would prefer a H200 card but it seems life has decided not to grant me such benefits.

The R9700 looks like a really nice card with 32Gb vram allowing local running of some larger models that would not fit on the 16Gb cards. I have a XT 7800 and would add this to the setup so I can run AI on this card only while driving screens on the 7800 card.

The prime function would be to edjucate myself more on AI and what it can do. As an older system engineer, tech is evolving faster than I can keep up. Still love tinkering and learn new things and LLMs+python creates endless possabilities for fun :slight_smile:

I’m also considering them.

Currently running 8b models on my 6900XT under linux via vulkan and llama.cpp (in addition to larger models on my macbook pro).

Enjoy!

Have you started trying to run anything yet? You can run smaller models on CPU at a reasonable speed!

I’d personally want more VRAM (48gb or more) + FP4/6 support but the R9700 is a pretty good GPU for the money.

32GB does allow you to run some great models locally. 64gb (or two GPUs) allows decent acceleration for MoE models but still can’t fit gpt-oss:120B or minimax m2 entirely in VRAM.

Here’s a review if you haven’t watched it already - https://www.youtube.com/watch?v=efQPFhZmhAo

I’m waiting for Medusa Halo parts in 2026/27 so that I can finally run larger (200B+) models locally.

Smaller models while great are no where near frontier LLMs outside of benchmarks.

The 7800XT should already allow you to run gpt-oss:20B and maybe Devstral/Mistral. You will not notice major improvements there.

You might get some improvements from adding the R9700 but you might want to shortlist the LLMs first before you run out and buy the GPU (you should still get the R9700 regardless given the shortage of PC parts).