ROCm compatibility and available memory on Ryzen 8000 Series APU

Has anyone verified:

  • if Ryzen 8000 series APUs (desktop or laptop) are working with latest 6.0x+ releases of ROCm?
  • if it works, the (un)supported max memory avaialble to ROCm for AI models?

I’m currently running a 7900 XTX (24 GB VRAM) on Ubuntu 22.04/ROCm 6.02… and it runs Stable Diffusion extremely well.
For LLMs, llama3 (8b) is working well too, but that’s only ~4 GB.
The 70b parameter model is way too big (@ ~39 GBs), and errors out with not enough memory. And Meta is going to release a 400billion model in the upcoming months…
Even the Radeon Pro Workstation cards are not enough (and way too expensive for hobbyist use), and the MI300 is unobtanium (and exorbitantly costly).

A Radeon APU plus 4x 48 GB DD5 DIMMs would equal the top end M2 Mac Studio Ultra at 192 GB of (“unifiied”) RAM, so even though the performance will be an order of magnitude slower than a 7900XTX (12 RDNA3 CUs versus 96), it would allow use of much larger models.

[Note: I know you can run llama3 on CPU only - and I have the 70b model running on the CPU without issues - just slowly - but this is mainly just a thought experiment for future/larger models; e.g. models of the ilk Of OpenAI’s SORA… ]