The VEGA 56 / 64 Cards Thread! General Discussion

Hold on I’ll get the aloe vera for Raziel.

I can give you a spoiler. Vega does amazing for open source. Proprietary leaves things to be desired. Nvidia proprietary leads the pack.

@Raziel @Leo_V

So far so good. Using more VRAM than the card has :wink:

HBCC set to about 11 gigs.

1 Like

Budget, is the quick answer. I decided to go with Threadripper because I was not overly impressed with Vega and Volta is coming, also Navi or Vega 2.0. So I decided to save a few bucks on GPUs, spend more on CPU and see what comes next year and upgrade. The 64s will be put in my wife’s PC next year which will be overkill for her, but you guessed it, the beginning of a render farm.

Now on launch day, when I saw the prices, had I not already purchased waterblocks, I would have bought 1080 Ti x2. But the more I have found out about Vega, the more I am glad I did not go with nvidia this round. And if all goes well with what @Steinwerks is doing right now with HBCC, that will cement my choice.

1 Like

@Raziel @Leo_V

All done, no hiccups whatsoever after getting it started.

Note to Vega users: set HBCC before launching Blender (or presumably any application that will be using VRAM) as it locked up and I had to end and restart. Not a big deal, but worth remembering I think.

HBCC set to ~11GB, RX Vega 56 flashed to 64 Air BIOS, Win10, 32GB@2933MHz, Driver 17.9.3, Wattman set to Balanced, GPU pegged at or above 1616MHz for all 36ish minutes. Now I’m off to the gym, hope this helps.

Edit: oh yeah, temp lived at 75°C ±1° with fan at about 2000RPM. I should’ve had HWiNFO open but forgot and didn’t want to launch sensors during render.

I’m not sure how HBCC assigns VRAM, as Blender was “using” about 9.5GB, but Radeon Settings (while open of course) sat at ~120MB. I believe that it links the RAM assignment to the application using VRAM though because the usage by Blender dropped to under 1GB immediately as the render ended.

5 Likes

That´s awesome! thanks a lot for taking the time to do it.

I would like to find a bigger test scene to make it more evident.

For now these are very good news for me, thanks a lot buddy.

1 Like

If you find one let me know! When I get back home I’ll see if it fails without HBCC enabled.

1 Like

@Leo_V @Steinwerks

Awesome!

This video shows Gooseberry will not load on an 8GB card. It’s nvidia 1080. Time stamped to 6:07…

1 Like

@Cavemanthe0ne

Also balancing for what I do, cause I need CPU power as well as I use DaVinci Resolve:

Almost like AMD developed one compute/render chip then tried to make it a gaming card :wink:

@wendell perhaps this should be tested in a video. Adds a lot of value IMO.

1 Like

@Steinwerks Yep, exactly.

@Leon_the_Loner No Aloe Vera needed…

I only buy games that support Linux and my library is filling out pretty well.

2 Likes

if the only thing you do is video editing then sure, even expensive vega might make sense. if you were doing stuff that was other than that though (i.e. were you need the Pro drivers, etc) the FE would probably be worth the extra $300 per.

I’ll do you one better. If you’re already gunning for the biggest and baddest why not just shell out the extra $300

Epeen fixation knows no limits.

That’s great (no sarcasm)
But I look at my library and besides all the half life based games, no Linux support.

Back to Vega though, it’s a day late and a dollar short. Wouldn’t recommend big Vega, would totally recommend 56 over 1080 for shill reasons but if you want to go bigger 1080ti is still king.

No. I use Blender and Fusion 9 as well. Only the extra vram would help and is not worth it to me especially after what HBCC is doing. Also Radeon Pro Render works the same. If price to performance could justify it, I would have bought the FE.

did you not read my post lol
-was talking about how if you need the pro drivers for the things they offer FE might be worth it
clearly you dont so …

the problem I’ve run into is that things are a bit weird for me still with fedora and the open source stuff, but yeah, civ 6 ranges from somewhat faster to considerably faster under linux now. vs a 1080ti.
We were using the asus strix 1080… hands down vega on linux, it is amazing, but the setup is way not newb friendly
and I’m having some weird problems that I suspect are probably library conflicts.

Ive got both the 56 and the 64 and have dumped probably > 35 hours into this, and the story to tell is still a muddled mess.

8 Likes

Oh but I was talking about using HBCC for rendering tasks. Seems to be working as it would on something like the WX9100.

I haven’t seen anyone else cover this on YT yet.

1 Like

I´m having an issue with this, this was reason #2 over what I was doing my assumption:

Reason #1 was that I tested myself, I can´t remember how many months ago, the Goosberry Benchmark on my 390x and I was hitting -out of vram-.

Now, this is crazy, I did the test again and I can render it on my 390x. GpuZ shows full usage of my videocard memory and above 1GB of dynamic vram (!?).

Amazed I have tested going back to Blender 2.77 and even Crimson -previous to Relive version- Nov.´16 and it still do the render. (??)

Something is weird here, maybe it´s a good one about OpenCL support, but… I´m a bit confused now.

At the moment I will edit my post making it clear that maybe that´s not an appropiate test to evaluate the HBCC functionality for compute based rendering.

Well that’s a little disconcerting? I haven’t tried it without HBCC on yet. Maybe we need a bigger render project to try?