Test Driving AMD's Epyc 32 Core Monster CPU | Level One Techs

********************************** Thanks for watching our videos! If you want more, check us out online at the following places: + Website: http://level1techs.com/ + Forums: http://forum.level1techs.com/ + Store: http://store.level1techs.com/ + Patreon: https://www.patreon.com/level1 + L1 Twitter: https://twitter.com/level1techs + L1 Facebook: https://www.facebook.com/level1techs + L1/PGP Streaming: https://www.twitch.tv/teampgp + Wendell Twitter: https://twitter.com/tekwendell + Ryan Twitter: https://twitter.com/pgpryan + Krista Twitter: https://twitter.com/kreestuh + Business Inquiries/Brand Integrations: [email protected]

This is a companion discussion topic for the original entry at https://level1techs.com/video/test-driving-amds-epyc-32-core-monster-cpu

As someone who has been forced to go “to the cloud” for large dataset computing, this was a sight for sore eyes… and sore wallets.

Thanks for continuing to dive deep into the nitty gritty with these setups Wendell!

Im glad this is in the mix :slight_smile: Ive been waiting for some EPYC content.


Speaking about PCIe port bifurcation, what about IRQ# (it’s shared?) for four splitted PCIe 4x?

I’m out ouf PCIe slot on Asrock X399 Taichi, I consider split one of PCIe8x for two Blackmagic Decklink cards (16x->4x NVMe-> NVMe->PCIe4x. In my case both BM cards will be used in HostOS.

PCIe bill:
16x PCIE1 (HostOS: GTX1080Ti)
1x PCIE2 (RS-232 card)
8x PCIE3 (4x 2.0 BM Decklink Recorder 4K Pro)
16x PCIE4 (GuestOS: GTX960)
8x PCIE5 (4x 2.0 BM Decklink Intensity 4K Pro)
4x NVMe (Samsung EVO960 for GuestOS)
4x NVMe (MP500 other VMs)
4x NVMe (not used)

48  (56_by_slot) of 60 PCIe lanes is used 

If I will succeed with PCI bifurcation, I can return HD5770 to empy PCIe 8x (for GuestOS2_XP).

Who said that, sixty PCIe lanes is enough. ;-)

@Wendall, I just watched the YouTube video and heard you’re open to doing workload tests. I’m curious how well that would scale with FFmpeg HEVC encoding vs. my OC 1700X.

Could I cook up a test for you with some beefy Carry Trainer footage and a custom FFmpeg line command?

Really nice build! I’m curious to know more about the AMD CBS settings seeing as Gigabyte barely explains any of that in the manual. E.g. how does the determinism slider affect performance? Is there something else that could squeeze out the last drop?

Just to add to the discussion, I’ve built a very similar system to use as a fat node in a cluster for physics simulations. I used the same processor and chassis, but with the Gigabyte MZ31-AR0 EATX motherboard, 512 GB 2666 MHz RAM (16x32GB) and used a Noctua NH-U14S heat sink with a Noctua 3000RPM iPPC fan. The other nodes are almost identical except based on the Threadripper 2990WX using 128 GB RAM. There were some minor hiccups:

  • The memory runs at 2133 MHz when all 16 slots are fully populated. To my knowledge there’s not much I can do about this. Is it the same for the MZ01-CE0? It has some frequency settings I’ve not seen on mine.
  • The EATX motherboard doesn’t really fit in the R6 ATX chassis; there was a labeling mistake at our suppliers, so I had to get creative:
    • The heat sink collides with the hard drive mounts, so I had to remove the mounts, panel and supporting beam that holds all the drives completely.
    • The mobo is wider than the back plate (EATX > ATX) only half of it is supported by the mounting screws, the rest hangs in the air. Luckily I found some cotton furniture pads to squeeze in between the mobo and back plate, so the area under the CPU and DIMMs gets proper support.

It worked nicely in the end and it’s up and running. A really good performer too, and seems to suffer less than the threadrippers from memory demanding work loads with high thread count, presumably due to the additional memory bandwidth/channels. I’m really eager to get back to work and explore the bios some more… or perhaps I should just use IPMI :smiley:

In any case, love to see Epyc content :slight_smile:

I only have a total of 8 slots but 3200 worked just fine on all 8 slots

I’m curious, can one buy the Gigabyte MZ01-CE0 online anywhere? Or is it necessary to call one of Gigabyte’s distributors and get a quote? If you’re allowed, I also wouldn’t mind knowing the general ballpark price I should expect to pay for the MZ01-CE0. Thanks for any help you can provide.

All that x4 Bifurcation means I could build the ultimate vMix system to compete with the Tricaster. Man, with that many lanes, you can fit as many 4K capture cards as you want for multi-camera production.

Anyone who isn’t familiar with vMix:


And Tricaster is a Newtek product that is basically a Windows Enterprise OS PC with lots of capture hardware:

Edit: Oh yeah, and vMix is so much better than the Epiphan Pearl if you have a monster machine like that.

Great video. I’m researching a build for 2019 to replace an FX-8350 cloud host (I need that management port). I have some questions:

  1. Have you run any Windows 7 VM’s yet on the Epyc platform? How is the performance? Any issues? Does it feel like the power of the Epyc is showing on the Windows 7 VM?

  2. What power supply did you choose for that motherboard? It requires 2 8-pin 12-volt connections, I’ve not found a redundant server power supply that seems to have two of them.

First post on the forum … Hello!

Followed Level1 on YT for a while. Seeing this build has motivated me into building an EPYC system.

(that and getting a stunning deal on a 7401P…)

So, finding SP3 boards in the UK has proven to be difficult, however I seem to have found a supplier that can sell me a Gigabyte MZ01-CE1 (ATX, no 10GBe).

I planned 8 (new) sticks of 8GB 2400 mhz DDR4 ECC to make 64GB, fully populated.

For similar money I could go for 4 sticks of 2nd hand Samsung 32GB DDR4 2Rx4 PC4-2133P

It will become my replacement ESX 6.5 server where I run a number of Xubuntu virtual desktops.

  • Am I going to see much difference dropping from 2666/2400 to 2133?
  • How big a deal will it be to only populate 4 slots ?

My gut tells me twice the memory will outway the speed but I am really not sure about half populated vs fully populated…

Any advice gratefully received… :smiley:

Bump… Anyone got any advice EPYC memory config ?