Is Thunderbolt worth it for Linux?

First build in 12 years

I’m thinking of a build around the Asus z790 ProArt board, which has 10GbE, 2.5GbE, two thunderbolt ports, and what looks like 3-slot spacing between the first two PCIe slots that can be x8/x8.

I’m wondering if thunderbolt is worth it and is likely to work as a networking bridge in linux, or whether I am better off finding another x8/x8 board with wider PCIe slot spacing that could potentially fit 2xRTX4090 or a 3.5+ slot card next to a NIC.

If I’m reasonably confident the thunderbolt ports will work for data and networking I’ll go for this Asus board.

But will it?

Build idea

Because I’m too new to post links:
https://
pcpartpicker
.com/list/cH3GBj

Part Component Price
CPU Intel Core i7-13700K 3.4 GHz 16-Core Processor $409.00
CPU Cooler Thermalright Peerless Assassin 66.17 CFM CPU Cooler $45.90
Motherboard Asus ProArt Z790-CREATOR WIFI ATX LGA1700 Motherboard $419.99
Memory G.Skill Ripjaws S5 64 GB (2 x 32 GB) DDR5-6000 CL30 Memory $204.99
Storage TEAMGROUP MP33 2 TB M.2-2280 PCIe 3.0 X4 NVME Solid State Drive $69.98
Storage TEAMGROUP MP33 2 TB M.2-2280 PCIe 3.0 X4 NVME Solid State Drive $69.98
Storage Sabrent Rocket 4 Plus 2 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive $129.99
Video Card PNY VERTO GeForce RTX 4090 24 GB Video Card $1,599.99
Case Fractal Design Torrent ATX Mid Tower Case $189.99
Power Supply SeaSonic VERTEX GX-1000 1000 W 80+ Gold Certified Fully Modular ATX Power Supply $248.41
Wired Network Adapter Weird MikroTik PCIE 25GbE $199.99

TIA

So does this build make sense or seem weird? I’m not sure if I’m over-doing it on CPU, and my other main question is what case to use.

I am not opposed to AM5, just seems like intel has the upper hand and it’s networking friendly a lot of the time.

Do I need to get a copy of windows to set up and verify hardware?


L1 build a PC Questions:

Budget, country, buying

$1600 + 1600 GPU budget, in USA, prefer to buy from Amazon, have peripherals except a gaming monitor, but I have monitors.

OS

Proxmox or NixOS

Use cases

  1. Proxmox hypervisor
  2. Linux software dev (node.js, maybe some compiling)
  3. Deep Learning experimentation and self-hosting open source LLMs (PyTorch and docker containers using LLMs)
  4. Router/layer 2 bridge over thunderbolt and PCIe card
  5. Maybe a game like Destiny2

overclocking, water cooling

Probably not overclocking, perhaps basic XMP settings, no water cooling

QNAP makes good tb3 to 10Gb adapters. I have two for a direct pc to pc connection

Oh, it’s good to hear that these work well! I was considering one for my mac, but I don’t yet have much other 10g stuff to plug in to it. I think if I needed a connection more than 2m away this would be a great option.

Are you using fiber or DAC cables with SFP or the RJ45 version?

if you plan on tb with prox, forget it… just boot win10 and used your pc. Prox will pass 1 time and then reboot is needed as the end device will not be detected anymore. TB is not for passing and getting converter to 10g is quite waste or ressource. You can put a 25g for proper speed.
*for those 2m cable. i got plenty and they died. Just keep this in mind as even the Corning one i had to replace a lot.

My bad, soooooo sorry for the delay, I’ve been away.

Been using the SFP version but if I were able to do it all over again I’d prob get the RJ45 for ease of switch shopping.

Based on the video Wendel just did, I think the linux drivers I want to use are probably not worth it compared to just getting the QNAP device and running a DAC or fiber. I have some time before I build anything.

I appreciate the feedback :slight_smile:

1 Like

4090 on a Linux build is kinda dumb, Nvidia has less features than on Windows and what features they do have lag behind the Windows features by quite a lot, I mean it works but it’s not great. Also the 4090 is too powerful and will get bottlenecked even by a 13900k or 7950X.

Unless you feel you need the RT, native CUDA and/or DLSS, which is kind of hit and miss on Linux as things stand, I’d recommend going down to a 7900 XTX - same amount of VRAM, 25% less performance in AI benchmarks. If you are only doing some basic learning in AI though, it is good enough. Save the extra money for a 13900k or 7950X instead, it will be a much better experience overall, but yes… Your training AI will take 4-6 hours longer to train, from 16 hours to 24 hours or so. Is that a problem for your use case though?

Then again GPUs can easily be swapped, a 13900k for a 15900k, not so much.

Which missing features? DLSS? Could still be worked around by having a windows VM if there is a need.

Seems OP is more interested in AI. AMD will be a pain in the ass to set up. Official rocm support is not there yet afaik.

For OP a 4090 or a 3090 on a budget are the best options IMO.

Disagree slightly, if Nvidia is a must then a 4080 + 13900k is a better combo than 4090 + 13700k (but not better than a 4090 + 13900k).

For AI/DL VRAM trumps all else IMO. Certainly for LLMs. That’s why I’d consider the 3090 as the next step down.

For other purposes, sure, a 4080 is solid. Or even for other kinds of DL that are a bit less VRAM hungry.

Will you, as a hobbyist, need more than 16 GB of VRAM though? I do not know, I am not involved in the AI field at all. Is this a hard requirement to run AI models?

Best would be to just pay the extra $200 for a 13900k instead of trying to cheap out here. 13900k + 4090 is an awesome combo on Windows, on Linux and for everything but AI, the 13900k + 7900 XTX is an even better combo.

So, the four options that present themselves:

CPU GPU VRAM CPU Str GPU Str Cost
Intel Core 13900k Nvidia RTX 4090 24GB Very Strong Super Strong $2150
Intel Core 13900k Nvidia RTX 3090 24GB Very Strong Very Strong $2000
Intel Core 13700k Nvidia RTX 4090 24GB Strong Super Strong $1950
Intel Core 13900k Nvidia RTX 4080 16GB Very Strong Very Strong $1650
Intel Core 13900k AMD Radeon RX 7900 XTX 24GB Very Strong Very Strong $1550

AMD is definitely not a bad choice in AI; but it has not yet caught up to Nvidia and it still needs to mature a little bit more before the software is as smooth. As always I can only lay out the options as I see them, your PC, your money, your decision.

Just to clarify, I don’t game and may not use a monitor on this machine. I just wanted to get into the price/performance sweet spot of the 4090 for experimenting with AI.

I just got the sense that the 13900k used a lot of power and spat out a bunch of heat, and I figured I wouldn’t really be running it at peak often, so I could just step down there.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.