X99 is still, and will be for some time, one of the best platforms for Tech Enthusiast and Home Servers

Let me start by saying, I’m a pretty deep rooted AMD (ATi) shill/fanboy. It’s what I could afford years ago, and it’s nice that they’re semi decent now.

That said, and out of the way, the X99 platform is, and will be, the greatest platform for some time to come!

Why:
The three main reasons why it’s going to be one of the best for some time to come;

  1. Cheap Xeon CPU Selection: LGA 2011-3 Xeon CPU’s are flooding the market, 14C, 18C, 22C, etc. Xeon CPU’s are getting dirt cheap! As of typing this, I recently acquired a E5-2696 v3 18C/36T 3.8GHz boost clock for almost $50.

  2. Cheap DDR3/4 ECC RAM: 16GB and 32GB DIMM’s are getting dirt cheap! The v3 Xeon’s use up to 2133mt/'s and the v4’s use up to 2400mt/s. So get what your CPU is capable of. I picked up 128GB of 2133mt/s ECC Samsung 4x 32GB DIMM’s for $120 a year ago.

  3. Cheap X99 LGA2011-3 Motherboards: There’s a lot of 2nd hand ones, but there’s also the cheap Chinese ones. Personally, I use a JGINYUE X99 Titanium D4, and it’s got full PCIe lanes (using a Xeon) to every slot. So 3x slots at full x16 bandwidth, and 1x slot at x4 bandwidth, and 1x slot at x1. It’s also got two M.2 slots; 1 PCIe Gen3; 1 SATA. Also, of many Chinese X99 LGA2011 boards tested, many have praised the VRM solution on this board. It does have a small 30mmx30mmx7mm fan in the VRM heatsink, but even with it off, it still works perfectly fine.

  4. Very Capable for old tech: The CPU’s/RAM were designed for 24/7 use at full capacity utilization in data centers. These never got OC’d, or ran in hot environemnts. It may be old for a datacenter, but these are still plenty capable for running/hosting VM’s, file sharing, or streaming/capturing gameplay, not to mention Xeon’s offer full PCIe lanes to any/all full wired slots.

The E5 LGA2011 v3 Xeon’s are Capable:
It’s old, it’s not new, so what?! I use mine to simultaniously do the following, all at the same time, and with ZERO issues!

  1. Host Game Servers; I’ve got 7 VM’s running, and hosting games for; 7D2D, RUST, Wreckfest, BOII, Minecraft, Unreal Tournament, Ground Branch (I’ll probably start up a few more). These only need to compute the environment variables, not the graphics, so it’s light task loads.

  2. Family File Sharing/Hosting; all of my family can see/download pictures and videos I upload through a VPN connection I setup for them. It’s free to my family, and they don’t need to worry about a monthly charge, etc. I have backups running to another device in another location at a friends house (we have a DR setup at each other’s place). If there’s any issues, they just text me. They don’t have to worry about a company going bankrupt, or locking them out through paywall of sorts.

  3. Streaming/game capture; while still hosting games, and allowing my family to pull pictures/videos (or upload), I can still stream or capture game footage.

Why people who stream think they need the latest and greatest, is beyond me! A great Home Server can be built and ran for a fraction of a new computer, and serve one’s purposes much better.

My Full Build:

CPU - Xeon E5-2696 v3 18C/36T----------------------------------------------------------------------- $50 (cheaper options)
CPU Cooler - Any for LGA20XX I believe will work---------------------------------------------- $30 (I got a DeepCool tower cooler, but pretty much anything for 145W TDP will work)
MoBo - JGINYUE X99 Titanium D4 (running 90mv undervolt mod)----------------------- $87 (about lowest for Chinese X99 boards)
RAM - 128GB 2133mt/s ECC Samsung 4x 32GB DIMM’s (w/ FreezerMod blocks)— $120 (cheaper options)
GPU - Nvidia Quadro P600------------------------------------------------------------------------------ $80 (cheaper options)
Capture - AVerMedia Live Gamer 4K---------------------------------------------------------------- $180 (cheaper options
SSD - SATA III - Phillips 240GB - host OS----------------------------------------------------------- $30
SSD - PCIe NVMe AID - SanDisk SX350-6400 - host VM’s and Game Capture--------- $613 (cheaper options - also $13 for expansion slot cooler for active cooling)
HDD - 4x 4TB SAS in RAID 5 - LSI MegaRAID 9264 4i - host file share-------------------- $99 ($25 for controller, $74 for 5x used 4TB SAS drives - cheaper options)
PSU - Thermaltake 600W-------------------------------------------------------------------------------- $36 (I’ll eventually upgarde to something better than a white label)
Case - Rosewill----------------------------------------------------------------------------------------------- $60 (was about $40 for case, and $18 for a aftermarket 4x 3.5" drive cage w/ trays)

Trimming Cost:
From what I paid, and my specific config, I’m at about $1385 all in. However, if you remove the PCIe NVMe AID and add a cheap 1TB-2TB PCIe NVMe SSD, it’d be down to $872 and it wouldn’t be a huge sacrifice. If I had gone with a 60FPS capable capture card, I could knock off another $130, and put me at $742, again and for someone streaming it would work just fine! Most people don’t need a swath of storage, and could probably replace or do without all together, the mass storage, trimming off another $99, bringing a possible Home Server build to $643. I also got my original Xeon E5-2697 v3 for $28 shipped, which would trimm off $24 from the cost, bringing it down a hair more to $619.

Gripes:
It runs 24/7 in the corner of my office. Only gripe I have, is the fans. Wish they could be a bit quieter. I didn’t add these in the cost, because you can sometimes find someone giving away 120mm fans, or selling them for dirt cheap. Sometimes the case and CPU cooler already include them.

Closing:
So if you have a friend saying they want a Home Server, and they’re looking to share files, run servers for games, and stream/record game play, this style build would be perfect for them!

4 Likes

I have to agree. I had two x99 systems and my Supermicro servers (Xeon 2600v4 series CPU’s) are based on the same chipset. It was one of intels best CPU and chipset.

1 Like

For raw value it seems like a good option.

For longer life cycle / lower power usage. id probably do 3800x/3600x AM4 system, if you need the lanes like I did Just deploy two boxes, idle power would probably be about the same in all honestly.

The best system is always the one you can afford.

4 Likes

Oddly enough I’m upgrading from a 2680v4 home server as we speak. The only issue I’ve had with it is that my friends (curse them, 100% their fault :eyes:) are fond of horrendously unoptimised games that prefer single core performance over everything. The 3.4ghz absolute max of a Broadwell CPU core just isn’t enough for Satisfactory or Space Engineers, sadly, and a 4ghz 1680v4 just isn’t enough of an upgrade to be worth the money (plus I’d lose 6 cores, anyway).

I’ve got a Supermicro X13SAE-F enroute which I’ll pair with some ECC DDR5 and (probably) an i7 12700, which should give me about double the single core performance when it’s running flat out - and a bit more multi-core despite the loss of 6 cores.

If it wasn’t for these stupid (fun) games I’d be quite happy with the 2680v4 for sure. I’m sure someone will purchase the board/CPU/RAM from me and be very happy with it :slight_smile:

1 Like

I’ve never played it, but anecdotally my old dual CPU 2697a v4 system (16 cores, 3.6 GHz turbo, two sockets) doesn’t have any trouble with current games at 2560x1440 144Hz.

Are you sure there isn’t a bottleneck somewhere else in the system? You might also check that your snoop setting is set to “home snoop” in the UEFI (it makes a huge difference).

For those interested in a full system, an HP Z640 barebones with the second processor card only runs about $300 on eBay. I own two that have run around the clock for years; they’re virtually silent, and very transportable!

1 Like

I tend to agree for the price X99 is very tempting right now. When E5-2680 v4 14 core CPUs go for $17 shipped it is a lot of CPU for not a lot of money. The biggest concern being power draw it might not matter as much for workstation or even decent budget gaming setups.

1 Like

Gaming is probably a pass with cheap ryzen

I’m having a very hard time upgrading my X99-DELUXE II and 6950X system at work even though I already have a replacement Ryzen platform ready to go. Quad-channel RAM, two onboard NICs, two SATA controllers, Thunderbolt 3, two U.2 ports and a vertical M.2 slot, several USB 3.1 Gen2 controllers, 802.11ac and BT4… I mean, come on. And I’ve got two GPUs, a SAS-3 HBA, and a 10GbE NIC in there all running with sufficient bandwidth.

More cores would be nice… better ST perf would be nice… PCIe Gen4 would be nice… but any significant upgrade is prohibitively expensive and/or comes with compromises. If and when I put the aforementioned Ryzen platform in there I’m going to lose a SATA controller, an expansion slot, and half my memory throughput. I might be waiting for Sapphire Rapids or Threadripper 7000 to hit the secondhand market…

1 Like

Oh, it’s not playing the games, it’s while hosting the dedicated servers for these games - I’ve tweaked the Ubuntu install that Satisfactory runs in, and the ESXi install, as much as I can, but it just can’t handle the amount of crap me and my friends build in that game :rofl: on a self-hosted instance it’s no problem, but when you have multiple people around in a single world it seems to really beat up a single core of the CPU.

I could wait for them to optimise the dedicated server software, but on the other hand power draw is also a thing, and I’ve decided I might as well get rid of the Xeon etc while they still have some value :slight_smile:

This would be true if either Haswell or Broadwell had an IMC that could tap into the maximum bandwidth of quad-channel DDR4. Spoiler: they don’t.

Broadwell can’t really extract more than about 80-85GB/s from DDR4 even with everything tuned to the absolute limit, which is at about 3600MT/s. Zen 3 on the other hand with a much newer memory controller and much better DDR4 scaling up past 4000MT/s can get really close to peak theoretical bandwidth, and tuned well can pull 65-70GB/s. So you’re giving up at most 20% in a best-to-best comparison.


4 Likes

Thanks for the thread HopnDude

Although I LOVE to scrounge for recycled parts and see what I can make with the smallest budget possible we have to admit something; the largest thing holding back this kind of setup would be the power/heat/electricity.

Gamers Nexus did a video about the x99 with several parts sourced from Aliexpress.

At the 23:00 mark the power consumption is reviewed.

In the long run the cost of power might cost you more then the saving you’ll have when starting your system build.

For reference an AMD R3900X (with the same core count) will use 100 less watts under load.

With that being said I do agree that “x99 is still… one of the best platforms for tech enthusiast and home servers.”

Cheers! :beers:

2 Likes

I don’t think anyone is arguing that a zen 3 chip would absolutely run circles around X99 in every way but I just think that being able to build a half decent system from salvaged enterprise CPUs from around the 2015 era for about the price of just a new motherboard is kind of neat.

Just scoured ebay for a bit for an example…
Kllsire X99 mATX Motherboard $89
E5-2680v4 CPU $16.98
16GB PC4 2400T $18
Be quiet! BK005 Shadow Rock 3 $53.65
Total $177.63
(Could probably find a much cheaper cooler as long as it can handle a 120W CPU.)

That would give you an mATX 14 core system with a 3.3Ghz turbo and 16GB ECC memory. Everything else like SSD, PSU, GPU, and chassis can be splurged on and carried over to a new system when that person can afford it.

2 Likes

I very much agree with this. If you want lots of memory or PCIe lanes then the options are still quite limited. The best offers I know of at the moment are Epyc Rome motherboards and CPUs on eBay from the more reputable Chinese recyclers, and even then you have to compromise on single-threaded performance.

1 Like

I used my i7-4820K system for so long. I bought it 12th Nov 2013

I upgraded it to a i9-10900K on 14th April 2021

The i9-10900K died a bit back and was RMA’d though the retailer and it took a couple of months. It then took me a few more months to get around to ripping the i7 out and putting the i9 back in.

The i7-4820K system never missed a beat, it just started to get a little long in the tooth and I wanted to turn it into a test lab server. When I first built it I put 4x4GB RAM in it. I then watched the prices and got a good deal on 8x8GB and so it’s now ideal for hosting VM’s. I could as suggested swap in a Xeon in the M/B and get a lot more cores. But this system was great, the number of devices I could hang off it because of all the PCIe lanes was great.

The i9-10900K system is a pain. I think I’ve got it right this time around that I’ve built it. I have 2x GPU’s (for all the monitors I run), 2x NVMe SSD’s. 2x SATA HDD’s and a Blue Ray drive.
I’ve had to read the M/B manual for the first time ever VERY carefully to get this setup right. Despite it having 3x NVMe slots, 6x SATA ports and however many PCIe slots you CAN’T fully populate it. Everything shares bandwidth with everything else. First time around I had it wrong and I’d notice poor graphics performance for monitors on the 2nd graphics card which wasn’t an issue on the i7-4820K system. This time around when putting back together the i9-10900K system I think I’ve got everything in the right slots so the only thing that shares bandwidth with anything else is the BluRay drive and being how little that gets used these days it shouldn’t be an issue.

I was excited when I heard rumour that Intel was going to bring back HEDT but I realise now that I’ve heard nothing on that front for a while and maybe sadly it’s not going to happen again.

Side Note:
This would require too much buy in from all the vendors. But I’d love to see PCIe Gen5 graphics cards be only 8x to free up 8 lanes for other purposes. I feel sure I saw a video somewhere that showed current cards running PCIe Gen3 and then Gen4 and there being a 10% less performance difference.
By reducing the number of lanes add-in cards use because of the increased bandwidth of the higher generations of PCIe spec, we could have the same number of lanes but more add-in cards.

If and when I put the aforementioned Ryzen platform in there I’m going to lose a SATA controller, an expansion slot, and half my memory throughput.

Memory problems aside, I’m in this exact crossroads now. I’ve got my 5900x set aside with a B550 MB to replace my Xeon 2673v3 as the homelab, but I keep getting into slot/expansion problems.

The B550 MB has 1 x16, 4 x1 and 1 x4 slots, but the x4 shares bandwidth with the x1s, and when I use one of the x1s, the built in MB ethernet stops responding. I’m waiting an x1 ethernet card to run some slot permutation tests, and I was able to free the x16 for a future HBA card, but I keep getting back at the Xeon for stability.

With the Xeon, with a chinese MB, everything works well. The only downside is single thread performance and a higher power usage, but since I couldn’t set up both machines with the same configuration, I don’t have enough data to compare.

So, Xeons for the win right now. Supermicro MBs are almost impossible to find and pricy as hell in Brasil, so no EPYC for me for a while. Also, these generation of Xeons 2011 are very desktop-like for cooling solutions, which would require something more specific for EPYC and these new Xeons.

1 Like

Yep going to deff be an issue

Agree. An X570 would be way better, but economically with the price of a single X570 MB here I can buy a Kit with MB, Xeon and 64GB of RAM, and have some change to spare to buy a 1TB NVMe or a SATA drive. 3rd world problems.

2 Likes

This is a recurring topic on the forum… please give us consumer platforms that feature more slots with fewer lanes at higher speeds. Unfortunately it seems to be moving in the wrong direction since very few LGA1700 and AM5 boards offer x8/x8 CPU-attached slots and the ones that do are prohibitively expensive. It’s like they’re trying to push us to HEDT but they forgot that they stopped making HEDT platforms.

4 Likes

the biggest concern I have with x99 is using the Chinese motherboards and driver packages. Without these boards the value proposition drops quickly.

I was using an old dell R610 for everything but when I was switching over to a new server for power/space reasons I did consider X99. I ended up using an old ryzen 2400ge to run truenas scale for a NAS and two VMs. It barely uses power and I really cant complain but I am considering x99 again as I build a new game server that will host two VMs. Im just concerned with some of the driver packages setting off alerts from Sophos and Malwarebytes.

If you don’t need an enormous number of threads I think the value proposition of x99 is quite low, but it certainly is a cheap platform (with the Chinese “gaming” type boards) that is easy to get 40+ threads on.

I was just looking at an old x99 system for 600 bucks, which where I live doesn’t seem too expensive. That would be the price of a 7700x over here

Sabertooth x99
i7 5920 w/ corsair h110i
4x8gb vengeance
1070 G1
120gb ssd + 700
Corsair CX850
TT Urban s71

Now you’re making me really consider it. I’ll just need to find a Xeon down the road. And that S71 with the HDD/SSD external bay, its a dream