Darknut - My slightly-more-ghetto approach to dual E5-2670s

Yo, forums! Is the Fox Loli again :DDDDD

Ever since the Xeon E5-2670 dropped to $60 apiece, I have wanted to have a crack at building my very own overkill over-the-top beastmeister of a machine, something to tear the socks off my old 2600K computer. This is something that a lot of others have done in months past, and it took me a little while to muster the courage (and the funds, hehe) to pull this off.

The machine will consist of the following:
• An Asus Z9PE-D8 WS
• Two Intel Xeon E5-2670s (with the C2 Stepping SR0XK, so as to retain the ability to use VT-d)
• The EVGA Supernova 750G2 that was in my old machine
• Two Xigmatek Dark Knights ( I will explain in a bit)
• 24GB of DDR3-1333 (I will be replacing it with 32/64GB ECC memory later)
• A Sapphire Radeon 7850 OC edition that I have lying around.
• Windows 10 Professional (I need to be able to use all the RAM I plan to put in this machine)
• Crucial BX200 256GB SSD (Hard drives will be added later.)
And last but probably most importantly:
• My heavily, heavily modded Fractal Design Define R4.

What is this build going to be used for?
• Video editing
• 3D Rendering
• Video encoding with 3 or 4 instances of Handbrake running simultaneously
• Lots and lots and lots of virtual machines :D
• All-around workstation badassery.

Now, I promised ghetto-ness and here are the points where it is required; keep these things in mind if you ever plan to do something similar.

Firstly, the case I shall be putting this in is my good old Fractal Design Define R4. This is a case that is very dear to my heart, but it's not the Define XL that was in the Dual Xeon workstation video that Tek Syndicate put out 3 years ago (which they mistakenly called an R4). The EEB motherboard I'm putting in this case is so wide that to get it to fit inside the case at all, I had to remove both drive cages and with a hacksaw cut off about a quarter of the 5¼in bay and plastic drive cage rail. That is far easier said than done - a normal hacksaw can't get to an angle necessary to carve out the bay. You'll have more luck with an angle grinder.

Second of all, in terms of mounting holes, SSI EEB is not supported by the Define R4 or by any mid-tower cases I am aware of. The holes covered in yellow are native ATX holes. The red holes must have standoff holes drilled into the motherboard tray. The green holes must also have standoff holes drilled, but they fall beyond the width of the mobo tray on the R4. The motherboard will have to be held up by a spacer of some description on the far right hand side.

Third and last of all, the coolers. I picked up the two Xigmatek Dark Knights from eBay because they were cheap ($17 apiece) and because they'll handily beat a Hyper 212 Evo any day of the week.

Now, these are version 1 of the Dark Knight, which was released before Socket 2011 was introduced. So they don't have out-of-the-box support for Socket 2011 motherboards like the Dark Knight II has.

Thankfully, because of the nature of their retention mechanism, and because Socket 1366 has the same footprint as 2011, the retention screws can easily be replaced with ones threaded for the mounting bracket threads on Socket 2011. What is needed for each cooler are 4 M4-threaded screws between 26 and 28mm in length. Note that these must be M4 screws - DO NOT USE 6-32 SCREWS FOR THIS, even though they appear to have similar threads - you will damage the threads and the mount will not be secure.

I decided to call this Darknut because it's clad in white armour (the R4 is white) and it has Dark Knights in it. It makes me think of the Darknuts from Legend of Zelda: The Wind Waker.

Now, I haven't started this quite yet because I need to pick up the screws and I'm also planning on turning this into a build video as well... But I will update as progress is made, with pictures! :DDD

Until then, have fun! ;3

2 Likes

Awesome! I can't wait to see your progress. I am about to do a build around that same chip because of the price...
There is a 10 core chip, the V2 variant of the E5-2670:


Which is the best 10 core price to performance...
I really want it because the board I am going to go for is only going to have one socket, when you see my build you will see why ;)
There are some space constraints...
Anyway since the standard 2670s dropped to like 60-70 range, which is sooooo cheap I think I will just do that.

Awesome, awesome, awesome info on the stepping of the chip, I never even THOUGHT about it!!
And I plan to do GPU pass-through so I NEED VT-d!!

Also very cool info about the 1366 to 2011 conversion, that doesn't seem to be something discussed much.
Sadly my board will have a Narrow type 2011 socket so.... Kinda screwed there.

Oh and I'm sure you want the features of the ASUS board and may not be technically inclined to do any soldering or hacking... But if you want a VERY cheap dual socket board...
This:

You can run that with 12v, so get an old HP 1000W server powersupply for like 25 bucks, solder up some cables, pipe it in, and wire up the start button, all documented or easy to figure out. And bam.

It has 2 x16 PCIe slots and two dell Mezzanine slots, which are x8 PCIe... Sadly the adapters that used to exist from china doesn't seem to exist anymore... If I were to get the board someone put up a possible pinout for them so I'd try to make my own adapters, but again, takes someone who isn't scared of a soldering iron and the possibility of fucking up a 120 dollar board.
But you can still get mezzanine cards for CHEAP:

Problem is the risers are expensive for some stupid reason :/

Anyway, I'd love to do this board, but still, I have size constraints so its not an option for me...

There are only TWO boards on the market it seems that fit my size constraints... And other specs.
I've looked at Supermicro, Intel, Asus, Asrock, Gigabyte, Tyan...
Only these two boards will fit:
http://ark.intel.com/products/68406/Intel-Server-Board-S1600JP2#@productimages
http://www.supermicro.com/products/motherboard/xeon/c600/x9srw-f.cfm

There are some micro-itx yes, but I at MINIMUM want two PCIe slots... Which is a huge problem with mini-itx.

So yea, goodluck!

3 Likes

:3? Dell board?

I already have all the stuff, so... :3

Ah, ok fixed.

Yea sorry, accidentally pressed enter before I was done haha!

1 Like

Have you seen this? Most likely isn't a thing on 2011-0 boards, but it's interesting.

So random idea then, since PCIE seems to be working over NVME quite easy, in an ITX rig you chould throw one of these cards/PCBs to get 4x 4xNVME to

And you could use these to let them be external to the chassis.

1 Like

I didn't want to derail your build log to discuss my build, but there are many reasons I didn't go with a mini ITX board like that.
A) While for one 10Gb port I really only need x8, I have also have dual port 10Gbe cards and that REQUIRES x8
B) I have a TON of 4GB low profile ECC ram from some old IBM Blades. Like over 400GB worth of ram...
So the problem is I'd have to buy expensive DDR3 to even get 32GB in that board, where as in the other board I have enough ram to get 32GB of LP ram for FREE.
C) I would really prefer to be-able to take advantage of the features of a server board, such as ECC ram.

Why do I need so much ram, well VMs... I am going to do GPU pass-through because I want my main desktop to be Linux, but I play video games and have an HTC vive, thus I need windows.
So I am going to have either:
A) A base Linux install with minimal resources to run two VMs, a Linux VM and a Windows VM.
B) A base Linux install for my main desktop and a Windows VM I can spin up for video games or other Windows based workloads.

1 Like

That is an AWESOME and GREAT idea!!
Really, I've even thought about this!
Problem is, I'd really prefer to keep the bandwidth to my graphics card at max because I have an HTC Vive and every little percent of performance matters!
Also this would ONLY work with a board that supports "bifurcation" which means you can tell the bios, HEY I am splitting this x16 into four x4s.

The final problem is what I stated above, I have some dual port 10Gbe NICs I can use and will use and they NEED x8 to run at full speed, otherwise I will be limited to one 10Gbe port, while I don't need two 10Gbe ports REALLY. It doesn't hurt to have an extra x4 available for something else, also the boards I am looking at has even ANOTHER x8 available so I have future expansion available.

I currently have an ASUS Maximus VI Impact, I love this board, I would love to keep it, but I can't put a 10Gbe nic in it, and THUS I am screwed with it :(
I will probably end up selling it to try and recuperate some costs for the new stuff. But hey, at-least now I will have 8 cores, PLENTY for two systems running concurrently :)

1 Like