It set me thinking on my spare PC. This PC used to be my main PC before an upgrade and is not really appropriate for what it is used for. It has an Intel Core i7-3770K 4-core (8 thread) CPU and 32GB DDR3-1866 memory. Since I use it mostly for builds, far more cores would make for a PC far better suited for what I use it for so I decided to build another server system, but this time configured as a workstation.
I knew that the CPU, RAM, and motherboard of the existing system would still have a significant resale value on eBay, so I anticipated this build as coming out as cash neutral.
Xeons suitable for twin CPU installation come in three flavours – “E” for general use, “L” for low power use, and “X” for performance. So it was pretty much a given that I would use X56nn CPUs. And they would have to be hexacore, which ruled out the 4.4GHz X5698.
The X8DAL/DTL series of motherboards, such as the X8DTL-3F that I used in the NAS server, have a maximum QPI speed of 6.4 GT/s, which means they can take any 5500-series or 5600-series Xeon CPU. So I knew that I didn’t have any concerns about some CPUs not working (an issue I had with the i7-3770K, but that’s a story that would take up an entire blog post on its own).
I went looking on eBay for X8DAL initially, since I knew I would be using an external GPU so on-board graphics weren’t really important. There were none to be found, so I widened the search to include X8DTL as well, and the seller from whom I bought the 1U server came up as still having some left, and I figured I could do a lot worse than simply buying another, so I put in an offer for the same price as last time, which was accepted.
But which CPU? Where was the sweet spot on price? Looking on eBay in “completed listings” it was clear that the X5680 CPU commanded a huge premium over lesser CPUs. The 3.33GHz X5680 was coming out at around twice the price of the 3.06GHz X5875, so it looked like the latter made more financial sense. But in the end I decided that I really wanted the X5680 and never mind the cost. So I made an offer of £120+VAT on a matched pair of them which was accepted. It was only afterwards that I found out that there is actually a 3.46GHz X5690 CPU as well, but I didn’t feel that it was worth the hassle of trying to change things (either by trying to return the pair of X5680s, or by trying to resell them). Perhaps down the line I might replace them but these will be fine for now. Besides, the X5690 is currently selling for around double the price of the X5980 and I’m not sure the price premium is justified.
For RAM, I reactivated the eBay search I made last time, and found a non-VAT seller selling the Hynix modules for £25 each on a ‘Buy it Now, Offers Accepted’ basis with free postage. I cheekily put in an offer of £80 for 4 which was accepted.
Finally, cooling. Given my positive experience with the Cooler Master Hyper EVO 212 I ordered a pair of these, both B-grade from Amazon Warehouse.
Whilst waiting for it all to arrive, I stripped down the old system, and set about photographing it for an eBay sale.
I initially planned to sell it as a bundle but I realised that the Z68-based motherboard did not fully realise the overclocking potential of the CPU, thus making the bundle less desirable, and that I would probably get more if I sold the parts separately. Also, I was wondering if the Arctic Freezer Xtreme rev2 cooler might be made to work with the NAS server, thus freeing up the existing Cooler Master Hyper 212 EVO in the NAS server, so I amended my order of the two 212 EVO coolers to just one, and put the Intel Core i7-3770K CPU, Gigabyte GA-Z68X-UD4-B3 motherboard, and Corsair 32GB (4x8GB) DDR3-1866 RAM kit up for sale on eBay as separate listings, taking advantage of a “£1 final valuation fee” offer that was running.
After a few days everything arrived and I was ready to start the build.
I powered up the server “as is” to verify it worked, which of course it did, and then stripped out the motherboard and took it to the bench to remove the E5645 and insert the two X5680s, and swap the memory. I then temporarily fitted the passive heatsinks (one that came with the last server and one that came with this one) to make for easier testing, rather than going straight to fitting the 212 EVOs.
Powering up, and the board wouldn’t boot, not even POST. Nothing apart from blinking motherboard LEDs to show power. So I methodically went through a problem-solving process. After the obvious like checking everything was seated, I removed all memory to see if there was POST and a ‘no memory’ error. Yes, there was. So it was probably memory-related. Next try different memory – two of the 4GB RDIMMs that came with it. That booted straight through POST and into BIOS. Hmmmm. Next was to try “known good” 16GB memory to eliminate the Hynix ones being duff, which meant borrowing the two Samsung units from the NAS server. No, that didn’t work. A suspicion was forming in my mind, and I examined the motherboard carefully for its revision number and made an unfortunate discovery – the motherboard was a Rev 1.01 board rather than a Rev 2.01 like the previous one. This was a real problem as, according to the manual, a Rev 2.01 board is required for quad rank RDIMMs (ie. 16GB ones) when used with 5600-series CPUs.
I contacted the seller explaining my situation, and asked if they could help in any way; perhaps with a motherboard swap or something. They came back to me, and although very sympathetic and polite, pointed out that the server was exactly as described (inasmuch as they had never specified the motherboard revision), and that since I had disassembled it they couldn’t possibly accept it for a return. That was entirely fair, so I looked at other options. Obviously I needed a Rev 2.01 board and I had one in the NAS server, so I had to do a motherboard swap. But what was the best system I could end up with for a Rev 1.01 board?
The manual suggested that there were no such restrictions on 5500-series CPUs so one option was a pair of E5540 CPUs, which could be picked up for around £6 each. I happened to mention this to the seller during the email exchange, and they said that they had a pair of E5530 CPUs which they would send me free of charge to help me out, which was very nice of them and I accepted.
In the meantime, I swapped out the two X5680 CPUs from the Rev 1.01 board and, since the E5530s hadn’t arrived yet, and since the NAS box was currently configured as a single E5645, and since an E5645 had come with the Rev1.01 board also, I thought I would just do a test install of a single E5645 and 32GB RAM just to see what would happen. I was pleasantly surprised to find that it worked fine. As did 48GB. So that looked like an ideal way forward. A single 6 core (12 thread) CPU with a TDP of 80W seemed preferable to me to two E5530 CPUs with a combined 8 cores (16 threads) and a combined TDP of 160W, given that the E5645 and E5530 both run at 2.4GHz, especially given the extra cost of an additional cooler.
I stripped down the NAS server and performed the motherboard swap, which went completely without a hitch.
The Arctic Freezer Xtreme Rev2 also fitted without a problem, and was in fact easier to fit than the 212 EVO. You can read more about it on this post.
With the NAS box all back up and working again, and the Rev 2.01 motherboard now available, it was time to build the workstation.
Minimal reconfiguration was needed on the case standoffs, and the motherboard fitted without problems. The PSU connected up easily too, which was hardly surprising as, just like the motherboard, it was a known quantity being identical to the NAS server’s.
Then the two Cooler Master Hyper 212 EVO coolers went on, without a hitch.
The case I was using, a Cooler Master Enforcer, uses a huge 200mm 3-pin front fan, and currently has two Corsair 3-pin exhaust fans. I considered the expense of replacing these with PWM fans but instead decided that an external fan controller would be preferable, so ordered a NZXT Sentry 3 Touch Screen Fan Controller.
I also ordered another USB3 PCIe card because, as with the NAS server, the X8DTL-3F only has USB 2.0 on the motherboard. Staying with the concept of treading the well-trodden path, I ordered an identical one to the one in the NAS server.
I ran the system up to test it, with the case fans temporarily plugged into the motherboard, to verify operation, and all seemed well.
Next, the nVidia GTX 1050 went back in. This is a fairly modest graphics card that doesn’t require PCIe power connectors from the PSU, and since the PSU is modular I could simply omit these cables.
However, as I was fitting this card, I did notice that the heatsink on the Northbridge seemed extremely hot, which was a concern.
The next day the fan controller and USB card arrived, and they went in without fuss. I did consider putting an old Sound Blaster Live PCI card in there too, since I had one and the motherboard has no sound but does have PCI slots, but then I remembered that the nVidia graphics card supplies sound over HDMI / DisplayPort, so it was not needed.
The fan controller had a bit of a rat’s nest of cables which made cable management quite difficult, especially in the confined area near the power connectors and CPU2. I did consider temporarily removing the cooler from CPU2 in order to give the access needed to manage the cables better, but I decided to just do the best I could and live with it.
I ran the system up and everything ran fine, booting into Ubuntu without problems. After that, I left the system to idle to get an idea of temperatures, especially on the Northbridge, and sure enough after a while the system shut down with an over-temperature error.
I repositioned the thermocouple for the fan controller to between the vanes of the Northbridge heatsink and then restarted, and very quickly this climbed up to 75°C, with the controller ramping the fans up to maximum to try to cool it but without success. So I looked around to see what fans I had available, since it looked like the Northbridge would need to be actively cooled. There were 40mm high flow fans from the 1U server available but they are incredibly noisy, so I tried a Cooler Master 70mm CPU fan instead.
Amazing this fit exactly in the gap between the SATA connectors and the graphics card. The edge of the fan casing just fouled the blades of the graphics card’s rear fan, but some foam pads cured that and also had the added advantage of fixing everything in place. Upon rebooting, the temperature held rock steady at 32°C which seemed like a far better state of affairs.
I then took the machine from the bench and put it under the desk where it would live, and connected it all up for use.
One thing I noticed during normal usage under Ubuntu 17.04, both on a mature install and a fresh install, was an occasional boot error message “ERST: Failed to get Error Log Address Range”. This is mentioned in SuperMicro FAQ 15594 which says that the reason is WHEA requesting ACPI 4.0 table support and the X8 DP series does not support ACPI 4.0 (only ACPI 3.0). Disabling WHEA in BIOS solved this. Another thing to note is FAQ 10371 which suggests setting “BIOS Native Module”. I was not aware of this when I made my fresh install, so it was left on the “Intel AHCI ROM” setting. I will need to investigate at a later date.
I did a test build of one of my bigger FOSS projects, and noticed a nice boost in compilation speed with all CPUs being utilised. Very impressive.
By now I had sold the i7-3770K CPU, the 32GB RAM kit, and the Z68 motherboard, and this had already put me close to the goal of being cash neutral, and I still have parts from the 1U server still to sell.
Here are the costings so far, and I will update them when other stuff sells.
|SuperMicro 1U server||-£175.00||Y||-£145.83|
|Hynix HMT42GR7BMR4C-G7 16GB (1x16GB) PC3-8500R DDR3-1066 4Rx4, qty: 4||-£80.00||N||-£80.00|
|2 x Xeon X5680||-£144.00||Y||-£120.00|
|Cooler Master Hyper 212 EVO CPU Air Cooler||-£23.61||Y||-£19.68|
|NZXT Sentry 3 Touch Screen fan controller||-£34.99||Y||-£29.16|
|CSL USB 3.0 PCI express (PCIe) controller||-£12.85||Y||-£10.71|
|Sold: Intel Core i7-3770K||£167.25||£159.53|
|Sold: Gigabyte Z68 motherboard||£72.99||£63.48|
|Sold: Corsair Vengeance 32GB kit||£167.25||£152.67|
|Sold: Supermicro parts (so far)||£97.19||£73.76|
So, that was the build of my twin Xeon workstation. It was a whole lot easier than the NAS server build, not least because I was on familiar ground, but also because I was using a case suited for it rather than hacking about an unsuitable one.