Corsair Obsidian 750D Monster Battlefield 4 System Build Log

 

For the impending release of Battlefield 4, we’ve already shown you how you can build a gaming ready AMD system on a budget. If you want to go whole hog, if you want the whole enchilada, though, we can run screaming the other way. There’s been some resistance around here to putting together a system this ridiculous, but a visit to our forums proves many of you are just as crazy as I am when it comes to building a high performance build.

This also seemed like a great time to show off our new Obsidian 750D chassis. As many reviews have pointed out, the 750D really is the Obsidian 900D for the rest of us, a flexible case in a much more manageable form factor.

I’m keen to point out this is my first build for Corsair, so please be gentle. Jeff has been doing these for a while and his presentation is exemplary, taking full advantage of the cable management features of our cases to produce systems that are frankly quite gorgeous to behold. As for me, where cable management is concerned I’m pretty much all thumbs. Our cases do the lion’s share of the work, but if you look behind the motherboard tray of this build you’ll probably scream bloody murder. And naturally, since the pressure’s on to make this build one for the books, I ran into some…troubles.

1

So first, our component selection:

This may seem like absolute overkill, and that’s probably because it is. There are good reasons, though (not the least of which is that it’s going to wind up being my video editing workstation). But if we’re talking strictly about Battlefield 4, the only truly excessive choice is going with 64GB of high speed DDR3 instead of 32GB.

The open beta of Battlefield 4 proved that the game will use just about any system resources you can throw at it, and it’s extremely well threaded. That makes the overclockable Ivy Bridge-E Core i7-4930K an excellent choice to build around. Supporting that with a high quality X79 board like the ASUS P9X79 PRO is the right call, though we would’ve gone with their X79 DELUXE if it hadn’t been sold out at the time of this build. And while it remains to be seen if a quad-channel memory bus can be saturated the same way Haswell’s dual-channel bus was, it’s not like having 2400MHz memory is going to hurt performance, right? Of course, none of this does us much good without high performance storage, and the 240GB Neutron GTX is more than up to the task.

On the graphics side, the pair of Radeon R9 280X cards represents arguably a pretty incredible amount of performance for the money. I’m not keen on running cards with open air coolers in multi-GPU configurations, but Gigabyte’s WindForce coolers should be up to the task, and the P9X79 PRO offers a healthy amount of space between the PCI Express slots. The additional upshot of these cards is support for AMD’s Mantle, which DICE will be patching into Battlefield 4 before the year is out, and Gigabyte ships them from the factory with an overclock that puts them on par with or even faster than a stock Radeon HD 7970 GHz Edition. Last but not least, they come with 3GB of GDDR5 apiece, which will let them keep up with Battlefield 4 even at its most demanding settings.

Finally, we’re running all of our power and cooling needs through Corsair Link. The AX1200i power supply offers a tremendous amount of headroom for our build and shuts down its cooling fan when under 50% load. When we connect it to Corsair Link we can actually monitor its efficiency and power consumption across all of the rails. Since we’re only using a 3.5” Corsair Link Commander box, we can remove the drive cages from the bottom of the 750D and add two AF120 Quiet Edition fans. The Hydro Series H110 we’re using to cool the CPU is our highest performing cooler, and while it doesn’t feature direct Corsair Link support, we can still connect its PWM fans to a Corsair Link Cooling Node and manage them that way. Ultimately, when you’re dealing with this many fans (one 140mm exhaust, two 140mm radiator fans, two 140mm intakes, and two 120mm intakes), Corsair Link’s ability to handle fan control entirely in software can be a real lifesaver. Finally, the RGB Lighting Node is just plain neat.

2

To prep the Obsidian 750D, I removed the two drive cages and their mounts from the bottom of the case, then moved one to the top, beneath the 5.25” drive bays. There are rails to mount the cage to, and one screw will mount it securely into place. After they were removed, I installed the AF120 fans as intakes. There’s actually a very healthy amount of space at the bottom of the 750D; when I installed the very deep AX1200i power supply I still found plenty of clearance between it and the innermost fan.

4

The mounting post in the center of the motherboard tray made installing the ASUS P9X79 PRO a breeze. It still surprises me just how massive the LGA2011 socket is, though; it’s fantastic for mounting aftermarket cooling, but the two step installation process for these huge chips is almost arcane. Memory went in much later but without event.

Our Corsair Link Commander Unit and Neutron GTX SSD were next to install, and both went in easily. The 3.5” drive caddy snapped easily and securely into place around the Commander, while the Neutron GTX slid snugly into the 2.5” toolless caddy, which in turn snapped into place behind the motherboard tray.

Installing our Hydro Series H110 cooler wasn’t a huge hassle, either. The 750D includes rubber grommets in the fan mounts on top of the case, and I elected to mount the H110 as an exhaust instead of its usual role as an intake. You sacrifice a couple of degrees of CPU performance, but overall system airflow is vastly improved; we’re essentially looking at two 140mm and two 120mm fans in the bottom front serving as intakes, pushing air up towards the graphics cards and the CPU, and exhausting that air through the H110 and rear 140mm fan. While our CPU is rated for 130W, the graphics cards are rated for 250W apiece and they feature open air coolers, so we really want to move air through the 750D as rapidly and efficiently as possible.

5

Thankfully, the LGA2011 socket includes four posts specifically for mounting CPU coolers instead of forcing us to install a separate backplate, and the H110’s hoses are fairly flexible. My installation is off center, but I’m opting to put as little stress on the hoses as possible. Getting the H110 mounted to the top of the case and the waterblock to the CPU was actually very easy, as the 750D is extremely spacious. I did install the AUX 12V CPU lead for the power supply before installing the H110, though, just to make life easier on myself.

Next I installed the AX1200i and the graphics cards. The AX1200i is 200mm deep but as I mentioned before, still has plenty of clearance between it and the AF120 fans I added to the bottom of the case. There’s some question as to the value of having a fully modular power supply since you’re always going to need the beefy 24-pin motherboard line and the 8-pin AUX 12V CPU line, but being able to route those cables at both ends makes cable management and overall installation much, much easier. Prior to installing the Radeons, I noticed that the top one would overhang the SATA 6Gbps leads that come off of the chipset, so I went ahead and connected that SATA cable from the motherboard to the GTX SSD beforehand.

6

Where things got a little goofy was with the CrossFire bridge. Both Gigabyte cards come with an AMD CrossFire bridge, but it’s actually too short. Modern high end motherboards designed for multi-GPU configurations will often space the two primary PCI Express slots three slots apart to allow for as much airflow as possible, but the bridge Gigabyte includes is only long enough to stretch two slots. Thankfully aftermarket bridges are easy to come by, but I can see someone trying to assemble their brand new system at 11 at night getting upset when suddenly they can’t get CrossFire working because the bridge is too short.

Finally, there’s installing the Corsair Link nodes and getting everything wired up. Two Cooling Nodes and one Lighting Node were required for this build; the Cooling Nodes only support five fans apiece, and we have seven we have to work with here. For the lighting, I decided to subdue it a little bit and run a single strip of LEDs along the bottom of the case. I affixed each of the nodes behind the motherboard tray, above the power supply. Zip ties were used liberally, but my ability to somehow produce hilariously mangled cabling nonetheless prevailed. It’s a gift.

8

Unfortunately, getting everything powered and working wound up being much more fraught. The culprit is Ivy Bridge-E itself; because Ivy-E uses the same X79 chipset Sandy-E did, those X79 boards typically just need a BIOS update and they’re good to go. That’s all well and good, but if you don’t have a Sandy-E CPU handy, you may find yourself unable to actually get your new system to POST. The ASUS P9X79 PRO had this problem, but ASUS at least built in a mechanism to deal with it. You can put an updated BIOS file on a FAT16 or FAT32-formatted flash drive, connect it to a specific USB port in the I/O cluster, and so long as there’s a power supply connected to the motherboard, you can just push a button and update the BIOS that way. Thankfully this feature worked, but the NewEgg reviews are littered with people who had some trouble making it happen, so buyer beware.

9

Of course, once the system was up and running it was positively beastly. Understanding that I plan to tweak the heck out of it in the future, nonetheless I did run some early benchmarks on it. At stock settings (as well as stock SPD on the RAM, not XMP), this bad boy scored 12413 points in 3DMark Fire Strike and 6732 points in 3DMark Fire Strike Extreme. BioShock Infinite was playable with everything cranked up, scoring a robust 113.1fps (Frame Pacing enabled, naturally) at 1080p, and Tomb Raider had all of its settings dialed up as well (4xSSAA, TressFX, the works) and still spit out 69.7fps at 1080p. And while I’ve been skeptical of open air coolers in multi-GPU configurations, the WindForce coolers on the Radeons allow them to run surprisingly cool (if a little noisy).

In an upcoming second part to this blog, I’ll talk about how far I was able to push this system and report on just how much faster it can run, so stay tuned!

Share:

Add your comment

  • *
  • *
  • *
  • Captcha
    *
 
Facebook Newsfeed