Thursday 11 November 2010

Multi-GPU - a different approach

Time and time again, the recommendation from people in the know, as far as multi-GPU approaches from ATi and nVidia (Crossfire and SLI respectively) go, is to avoid as there's issues. Fairly substantial ones, at that.

For a game to be multi-GPU compatible, it needs to have the appropriate code in the game to allow the cards to play nice, the drivers need to be firing on all cylinders, graphics card profiles need to be assigned, the graphics cards need to be as identical as you can get them, there's a doubled risk of component failure, your motherboard needs to be compatible, your PSU needs to be bigger, your case needs to handle the heat of 2 graphics cards on top of CPUs, hard drives... the list is eventually exhaustible, but still long.

I love it. And here's why...

As it stands, the multi-GPU game is a lot simpler than it used to be. Back in the day, when ATi introduced Crossfire, you needed to buy very specific video cards. You couldn't just go out and buy yourself a nice shiny new X1800XT back in the day to put in your rig with your current X1800XT, no sir! You had to buy yourself a X1800XT Crossfire Ready Master card (which, incidentally, put a hell of a strain on your other Mastercard), then take out your first card, put in the Master card, then put your Slave card (massively devaluing your £400 purchase, thinking of it as a "slave") in the second slot, connect a fiddly cable across the back of both cards to allow the Master card to join up the output of both cards, THEN hook up all power connectors, displays, case sides, and turn the power on. Then find out you'd accidentally put the unmarked cable on the wrong way, and have to pull the machine back out form under the desk, re-jig, and finally get it all working. Whew...

Nowadays, there aren't any Master or Slave cards - just pop in card 2 (assuming your motherboard can use your multi-GPU setup of choice - more on that later), plug in the power cables, attach a ribbon cable between the 2, and go. MUCH simpler.

As a side-note, sometimes those little ribbon cables don't come with graphics cards, so you've got to head along to your retailer/etailer of choice and get one. A couple of quid, but still, a big inconvenience.

Motherboard selection can make a HUGE difference in terms of your multi-GPU experience. As recently as 2008, motherboards were capable of Crossfire OR SLI, but not both. If you wanted to go Green Team, you had to use an nForce motherboard, which was undesirable for a lot of enthusiasts, due to reduced CPU performance levels in comparison to Intel/AMD chipsets. SLI in itself is a rarity on an AMD-based platform, even now, due to AMD owning ATi. If you were running an AMD chipset with a nice big shiny Geforce 8800GTX, you were screwed over if you wanted to get another - it would involve ditching your current motherboard, and possibly processor, having to make the transition to Intel, followed by reinstalling WIndows because it didn't like being in a "new" system. Thanks to the introduction of the P55/X58 Intel chipsets, which can run both setups, there's no need to panic over motherboard restrictions (as long as you buy Intel and the right board, anyway). With the unprecedented ability to switch between SLI and Crossfire at will comes another big step - the ability to switch between manufacturers during generations, and not have to worry that future upgrades will be hampered by compatibility. Holy scalable graphics, Batman!

Performance is still an issue with multi-GPU setups. If a game doesn't natively support multi-GPUs, you'll find your performance stagnates (or worse - plummets), for double the cost and power-draw. A benchmarkers' favourite, X3: Terran Conflict, was one of the more notable culprits of this. Starcraft II was a major pain for this too, with no implementation at launch, leading people to ask what Blizzard had actually been doing for the last 12 years (admittedly, in the grand scheme of things, this was a minor complaint compared to some of the other issues levelled at the game). This happens whether it be through 2 cards reverting back to one, or one of the X2-style cards, that see a multi-GPU setup on one physical card.

Now, I said back at the start that I loved multi-GPU setups. And I do. Here's the hypothetical to demonstrate:

Average gamer rig from 2008 - Overclocked Core 2 Duo/Quad, P45 motherboard, GTX260-216, 4GB DDR2, 500GB hard drive, maybe some extras. Still not too shabby, is it? However, say there's the option for a £650 upgrade (Christmas, birthdays, having some cash on hand - whatever reason), designed for playing games and doing a bit of extra work on the side (for me, it'd be music recording). Buying new graphics cards that will deliver solid performance increases means you're looking at £200+ of budget, limiting the rest of the build to just over £400 - you're going dual-core, whether you like it or not. If you can pick up a second-hand GTX260 for £60-70, all of a sudden, you're looking at nearly £600 to spend on CPU, motherboard, RAM, PSU etc - that's Bloomfield territory. Nice shiny Core i7-950 good enough for you?

Yes, you'll have to ditch a pair of GTX260s once you decide to get some beefier graphics - I'm under no illusion that this would be the perfect upgrade path - but with the massive amount of regular launches in today's graphics market, prices are falling heavily. The extra time that a second old graphics card could buy you, may mean the level of performance you're after plummets in price. Selling the old cards online (probably for not much less than you paid for the second one, due to the non-linear depreciation of electronics) will net you some of your money back, so overall, you could be better off financially, as well as having the best PC you can get your hands on. That's the enthusiast way, right?

So, anyone who's reading this in 2012, and has a Sapphire 5870 Vapor-X 2GB they want rid of, give me a shout. I should be looking out around then.

No comments:

Post a Comment