Review: Nvidia’s dual GPU GTX 690 AIB
Jon Peddie on May 10th 2012 |
Fastest, bestest, not biggest
With the GTX 690, Nvidia was focused on creating not only the most powerful dual-GPU card in the world but also a card that was power efficient with great acoustics.
There has always been a sacrifice when trying to cobble together the best enthusiast PC in trying to figure out what is the best way to get the best performance. Crossfire and SLi were the solutions up until now. Early attempts at dual-GPU cards underperformed compared to Crossfire and SLi, but the latter was expensive, took up quite a bit of real estate in the tower, and ate up power. Dual-GPU cards of the past could not compete on the performance front, and the core clocks had to be turned down as much as 20% to 22% to account for heat and wattage restrictions, resulting in cards that just underperformed when compared to their SLi counterparts.
Nvidia’s new GTX 690 introduced last week is the first dual-GPU AIB that challenges the notion that one card can’t compete with two. In our tests, the new Nvidia board not only crushes the former heavyweight champion, the limited-edition ASUS Mars II goliath, but shows performance on par with GFX 680s in SLi. We are seeing comparative scores with two GTX 680s because the GTX base and boost clocks are only about 5% less than a standalone GTX 680.Compare that to the previous generation dual GPU card, the GTX 590, which was outfitted with GPUs whose a base clocks were more than 20% slower than the GTX 580 (to manage heat).
So what is allowing the GTX 690 to run at such high core clocks? The answer would be twofold. First, the cooling system on the GTX is exceptional; Nvidia decided to “spare no expense” when it came to construction. The shell of the GTX 690 is cast aluminum with trivalent chromium plating, which not only acts as a heat sink but gives the card a striking appearance—space-age coolness. Molded magnesium makes up the side walls of the 690, which cover two nickel-plated fin stacks as well as ducted airflow channels and vapor chambers. In the center of the card you see the single axial fan. The fan’s speed can be adjusted using EVGA’s precision software, but we chose to leave it at auto during our tests.
In our mountain lab, we saw temperatures hitting 62 Celsius on the GTX 690 at load compared to the 78 Celsius that we observed with the GTX 590, and the 690 was running at higher clocks and with roughly 25% better performance on average. Power consumption in the GTX 690 is about 45 watts, which is less than the GTX 590 as well.
Along with the revamped heat sink technology, another reason the 690 is allowed to run at such high speeds is the GPU boost. The GTX 690 core clock is 910 MHz, but with GPU boost the clocks can be cranked to over 1 GHz, depending on load and voltage. But the GPU boost also regulates power going into each GPU and optimizes each GPU independently in real time. One GPU could be running at 1 GHz while the other is at base 910 MHz or even at idle 300 MHz when the application does not call for SLi. This allows the card to run most efficiently and keeps it cool, relatively speaking.
Nvidia also focused on the acoustics of the GTX 690. The GTX 690 is the first dual-GPU Nvidia card that is quieter than comparative SLI configurations: the GTX 680 runs at an estimated 51 dB with the 690 running 5 dB quieter on average, which puts the GTX 690 just 5 dB louder than a single GTX 680.
The new line of Kepler GPUs also allows for surround gaming and surround gaming in stereovision. With Fermi, two AIBs were needed for surround 3D. The GTX 690 comes with three dual-link DL-DVI connectors and one mini displayport.
With the introduction of the GTX 690, Nvidia again stakes its claim to be the fastest AIB on the market. And for the first time you can get premium performance from a single card. The GTX 690 can be run in Quad SLi with another GTX 690, but Nvidia doesn’t support a GTX 690/ GTX 680 configuration. You can add another AIB for PhysX, however. The GTX 690 is a hard launch with boards available today. The price tag runs close to $1,000, but as we said before, Nvidia spared no expense and expects solid sales of this top-of-the-line graphics card. We’re waiting to hear back from Asus whether there will be a Mars III answer.
We ran the GTX 690 using three games and two benchmarks.
NOTE: We’ve (slightly) renamed the Pmark and now call it the P3mark. The P3mark is an overall score that measures a board’s performance, price, and power consumption. We added the “3” to eliminate confusion that the “P” stood for Peddie, or that it might be confused with PassMark, another benchmark. Also, we are pleased to report that Futuremark is using the P3mark on their website (http://www.futuremark.com/) for scores sent to them.
We compared the Nvidia GTX to four other boards and SLi configurations; the results are shown in the two charts to the left.
One of the promises of additional graphics horsepower is the ability to run a game or other application at ultra-high resolution with maximum graphics features such as lighting, shadows, smoke, and 8x anti-aliasing. However, in some games, for example, Crysis 2, you can’t select all of those options.
The first chart shows the GTX 680 getting the best overall P3mark scores, which is mostly due to the price difference. Likewise, the older GTX 580 does better on the P3mark score than the GTX 690.
The GTX 690 beats the 680 SLi configurations and is close to the GTX 590.
In the case of raw performance, the GTX 690 is almost tied with the SLi 680s. You can see that more clearly in the next chart, which shows the various AIBs’ FPS results for all of the tests.
The GTX 690 does best on Batman: Arkham City and Dirt 3.
Pros and cons
One might argue that if could overclock an AIB, why wouldn’t you? And the simple answer is you would, but not everyone would be able to achieve the same level of overclocking due to manufacturing tolerances in the GPUs. Obviously, if Nvidia could get 100% consistent performance out of GTX 680/690, it would raise the specifications. For that reason, overclocking has no place in comparative testing but can be interesting in experimentation by individuals.
Just incidentally, and not part of this test, the GTX 690 got a higher score in performance than an AMD HD 7970 and a HD 6970, but they both beat it in the P3mark score (due to price and power).
The test platform
All tests were run using a 3.3-GHz Sandy Bridge Core i7-3960x processor, with 8 GB of 1.6-GHz DDR3 Corsair Viper Extreme RAM, in a system with a Windows 7 64-bit operating system and a 249.68-GB HDD.
The AIBs used in the tests are shown in the next table.
What do we think?
It’s a personal decision that is gated by your wallet. If I had two GTX 580s driving three screens in a surround 3D setup and wanted to reduce the heat and noise, and had the money, I’d immediately replace them with a GTX 690. In addition to being able to drive my surround 3D configuration, I could (with an active DVI splitter) drive even more screens. And … maybe most important, if I a had two GTC 580s and could drop in one GTX 690 and then overclock the hell out of it, well, boys and their toys—it’s lot safer than what I used to do with souped-up cars, and I never get grease under my fingernails on a date night.
|GTX 580||GTX 590||GTX 580 SLi||GTX 680||GTX 680 Sli||GTX 690|
|Cuda Cores||512||512 (2x)||1536||3072||3072|
|Memory Interface Bits||384||768||768||256||512||512|
|Die size MM2||520||520 2x||520 2x||294||588||294 2x|