Monday, November 15, 2010

Today is Crazy Graphics Cards Day

When the GeForce GTX 580 was released a few days ago, reviewers were surprised to find out that it featured a so-called "limiter" which throttles the card aggressively when either Furmark or OCCT is launched. Some of them dug up old versions of the aforementioned software, and proceeded to measure the beast's power draw. They usually recorded something like 300 to 310W.



But now, GPU-Z developer W1zzard from TechPowerUp has just added a new feature to GPU-Z, which permits disabling the limiter. He's even measured to GTX 580's power draw under Furmark, and as it turns out, the thing pulls 350W!

Now you might be wondering why he got such a high result. My theory is that the latest version of Furmark puts an even heavier load on GF110 than older ones. This may well be why only recent versions of Furmark and OCCT are detected by the limiter. When said old versions were released, GF110 wasn't around to "optimize" for, so it may not be fully utilized.

Then again, in actual games (or 3DMark) the GTX 580 tends to draw a bit less power than the 480. Why? One possible reason is that games aren't quite that demanding, and under such circumstances, the GTX 580's cooler is able to cope very well with the card's heat output.
And as we know, power increases with temperature. We even know (thanks to Dave Baumann) that around its standard operating temperature, Cypress (HD 5800) draws about one additional watt per additional degree Celsius.
If we assume the additional heat-related power draw to be proportional to TDP, then GF100/110 draws about 1.6W more per additional °C around its standard operating temperature. Since the GTX 580 typically operates 15 to 20°C lower than the 480, we can expect it to draw approximately 24 to 32W less, all other things being equal. Of course, all other things are not equal, the 580 has more enabled units, higher clocks, and is based on a newer revision.

UPDATE: Psycho from the Beyond3D forum has just reminded me of this test by Kyle Bennett at HardOCP. He measured total system power draw in Furmark while keeping an eye on temperatures. The PSU was roughly 87% efficient. Shortly after launching Furmark, Kyle says the GPU is at 75°C while the system draws 449W. At the end of the run, the GPU is at 95°C and the system draws 481W. So that's 32 additional watts, but probably around 28W, taking PSU efficiency under account. In other words, about 1.4W per additional °C. So it looks like I wasn't too far off.

The 580's lower temperatures are probably an important factor in keeping power down in games, but it that doesn't really help in Furmark, because with such a heavy load, its cooler has trouble keeping things… well, cool.

Still, interesting as this may be (and if you're reading this, then you haven't fallen asleep yet) in the end it doesn't really matter: I don't know about you, but I usually play games, not Furmark.

But the GTX 580 isn't the only crazy graphics card under the spotlight today. PCWorld.fr has just reviewed a Crossfire of ASUS ARESes. I certainly wouldn't advise anyone to buy that, especially now, but it's still fun to read, in a crazy, over-the-top kind of way. If Michael Bay were to publish a graphics cards review, that's probably how he'd do it. Except at the end, the testing rig would be destroyed in a huge explosion and the reviewer would narrowly escape on the back of a giant Transformer. But other than that, it would be the same. Anyway, here is the review [French].


On a related note, there are whispers about an upcoming GTX 470. I'd expect 480SPs, clocks around 675/1350MHz, and maybe 3.6Gbps memory for roughly 240W; basically a cheaper, cooler GTX 480. I'm not sure that will be enough against Cayman, though.

No comments:

Post a Comment