Legit Reviews has used FurMark in their last Radeon HD 5850 Crossfire review. The interesting thing is the comparative graph that shows the performance of AMD’s new babies compared to NVIDIA GeForce. But Nathan (the guy behind Legit Reviews) has had some problem to enable multi-GPUs support on both ATI (Crossfire) and NVIDIA (SLI). So all scores are those of a single GPU card that’s why the performance of HD 5870 Crossfire is the same than a single HD 5870 and the GeForce GTX 295 is a little bit under the GeForce GTX 285.
That said, the interesting results are:
- a Radeon HD 5850 is around twice faster than a GeForce GTX 275
- a Radeon HD 5870 is around 20% faster than a Radeon HD 5850
Radeon HD 58xx is a real killer in OpenGL. And with AMD that is working hard on its OpenGL driver, these new Radeon seem to be a good choice for OpenGL users and developers. I can’t wait to test a HD 5850/5870…
Now a little word about FurMark and multi-GPUs support. First thing, FurMark supports NVIDIA SLI and ATI Crossfire. It’s not simple to make multi-GPUs to work with FurMark but that works. At least under Windows XP (maybe under Vista / Seven it’s different: Nathan’s review has been done with Windows 7 Ultimate 64-bit …). The following screenshot shows FurMark 1.6.5 with a SLI of GeForce GTS 250. And the SLI visual indicator shows that both GPUs are used:
FurMark 1.6.5 displays only one temperature graph (first GPU). This feature has been improved in FurMark 1.7.0 where a temperature graph is displayed for each GPU.
I never had the chance to test myself a Crossfire system but I know it works: HERE and HERE.
I will publish shortly an article about how to use SLI with FurMark under XP and Vista.
I have a note about load balancing on SLI setups, because I just got my hands on a 9800GX2 and using Everest for monitoring while Furmark 1.70 was running, I noticed a very odd readout of the two GPU Ampere values:
GPU1 was only using about 15A, while GPU2 went through the roof, exceeding 50A at times … as a result, my system astrted acting up and consequently crashed. Usually, both GPUs draw ~40A under full load using eg. OCCTs GPU test.
As you can guess, the temperatures were also very uneven, ie. ~80° for GPU1, but ~100° for GPU2
Any idea what might be causing this behaviour and if there’s anything I can do about it ?
Cheers,
Maggi
nVidia bad testing…
“real killer in OpenGL”, from FurMark to this, it sounds quite a shortcut to me. I bet that with a Direct3D implementation you would see similar results. FurMark is first, a ROPs intensive software and then a bandwise limited software.
All you see here is that the Radeon 5870 has 32 ROPs instead of 16 which so that ROPs are not anymore a bottleneck. Bandwise is probably less an issue because if I’m right this 5870 use generic framebuffer compression which could lead to a really high compression rate with blurry things.
Chances are that the AMD hack to slow down Furmark is no longer useful anymore to “protect some card” because it might be difficult to reach 100% of work load for these composantes … expect maybe with Eyefinity.
Well, it just makes thanks, I don’t see any OpenGL drivers optimisations here …
it’s all a big lie…
I own a ENGTX275 and just bought a Club3d Radeon 5870 and my GTX275 is faster in game..
So i’m very very very dissapointed with al the talk about the 5870, it’s essentially a piece of crap…
It benchmarks significantly higher in 3Dmark 2006 but in game it sucks badly, i tried all drivers and a lot of different direct3d settings, upping the core and the mem, nothing gets it going…crap…
nvidia seems to be way better..
so i don’t believe any of these benchmarks anymore..
by the way, i repair and configure PC’s for a living, so i know the tricks to get my software and hardware running optimal..
i’m not talking nonsense here..ati sucks..
Furmark is shader heavy, so of course it won’t display how crappy ATI’s drivers are at complex texture mapping.