Not to mention that you can't just preorder a random GF6800U. You have to pick a certain brand. Different manufacturers use different cooling solutions and clockspeeds on their cards. For example, it's said that the Albatron GF6800U will be clocked at an amazing 600/1000MHz. :O
There are literally dozens of 6800ultra reviews on the web. Most of them look the same. Since XBitLabs isn't as well known as HardOCP, Anand or THG, they have to do something special in order to draw an audience to their review. They choose to do it with the stolen HL2 beta. Lame.
Btw, the...
No, he only has a 2 GHz CPU, which will be a bottleneck. Good graphics card need to be accompanied by a good CPU, or otherwise they won't be able to fully spread their wings, so to speak. I think his 9600np matches his 2 GHz CPU perfectly.
XBitLabs is abusing software that was illegaly taken from Valve. They only do this for the sole purpose of generating hits for their site. I have zero respect for websites who act like that. :flame:
Highly unlikely. Support for Shader Model 3.0 requires an entirely new chip design. Besides, a leaked internal ATI presentation already noted that the R420 won't have SM3.0 support.
Other then that, the rest of your post makes sense. A combined Valve + ATI demonstration at the next E3 could...
The drivers that Tomshardware used didn't have ShaderModel 3.0 enabled yet, so Far Cry defaulted to PS 1.1/2.0 mixed mode for the NVidia card. The screenshots that ray_MAN showed give a hint of the true shader quality of the NV40 (although I wish the screenshots weren't scaled down in size)...
Wow.... :O
Very impressive.
Between all the dozens of benchmarks, these struck me the most: Call Of Duty @ TomsHardware.
The FX5950U and 9800XT look like low-end cards next to the GeForce6800U. CoD doesn't have any pixelshaders of course, so I'm very much looking forward to benchmarks with...
Wow, a lot of people are immediately trying their hardest to defend the Catalyst drivers! There's really no need to. Both the Catalyst and the ForceWare drivers currently have some minor bugs/open issues, but nothing worth bitching about.
I have used NVidia's drivers since 2000 and never...
No, that was normal mapping and some of the other techniques that were outlined in the GDC document.
HDR is related to lightsources (like the bloom effect).
EDIT: Spiffae beat me to it.
21
Far Cry is the best game I've played in a long time. The boat and buggy chases were particularily awesome. If HL2 and Doom3 wouldn't appear later this year, this would have been my choice for game of the year.
Return To Castle Wolfenstein, because of the medic, engineer and lieutenant classes, the interesting objectives, the teamplay, the mature community and the funny voicechats.
I think the main reason for that is because the FX cards have less shader pipelines then the Radeon cards.
PixelShader3.0 requires 32bit, so NVidia won't throw 32bit out of the window. ATI will stick with 24bit, which is potentially faster. The disadvantage is that 24bit is not enough for PS3.0.
NVidia anounces its new graphics chip on April 13th. So I guess we'll see a lot of benchmarks pop-up in two days.
Of course, after that we'll still need to wait for the launch of the R420, before any real comparisons can be made.