P
Pr()ZaC
Guest
If you're pondering for a system update to play HL2 and other PS2.0 (DX9) games, you should check this out:
http://www.beyond3d.com/misc/traod_dx9perf/
http://www.beyond3d.com/misc/traod_dx9perf/
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: this_feature_currently_requires_accessing_site_using_safari
Well that's uninformedOriginally posted by Asus
I realize this benchmark is about dx9 compatibility. But there is a reason why sites do not benchmark using full AA/AF.
Nvidia has another lvl of AA(8x) that ATI doesnt have while ATI has another lvl of AF(16x) Nvidia doesnt have. You do not now what is causing the performance numbers if they are not using the same settings (Resolution,AA/AF settings,Color depth etc).
Remember ATI uses 16bit and 24bit precision and converts to 32bit when needed (does not hold 32bit quality) while Nvidia uses 16bit and 32bit precision. The reference to 16bit vs 32bit color depth in reflections isnt the same as precision.
Neither of these generation of cards are DX9 compliant (only compatible). MS changed the dx9 specification after both cards were already made. Christmas isnt that far off to wait for a DX9 compliant card (DX9.1 maybe even?), thats if you care about dx specifications.
what part of read the thread do you not understand?Originally posted by SidewinderX143
"bugs in nvidia's drivers"
proof?
I have been a long-time NVIDIA card user. Currently I have ATI 9800 Pro's in both my work and home machines.
The DX9 performance described by the Beyond3D article is consistent with what we've been seeing.
Well i assume the R360 will be better equipped to handle dx9, but again, you won't see compliant cards for a while longer.Originally posted by Asus
I realize this benchmark is about dx9 compatibility. But there is a reason why sites do not benchmark using full AA/AF.
Nvidia has another lvl of AA(8x) that ATI doesnt have while ATI has another lvl of AF(16x) Nvidia doesnt have. You do not now what is causing the performance numbers if they are not using the same settings (Resolution,AA/AF settings,Color depth etc).
Remember ATI uses 16bit and 24bit precision and converts to 32bit when needed (does not hold 32bit quality) while Nvidia uses 16bit and 32bit precision. The reference to 16bit vs 32bit color depth in reflections isnt the same as precision.
Neither of these generation of cards are DX9 compliant (only compatible). MS changed the dx9 specification after both cards were already made. Christmas isnt that far off to wait for a DX9 compliant card (DX9.1 maybe even?), thats if you care about dx specifications.
R300 is DX9 compliant, no one has proven otherwise yet.Originally posted by Pagy
Well i assume the R360 will be better equipped to handle dx9, but again, you won't see compliant cards for a while longer.
I feel that ati's 6xaa should be tested against nvidia's 8xaa. the radeons are faster and look better. If anything, sites should state that the settings are different at "max quality settings" and show the tests anyhow. It's not a test perse, but it sure is a decent performance comparison
Overall, it means that with 2x the speed in DX9 games, people using ATI can get a HELL OF A lot higher image quality... Both by default (these 2x is already of higher quality), but you can enable more and still measure up to Nvidia... High FSAA/AF is out of the question for Nvidia, and they are the ones that really need the taxing FSAA... If you want to use it that is.Originally posted by alco
So, in summation, how does this apply to the games?
HL2, D3, STALKER, Far Cry, MP2?