Gorgon
Newbie
- Joined
- Aug 20, 2003
- Messages
- 6,684
- Reaction score
- 0
Here's a story I posted at Driver Heaven earlier today! As I'm pushed for time, I'll post it as is.
I recently posted an article from Beyond 3D called ATI vs. Nvidia Tomb Raider: Angel Of Darkness DX9 Benchmarks! The importance of this benchmark is that it is the first to full utilise PS2.0 DX9 functionality within a gaming environment.
What most people seem to have missed was they weren't looking at the game which in our Great leaders words "Zardon: I agree totally the game is utter garbage" but were looking at how well ATI's and Nvidia's High end and Midrange cards compared when the DX9 Pixel Shader 2.0 functionality is enable! Here's a rip from the thread.
Looking at the benchmarks , Nvidia is getting a spanking! also worth noting is that in all cases Cg compiled shaders were enabled for the NVIDIA boards and disabled for the ATI boards. There is no difference in the output of the shaders compiled by Cg, however this should represent the the best case for the NVIDIA boards.
WaltC: I think if you read the article you'll see that the game runs much less "sh*tty" on the Radeons...
Come on, now, after all the huff & puff coming out of nVidia about using "real 3d games" you surely can't object to using a real DX9 game as a test...? It's not a benchmark--it's a real 3d game. Lots of "real 3d games" run far less optimally than benchmarks, and other 3d games. Doesn't mean they aren't "real 3d games," however. What's interesting to me about this game is the fact that it's probably the first real DX9 game to hit the market, and looking at nVidia's DX9-feature support scores in this game it's not hard to see why the company quit FutureMark last year... If anything, nVidia's DX9 feature performance is even worse in this "real 3d game" than it is in 3dMk03.... It's an eye-opener in that regard, IMO...
and I said this at the time WaltC is right with what he says and just wait and see what happens when Half-Life 2 is released? I bet you wished you'd had bought an ATI card .
Now over to my good freind Matt Burris at 3DGPU
This isn't confirmed as 100% official, so keep that in mind. On the HalfLife2.net forums, a gamer emailed Gabe Newell of Valve Software, and asked him a question in regards to GeForce FX cards dismal performance in Tomb Raider using PS 2.0 brought up by the article on Beyond3D (see this post). Here's the question he asked Gabe:
Is a ATi 9800pro card really alot better for HL2 then Nvidia's FX5900? Or is the difference not that big (quality & fps wise)?
Click her for the Answer:
http://www.3dgpu.com/modules/news/article.php?storyid=315
Thanks.:cheers:
I recently posted an article from Beyond 3D called ATI vs. Nvidia Tomb Raider: Angel Of Darkness DX9 Benchmarks! The importance of this benchmark is that it is the first to full utilise PS2.0 DX9 functionality within a gaming environment.
What most people seem to have missed was they weren't looking at the game which in our Great leaders words "Zardon: I agree totally the game is utter garbage" but were looking at how well ATI's and Nvidia's High end and Midrange cards compared when the DX9 Pixel Shader 2.0 functionality is enable! Here's a rip from the thread.
Looking at the benchmarks , Nvidia is getting a spanking! also worth noting is that in all cases Cg compiled shaders were enabled for the NVIDIA boards and disabled for the ATI boards. There is no difference in the output of the shaders compiled by Cg, however this should represent the the best case for the NVIDIA boards.
WaltC: I think if you read the article you'll see that the game runs much less "sh*tty" on the Radeons...
Come on, now, after all the huff & puff coming out of nVidia about using "real 3d games" you surely can't object to using a real DX9 game as a test...? It's not a benchmark--it's a real 3d game. Lots of "real 3d games" run far less optimally than benchmarks, and other 3d games. Doesn't mean they aren't "real 3d games," however. What's interesting to me about this game is the fact that it's probably the first real DX9 game to hit the market, and looking at nVidia's DX9-feature support scores in this game it's not hard to see why the company quit FutureMark last year... If anything, nVidia's DX9 feature performance is even worse in this "real 3d game" than it is in 3dMk03.... It's an eye-opener in that regard, IMO...
and I said this at the time WaltC is right with what he says and just wait and see what happens when Half-Life 2 is released? I bet you wished you'd had bought an ATI card .
Now over to my good freind Matt Burris at 3DGPU
This isn't confirmed as 100% official, so keep that in mind. On the HalfLife2.net forums, a gamer emailed Gabe Newell of Valve Software, and asked him a question in regards to GeForce FX cards dismal performance in Tomb Raider using PS 2.0 brought up by the article on Beyond3D (see this post). Here's the question he asked Gabe:
Is a ATi 9800pro card really alot better for HL2 then Nvidia's FX5900? Or is the difference not that big (quality & fps wise)?
Click her for the Answer:
http://www.3dgpu.com/modules/news/article.php?storyid=315
Thanks.:cheers: