Nvidia wins in HL2?

1. HL2 leak (wait for the game)
2. Those tests are somewhat cpu limited (70 FPS at 1024x, 1280x and 1600x)
3. Its a Tie...but X800 does lead by a full 10fps on the 2nd benchmark at 1600x (not a lot)

btw I was board and took 5 mins to create the excell file over xbitlabs' benchmarks, in sig.
 
Gabe Newell has said that the x800 is 40% faster then Nvidias 6800 with Half-Life 2, so don't believe those stats.
 
WhiteBoy said:
Gabe Newell has said that the x800 is 40% faster then Nvidias 6800 with Half-Life 2, so don't believe those stats.

Now, I hate to sound negative.. but Gabe may be a TAD bit biased towards ATI
 
Well, if Crytech said the X800 was 50% better than 6800U...it wouldnt be lies.
Look at X800 vs 6800U with AA/AF enabled in Farcry (Let alone renaming it to FartCry).

Gabe didn't give details when that 40% occurs.
 
Asus said:
Well, if Crytech said the X800 was 50% better than 6800U...it wouldnt be lies.
Look at X800 vs 6800U with AA/AF enabled in Farcry (Let alone renaming it to FartCry).
The 6800U might draw some benefit from the ShaderModel3.0 mode, once that's implemented.
 
WhiteBoy said:
Gabe Newell has said that the x800 is 40% faster then Nvidias 6800 with Half-Life 2, so don't believe those stats.

Gabe also said the game would be out on sept 30th 2003...
 
as the 12-pipe and 16-pipe flavours of the chip demonstrate equal speeds. This is probably another example of suspected CPU limitations or some issues with ATI’s beta drivers.

Lets not forget that the cards are running in the wrong code paths as well. Valve had gone to hell and back to get the 5800 to run at acceptable speeds on HL2 (and failed) and it's most likely the 6800 is running in it's code path.

Personally I don't think Xbit should be benchmarking alpha code, I also don't think they should be calling a DX8 bench DX9, bloody retards.
 
Arno said:
The 6800U might draw some benefit from the ShaderModel3.0 mode, once that's implemented.
Well, the being tied at 1024x,x1280x, and 1600x and then loosing by 50% in AA/AF isn't a shader issue at all.

mrchimp said:
Lets not forget that the cards are running in the wrong code paths as well. Valve had gone to hell and back to get the 5800 to run at acceptable speeds on HL2 (and failed) and it's most likely the 6800 is running in it's code path.
Yep!
Just like FarCry (bad shader IQ and all).
 
Yeah, the leak is atleast one year old, I wouldn't think of those performance numbers as accurate.
 
Asus said:
Well, the being tied at 1024x,x1280x, and 1600x and then loosing by 50% in AA/AF isn't a shader issue at all.
Improved shader performance can give higher fps in all resolutions and AA modes.
 
Benchmarking non-production cards on non-production software. Interesting.

I am positive that how HL2 does shaders and treats different cards will be different from the leaked build. It's just dumb to compare them.
 
Asus said:
Well, if Crytech said the X800 was 50% better than 6800U...it wouldnt be lies.
Look at X800 vs 6800U with AA/AF enabled in Farcry (Let alone renaming it to FartCry).

Gabe didn't give details when that 40% occurs.
Funny enough, because Far Cry is an "Nvidia: The way its meant to be played" game.
 
DiSTuRbEd said:
IT STILL IS!!!!! :naughty:

Yeah! last time i looked it was 9th May _2004_ and i still dont have HL2.

Gabe's not going to say "The ATI scrapes by the nvidia on HL2" because ATI + Valve = together....duh! when i see a 6800u getting 50fps on HL2 and a x800 getting 90fps then gabe will be right, he might look at HL2 everyday and know a lot more than i do but companys teamed up dont try to ruin eachothers cash flow.

I wonder how many people will go out and buy an x800 because 'GABE' says it runs 40% faster...i wonder i wonder!!! anyone else thinking "business men at work"?
 
X800 Pro and XT will indefinantly get better frame rates then the 6800U I wouldn't doubt that, but probably not by much.
 
the "leaked" version is not current version code.
TWIMTBP is just nVidia paying the devs off to put a sticker or logo on their game case or loading screen. DOES NOT(from what I've seen) TRANSLATE TO ACTUAL EXTRA SPEED.
It is kinda.. well, I guess really stupid to benchmark a alpha version game.
 
i wont believe gabes BS untill i see the frame rates either... i really dont care if my fps are 50 or 1000000000 if its over 25 ile play it ^_^... and thares just something in Nvidia i just cant get out of ati :afro:
 
Johan_Tayn said:
the "leaked" version is not current version code.
TWIMTBP is just nVidia paying the devs off to put a sticker or logo on their game case or loading screen. DOES NOT(from what I've seen) TRANSLATE TO ACTUAL EXTRA SPEED.
It is kinda.. well, I guess really stupid to benchmark a alpha version game.
Like to point out, thats exactly the same as Get in the Game
 
ComradeBadger said:
Like to point out, thats exactly the same as Get in the Game

Yeah pretty much....

But nvidia use's it alot more losley than ATI does.....
 
[Matt] said:
Gabe also said the game would be out on sept 30th 2003...

now, he never said what form it would be in so TECHNICALLY it was released on/around sept. 30, 2003 ;)
 
I find fanboys amusing and confusing at the same time. :)

I wish I could have such blind faith or fulfillment. Then again I don't, I want to know what is actually best for me and let that be what satisfies me.
 
Alig said:
Gabe's not going to say "The ATI scrapes by the nvidia on HL2" because ATI + Valve = together....duh! when i see a 6800u getting 50fps on HL2 and a x800 getting 90fps then gabe will be right, he might look at HL2 everyday and know a lot more than i do but companys teamed up dont try to ruin eachothers cash flow.

I wonder how many people will go out and buy an x800 because 'GABE' says it runs 40% faster...i wonder i wonder!!! anyone else thinking "business men at work"?

NEVER!!!! ;)
 
Back
Top