Benchmark Result Debate - Lies??

Matrix

Newbie
Joined
Sep 4, 2003
Messages
85
Reaction score
0
Ok all this half life buzz going around its confusing and upsetting many fans of NVIDIA and ATI . It's very ironic , just click this link and look on the left hand of the page.

http://www.nvidia.com/object/winxp-2k_45.23

Oh and i just found this article on the same site that just released the benchmarks article

NVIDIA Responds

Here's the official statement from NVIDIA regarding the recent Half-Life 2 performance issues:

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

http://www.gamersdepot.com/index.asp

ok so theres the link if you want to see.

But its strange it seems ATI is so much better at the same time NVidia would just fail so utterly bad? it makes no sense, some insisted ATI Paid off Gabe, but we won't know that, remember these are 2 TOP contenders that want the huge coinage and are using Half Life 2 to get the business....so what do you all think. everyone thinks geforce's will do bad just because gabe said that but i have a feeling it all should turn out ok, and if he is right then im gonna be pissed off to go fork out more money on upgrading.
 
It is odd for Nvidia to seemingly drop the ball so badly.... Maybe its really less of a discrepancy than it seems.
 
HL2.jpg
 
ahahah i saw that pic earlier, but we won't know for a whilke some also suggest its a way out for the sept 30 launch but i doubht it.
 
The benchmark has been given to hardware sites, I'm sure ATI isn't going to pay off all of them :p
 
Even John Carmack has said that Nvidia's PS2.0 is bad.

The people who post this shit about ATI paying off Valve are conspiracy theory freaks, or are in denial. Valve has done a lot of work to get Nvidia hardware to run HL2 well.
 
Gabe stated (i read on one of those tech sites with the benchmarks) that they were NOT pleased with NVIDIA's optimizations with the rel 50 drivers so they wanted people to NOT use them. NVIDIA probaby knows this and thus should not complain.
 
yeah right i read in one of the pages that they where taking things out of the game to improve performance. thats gay. Anyways nvidia sux ass 2 slot megafans are for suxas.

edit:".. Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work...."

specialized work.... nvidia team "hmmm what can we eliminate from the game that people won't notice."
 
NVIDIA should just admit they suck, cut their losses and go back to the drawing board for a new card. No shame in that.

Also, as far as i understand, NVIDIA already has to use some shader paths to get AA/AF working properly. This already makes the card slower than ATI so where else do they expect to cut corners.

(If i am wrong about this, someone correct me)
 
i was reading this in a detonator text .

DirectX 9 Support
When Microsoft releases DirectX 9 runtime, Release 40 will provide support
for DirectX 9, which includes the new vertex shaders, antialiasing modes,
and multi-display device support.

So is there still hope after all???
 
Originally posted by droper
yeah right i read in one of the pages that they where taking things out of the game to improve performance. thats gay. Anyways nvidia sux ass 2 slot megafans are for suxas.

got a link?

you should know already saying "i read" or "i heard" on these fourms...does not cut it. dont start rumors, unless you got a valid link.

Originally posted by droper
edit:".. Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work...."

specialized work.... nvidia team "hmmm what can we eliminate from the game that people won't notice."


like i said, do you have a link?
and nvidia cant touch what valve puts into or takes out of hl2.
so i really dunno where you gettin your source of information.
 
LOL on that pic alehm ! :D

As for that reply by nvidia it seems to me like a load of crap. At the moment I believe in Gabe's interpretation of the facts.
 
Seems like a big excuse to me...

Same here. We'll know more tommorrow when the hardware sites start releasing their benchmarks. I think anandtech.com is releasing it at midnight tommorrow, not sure when the other sites will be releasing theirs...

It will be interesting to see just what the detonator 50 drivers do performance wise. If they do have a significant performance increase, then there will probably a long debate as to whether they cheated or not. I don't think we should expect to see very drastic gains, as Nvidia has obviousley been optimizing there drivers for directx 9 for some time now. They wouldn't wait until this late to be releasing directx 9 optimized drivers. Either way, I'm happy I got my radeon 9800 non-pro turned pro and then some......
 
I dont know why this is a big surprise to people. THere have been other recent games where NVIDIA has failed. Take the latest tomb raider. Bad game but it uses a fair amount of PS2.0 stuff. NVIDIA's card has apparently been shit-poor with it... or so i hear
 
"..The optimal code path for ATI and NVIDIA GPUs is different ..."
damn it wasn't gabe clear enough. Nvidia is not built properly around directx 9 code path, thats y it performs badly with games built around directx9.

these people .. "our 100 million customers" ahahah shut up and play dead.
 
Originally posted by droper
nvidia team "hmmm what can we eliminate from the game that people won't notice."

Whats wrong with that, would you want it to render 50 faces at 10 FPS, then render the only 2 you see at 40?

(Numbers dont match here, just wanted to show you the basic gist.)

"..The optimal code path for ATI and NVIDIA GPUs is different ..."
damn it wasn't gabe clear enough. Nvidia is not built properly around directx 9 code path, thats y it performs badly with games built around directx9.

Thus new drivers.
 
This has been the most interesting event surrounding HL2 since E3.

At least Valve is trying their best to make HL2 work with NVIDIA cards. They want people who love to play games to enjoy playing HL2. I give Gabe kudos for that.
 
I just bought a Radeon 9700 pro and with the newest Cat 3.7 drivers all my games seem relatively dark except when I look at the light (bulbs, sky, glare) and then it's almost blinding. Never experienced this before on my old GF 4600. Sure the games look great and run smooth, but it's hard to play FPS games the way I used too when the enemy is to hard to see. I tried playing with the gamma/contrast/brightness hotkeys to try and get a balance in the game, but then other games are too bright and/or dark. It's annoying as hell, and me thinks of going back to Nvidia since I never had to fiddle with that crap to enjoy gaming with those cards..
 
i would want it the way valve released it. doing that kind of thing is basically lying to the customer so they won't buy their competitors product under false pretences thats its the same thing or better. If you have a nvidia card then that kind of thing is your only hope. I actually hope nvidia fixes it but drivers can only do so much.

edit:this is a comment based on a previous post in this thread as to nvidias actually changing of the game in order to make it work better on their card.
 
Originally posted by droper
i would want it the way valve released it. doing that kind of thing is basically lying to the customer so they won't buy their competitors product under false pretences thats its the same thing or better. If you have a nvidia card then that kind of thing is your only hope.

hl2 has competitors? what competitors? ;)
 
Whats wrong with that, would you want it to render 50 faces at 10 FPS, then render the only 2 you see at 40?

It still makes a big difference. These sort of modifications should be up to the user to decide whether to implement or not. The optimizations that we are referring to are ones that degrade image quality, even if its only a small amount. The user should be the one deciding whether or not to degrade quality for the sake of performance.

This is also for the sake of comparison, if Nvidia comes stock with real bad image quality, but insanely fast benchmark scores, and ATI comes with good quality and good benchmarks, then this is misleading consumers into thinking that Nvidia is much greater than ATI when in reality you cant draw a proper conclusion.
 
Lots of competition out there coming up.. although HL2 may be a clear winner in our minds, games like Stalker, Doom3, Call of Duty and many upcoming FPS are in others minds. It's almost certain HL2 will be great, but I am also looking forward to the others in almost as much anticipation as they all look amazing. Although too damn dark on my Radeon 9700 pro. Maybe I should have stuck with Nvidia :D
 
Originally posted by TheWall421
It still makes a big difference. These sort of modifications should be up to the user to decide whether to implement or not. The optimizations that we are referring to are ones that degrade image quality, even if its only a small amount. The user should be the one deciding whether or not to degrade quality for the sake of performance.


But you would not know the diffrence, only faster FPS.

It seems to be you are looking for a reason to bash Nvidia by making it the users decision. Who in their right mind would want slower FPS in exchange for nothing?

As for what else you said. Thats almost something diffrent. I still will probably get 9800 any way. I am just trying to keep some hope for nvidia-ers.
 
just turn up your gamma in display properties to turn the brightness up, should carry over to all your games....
 
Originally posted by TAZ
I just bought a Radeon 9700 pro and with the newest Cat 3.7 drivers all my games seem relatively dark except when I look at the light (bulbs, sky, glare) and then it's almost blinding. Never experienced this before on my old GF 4600. Sure the games look great and run smooth, but it's hard to play FPS games the way I used too when the enemy is to hard to see. I tried playing with the gamma/contrast/brightness hotkeys to try and get a balance in the game, but then other games are too bright and/or dark. It's annoying as hell, and me thinks of going back to Nvidia since I never had to fiddle with that crap to enjoy gaming with those cards..

And the funny part is, technically you didn't need to upgrade for Half-Life 2. Since the ti4600 in directx 8.1 res. 10x7 will run about 45 fps, which is pretty decent...

So your old vidcard will run even better then the FX5600 will. Weird...
 
Originally posted by TAZ
Lots of competition out there coming up.. although HL2 may be a clear winner in our minds, games like Stalker, Doom3, Call of Duty and many upcoming FPS are in others minds. It's almost certain HL2 will be great, but I am also looking forward to the others in almost as much anticipation as they all look amazing. Although too damn dark on my Radeon 9700 pro. Maybe I should have stuck with Nvidia :D

you just named three very different games, and all three are also very different from HL2. they can all be great, and i wouldnt call them competition, oh, and call of duty is using the Q3 engine, its nothing super great (its a great game judging from the demo, just not a great engine), although they have achieved alot with it.

BTW, why not up the contrast on your monitor to 100% and then adjust brightness as needed....
 
Originally posted by Impute
But you would not know the diffrence, only faster FPS.

As for what else you said. Thats almost something diffrent. I still will probably get 9800 any way. I am just trying to keep some hope for nvidia-ers.

hope isn't going to improve their FPS. Alot of nvidia-ers shelled out big bucks based on nvidias claims of directx9 compliance and performance. Thats no cool at all considering many games in the future are going to be based on the same technology. I am shocked personally at the performance of 5900ultra vs 9600 that just shouldn't happen.
 
But you would not know the diffrence, only faster FPS.

If no one can tell a difference then yeah, thats fine. But graphics chip makers can only tweak so much performance out of their cards without sacrificing quality, and nvidia has a lot catching up to do.

Benchmarkers have been complaining some time now that graphics cards are becoming too hard to compare because some companies (namely Nvidia) are degrading the quality of their images to create more performance. Benchmarks are supposed to be the comparison of two identical scenarios, but when one card ends up looking far worse than another in benchmarks its not a fair comparison. Then again graphics card makers dont really have any sort of benchmark as far as quality is concered so it will be a long time before this issue is resolved.
 
gabe just replied to my e-mail and told methe best card for half life 2 at the moment is the ati 9600 pro, guys you know he really is onthe forums, you don't want me to post a screenie do you? * so lazy lol * im arleady shoppin around the lowest i can find near me is 160$ (canadian)
 
Originally posted by droper
hope isn't going to improve their FPS. Alot of nvidia-ers shelled out big bucks based on nvidias claims of directx9 compliance and performance. Thats no cool at all considering many games in the future are going to be based on the same technology. I am shocked personally at the performance of 5900ultra vs 9600 that just shouldn't happen.

Yeah, I just recently bought a 5600 and I am now feeling quite disillusioned...
Well, I will wait at least until the stand alone benchmark comes out(why so late?)and/or the full game before making anymore rash decesions.
Maybe the rel.50 drivers will make a difference............
 
Originally posted by =)PoLo(=
And the funny part is, technically you didn't need to upgrade for Half-Life 2. Since the ti4600 in directx 8.1 res. 10x7 will run about 45 fps, which is pretty decent...

So your old vidcard will run even better then the FX5600 will. Weird...

My Geforce 4 ti 4600 is in my Athlon 1800+ which had a geforce 3 ti 200 , this way me and my son can play HL 2 together.. my 2 ghz Athlon 2500 (200x10) should be more then okay to run HL2 with my 9700 pro, but I might upgrade the cpu to a 3000 xp before then.

And I adjusted everything .. Gamma, Contrast, brightness... it's rather too damn dark in places, or way to bright where the light is when I make the dark places more visible. There probably is a setting to make it all clear, but why the hell should I have to play with it in the first place? Should this not be set at defaults? Or could it be my monitor with Xp since when i install the monitor drivers it says it's not recommended by MicroSuck?
 
Originally posted by Matrix
gabe just replied to my e-mail and told methe best card for half life 2 at the moment is the ati 9600 pro, guys you know he really is onthe forums, you don't want me to post a screenie do you? * so lazy lol * im arleady shoppin around the lowest i can find near me is 160$ (canadian)

the 9600 is the best performance for the price, it is NOT the best card to run HL2 regardless of price.

the best card for HL2 (regardless of price) is the 9800 pro.
 
ooh dame -_- i knew that would have been too good to be true, i should save 50$ every 2 weeks until i can afford the 9800 pro 256 ddr :).
 
Originally posted by Xtasy0
you just named three very different games, and all three are also very different from HL2. they can all be great, and i wouldnt call them competition, oh, and call of duty is using the Q3 engine, its nothing super great (its a great game judging from the demo, just not a great engine), although they have achieved alot with it.

BTW, why not up the contrast on your monitor to 100% and then adjust brightness as needed....

All games are competitors for GOTY :)

Even that stupid Sims crap which most ppl somehow enjoy and buy.

And my contrast is at 100%, brightness 50% and I fiddled with them all, the monitor, the desktop and the D3D settings to try to get a good balance.. but when i finally do, and play another game, I have to do it again.. and yes i save the profile for the game i finally set. It's just a pain that I never had to do before with my GF card.
 
As an unfortunate Nvidia owner (FX5600 non-pro)...I hope these new drivers actually do close the gap a little with ATI's cards. I am in no way a fanboy of either company, I just happened to be forced into getting the nvidia card this time despite all the positive info I've heard about the ATI cards...

I don't know enough about gfx cards to say whether the newer drivers will really help, I hope they can though.

I'm waiting on the drivers and their benchmarks before I put in my $.02 about this whole thing.
 
Well here is one thing for sure , they cant use FP16 fully , so I have no idea how they are going to speed it up that much. Maybe by lowering image quality ? I Dunno Either they use FP32 or no WHQL Certification from Microsoft since DX9 Requires FP24 or up.
 
Back
Top