Valve and NVidia?

ShaithEatery

Newbie
Joined
Oct 12, 2003
Messages
124
Reaction score
0
Is it just me or has Valve given an indefinate "Screw YOU!!!" to NVidia? I say this because I read on Tom's Hardware that Valve wouldn't let NVidia do benchmarks with the beta Rev. 50 drivers, almost seeming that they want NVidia to look worse than they should (http://www6.tomshardware.com/graphic/20030912/index.html). Second, I saw that ATI On Top at the Rock video from Fileplanet.com and Gabe seemed to express his great interest in ATI. I wish NVidia could get some props or something. As the proud (yes, i said PROUD), owner of a GeForce FX 5900, I hope that Valve could actually help out NVidia get out of this rut they're in curently by helping out with the Rev. 50 drivers. There are millions of FX owners out there and they want great Half Life 2 preformance without all the ATI owners screaming FRAMERATES, NUMBERS, and BENCHMARKS in their faces.
 
Well how about you stfu and let a real discussion start. I don't want a bunch of ATI fans saying "NVIDIA SUCKS!!!" every othre post, and I'm sure the mods and admins don't either
 
LOL! And a few years back all that ATI users wanted was to be able to play a game without some Nvidiot posturing that their card was better because it was faster. Time changes things! :p

Seriously, the reason that Valve didn't want Nvidia testing with the new driver set was because they already knew that Nvidia was planning some "optimizations" for the HL2 engine. And we've seen where those lead.
 
Originally posted by ShaithEatery
Well how about you stfu and let a real discussion start. I don't want a bunch of ATI fans saying "NVIDIA SUCKS!!!" every othre post, and I'm sure the mods and admins don't either

Did I say nVidia sucks ?? did I flame you ?? so stfu you down syndrome n00b.
 
I agree...but i see it both ways....both card companies are full of shit. and both pimp their stuff like the stuff from 3 months ago won't run anything...its pathetic...so many people buy into it to...makes me sick
 
NVIDIA performance sucks on DX9, Rev50 has cheats in it that degrade image quality, so if they used Rev50 drivers, the game would have looked worse.

According to the general consensus in these forums, anyway. This is old stuff.:E
 
and hallucinogen you called it nvidiot...
so yea you did flame...subtle as it was...:dozey:
 
Try reading the article I posted before replying. NVidia's new drivers will have "legal" optimizations and Valve didn't want that for some odd reason...
 
Originally posted by ShaithEatery
Try reading the article I posted before replying. NVidia's new drivers will have "legal" optimizations and Valve didn't want that for some odd reason...

Did I say nVidia sucks ?? did I flame you ?? so stfu you down syndrome n00b.
 
Originally posted by Hallucinogen
I have ATI, so I could care less about nVidiot.

That doesn't exactly have the beginnings of a productive topic. If you don't care, don't post.
 
Originally posted by ShaithEatery
That doesn't exactly have the beginnings of a productive topic. If you don't care, don't post.

That was my opinion, you dont like it ? dont read it, simple as that.
 
I'm ending this dumb argument now before the topic becomes a war of words. Back to the topic...
 
one of the reasons ppl wont benchmark with nvidia's 51 drivers, is because it DRAMATICALLY reduces image quality, just for better performance
which is lame, and is an unfair way of benchmarking a card
 
source of information? what's your proof it reduces image quality besides what's already happened with their other drivers?

and even though i'm an nvidia fan, i don't just follow them blindly. I want facts
 
Washuu ..... your sig say "AMD 2600+ OC@11x400 2205mhz" ... 11x400 = 4.4GHz ... which is impossible .... I think you ment 11x200 :)
 
Nvidia cheated with their drivers to make hardware that they knew wasn't up to snuff perform like it was. There is nothing that the drivers can do that will make up for some of the failings of hardware design unless they cheat it.

There is no doubt that Nvidia's GeForce FX cards can perform well in older games but they can't stand on their own legs when it comes to newer games.

The "legal" optimizations that you speak of are only "legal" in Nvidia's eyes (and maybe in Futuremark's too, ever since Nvidia rejoined the beta program and payed them a lot of money). Valve obviously didn't think that they were legal and chose to omit them from testing and that is their right as the creators of the game.
 
1) det. 50 drivers = beta
2) Image quality is reduced to gain performance on top of Valves special "mixed mode" which reduces precision.

The choice of image quality should be up to the developer to display their game how they wish. Valve was not able to chose like it should be but rather det. 50 drivers reduce quality below what the creator wishes and they are not in control like any developer should be over their own game. Nvidia chose a split path, 16bit and 32bit presicion, to be prepared but infact it was a bad move. 16bit is too low to maintain the quailty and 32bit is too high to maintain performance.
 
Nvidia screwed up. They didnt make next gen cards, they just tried to make faster dx8 cards. When it comes to shaders and dx9, nvidia has nothing.

They have repeatedly tried to make up for this with dodgy drivers that ARE NOT REPRESENTITIVE of how the cards play games.

Valve is right. Dont buy nvidia this generation. You should have bought it last generation, and perhaps they will regain it next generation, but dx9 belongs to ati.

End of story.
 
HUZZAH FOR THE NEW DETONATORS USING LEGAL OPTIMISATIONS OVER 51.75!!

Other than the Detonator 51.75, the Detonator 52.10 and 52.14 differ in so far as that the faked trilinear filter is also used for the texture stage 0 (which usually is the basis texture). That’s why both new Detonators are more "optimized" than the 51.75!

Finally, let's draw the conclusion of everything we found out in this article. First of all, let's have a look at the discovered "optimizations" of the Detonator 45.23 and 52.10/14, but only related to the GeForceFX series.

The Detonator 45.23 shows an exemplary filter quality for both OpenGL and Direct3D. However, an application specific "optimization" has been found for Unreal Tournament 2003, which can be deactivated by using the Application mode.

The Detonators 52.10 and 52.14 instead are showing us a lot of "optimizations" for Direct3D filtering, but seemingly neither an application specific "optimization" nor an "optimization" for OpenGL. You could say that in Direct3D, generally all texture stages are filtered by this faked trilinear filter, regardless of the filter setting forced by the control panel or by using the Application mode. In addition to that, there is another "optimization" when using the Control panel mode (not the Application mode), where the texture stages 1 till 7 are only filtered with a 2x anisotropic filter at the best.

http://www.3dcenter.org/artikel/detonator_52.14/index_e.php
 
I dont give a shit about brand i want a good card if Nvidia is better i buy it if ATI is better than i buy it .Im not a fanboy i want the best of the best.Right now ATI is best so im with it if they continue being best then so be it.Also im on ATI's side right now cuse Nvidia cheats on all those benchmark tests and stuff so i won't buy a nvidia card for a while atleast until they start coming clean.
 
If you read the entire Toms Hardware review you would of found out that Valve tried using the Rev 50 beta drivers and found less image quality and even an entire leve that cut out the fog.
 
Thank you, Crunkles. I was afraid I was going to have to go back and start copying & pasting.

Do a little in-depth research, Shaith and you'll see that VALVe gave and is giving their best effort to make HL2 work well with Nvidia. Problem is the Nvid hardware just doesn't compare to ATI's right now.
 
Originally posted by ShaithEatery
Is it just me or has Valve given an indefinate "Screw YOU!!!" to NVidia? I say this because I read on Tom's Hardware that Valve wouldn't let NVidia do benchmarks with the beta Rev. 50 drivers, almost seeming that they want NVidia to look worse than they should (http://www6.tomshardware.com/graphic/20030912/index.html). Second, I saw that ATI On Top at the Rock video from Fileplanet.com and Gabe seemed to express his great interest in ATI. I wish NVidia could get some props or something. As the proud (yes, i said PROUD), owner of a GeForce FX 5900, I hope that Valve could actually help out NVidia get out of this rut they're in curently by helping out with the Rev. 50 drivers. There are millions of FX owners out there and they want great Half Life 2 preformance without all the ATI owners screaming FRAMERATES, NUMBERS, and BENCHMARKS in their faces.

valve did help nvidia, they coded a specific codepath for nvidia cards so that they could perform better than using the default DX9 path.

also, the reason valve didn't want the beta rev.50 drivers used is because 1) they aren't public drivers. 2) they had some optimizations which did a few naughty things, one such thing is removing the fog in game, causing the game to perform better, at the cost of detail that the developers want in the game.

basically the drivers would show better performance, but when nvidia releases the det 50 drivers publicly, the optimizations which lower quality might be taken out, which drops the performance down again, so thats the reasoning behind valve not wanting the rev.50 drivers used for benchmarking.
 
i am an nVidia man myself. I am proud of that. But i could own an ATI and still have good graphics. Its all in personal interests my friends.
Oh, Hallucinogen. Has anyone told you that you can be a a jerk somtimes? and has anyone called you a fourm troll before?
Anywho back to the subject.
nVidia is great and i like their cards they produce. ATI is great too. But as i mentiond i am an nVidia man. But the thought has crossed my mind about getting an ATI. I like to keep my interets open though.
 
I'm a little confused here so:
Is this H-L2 fight between Nvidia and ATI just about their newer cards
so that a Nvidia DX8 ( or 8.1 whatever) owner like me (GeForce 4 Ti4200) won't suffer dramatically playing H-L2 with it?
 
I have both an ati 9800 pro, and a nvidia fx5800 ultra. I dont need to pick, but when it comes to me monster (gaming rig) Im using my 9800 pro.. simply because it runs better at higher resolutions (1280x1024) with full aa and antisotropic filtering, without framerate loss.
you could argue till your blue in the face.. but I own both cards and therefore can afford to be a tad biased.

anywhoo back on topic.. the nvidia cards (especially the 5800 and even more dissapointingly the 5900) perform HORRENDOUSLY (as in, half the framerate it should be) in directx 9.
valve has nothing against nvidia, thinking they do is simply niave.. valve want to make money.. their customers have both kinds of cards (and others) they even went ahead and spent extra developement time just so that nvidia cards wouldnt take such a performance hit under dx9.

valve havent gone and "optimized" the game for ATi (unlike some "way its meant to be played" games supporting nvidia) quite the opposite in fact... they had to write new code paths just so it could run "decently"
now... fanboyism and ignorance aside.
facts are facts... and in the driver developement business, a lot of "beta" drivers get tossed down the crapper and never see the light of day.. which is the reason Valve won't test on "beta" drivers.
I mean honestly drop the facade and think about fairness.

1. should nvidia's cards need their own code path for dx9.. when its supposedly a "fully functional dx9 compliant card"?

2. should ANY "optimizations" be allowed that sacrifice even a pixel of quality, when ati does not?

3. Valve's released benchmarks are taken with the (I believe it was 3.7s at the time) public drivers.. and Nvidia should get to use their "beta"?

the whole scenario is ridiculous.
even more so when I was first a 3dfx fan, then an nvidia fan.. but with all this nonsence I lost all faith in the company.
I am shamelessly in love with my 9800 pro.
 
Originally posted by Hallucinogen
Did I say nVidia sucks ?? did I flame you ?? so stfu you down syndrome n00b.

stop the spam please...

Dunno why they didnt accepted new nvidia drivers... maybe they are just a little mad about them because of the older drivers? ;(
I have a gf4 ti 4200 128 mb and it works great... dunno about the new fx thingies :)
 
Originally posted by ShaithEatery
Is it just me or has Valve given an indefinate "Screw YOU!!!" to NVidia? I say this because I read on Tom's Hardware that Valve wouldn't let NVidia do benchmarks with the beta Rev. 50 drivers, almost seeming that they want NVidia to look worse than they should (http://www6.tomshardware.com/graphic/20030912/index.html). Second, I saw that ATI On Top at the Rock video from Fileplanet.com and Gabe seemed to express his great interest in ATI. I wish NVidia could get some props or something. As the proud (yes, i said PROUD), owner of a GeForce FX 5900, I hope that Valve could actually help out NVidia get out of this rut they're in curently by helping out with the Rev. 50 drivers. There are millions of FX owners out there and they want great Half Life 2 preformance without all the ATI owners screaming FRAMERATES, NUMBERS, and BENCHMARKS in their faces.

The FX series isnt that great for DX9 Titles. Im not trying to flame nvidia or anyone but thats the truth. If you look at the benchmarks and how Nvidia cards are built you can understand why Gabe would choose ATI. Also I can imagine why they did not allow those drivers.
 
Originally posted by Mr. Redundant


1. should nvidia's cards need their own code path for dx9.. when its supposedly a "fully functional dx9 compliant card"?

they shouldnt , who knows what they are hiding. DX9 Requires FP24 and up , Nvidia cannot use FP16 fully for DX9 so they must use FP32 and this is why its very slow on DX9 games (i think)
 
For all those wonder about the G4Ti, that isnt an issue here. All "optimisations" done by Nvidia only apply to the FX series. The questionable, the less questionable, the straight out deciet, everything. ONLY the FX series. They arent in effect for the G4 and lower series. You actually get much higher quality images with a G4 than a FX super-duper-uber-cool 5900 Ultra.
 
Back
Top