All Nvidia Users Read

link84

Newbie
Joined
Aug 29, 2003
Messages
72
Reaction score
0
First (before I explain how this is relevant to the "video card" thread), guess what graphics card this is rendered on? It's not my pic (I'm just hosting at the moment). Do pay attention to the FPS counter in the lower right.


http://xanderf.dyndns.org:8080/images/pc/compare/guess_who.jpg

Oh, it's pretty sad really.

Basically, some guys on Guru3d figured out what Valve did to cripple nVidia cards.

First off, you need 3dAnalyze. I'm assuming everyone knows that you can force HL2 to run in DX9 mode on FX cards, right? Only, you get artifacts in the water and other areas?

Well, that's pretty easy to fix. Just have the 3dAnalyze util report your card as an ATI Radeon instead of a GeForce FX.

*taddah* All the artifacts go away, and you get true DX9 reflections!

Okay, but there IS a performance hit doing that. How to get around that?

Well, the funny thing is that Valve coded Half-Life 2 to use FP24 shaders all the time every time. And it's really not needed. Nope. In fact, FP16 seems to do the trick all the time - as seen in that above pic. FP16 and FP24 are indistinguishable in Half-Life 2.

Again, using 3dAnalyze you can test this. It is capable of forcing a card to use only FP16 shaders no matter what is requested. You'll see no image quality difference doing that - just a HUGE performance boost. Why? Well, because while FP16 is all that Half-Life 2 *needs*, if they let the GeForce FX cards do THAT, they might have been competitive! So, instead, they forced FP24 (unneeded), which caused the GF-FX cards to render the DX9 mode in FP32 all the time. With the obvious associated performance hit.

Try it yourself. The link to the article is here. Download 3dAnalyze, and follow these instructions:
Quote:
Originally Posted by Presi
Open it and follow the numbers:
1. select HL2.exe file in half-life 2 folder
2. select any file inside the folder half-life 2\bin
3. select Steam.exe
than check these options:
- Under the section Pixel and Vertex Shader: FORCE LOW PRECISION PIXEL SHADER
- Under the section Remove stuttering: PERFORMANCE MODE
- on the bottom left: FORCE HOOK.DLL

If you haven't change the file dxsupport.cfg with the method described in the beginnig of this thread, you can obtain the same result typing in the section DIRECTX DEVICE ID'S the ATI Vendor and Device ID, there are just two device though.
....
In the end 3D ANALYZE gives me an error, CREATEPROCESS FAILED, I launch HL2 anyway, the water looked awesome, awesome detail and I noticed a boost in performance too. I think around 20/30% which allowed me to play the WATER HAZARD level with this setting: 1024x768 everything max, water relection to ALL, 2xAA, 4xAnisotropic with a range of fps of 40 and >150.

Amazing, huh?


http://hardforum.com/showthread.php?t=838630
 
OK, thanks for info...

I've got a Nvidia card, but not very into computers at that level, could you sum up basicaly whats impotant??
 
demon_fall said:
OK, thanks for info...

I've got a Nvidia card, but not very into computers at that level, could you sum up basicaly whats impotant??
just click on the link and read
 
:O

If this is true,I can't try it myself because I've got an ATI card, then shame on valve. I would like to be so naive to think that valve just sort of overlooked the issue, but that would just be....welll.......naive.

If I were an FX user, I would be pissed, yet glad that somebody discovered this. Valve will have no choice but to release a patch after this spreads a little bit...
 
i tested it on a gffx 5200 and it allmost doubled my framerate from 5-6fps @ dx9 everything high to 10-13 @ dx9 everything high.

...
 
jacen said:
i tested it on a gffx 5200 and it allmost doubled my framerate from 5-6fps @ dx9 everything high to 10-13 @ dx9 everything high.

...

Thats uhh big news congrats, :) I have a 5900 and i get like 45 fps I wonder what i will get with this stuff, also does it work for css?
 
Very interesting, I will do this on my GeForce - which does, however, perform pretty well with HL2 as it is.

It's a shame really that game developers have those video cards they prefer, like Valve and ATI or ID and nVidia. Alright, I understand that they have deals together, like all that ATI coupon stuff, etc., but it would be much more fair to customers if the cards performed equally.
 
jacen said:
ahahha
if you hadnt bitched with me, i could tell you
smartass
I guess that shows me, Since I already found out the answer to my question
dumbass
 
Solver said:
Very interesting, I will do this on my GeForce - which does, however, perform pretty well with HL2 as it is.

It's a shame really that game developers have those video cards they prefer, like Valve and ATI or ID and nVidia. Alright, I understand that they have deals together, like all that ATI coupon stuff, etc., but it would be much more fair to customers if the cards performed equally.

My guess is that ATI gave valve plenty of incentive to endorse their cards. I'm not saying they intentionaly sabotaged the FX line of cards but it doesn't look like they did a very good job of optimizing them.
 
i dont know why but i get very ,very good framerates with an old pci only 5200 fx,yet almost everybody here says its a bad card.
 
it might not be 'direct' sabotage.

nvidia has custom directx paths and whatnot, so it could simply be a mistake.

the game was optimised for ati, and i'm sure nvidia and ati are quite different .. so problems are sure to arise.
 
So cant valve release a patch or something to force FP16 or someting ince it seems kinda stupid if it has to use FP24 all the time, that explains all the stupid lag for no reason.
 
wayne white said:
i dont know why but i get very ,very good framerates with an old pci only 5200 fx,yet almost everybody here says its a bad card.
I too, also have an PCI 5200FX and it plays amazingly great on my pc even with all settings on high. But, I also have 768MB of ram with 5GB saved for the page file.
 
Oh noes! A great conspiracy is uncovered once again by the great Mr. Tinfoil Hat.

Get a clue people. How is it Valve's fault that FX series of cards just plain suck. If Valve was really trying to undermine NVidia wouldn't it make sense to make GeForce series run worse than ATI's lattest offerings? But that's clearly not the case. So blame your own stupid ass for buying a card without checking the benchmarks.
 
settle down bloodymario.

these are the fx series .. hl2 and it's benchmarks weren't exactly around back than ..
 
BloodyMario said:
Oh noes! A great conspiracy is uncovered once again by the great Mr. Tinfoil Hat.

Get a clue people. How is it Valve's fault that FX series of cards just plain suck. If Valve was really trying to undermine NVidia wouldn't it make sense to make GeForce series run worse than ATI's lattest offerings? But that's clearly not the case. So blame your own stupid ass for buying a card without checking the benchmarks.

I guess a thread like this is bound to bring out the fanboys.

I think it's a bit shocking that valve didn't even bother to do any basic optimisation for Nvidia cards, but rather just suggest that they use directx 8 mode instead.

I'm getting a bit sick of having all the big name games allying themselves to either ATi or Nvidia... you end up being able to play half the new games in full quality, and having the other half crippled somehow.
 
I wonder, by the way, how legal is this? Supposing that HL2 does indeed (and it apparently does) have tricks like this that seem to undermine performance on many, if not all, nVidia card, isn't that illegal?
 
Valve Sucks

Wow, how low. And just when I thought Valve was a respectable company, too...

link84 said:
everyone knows that you can force HL2 to run in DX9 mode on FX cards, right? Only, you get artifacts in the water and other areas?

Well, that's pretty easy to fix. Just have the 3dAnalyze util report your card as an ATI Radeon instead of a GeForce FX.

*taddah* All the artifacts go away, and you get true DX9 reflections!
That pretty much proves it. Valve has deliberately conned nVidia users. That just goes to show companies these days don't care about the customer anymore, they just want to get as much money as possible. I wonder how much ATI payed off Valve to implemint this "special feature"?
 
I WANT TO CLEAR SOMETHING UP

The author states that 24bit percision is not necessary.

This is not true. The DX9 standard dictates the minimum requirement of 24bit percision. Therefore for a correct DX9 implimentation you need 24bit percision or better.

When nVidia made the FX series, they decided to sell users on the "check box" of 32bit percision and leave out 24bit. This has been a trend in the industry for a long time... introduce new features that are underpowered, then the next gen offer a part powerful enough to use the new features. nVidia got caught with their pants down when ATi released the R3xx series because the R3xx series had all the DX9 specs and did them well.

So this has nothing to do with Valve crippling FX boards. It has to do with the fact that the FX series has no 24bit percision and is SLOW in 32bit. The FX series is not a great performer in true DX9 protocal.

Anyhow, Valve went out of their way to make FX boards look good and perform well. And as for any artifacts, I would leave that up to veteran reviewers to look at. It is pretty easy to miss artificats, especially when you are not comparing 2 cards side by side at full resolution.

If you would like to know more about the FX problems click here.

Ps- I own a 6800GT right now, but I did own a 9700. ATi and nVidia both have made good and bad products. This generation they both made good products.
 
NachoMan said:
I guess a thread like this is bound to bring out the fanboys.

I think it's a bit shocking that valve didn't even bother to do any basic optimisation for Nvidia cards, but rather just suggest that they use directx 8 mode instead.

The only fanboy I see is the one saying, "it's a bit shocking that valve didn't even bother to do any basic optimisation for Nvidia cards". The facts are they spent more time optimizing for nVidia cards than ATi cards.

The fact is nVidia cards are slow in 32bit percision and do not support 24bit. That makes it hard to do a lot of DX9 effects. Without BENCHMARKS and side by side full size comparisons of the tweak and standard DX9 we cannot be sure what performance or artifacting issues there are. A scaled down screenshot may be ok, but how is it in motion at full size next to a true DX9 setup?
 
Is this issue really only for FX cards or for all Nvidia cards??

(Ive got a GEFORCE 6600GT)
 
i've got a GeForce FX 5700 Ultra, and ill be getting half-life 2 for x-mas,
is that good enuf to run hl2?
 
They wouldn't have sabotaged them - they were supporting a higher standard that will come to more use in the future.

Some people are such damn losers. 8o|
 
how do u get half life 2 to run in dx 9 mode all the time and FP24 plz coz im tryin to do this now
 
If I do this, is there a way to restore the game to defaults if this does not work?
 
Acert93 said:
I WANT TO CLEAR SOMETHING UP

The author states that 24bit percision is not necessary.

This is not true. The DX9 standard dictates the minimum requirement of 24bit percision. Therefore for a correct DX9 implimentation you need 24bit percision or better.

When nVidia made the FX series, they decided to sell users on the "check box" of 32bit percision and leave out 24bit. This has been a trend in the industry for a long time... introduce new features that are underpowered, then the next gen offer a part powerful enough to use the new features. nVidia got caught with their pants down when ATi released the R3xx series because the R3xx series had all the DX9 specs and did them well.

So this has nothing to do with Valve crippling FX boards. It has to do with the fact that the FX series has no 24bit percision and is SLOW in 32bit. The FX series is not a great performer in true DX9 protocal.

Anyhow, Valve went out of their way to make FX boards look good and perform well. And as for any artifacts, I would leave that up to veteran reviewers to look at. It is pretty easy to miss artificats, especially when you are not comparing 2 cards side by side at full resolution.

If you would like to know more about the FX problems click here.

Ps- I own a 6800GT right now, but I did own a 9700. ATi and nVidia both have made good and bad products. This generation they both made good products.


Excelent points ! :)
 
DrkBlueXG said:
I too, also have an PCI 5200FX and it plays amazingly great on my pc even with all settings on high. But, I also have 768MB of ram with 5GB saved for the page file.

Do you really need 5GB reserved for the pagefile? I don't see why you would need that much unless you are running other apps that require it and it definately won't improve your performance having such a large one.

Having an adequate pagefile is necessary. Yours is just bloated.
 
Excellent points indeed.

But if visual quality does not seem to be sacrificed by lowering the precision to 16 in dx9 mode, Why don't they just do it? I guess because if people complained about visual quality they would get the blame then... :|
 
well to an owner of an FX5900 Ultra 256mb DDR. im quite pleased.

i havent got HL2 yet (xmas :-/ )

i reckon id be able to run the game with all settings high. AMd 1900 XP, 1GB of ram. and of course my gfx card sux. and id accepted that id have to run in DX 8.1, cos 8.1 isnt realy to bad, but if i can run in DX 9 with all the bells and whistles on, without significant frame rate drop (compared to DX 8.1) then i guess il be playing it this way come xmas.
 
Hitman89 said:
how do u get half life 2 to run in dx 9 mode all the time and FP24 plz coz im tryin to do this now

Okay im trying to do this now also Hitman.. i looked up and to get hl2 running in DX9 mode your HL2 shortcut target should look like this.

Code:
"C:\Program Files\Valve\Steam\Steam.exe" -dxlevel 90 -applaunch 220

I ran the game and there is no water.

Also i cannot get 3DAnalyze working. Seams to tell me off and say "CreateProccess Failed" pfft.

If any1 has this working please let us no : - )
 
NachoMan said:
I'm getting a bit sick of having all the big name games allying themselves to either ATi or Nvidia... you end up being able to play half the new games in full quality, and having the other half crippled somehow.
The strangest thing I've seen:

Valve promotes ATI as the card for Half-Life 2.

Trokia promotes nVidia as the card for Vampire: The Masquarade - Bloodlines.

Both games use the exact same graphics engine.

Please explain to me how this makes sense.
 
lol mountain man. thats a fairly good point.

however i havent researched vampire, so i dont know if it takes advantage of all the DX9 features (since its source it could but doesnt have to) since source goes all teh way down to DX7, so they may not need the DX9, who knows.
 
hl2 runs great on the 9600(atI).. conversly, the same model graphics card that nvidia produced at the time sucked arse when it came to shaders (directx 9).

modern graphics cards may not make that much of a diffrence, but the for the older ones, ati's preform better. I think valve is promoting ati because of the fact that not everyone has enough money to go out and buy the high end cards where the stats are more closely matched.
 
kenyo said:
i've got a GeForce FX 5700 Ultra, and ill be getting half-life 2 for x-mas,
is that good enuf to run hl2?

i have the same card and it runs perfectly fine. to tell you the truth i'm not gonna go to all the trouble of "optimizing" my card for some uneccessary water effects....who cares? playing on my video card gave me the necessary graphics i was looking for. i didn't even notice the water reflection. i had everything maxed out and was still getting 40-180 fps. not bad for a "shitty" card eh?
 
Acert93 said:
The only fanboy I see is the one saying, "it's a bit shocking that valve didn't even bother to do any basic optimisation for Nvidia cards". The facts are they spent more time optimizing for nVidia cards than ATi cards.

Sorry, how does that suggest that i'm a fanboy exactly?

I guess I was jumping the gun a little in assuming that Valve hadn't optimised for Nvidia, but what with all the reports coming in of Geforce cards running slower and with poorer image quality AND the fact the valve endorsed ATi, it's hard not to jump to conclusions.


Mountain Man said:
The strangest thing I've seen:

Valve promotes ATI as the card for Half-Life 2.

Trokia promotes nVidia as the card for Vampire: The Masquarade - Bloodlines.

Both games use the exact same graphics engine.

Please explain to me how this makes sense.

That is strange indeed, unless Trokia have removed all ATi optimisations and replaced them with Nvidia ones (a pointless exercise and pretty damn unlikely ;) )
 
Valve did have a Mixed Mode and they showed benchmarks last year, if you can't remember. They dropped it and went for the strait DX8.1 mode. Still very poor performance with Mixed Mode..

Don't jump to conclusions just yet. Wait for a review site to actually look further into how it renders. :rolleyes:

It's about keeping the bar up for image quality and they went with the full DX9 spec. Sure you can always lower quality and have it perform better but Valve wants to choose where to hold the quality.
 
Back
Top