Optimal settings for nVidia 6800GT

M

mabuhr

Guest
I'm generally happy with the performance of the card on my machine (a hyperthreaded P4 2.8Ghz with 1Ghz SDRAM), but was wondering what I can do to wring a bit more out of HL2 on my card.
 
I have a P4 2.85Ghz (2.533 OCed) with 1GB DDR400 and a 6800GT. I run at:

1280x1024 4xAA 8xAF.

I would run at 1600x1200, but drops below 60fps annoy me. 1600x1200 is great, but it often goes around 50 to 60, and at 1280x1024 you will get more in the 60-70 range.
 
Set image settings to high performace, antialisasing settings to 4x and anisotropic filtering to 16x.

...or not.
 
Acert93 said:
I have a P4 2.85Ghz (2.533 OCed) with 1GB DDR400 and a 6800GT. I run at:

1280x1024 4xAA 8xAF.

I would run at 1600x1200, but drops below 60fps annoy me. 1600x1200 is great, but it often goes around 50 to 60, and at 1280x1024 you will get more in the 60-70 range.

Hmmm......you must have cybernetic implants, because I can't see more than 30 FPS.
 
The eye can only see 30fps, but it can detect changed between 30 and 60. Some of us are more bothered by jumps up and down between 30 and 60 than others. That is why most review sites consider 60fps idea, and not 30fps. 60fps is much smoother, at least for me :)
 
yeah if you play on 15 and it goes to 30 your like "WOW!!" but when i goes to 60 then back to 30 you can kinder tell
 
By the way....

I'm viewing on HL2 on a 27" HD LCD display, which obvious effects the display resolution.
 
Acert93 said:
The eye can only see 30fps, but it can detect changed between 30 and 60. Some of us are more bothered by jumps up and down between 30 and 60 than others. That is why most review sites consider 60fps idea, and not 30fps. 60fps is much smoother, at least for me :)

So then, how are you detecting the "jumps"?
 
That "the eye can only notice up to 30fps" thing is a myth, it's based on people watching movie film frames which have motion blur - each frame contains all the action from a period of time.

game frames have no motion blur - each frame is an instantaneous snapshot. with game frames, people can tell the difference right up to about 80fps.

try it.
 
Zaphod_Saves said:
So then, how are you detecting the "jumps"?

See = distinguish frames; detect = notice difference in fluidity.

When a game goes slower than 30fps you can actually see the individual frames. Over 30fps you cannot see the individual frames but you can detect smoothness and transition. When it bobs up and down it bothers some people. I know a lot of people who would prefer a game locked in at 30fps rather than jumping from 30 to 60.

Personally I would rather have a solid lock than up and downs. Oh well, some of us have better eye sight. I know people who prefer 80fps because it allows for some minor drops without any serious notice. Everyone is different... anyhow, you can tell the difference. Maybe not you, but a lot of people can, and thus the prefered standard is 60fps and the minimum 30fps. Reviewers do not even accept sub-30fps as playable on hardware forums.
 
fragShader said:
That "the eye can only notice up to 30fps" thing is a myth, it's based on people watching movie film frames which have motion blur - each frame contains all the action from a period of time.

game frames have no motion blur - each frame is an instantaneous snapshot. with game frames, people can tell the difference right up to about 80fps.

try it.

I was going to mention that. Movies are at 24fps; but they can get away with that because whatever movement that is not caught between frames is blurred. I cannot STAND action scenes in movies where the camera pans quickly--looks like GARBAGE.

Motion blur was a technology 3Dfx was working on... I think--IF USED WISELY--this could be a nice feature in games, but from the FarCry 1.3 engine demos it looks like garbage IMO. Same with HDRL. These are great features but they need to be designed well.
 
OC the CPU a little bit extra :) Or get a cpu that causes a second mortgage to be needed :p For a 6800 GT you need a 3.2 P4 800 FSB or AMD64 2.2 Ghz Minimum for max benefit.
 
My Geforce 6800 GT runs the game at 1600X1200 with everything set to high. Reflections set to full, and x6AA x16AF and vsync on. :borg:
 
how the hell do you run on such high settings on 60fps.. and you're complaining because of 30fps.. wow...

I have a similar system.. athlon xp 3200+ and geforce fx 5700le 256mb and it runs like 15-30 if i put everything on high.. especially 4x aa and 16x af
 
What's the console command for counting frames per second? I downloaded some kind of (unofficial???) HL2 benchmark off of Guru3D but it seems kinda sucky. I'd like to have a counter for my fps in the corner sometimes.
 
I don't believe these people claiming to run at 1600x1200 with FSAA and AA. Nvidia's fsaa implementation has always been a dog, and nothing has changed with the 6800 series.

I do run at 1600x1200 on my 6800ultra here, but with no FSAA or AA. I make sure the bios AGP Aperture is at 256meg (to match the card), and FastWrites are off, and vsync is on. I also set the audioquality slider to Medium (everything else on highest settings), all things which help the stuttering issue, which no longer occurs. In timedemo runs using the anand/hardocp demos, I get in the 65fps range with vsync on, and 72fps with it off.

rms
 
Same here. My comp is no insane geek-machine or some crazy "rig" as you want to call them, but it's enough. And unless you payed more than $5000 on your computer, I truly doubt you can get such high fps.


or maybe it's just my videocard
you tell me... is geforce fx 5700le 256 good?
 
I run HL2 single player at 1280x960 with 4xAA/8xAF and Vsync on. It's very playable although it can dip a bit.
I tried it at 1600x1200 and it dipped too many times.

I play HL2DM and CSS at 1024x768 with 2xAA/8xAF because there is a bit of a responce difference which I noticed in CS:S. Has to do with the Mouse settings which I never changed. ;)
 
Asus, same here. I can go 1600x1200 with AA and AF, but the GT only averages in the 50's depending on the map. Some maps 16x12 runs GREAT, on others you are lucky to get low 50s. The Ultra does a bit better though (~15%... I may OC mine again to do 16x12 but I am happy with 1280x960). So for the guy with the Ultra give it a try. Remember HL2 is often CPU limited so if you are getting low FPS at times it may be CPU related to begin with. Always worth a try.

Btw, The problem with average FPS is that it does not demonstrate the dips. An average is only relevant if you also know the minimum fps.

I would rather have a card/settings with a 50fps average with a low of 40 than a 70fps with a low of 20.
 
The NV40 core is a very powerful one. ATI and Nvidia's performances in HL2 are both very good.
 
I play with an athlon 64 3400+ 512mb of Ram and a Nvidia FX5500 256mb, and for some reason in CS source, if I run it without full aa I get around 60-80fps, but with 6x aa I get 80-100... Its rather.... weird... Shouldn't turning up your graphic settings make it run...erm... slower?
 
Btw everything else is now on high and I get another 5fps, apart from filtering thats set to trisphonic(sp?). Its kinda erm.... weird.
 
your 6800 will not run 6x AA unless you have modded your drivers yourself. It's running better cause no AA is being applied, probably.
 
When you people say 60fps, do you mean a high of 60? or it never drops below 60?
 
Parabolart said:
Same here. My comp is no insane geek-machine or some crazy "rig" as you want to call them, but it's enough. And unless you payed more than $5000 on your computer, I truly doubt you can get such high fps.


or maybe it's just my videocard
you tell me... is geforce fx 5700le 256 good?

Sorry No! That card is about half as good as a 9600XT which is half as good as a 9800pro which is about half as good as a 6800GT
 
Back
Top