Overclocked 8800gt

taviow

Tank
Joined
Feb 18, 2007
Messages
3,171
Reaction score
8
So I overclocked my 8800gt from it's default Core 600mhz Shader 1500mhz Memory 1800mhz to 675/1687/1950mhz. Basically 8800gt KO speeds. Fan speed is at 80%. It's currently at 47?C, it's idle of course. I'm going to test Crysis soon.

So my question is: Do you guys think it's going to be stable, should I lower clock speeds? Do I test if it's stable just by running games or do I use some program?
 
Usually you up it slowly and test as you go. But yeah, just run some 3D graphics (games) and if you see artifacts or it crashes then it is too high. Could use atiTool to OC and test on the spot (works with Nvidia cards).

Artifacts would be white dots that look like snow or harsh screen tearing. (depending if the memory or core is too high)
 
I heard about ATITools, but I'm on Vista and it doesn't work on Vista, I tested it. I played Crysis for a while. The overclock made not much difference and the temperature is quite high 66-68?C. No artifacts. Fan speed is now at 85%. I'm going to test some other game because Crysis is not really affected for some reason. I got a 5 fps increase on Paradise Lost (the snow level), I did expect more. Well I *think* this overclock is stable because I heard about some (crazy :P) people taking the core clock speed up to 730-750mhz, so that means I might be around 50-75mhz away from danger, that's nice.

http://www.techpowerup.com/atitool said:
ATITool will only work on Windows 2000/XP/2003 (64 bit versions are supported).

Edit: Well I should have tried compatibility mode...
 
I got an ATITool version that works with Vista, I'm checking for artifacts now. No artifacts for 11 minutes now.

My question is now about the temperature. It's really crazy! The max so far is 75?C, is that too much?
 
Whats the airflow in your case like?
 
Nah. Some GPU core's warning temp are about 115C. I would think 90's would be too hot though for constant use (guessing). Often see 80's from stock or factory-OCed GPUs. In the Nvidia driver panel is there a place where it lists the temp it will shutdown at?
 
You know, I think I'm just going to open the case and leave it open, absolutely better than this :O. No artifacts for 23 mins. For how long should I test this?

Nah. Some GPU core's warning temp are about 115C. I would think 90's would be too hot though for constant use (guessing). Often see 80's from stock or factory-OCed GPUs. In the Nvidia driver panel is there a place where it lists the temp it will shutdown at?

Really? Oh my god! Fan speed at 90% btw. 75?C is the max I got. It's between 73-75 now. 32 minutes, no artifacts. I'm going to look for the place you mentioned in the Nvidia control panel.
 
did u find a good guide in OCing the 8800?
 
I just checked the 8800gt KO clock speeds and did it. Not hard and no artifacts for 48 mins, 75?C is really the max temperature.

8800gt KO core/shader/memory = 675/1674/1950
My 8800gt core/shader/memory = 675/1687/1950
 
I tought about that too before doing it. It's closed :P
 
If you can get a top of the line cooling system that would help. But then again, you could just buy another card and put it in SLI. :P
 
Comparing to the KO? Is it a evga then?
They have the old coolers on their 8800gt cards, so that's pretty much an expected temperature.
 
Well, no artifacts but the screen did start shaking a bit for no reason a while ago. It stopped now. Should I lower it? 1h 50min with no artifacts according to ATITool!
 
Why overclock a 8800gt? *sigh* I don't understand the logic of an overclocker. Just buy a better card when they come out. BTW: Having the case cover open is a bad idea too. You need more (or better placed) case fans.
 
I bought a 8800gt at stock speed, that means I have the option to overclock it to the speed of a 8800gt SC or a 8800gt KO. Both are more expensive, and the only thing changed is the clock speed. So by overclocking I just saved money. 2 hours and 10 minutes without artifacts. Only the screen shaked for like 5 seconds earlier. That scared me and I don't know the reason yet.
 
I plan to overclock my graphics card and cpu once they start to feel old. That way I won't spend money on upgrading as soon :)
 
Lowered it to 670/1675/975. Just to be safer because I'm easily scared by these things. Stopped scanning for artifacts at 2h 16min.
 
So your 8800gt is ninja? I would love that! D:
3dmark 06 score jumped from 11800 to 12500, that was not much of a jump, was it? I mean, in games I don't see much of a difference... NFS Prostreet jumped from its 40fps to 46fps. Should I just take the overclock off and leave it as it at it's 600/1500/1800? Really I was expecting a little more you know? Overclocking made not so much difference and for my standards it was quite heavy because I'm a coward. It won't hurt to lose like... 6 frames...
 
Really? Well, but I didn't see any increase at all in Crysis, it's not affected by overclock for whatever reason. Prostreet had it's magical 6fps increase. I have to test something else now

Edit: Anyone who wants to get over 2000mhz for the 8800gt memory clock, DO NOT.
All is explained here.
Don't know if it's true, of course. Better safe than sorry I guess?
 
E6850 @ 3ghz. I heard it doesn't really use quad core. And why do I have the option to link the core and shader clocks? Should I keep it linked or what?
 
Edit: Anyone who wants to get over 2000mhz for the 8800gt memory clock, DO NOT.
All is explained here.
Don't know if it's true, of course. Better safe than sorry I guess?
Hmm, wonder if that applies to cards like Palit/Xpertvision/Gainward 8800GT which have coolers on the memory and 3-phase power?
 
As an aside, turn off the "soften smoke edges" setting in the game for a large performance boost (15-30 depending on the area) with almost no noticeable drop in visual quality.

I didn't know disabling it was so good! Thanks.
And yes, I did use Rivatuner. Look for a tutorial, finding the overclock settings in Rivatuner is not very obvious.

Hmm, wonder if that applies to cards like Palit/Xpertvision/Gainward 8800GT which have coolers on the memory and 3-phase power?

I don't even know it's true yet. They say "the interface can't handle it", but I wanted something that explains why you can't do it. "The interface can't handle it" doesn't prove anything.
 
Back
Top