jesus christ video card scare

socK

Newbie
Joined
Jan 24, 2004
Messages
244
Reaction score
0
Got a new 9600xt for BF2 (which runs it rather well I might add, a happy 50+ fps on medium settings most of the time) and it's been great so far. Was bored so I felt like pushing it a bit. Im not really new to this and the whole thing seemed simplified with ATI Tools auto-ness thing. Overdrive was on, which put the core at 526, a 26mhz boost. Now, overdrive will lower dare it get too hot. I wanted to jack up the memory a bit. I hit the auto detect and let it run it's thing. It got up a mere 25mhz, putting it at 526/325. Quite a boost. :\ Any farther than this would get artifacts so to be safe I back it down 10 leaving it at an amazing 315. Now I had my overdrive window up, so I could see my temps. Seeing as I didnt exactly get far anywhere, they didnt really budge at all. I wasnt really doing this for the whole preformance thing, just sort of an experiment. Now, im about to call it quits and exit this program, reverting all clocks to default but, during my move to move the overdrive window out of the way and somebody calling me I managed to click the overdrive thing off. Now lowering the core back down 500 doesnt exactly sound like a bad thing to me. But ATI Tool didnt like it for some reason. I got green artifacts all over the screen and ATI tools artifact detector thing went apeshit. Thinking something went terribly wrong with teh overclock somewhere I reverted to default and quickly checked temps. They where perfectly fine but the fabled artifacts didnt feel like going away, so I rebooted and they left.

Being as brave as I am I had the wonderfull, bordom filled idea of trying that agian, almost. Everything was defaulted now and I felt like looking at that snazzy fur cube that is the artifact test so I started it. I let it ran and watched it for a bit. Didnt really do much. Then, the second I close ATI tool, the terrible artifacts came back for no reason. I one again rebooted and they left.

I tried games, benchmarks and everything seems to be in order now. I dunno wtf went on with it, everything seemed fine during it, just artifacts for no real reason. Spooked the living shit out of me. :X
 
How would turning overdirve on help anyway?
What does it actually do in performance of games?
 
It doesnt do much. Gave about 1-3 fps boost. BUT I NEED THOSE FPS, THEY ARE VERY IMPORTANT TO ME. I figured it couldent hurt so I just turned it on when I was setting everything up.
 
I have the same card, and i keep overdrive on just because it makes it look like im getting more. Its really nothing, 1-3 fps is accurate and pointless, you don't need any more fps, 50 is fine. Human eye can't detect anything from like 20 fps or something. You clear that easily, so your technically wasting your time.
 
Hectic Glenn said:
I have the same card, and i keep overdrive on just because it makes it look like im getting more. Its really nothing, 1-3 fps is accurate and pointless, you don't need any more fps, 50 is fine. Human eye can't detect anything from like 20 fps or something. You clear that easily, so your technically wasting your time.
That's what all the websites say, but it's easy to tell when you're getting less than 40 FPS. Totally noticable.
 
StardogChampion said:
That's what all the websites say, but it's easy to tell when you're getting less than 40 FPS. Totally noticable.
This seems like an interesting topic to research, someone should conduct a double blind study.
 
I don't know where the heck you guys pulled your numbers, but I'm about 99 percent positive the human eye can detect up to 60fps, or somewhere very close. It is insanely noticiable to see the difference in fps the farther you drop from sixty. Anything much higher, and the only thing you might notice is a lack of responsiveness (just an appearence) as movements just don't have the same timing. an example would be (I play tfc) if you turned on dev_mode in hl1, and set max fps to 200, it feels insanely smoother then 100 fps, it doesn't look any different, i just think it has a feel to it.
 
You can notice it I think. I can totally tell the difference between about 30 and 60fps is CS:Source.
 
This subject has been discussed to death on these very forums. So to keep it very simple you do see a difference between 30 and 60fps.

Looking at a film in a movie theater at 24fps is not the same as playing a video game on a CRT monitor because on a big movie screen your brain can create a kind of motion blur between frames because the entire image is drawn at once.

Some people can even see pass 200fps.
 
Not again...

The human eye, according to military studies involving fighter pilots, can consistently detect flashes of light lasting less than 1/200th of a second... and perhaps even shorter. The 24-point-something figure is an old number determined back before video cameras were even invented. It refers to the number of frames per second a series of sequential images needs to display at in order to fool the brain into seeing fluid motion. That doesn't mean the eye sees at ~24fps... and it doesn't mean you should settle for that framerate in PC games. The reason movies can get away with it is that video cameras function in a manner similar to the human eye. They both use timed exposures involving chemical reactions.

The human eye does not see in instantaneous progressive scan frames produced by your video card... if you want to find out more about that you'll have to look it up on your own. Also, unlike a video card, every update (not a whole frame) sent to the brain includes light gathered between updates (in the form of motion blurring) because of the way the light is detected... making each update represent a period of time rather than an instant, whereas a PC game displays discrete frames representing the game world as it was at that exact moment. So, from frame to frame there will be gaps between where an object was in the previous frame and where it is in the current frame. The faster an object is moving relative to the screen, the higher the FPS needed to represent it as fluid motion. Even the difference between 60fps and 85fps is noticeable in a fast-paced game.

Another reason why people prefer high framerates is that video games rely on user input usually gathered and processed between frames. So, a higher framerate gives you smoother, more accurate control... which is important, especially in multiplayer games with a lot of action.
 
Back
Top