Follow along with the video below to see how to install our site as a web app on your home screen.
Note: this_feature_currently_requires_accessing_site_using_safari
Originally posted by Matrix302
just buy R9800 and 2.8GHz or over + 1GB corsiar xms4000 memory. you will have 100 or over fps. tRUst me.
Originally posted by Think
Is 40fps good? and how high fps do you get in ut2003 with 9700pro and p4 2.0 gh?
Originally posted by [Hunter]Ridic
1gb of any ddr memory...it dosnt matter if its corsiar xms. Any good brands would do...corsair, crucial, kingston
Originally posted by Think
but you should see the differance.. i can.
DUDE! Anyone not capable of seeing the difference between 60 and 80 must have really poor eyes. I got nauscious playing BF1942 at 60 fps (vertical sync). The screen flickered visibly. I play it at 85 fps instead, smooth as butter. Its the same in normal 2D, I can not have it at 60hz. I need 75 or higher. But yeah it does depend on the game... For example, I played Morrowind at 15-25 fps before I upgraded just fine. It was to slow to flickersome people say 30fps and up is the same and its true for SOME games. But its impossible to tell between 60 and up. most games impossible with 50 even.8
Originally posted by synth
Humans can't detect framerates past 60fps. so if it's 60 or 6000, you won't be able to tell the difference.
It will just change the amount of updates per second, just like fpsOriginally posted by [Hunter]Ridic
I dont much about monitor refresh rate, but mine was at 60. I just turned it to 85h. What will this change? Right now the only thing i noticed was a lot more sensitivity in my mouse.
And that makes how much sense? At any rate, its the hz that's the important part. And I can see a difference between 60 screen flickering per second and 75 screen flickerings. Of course fps is rarely the same as hz, but if one follow it one can see the difference.Originally posted by Archangel
FPS and Hz are two completly different things. hz states how often a second the monitor draws anew picture, FPS is how many frames are rendered by the graphicscard a second...
Originally posted by dawdler
DUDE! I got nauscious playing BF1942 at 60 fps (vertical sync). The screen flickered visibly.
Originally posted by chris_3
when i played teh doom 3 demo i had a 30fps in one spot and when i was near a monster i had like 2fps sometimes.......... hmm if i upgraded my athlon xp2000(1667mhz) to a xp2700(2170mhz) would it make a diffrence in my fps?? and another question is why is amd so great?? their ghz are way lower than intel's and intel has double the fsb... i dont see why their so great :\ To me it looks like amd's processors are stone age compare to intel ;( can anyone explain the amd vs intel?
No, the screen can flicker anyway. Vertical sync is there to keep it from tearing (out of synch with fps).I believe you have that wrong. If I'm not mistaken, the entire point of vertical sync is to limit your FPS in order to keep the screen from flickering.
Originally posted by Mad_Eejit
I'll probably get flamed for this but i think most of this stuff about fps is all Psychological. I would recommend that you just don't look at what FPS you are getting. When i play BF1942 I find it perfectly playable and then when i look at my FPS i only have about 25 and thats when i notice it jerking. Same thing with NOLF2, although i really dislike that Lithtech engine they use because it is slow.
Maybe if in these games i experinced them at faster speeds i might start noticeing it but at the moment i don't.
Id like to see any human notice the differance between 40 and 72 FPS....
basically, movies are made with 26 frames per second (initially, nowadays the probably have gone up to 30 or something). You might notice the difference between 30 and 40 Frames (skilled eye of a gamer), but anything above that is unnoticable (besides, i loath people who oc their system just to get a few more frames, and the brang about how cool that is)
Originally posted by crabcakes66
Id like to see any human notice the differance between 40 and 72 FPS....