40fps.

he he. maybe i will buy that.. no maybe not. but im going to buy 512memory so i have 1GB
 
how can you find out what kind of ram you got? i got a customized dell but im not sure what i got...
 
gamers are so spoiled nowadays. I remember when I used to play DOOM on my 386 at a staggering 8 fps. it was still a blast. lol.
 
Originally posted by Matrix302
just buy R9800 and 2.8GHz or over + 1GB corsiar xms4000 memory. you will have 100 or over fps. tRUst me.

1gb of any ddr memory...it dosnt matter if its corsiar xms. Any good brands would do...corsair, crucial, kingston
 
Originally posted by Think
Is 40fps good? and how high fps do you get in ut2003 with 9700pro and p4 2.0 gh?

Anything below 60fps (on my computer anyway) is not really good....
 
Originally posted by [Hunter]Ridic
1gb of any ddr memory...it dosnt matter if its corsiar xms. Any good brands would do...corsair, crucial, kingston

yes that is right. I forgot to mention that. :cheese:
 
I like gamers who say woohoo im getting 200 fps!!!! on a 85 hz monitor...

I find 40fps find, if you want to know what yours is like find a game like UT2003 where u can change the setting to have a meaningful change in framerates. Im sure its different for other people.
 
some people say 30fps and up is the same and its true for SOME games. But its impossible to tell between 60 and up. most games impossible with 50 even.8
DUDE! Anyone not capable of seeing the difference between 60 and 80 must have really poor eyes. I got nauscious playing BF1942 at 60 fps (vertical sync). The screen flickered visibly. I play it at 85 fps instead, smooth as butter. Its the same in normal 2D, I can not have it at 60hz. I need 75 or higher. But yeah it does depend on the game... For example, I played Morrowind at 15-25 fps before I upgraded just fine. It was to slow to flicker :p
 
sure he said 40fps but is that in the midst of action or walking in a close hall

min. fps? or max fps?
 
I think its max. he siad everything is turn on, means maximum setting=maximum fps
 
I dont much about monitor refresh rate, but mine was at 60. I just turned it to 85h. What will this change? Right now the only thing i noticed was a lot more sensitivity in my mouse.
 
Originally posted by synth
Humans can't detect framerates past 60fps. so if it's 60 or 6000, you won't be able to tell the difference.

I can tell the difference between 85FPS and 100FPS when i play NS in 1024x768 at 100Hz or at 1280x960 at 85Hz.... just use mouse look to look around and you can see the difference between them as the number of frames drawn becomes much more apparent as you turn around. I cant see the difference however if i just move in and out of the screen however without using mouse look!

It all depends on the situation...
 
Originally posted by [Hunter]Ridic
I dont much about monitor refresh rate, but mine was at 60. I just turned it to 85h. What will this change? Right now the only thing i noticed was a lot more sensitivity in my mouse.
It will just change the amount of updates per second, just like fps :)

Btw, just testing on my crappy Riva 128, I'm running at 60hz, and see a VERY clear difference when changing it to 75hz, but sadly the screen shrinks then giving some sort of pseudo 10x7 (crappassed card :))
I can see it flicker at 60hz. I cant see it flicker at 75hz. So you cant see anything above 60? I trust my own eyes more than anyone on this forum :dozey:
 
FPS and Hz are two completly different things. hz states how often a second the monitor draws anew picture, FPS is how many frames are rendered by the graphicscard a second...
 
Originally posted by Archangel
FPS and Hz are two completly different things. hz states how often a second the monitor draws anew picture, FPS is how many frames are rendered by the graphicscard a second...
And that makes how much sense? At any rate, its the hz that's the important part. And I can see a difference between 60 screen flickering per second and 75 screen flickerings. Of course fps is rarely the same as hz, but if one follow it one can see the difference.
Playing the game at 25 fps at 85hz? Relativly smooth. Playing the game at 25 fps and 25hz? *urk* :D
 
i can easily tell the differance between 60-75hz ...........but in bf42(DC)(only game i play right now). That engine is so choppy i just cant tell the differance in fps ......it could be PL ,could be ping.....could be the engine doing its normal stuttering bullshit.

my computer is by no means slow..... every system ive played it on hs had the same stuttering feeling when moving the mouse.
 
Originally posted by dawdler
DUDE! I got nauscious playing BF1942 at 60 fps (vertical sync). The screen flickered visibly.


I believe you have that wrong. If I'm not mistaken, the entire point of vertical sync is to limit your FPS in order to keep the screen from flickering.

As for the FPS. I think that the console command "fps_max" limits the FPS in Half-Life. Just set that to whatever value you want to see what that looks like. I can tell the difference between 72 (my max with Vsync on) and 20 fps (the lowest the cvar limits the fps to), but I can't tell the difference with anything above 30. That might just mean that the variable doesn't do anything, although it does change the fps indicated in both cl_drawfps and net_graph 3. So I guess it's actually limiting it, but I can't tell the difference, so I don't really know. But as long as I get above 30, I'll be fine (which I should, I have a p4 2.6 with a 9700 pro).
 
I'll probably get flamed for this but i think most of this stuff about fps is all Psychological. I would recommend that you just don't look at what FPS you are getting. When i play BF1942 I find it perfectly playable and then when i look at my FPS i only have about 25 and thats when i notice it jerking. Same thing with NOLF2, although i really dislike that Lithtech engine they use because it is slow.

Maybe if in these games i experinced them at faster speeds i might start noticeing it but at the moment i don't.
 
when i played teh doom 3 demo i had a 30fps in one spot and when i was near a monster i had like 2fps sometimes.......... hmm if i upgraded my athlon xp2000(1667mhz) to a xp2700(2170mhz) would it make a diffrence in my fps?? and another question is why is amd so great?? their ghz are way lower than intel's and intel has double the fsb... i dont see why their so great :\ To me it looks like amd's processors are stone age compare to intel ;( can anyone explain the amd vs intel?
 
AMDs are actually faster. XP2000 and actually means it is the equivelent of a Intel 2 GHz even though it only has 1667MHz and also they are much cheeper
 
Originally posted by chris_3
when i played teh doom 3 demo i had a 30fps in one spot and when i was near a monster i had like 2fps sometimes.......... hmm if i upgraded my athlon xp2000(1667mhz) to a xp2700(2170mhz) would it make a diffrence in my fps?? and another question is why is amd so great?? their ghz are way lower than intel's and intel has double the fsb... i dont see why their so great :\ To me it looks like amd's processors are stone age compare to intel ;( can anyone explain the amd vs intel?

Shutup dumbass.
Use the internet, research.
 
read some stuff in the hardware forum if you want to know the strong and weak points of both companys cpu lines.
 
about the amd vs. intel.. try the hardware section :)

about the fps.. I try to play HL in 100 fps.. but sometimes it drops to 80 and I dont really notice (I know it drops because of net_graph 3 :p). As soon as it comes under 60 fps I really 'see' it. I think 60 is enough for single player tho
 
Turn the detail up to max and reduce the res until you get a nice frame rate.

Nobody on earth should reduce the detail of a game like this for the sake of a few more pixels per square inch.
 
I believe you have that wrong. If I'm not mistaken, the entire point of vertical sync is to limit your FPS in order to keep the screen from flickering.
No, the screen can flicker anyway. Vertical sync is there to keep it from tearing (out of synch with fps).

It is nothing psychological. After a reformat a while ago, I installed everything, turned up FSAA/AF and all that other junk, and started playing... My eyes nearly popped out. Screen flickering in every game, tried alot, they all had lousy refreshrate. Which made me remember I forgot the winXP refreshrate bug, DOH! 10 seconds and 2 clicks later I my eyes nearly teared at the new found smoothness.
I simply do not understand how people cant notice it. But I also have a larger 21 inch monitor, maybe makes me see it more clearly?
 
I do notice refresh rate, but I thought this was going about you noticing the difference between 30 and 40 FPS. Meaning, you have the über 1337-Monitor, running at 5000 hz, and then we start talking about noticing the FPS...jeez...
 
Originally posted by Mad_Eejit
I'll probably get flamed for this but i think most of this stuff about fps is all Psychological. I would recommend that you just don't look at what FPS you are getting. When i play BF1942 I find it perfectly playable and then when i look at my FPS i only have about 25 and thats when i notice it jerking. Same thing with NOLF2, although i really dislike that Lithtech engine they use because it is slow.

Maybe if in these games i experinced them at faster speeds i might start noticeing it but at the moment i don't.

I entirely agree, people want to have the max fps possible on their machines which is normal but when they start thinking about upgrading because of one game that runs at 30 fps it gets ridiculous.
 
Id like to see any human notice the differance between 40 and 72 FPS....


i can tell the difference easily, but everything after 60fps looks the same to me.


basically, movies are made with 26 frames per second (initially, nowadays the probably have gone up to 30 or something). You might notice the difference between 30 and 40 Frames (skilled eye of a gamer), but anything above that is unnoticable (besides, i loath people who oc their system just to get a few more frames, and the brang about how cool that is)

uhhh movies double their frames. so NTSC =30fps=60fps.
PAL format = 25fps = 5=frames.
 
Originally posted by crabcakes66
Id like to see any human notice the differance between 40 and 72 FPS....

I can tell the difference between 100 and 80! Thank you Counter-Strike!

I swear, I can...
 
When I played CS (it was entertaining back then) I used to have net_graph, scores and timeleft binded to tab, and sometimes I thought the FPS was below my usual 100, such as 80-85 and below; I used to press tab and check, and yes, I could definitely see the difference between 100 and 85 FPS.. Maybe this have something to do with timing between the refresh rate of my monitor and the rendering? I experience the same thing with v-synch @ 100Hz though...

The thing is, I read and hear different things concerning this "How many fps can one register" Does anyone have any good documentation?

Some people say that we can only register 30 fps, whatever it is it cannot be exactly the same for each individual...

I've been wondering about this for some time, so it would be nice if someone had any url's or anything on this.. :p

EDIT: If my English does not satisfy you, keep in mind that i've had some 5+ jägermeisters and live in Norway, anyway, I'm sure it should'nt be any problem for you to understand, I like to believe my English is good:)

Anyway, I'm going out now :cheers: , I love you guys, really enjoy browsing this forum for anything related to HL2, exept for those (random bad word) morons who just have to (another well known bad word) up for the "serious" ones.
 
If people had the ability to notice the difference between 40 and 80 fps, they'd notice their tv refreshing and blinking all the time.
 
It might have something to do with the game acting different when at one fps rate than another, anyway, I CAN notice the difference between.. those mentioned above.. And I know more people do... I'm not really sure, thats why I would love to read something serious about it...
 
People who say that the human eye can't notice the difference between 30 and 60 are people who prefer great graphics to high fps.

When I had my old comp (PII 350, GeForce2 MX, etc..), I had to remove all decals (blood, bullet holes, etc.), use old models , run at 640X480, etc. only to get 20 more fps (20 to 40) in CS, but it really was noticeable (sp?).

Now that I have my new comp, I have a steady 100fps, and I can play with fps_max to see the difference, and beleive me, once you're used to 100fps, a drop to 80 is obvious.

Final thoughts: Yesterday I was reading a PC magazine that I bought in 1994. They were testing 3d cards (diamond? what the hell is that?), and I remember them saying that Grand Prix 2 ran at an outstanding 25 fps. Old games ran at 30 fps max, but we didn't notice. I guess only the trained eye can tell...

(This thread will make me reach 30 posts... NOOO!)
 
Exactly, the trained eye, anything trained will "perform" better:p
 
wow, its interesting to look at the leap in computer technology in recent years. In fact its astonishing!
 
You can definately notice the difference between frame rates upto 60 fps. Although when you begin to reach the 60 fps benchmark you really need a side by side comparison to recognize it. But, and this matters on the game, you can 'feel' changes in framerates up even higher. This is most evident in the fact that in half-life you are able to reach a higher altitude whilst jumping if you are experiencing higher frame rates.
 
Back
Top