At how much fps would you consider the game being unplayable?

I believe when I turned on Unreal 2 frame rate indicator, the game felt all right at about 18-20 fps and very good at 25 fps.

fps are mostly dynamic, so engine and the system has to take special care to make sure you don't have performance drop once you have 5 combine soldiers budging in, but be realistic here, guys, there could be scenes when even the best PC's will suffer some fps drop.

If it's 25-30 in the walk mode and 20-25 in the massive combat mode, I'll be happy
 
zex said:
FPS depends on the engine. Since the HL engine is pretty crappy, anything near 30 fps or below gets choppy. In a game like, COD, 30 fps looks like 100 fps in HL.

Sir, I have no idea what are you talking about ;) The FPS is the consequence of bad or good engine. There is no difference between 30 fps in HL2 or 30 fps in HL, and it will be choppy in 15 fps HL2 and 15 fps in HL (unless some special motion-blur techniques are used, but I highly doubt it).

zex said:
30 fps in HalfLife2 would probably be smooth as glass, but don't expect to get that on high settings or anything.
I would agree with that, but lower your marks - 24 stable fps is pretty much good enough (movies in the theater use that). I am not arguing that 24 is better than 30 :dozey: , 30 will be nice, and also movies are able to get away with 24 being smooth because of nice natural motion blur occuring with fast moving objects.
 
After reading only a few of these posts, it's fairly obvious that people don't understand comfortable FPS in regard to computers.

30 FPS is great for movies - they use a motion blur effect that makes it seem as if it's actually a lot more than that. 30 straight FPS with no blur (like a computer rendering a scene) is not going to look very good. Sure, you can understand what is going on and what is happening (generally), but having FPS around 60 on a computer is going to look about the same as 30 fps would on a movie.

To those that think anything above 24 or 30 is not going to look any different, that is simply incorrect. The brain can see 220 fps (and even higher) - noticing the difference between 30 fps and 60 fps and 90 fps is not hard at all. You can always google search this information as I did, or you can just launch HL or one of it's mods and set your computer settings (resolution) so that it can run around 70-90fps. Then set your FPS max to 60, play around a bit, then to 30. There is a world of difference, and unless your monitor is totally worthless, you can see the difference quite clearly.

In summary - I go for 60fps or higher, because video cards render each individual frame instead of having motion blur, thereby making 60 fps nice and smooth (noticably).
 
Frigs sake this was all explained about 30 posts back... :p

But thank you for the reiteration anyway :)
 
I always considered anything under 24 fps to be unplayable. For HL2 I'm expecting 40-80 fps. 60 is optimal I feel.
 
Crusader said:
Frigs sake this was all explained about 30 posts back... :p

But thank you for the reiteration anyway :)

Teehee. Didn't take the time to read back more than 3 pages. :eek: :upstare:
 
nc17 said:
I get between 70-130 in the leak even though it is missing alot of it's advanced features it won't change much

i am willing to bet u 60 dollars that ur 5900 Ultra will not get u a steady 60 fps(1 dollar per fps..lol).. im thinking u'd be lucky to get a steady fps of 40.. but anyway seeing as how this is the internet and anyone can bs anyone at any given time.. this bet sounds silly..

why do i believe u won't be getting 60 fps in HL2? at the beginning of this thread Shuzer quoted someone who's played CS:Source in Korea.. go back and read that post.. the video card that person claimed to have been in the PC was a X800 if im not mistaken.

now i have no reason to think this guy is telling the truth.. but i don't have much of a reason to think he'd lie about something like this... anyway.. i highly doubt ur 5900 Ultra can perform as good as an X800 for obvious reasons.
 
McFly said:
I think spending 419 bucks on a video card is the greates waste of money, ever.

No the 400 dollar Coach and Gucci bags my gf buys are the grestest waste of money, ever!
 
gunstarhero said:
No the 400 dollar Coach and Gucci bags my gf buys are the grestest waste of money, ever!

Go pawn her stuff for a few sticks of Corsair RAM ;)
 
am i the only one who can notice the difference between 72FPS in CS and 100FPS? The Hl2 engine may make it harder to distinguish the FPS differences but i know in HL1 i could tell the difference..

I don't want to judge the engine on CS:Source performance as that has probably not been fully tweaked or had nearly as much time as HL2 the actual game (textures or whatever)
 
The difference between 72FPS and 100FPS in cs is noticeable in your Crosshair Recoil..
 
That too but just moving around is noticeable as well
 
This is a real bone of contention among gamers.

From my point of view - if I aint running at circa 60 FPS I can tell!! Its not that important when you are playing a game like say 'Theif 3' where you are in third person mode - but FPS games are more reliant upon having a good frame rate! If you dont, you cant aim your weapon properly and the mouse pointer lags!

Anyone who says less than 30 frames per sec is ok in a FPS game is seriously deluding themselves! You CAN tell the difference.

I had a similar discussion in another game forum about the same issue - I was arguing the other way though and making really poor excuses for my (at the time) less than great hardware!!

I have a 9800 Pro at present and it runs all of the current games fine - I'll be out buying a new card, an (X800 XT) if it doesnt run the new games ok though!
 
obiwanquinobi said:
This is a real bone of contention among gamers.

From my point of view - if I aint running at circa 60 FPS I can tell!! Its not that important when you are playing a game like say 'Theif 3' where you are in third person mode - but FPS games are more reliant upon having a good frame rate! If you dont, you cant aim your weapon properly and the mouse pointer lags!

Anyone who says less than 30 frames per sec is ok in a FPS game is seriously deluding themselves! You CAN tell the difference.

I had a similar discussion in another game forum about the same issue - I was arguing the other way though and making really poor excuses for my (at the time) less than great hardware!!

I have a 9800 Pro at present and it runs all of the current games fine - I'll be out buying a new card, an (X800 XT) if it doesnt run the new games ok though!

Exactly :D I could probably get through HL2:SP alright with say 30FPS but coming against ppl in multiplayer with smoother movement would be a different story
 
I owned at CS with about... hmmm 20-35 fps...

Now I have constant 100 fps and I am crap :D

Granted, I don't play as much as I used to, but I never even knew I was missing out on anything until I was getting 100 fps, and even then the only real difference I can see is a change in the amount of "tearing" when I look around.

Oh and of course when there are tonnes of models on screen my pc just used to stop entirely :p
 
900 FPS IS UNACCEPTABLE!!!

My computer is ultra - leet.

lol...30 fps maybe...

I average about 60, sometimes 90..
 
Okay, there've been 12 pages, and I'm not about to read them all so I apologise if anyone's already brought this up but movies play at about 30 fps (possibly less?) and I think videos and tv etc work at somethin similar. So why, then does this make such a huge difference for computer games?
 
I want 30+ (30 at the hardest of times, hopefully 70 most times) in HL2, but I want more in CS:Source because I need actual performance to play well and have fun.
 
el Chi said:
Okay, there've been 12 pages, and I'm not about to read them all so I apologise if anyone's already brought this up but movies play at about 30 fps (possibly less?) and I think videos and tv etc work at somethin similar. So why, then does this make such a huge difference for computer games?

Are you making a point, or asking a question?

And I'm sure this has come up before, but I really don't want to read all the pages either.

People say "There's no difference between 30 FPS and 100 FPS because no can tell the difference!"

They're liars.
 
Baal said:
Are you making a point, or asking a question?

And I'm sure this has come up before, but I really don't want to read all the pages either.

People say "There's no difference between 30 FPS and 100 FPS because no can tell the difference!"

They're liars.


I think hes asking a question.

The reason we want higher FPS, is so we dont die because of choppyness...
 
el Chi said:
Okay, there've been 12 pages, and I'm not about to read them all so I apologise if anyone's already brought this up but movies play at about 30 fps (possibly less?) and I think videos and tv etc work at somethin similar. So why, then does this make such a huge difference for computer games?

im not sure if i can answer ur question...but what i can say is.. there are alot more elements involved with fps on a computer compared to TV, movies etc..

one of those elements is lag/ping... on a computer u can have 200 fps in a game... but that 200 fps is negated if ur ping is 200+ or say u have 20 ping but then ur also getting 10 fps.

with TV/movies u don't worry about lag/ping because it doesn't apply.. so comparing the these two to computer fps is a bit unfair.

plus nowadays the netcode is written in such a way that it is made 56K'er friendly.. what does that mean to the general populace of gamers who own broadband services? it means theres another element that affects fps.. don't believe me? then don't.. but u can try explaining bullet registration to me and others.. and i will point to "56K'er friendly" netcode as perhaps a reason why we have bullet registration problems..

anyway im sick of the comparasions.. and i hope pple realize theres more elements that affect fps when talking computers compared to TV/movies
 
Dr. Freeman said:
im not sure if i can answer ur question...but what i can say is.. there are alot more elements involved with fps on a computer compared to TV, movies etc..

one of those elements is lag/ping... on a computer u can have 200 fps in a game... but that 200 fps is negated if ur ping is 200+ or say u have 20 ping but then ur also getting 10 fps.

with TV/movies u don't worry about lag/ping because it doesn't apply.. so comparing the these two to computer fps is a bit unfair.
Err, latency does not affect FPS rates. If your comptuer runs at a steady 200 FPS all the time, and you start lagging in an internet game, your FPS will not change at all, it's just elements within the game will start acting irrationally because information is not being sent quickly enough.
 
Minerel said:
Dude
my 2ghz, 256meg, Geforce 4 Mx420 could run Ut2004 smoothly..
Then i got 256megs more ram, maxed all settings onsept LOD on 1 lowest than highest and it ran hella smoothly!

GOD DAMN! The 5200 must be super crap!

I have done exactly the same with exactly the same machine :cheers:
 
Abom said:
Err, latency does not affect FPS rates. If your comptuer runs at a steady 200 FPS all the time, and you start lagging in an internet game, your FPS will not change at all, it's just elements within the game will start acting irrationally because information is not being sent quickly enough.

aight.. so no direct relationship with these things.. but they do indirectly affect each other in some way... at least i believe they do..

with a choppy net connection.. i think choppy fps isn't out of question here and vice versa.. and i haven't even talked about aiming.. and how choppy fps or connections screw up the aiming.
 
Well the way monitor's work means that rather than interlaced frames, where the "upper" and "lower" portions, or fields of the screen update sequentially, the update is progressive. From top of screen to bottom of screen, as the cathode ray tube fires electrons from left to right across the screen, exciting the banks of red/green/blue on the screen. If you are using a standard monitor, anyway.

NB. By upper and lower fields, I do not mean the uppermost and lowest parts of the screen, it's just a term used for the two interlaced frames.

Anywho, the long and short of it is that the interlaced method does not create problems with tearing, but is more blurred than progressive updating.

This makes it look much smoother than a progressive monitor at lower frame rates.

With reference to the lecture I linked to earlier, the eye can succesfully merge images between 10 and 24 frames per second on average.

Soo, you might think that all we need is 10-24 fps and we're set. However progressive monitors displaying 10 fps look so flickery it is insane! This is because the whole matching up of the frame rate with the monitor's refresh rate plus the fact the monitor updates from top to bottom so half of the time you are not looking at a whole frame but bits of two separate frames.

Hmmm... this is sounding more confusing than I expected...

Anyway to cut a long story short different types of refresh methods means PC's need a higher fps than the brain usually needs to make sense of moving images.
 
Point taken. The more FPS you have does also increase the chance of you picking up the "latest" screen instead of the "next to latest" one, or even worse.
 
OCybrManO said:
False

The human eye is able to detect a difference between framerates of over 200fps depending on the speed of the objects relative to the viewer's point of view. I'm too tired to write up a 1000 word essay on it. You'll have to look it up on your own.

Who are you f@@king Superman?...It has hard for MOST people to pick up on anything more than 25-30 fps..DVD framerates are 28 fps..."You'll have to look it up on your own"
 
wow, you guys must have good eyes. I play JointOps at like a constant 20-30 fps unless im nowhere near anybody. I dont think it should be like that, but it hardly ever bothers me. I played starwars galaxies with about 17 fps and morrowind with 20ish sometimes. The only game I have checked that gave me a great framerate was dod and that was 70fps.

My specs: AMD athlon xp 2500 , 512mb ram, radeon 9600 pro 256mb. Should my fps be that low?
 
Yeah, I've heard that fighter pilots can detect differences in very high frame raes, like 200 fps.
 
Krynn72 said:
wow, you guys must have good eyes. I play JointOps at like a constant 20-30 fps unless im nowhere near anybody. I dont think it should be like that, but it hardly ever bothers me. I played starwars galaxies with about 17 fps and morrowind with 20ish sometimes. The only game I have checked that gave me a great framerate was dod and that was 70fps.

My specs: AMD athlon xp 2500 , 512mb ram, radeon 9600 pro 256mb. Should my fps be that low?

Donno.. My old comp (AMD 1.2 GHZ, 256 MB, GeForce 2 MX 200/400) ran Morrowind pretty flawlessly. I usually don't check framerates as it's often overrated. I have, however, noticed a slight difference in my aim (It's not a slight bit better, or so my kill-score says) since I changed computers three weeks ago.

Edit: My point is that your computer SUCKS but it shouldn't. I bet you got some crap on your computer you need to clean out. Get spybot search & destroy and use it, then run a defrag on your OS disc and the disc your game is on.

Edit 2: Joint Operations is, strangely enough, poorly written. It takes out way more power out of your computer without much showing on your actual screen. They should have started over with a new engine. Hmm... So don't mind your FPS in JO.
 
is it better playing on 800x600 with all the details cracked up, or 1024x768 with the details on low/medium?

man valve should release some damn benchmarks, the games practically done, it would put a lot of ppl like me out their misery wandering how HL2 will run.

ive got an amd 2400+
768 ram
256meg 9600XT
 
I have a p4 3.2, 1gb ddr, msi gf fx 5900 ultra, 120 gb seagate baracuda hd

I run all my games at 1024x768 res, AFx8, max everything, forced ps and vs 2.0 and mipmap lod set to -1.5 (sharper textures) and i rarely get under 60 fps in any of my games and my image qality is better then it was on my saphire 9800 pro
 
guise said:
is it better playing on 800x600 with all the details cracked up, or 1024x768 with the details on low/medium?

man valve should release some damn benchmarks, the games practically done, it would put a lot of ppl like me out their misery wandering how HL2 will run.

ive got an amd 2400+
768 ram
256meg 9600XT

... You already got the specs off of VALVe...

Min. Spec:
1.2 GHZ
256 mb
DirectX 7.0 Compatible
OS 95/98/NT?/2000/XP

Recommended:
2.4 GHZ
512 mb
DirectX 9.0 Compatible
OS 2000/XP/+??

And I'd probably play on 800x600 with high then a bigger screen... =/ Still, though, I won't have to make that choise so I'm happy. :)



nc17 said:
I have a p4 3.2, 1gb ddr, msi gf fx 5900 ultra, 120 gb seagate baracuda hd

I run all my games at 1024x768 res, AFx8, max everything, forced ps and vs 2.0 and mipmap lod set to -1.5 (sharper textures) and i rarely get under 60 fps in any of my games and my image qality is better then it was on my saphire 9800 pro

Gargh you talking shit about my new Sapphire 9800 Pro? GARGH! I'm okay with my comp though :p
 
A-Train said:
Who are you f@@king Superman?...It has hard for MOST people to pick up on anything more than 25-30 fps..DVD framerates are 28 fps..."You'll have to look it up on your own"
Alright, everyone STOP comparing video framerates with PC framerates! They work on vastly different principles.

A video camera works in a very similar method to that of the human eye. Each frame the camera records includes the light gathered over a certain amount of time. If an object moves during that time it leaves a trail behind it (motion blurring). Each individual frame shows a passage of time.

A PC game's frame is how the world looks at an exact point in time. There is no proper motion blurring in real-time games (at least, not yet). You can't be sure that something is moving by looking at one of the frames.

There are several ways to get similar effects in real life. If you took an ultra high speed camera that is 50000 times faster than a normal camera (the faster the camera, the less blurring there is in each frame) and removed all of the frames except the multiples of 50000 you would get a similar effect. You could also use a strobe light that has really short pulses. Yet another way is to wave an object in front of your monitor (when it is showing something bright). If you do that you will see a series of images with gaps (the size of which depends on the speed of the object and your refresh rate) between them instead of a blurring effect as seen in normal lighting.
 
I have a:
P4 2.53 ghz
513 mb RAM
64mb GForce 4 Ti 4200
soundblaster value...blablabla

What settings shoudl I be able to run well on. **AND, I consider 30 fps my rocommended fps, i can live w/ 20, but....well....:)
**AND, how much of an improvement would i *probably* get if i upgraded my gfx card to a 9800 pro/xt?
 
Back
Top