At how much fps would you consider the game being unplayable?

nc17 said:
Anything under 40 fps anoys me in current games so it will probly be the same in hl2

But i havnt gotten under 60 fps in any of my games since i bought my 5900 ultra

hehe
u should prepare for a letdown.. :p
 
Ehh, I could play Far Cry at medium settings with my MX440. Don't know what fps I had but definately above playable. maybe 30 sumthin
 
Shuzer said:

I should get around 40-50 with my 9800pro than the x800's 50-60. Im praying hard.

AJ Rimmer said:
Ehh, I could play Far Cry at medium settings with my MX440. Don't know what fps I had but definately above playable. maybe 30 sumthin

You played farcry on a MX440?? :eek: :eek:


alan00000 said:
i am a nice guy so my old stuff i just simply give away to my friends for free because i want them to enjoy games to the way i play them for example i purchased x800 pro and i just gave my friend my old 9800 pro for free plus my 19 monitor too. and i gave a guy at work 512 mem stick and amd 2800+cpu motherboard and all..for free i like to make people happy, what do you guys think

Could I be your friend? WHen you get the next gen vid I could possibly get the x800! WOOT! :cheers:
 
ferd said:
yup, the human eye cant register things above 30-40 fps

But when you move your mouse across the screen fast it feels a little bumpy.
 
sup2069 said:
You played farcry on a MX440?? :eek: :eek:
Yeah... the Far Cry demo. actually I think I had minimum on some things but mainly medium. it only lagged during firefights with lots of dudes.
 
Thats amazing, I use to have a mx440 myself last november. Pretty sweet card
 
I think unplayable would be 10 or under.

*Will never be one of those "WHAT? ITS UNPLAYABLE IF I DONT GET 130FPS" kind of person*
 
Annoying @ < 30
Unplayable @ <15

Going to get the X800pro so I can enjoy smooth, full-detail, high-fps goodness. Hopefully at at least 1024x768.
Preferably at 1600x1200
 
1024 is great in my opinion, but then, i've never played in a higher res. :/
 
I just checked my FPS on FarCry with everything as high as it could go, and I got between 25-40FPS. I think anything under 15 is irritating to the eye.

My specs are

3.0Ghz HT
512MB PC2100(It's crap)
ATI Radeon 9800 128MB

I am hoping Doom III runs well :p
 
The average framerate doesn't matter that much, if it's above 24 the human eye struggles to see the difference, above 30 and it simply can't.

Where the problem lies, is that if you're running around with 28 fps, it's not going to stay there, it's going to drop dramatically every now and again, and that's when you notice it.

Thats why it feels bumpy when you move your mouse at low frame rates, 30 frames EVERY second is not bumpy, that's pretty damn smooth. Think about it.

However the frame rate might change from 30 fps to 3fps just for a teeny amount of time as you're playing. This is what makes playing at 20 odd fps unplayable.

And especially because we all know that the frame rate you get is inversely proportional to how exciting that part of the game is :(
 
WhiteBoy said:
I just checked my FPS on FarCry with everything as high as it could go, and I got between 25-40FPS. I think anything under 15 is irritating to the eye.

My specs are

3.0Ghz HT
512MB PC2100(It's crap)
ATI Radeon 9800 128MB

I am hoping Doom III runs well :p
Yeah... it's not. You've got the Doom 3 minimum specs there buddy.
 
gunstarhero said:
I thought anything under 30 fps and the eye would be able to pick up on it. Is that true? cause if it is then I would set your sights on having your computer pull 30 fps in a firefight, not just walking around.

Your eye goes at 24-26 FPS, if it's faster, your mind makes up the rest of the images (So that it "moves"). Basically you only ned 30-35 FPS for a smooth experience... So **** 100 fps.

Edit;
Note: You need a smooth experience to aim well... Or you'll have to wait for frames to catch up to you.
 
Dead-Inside said:
. Basically you only ned 30-35 FPS for a smooth experience... So **** 100 fps.

you need higher than 35 to compensate for drops in fps. 30-35 should be the absolute minimum
 
Crusader said:
if it's above 24 the human eye struggles to see the difference, above 30 and it simply can't.
False

The human eye is able to detect a difference between framerates of over 200fps depending on the speed of the objects relative to the viewer's point of view. I'm too tired to write up a 1000 word essay on it. You'll have to look it up on your own.
 
bliink said:
you need higher than 35 to compensate for drops in fps. 30-35 should be the absolute minimum

Hmm... True, but I didn't count those in. 35 is a smooth experience. With bumps it might not be but that's now what I was talking about.

@OCybrManO
Well, the eye picks up 25 images per second, but WHICH 25 images it picks up makes a big difference for your aim specificly. I don't know my aim in CS but when I got a new computer I certainly got a smoother experience (And better aim), but that was my last computer upgrade, not the latest.

Edit; It doesn't pick up all those images, but it does pick up different ones.


Bottom line: An fps of 60+ will get you through any game with best possible performance, unless you go higher but 60 is smooth (Highest setting if you will).
 
its not just the FPS a faster comp. boosts, but the response time.
 
TV works at 24 fps. So does cinema (although cinema uses motion blur, which is a bit of a cheat).
 
Ehm... No.. It doesn't. Maybe in the states. But then again, your network sucks, so no surprise there.
 
Ok then, 24fps is the most usual fps for TV, if you like that better :p

"Yes, there have been studies. It's generally accepted that the average person doesn't see perceptible change above 30 FPS.

However, that's an average."
 
False

The human eye is able to detect a difference between framerates of over 200fps depending on the speed of the objects relative to the viewer's point of view. I'm too tired to write up a 1000 word essay on it. You'll have to look it up on your own.

We're talking about games here, I am aware that fighter jet pilots etc. have been known to detect and identify enemy jet's in 1/123rd of a second, however we're just talking about computer games.
 
Dead-Inside said:
Well, the eye picks up 25 images per second, but WHICH 25 images it picks up makes a big difference for your aim specificly.
The biological nature of the human eye causes it to receive input in a way that can't really be measured by a "framerate" (I'm not going to go into the biology behind it in this post). Even if the eye sends updates to the brain at a given interval it still includes all of the light that was received during that time in each update (not just the light that it receives at the exact moment of the update). This extra data is what causes the effect of motion blurring. Your eye will pick up every frame your monitor can put out. The higher the framerate is the more smoothly the series of images will appear to move (because the distance traveled by a specific object in the time between frames is smaller). If the object is traveling at a high speed its movement will look less smooth than that of an object moving at a slow rate even when both are being viewed at the exact same number of frames per second. If the framerate is high enough the frames will appear to blur together in a fluid movement rather than having visible gaps between where an object is in one frame and where it is in the next frame.
 
I play BFV at average 20-30 FPS with settings all at Medium with shadows off.

running at
2.0ghz
512MB RAM
Geforce 3

And BFV still looks DAMN good when I play. I don't notice it as a low FPS at all, looks fluid (did console.showfps 1 to make sure of what it really was) and the graphics are great even at medium in that. So hopefully if I can't upgrade to a 96/9800 soon my Geforce 3 will run it mediocre.
 
if you dont have cpu over 3.0 gig and a ati card 9800 or above and 1 gig of ram you will not run the game the way its meant to be ran just like gabe said

MMMM Just a quick side note; how much is RDRAM differed from DDR ram... I have RDRAM and it costs a lot more than DDR. So I assume that RDRAM compares to a higher quantity of DDR, so if you had 512 RDRAM does that mean you have 768 of DDR... ( Performance wise?) because for 1 GIG of RDRAM, it costa up in the $400-$500 range... Anyways I may be totaly wrong and just thought to ask because I want HL2 to run kicK ass on my comp
-------
ATI Radeon 9800 pro 128 mg
P4 2.4 GHz
512 RDRAM

EY?
 
Even if the eye sends updates to the brain at a given interval it still includes all of the light that was received during that time in each update (not just the light that it receives at the exact moment of the update). This extra data is what causes the effect of motion blurring.

Interestingly, there is no update message sent to the brain, the retina actually consitutes part of the brain itself. However, if I may use this term in discussion of nature, processing techniques then applied within the brain itself are what determines how we perceive the data.

The brain has no idea of the concept of frames in reference to it's own visual stimuli, it simply merges the most recent data so that we can tell what's going on in a situation where the stimuli is changing.

In essence it is more pertinent to discuss the minimum frame rate at which the brain can successfully and believably merge the incoming data from photons, both luminosity from rods, and of course the rho and beta cones providing data which is translated together to produce colour, rather than maximum frame rates the eye can detect.

As mentioned earlier, a fighter jet pilot can detect a shadow of a plane traversing his view in 1/123rd of a second, thus it is obviously to be assumed that, if we are thinking in terms of frames, that this is detection of 123 FPS input. Which it could be argued is correct. However, simply because the human eyes have the ability to detect a change so small, does not mean that they cannot comfortably process and merge lower frame-rates and suffer no ill-consequences.

Much of this ill-defined terminology and the utter lack of any frames-per-second system in the human eye or brain is what causes the confusion when discussing such matters. In fact, you could say it is wrong to even bring the subject up for discussion, since there is in actual fact no absolute answer to the question.

Indeed, no two studies agree on a) what property of the eye/brain system exactly they are testing for that can yield information of the frame-per-second data rate, or b) what the results mean once they have gotten them.

Which is why no results are readily available on the internet.

Furthermore, to make matters more complicated, separate components of the eye may react in different time frames, and it is already apparent through earlier tests that the range of colours and luminosity in a "frame" change completely how fast it is detected and processed.

For instance, if you are sitting in a dark room and all the lights go on, the duration for which the lights must be in the on position for you to detect it, is less than the duration the switch must be in the off-position if you are in a light room trying to detect a period of darkness.

In short, there is no clear-cut answer, but the average person struggles to see much difference on a computer screen displaying above 30-40 fps.

However this is another argument entirely from the matter of input rates. FPS in a game also controls the rate at which the screen can be updated with your actions, which gives a latent phase between mouse movements and visual stimuli.

However since I find it unlikely that this is more stringent than the erroneous concept of a frame rate for the eye, it's not worth discussing.
 
And yet another debate on FPS issues... stop comparing video games to TV, it's not the same bloody thing.
 
Maybe I have superhuman sight, but I absolutely can tell the difference between 60 FPS and anything lower, and anything higher is only useful for bragging rights.

As far as playability goes, I think anything over 25 FPS is just fine, as long as it's consistant. That's really the big difference for me - my eyes obviously prefer something closer to 60 FPS, but my brain will settle for just about anything provided it's reasonably constant. A good example for this would be Halo on the Xbox. It never went above 30 FPS, which IS noticable, but it also very rarely ever dropped below that, keeping me into the game.

When Unreal II came out, I settled for about 27 FPS average. Fairly consistant, and besides the fact that the game was inherently boring, totally playable.

I think Source, at least with DirectX 9 effects disabled, is comparable to the current Unreal engine as far as CPU and GPU load, although I'm not sure how much of a load HL2's physics will demand. In fact, I think a CPU upgrade may even be a wiser choice in the long run than a GFX card. Like if you're running below a 2.0 GHz CPU and have around a GeForce 4 series or an equivilent, up the CPU. You'll get way more life out of the game in playability, and then get the video card near the end of it's product cycle, when it's dirt cheap.

For my own part, I just went from a 2.4 to a 3.2E with HT and the difference was very favorable. With the same old 9700 Pro Far Cry went from random stuttering to a very diggable and stable experiance.

Then again, I also got a X800 Pro soon after, which is generally twice as fast as the old card, at least on DX9 games. This is a card for detail whores, which up until now really wasn't me. Things like AA and AF were options I might flip on for a bitchin' screenshot, but I have always been a fan of smooth over pretty, no matter what. This card gives me a third option: Smoother and Prettier. Still, if I had to give up either the CPU or the card, I'd give up the card. Bring this up again in like 2 weeks and I may have a different opinion though ( :bonce: <--- Doom zombie)
 
Having sucessfully played such games as Morrowind and the original HL at framerates < 10FPS in the past, I'd have to say anything under 7FPS is unplayable for me. "Playable" not being equivalent to "Totally enjoyable"...
 
I don't play anything much apart from HL mods at the moment, so when I upgraded I went from 30 odd fps to 100 fps, and it maybe looks a little bit higher quality, too.

My rig is just wasting away til HL2 comes out, I might have to get Doom 3 to tide me over...


Hehe, can you imagine how many sales Doom 3 will get on that basis? iD were so lucky Doom came ou first ^^
 
20fps or more would be playable, but I would tweak it until I got at least 30+
 
Bah! you kids are spoiled.. I remember the old days when the early 3D games would chug along at 2fps if you were lucky.. And you know what, they were like the most popular games too. Nobody really minded.

You know, if you turn down your x2462497 AA and use a smaller resolution, all the modern games play brilliantly.. You can't even tell the difference at such high resolutions and high AA settings anyway. It's just showing off to yourself (since nobody else cares)

You should all go have a game of Driller on a spectrum sometime, you'll appreciate anything above 10fps then :p
 
Fenric's right... I reckon I used to see about 5FPS on a mere demo of Frontier on my A500, and that thing had none of the AI running...
 
Back
Top