What target fps & reswill satisfy fyour needs in HL2?

I guess i'll be playing with about 40-50 fps in 1280x1024 . 4xAA and 8xAF

And guys, 60FPS is insane, you can run a game smooth in like 35FPS.
 
...And guys, 60FPS is insane, you can run a game smooth in like 35FPS. [/B]


That's what I was thinking. And isn't it true that the eye can only see a certain number of frames per second anyway? Or did I hear wrong?

Anyway, while these sepcification threads go over my head somewhat, apparently I'll be satisfied with 30fps, 1024x768 with all the bells and whistles. Is there a significant difference when bumped up to 1280x1024 though?
 
30 is ok but if you get a strider and 5 COmbines all at the same time then your 30 fps will become 10 thats what i´m worried about.
 
Originally posted by ShortStuff
That's what I was thinking. And isn't it true that the eye can only see a certain number of frames per second anyway? Or did I hear wrong?

The answer is 15 frames per second, all movies run at 15fps, ok good. :)
but thats for Films, not Games :)
 
Films run 24Fps/second but they triple show every frame for smoothness.
 
I'll be happy if the framerate keeps between 20-40 fps at 1024*768.
 
anyway, what gpu should I buy for the following pc,
a radeon 9600 pro which is very good in price or should I
spent twice as much for a radeon 9800 pro or something?

I got a P4 2,4 Ghz with HT and 800 Mhz FSB, fast 512 MB DDR
(but with a nVIDIA quadro NVS currently built in.. it's a DELL
Precision 360 and it looks just like that thing VALVe used for
presentation, although I know they didn't have a nVIDIA in it :)
 
20-60fps @ 800*600 with full detail and 2xAA and 16X AF would be good for me, since I will turn off some details or go down to 640*480 for the sake of a good, consistent framerate, but I doubt that I will have to go down that low
 
Considering that television is 24 frames per second, I think anything above that is pretty superficial.
 
Yes you're right Java. When a game gets choppy it's like 10 fps. Not 30.
 
Originally posted by HaloEleven
I want to be running HL2 at at least 60 frames per second and at 1280x1024 with at least 2x AA with DX9

P.S. Check out my reply to the training thread, see what u think.

the FPS in the valve benchies are at 1024x768 with no AA or AF using a pentium 4 2.8 if I remeber correctly. so good luck getting higher fps at higher res with the same system :p
 
Originally posted by CrazyHarij
Yes you're right Java. When a game gets choppy it's like 10 fps. Not 30.

Exactly. As long as I can turn all of the features on and not dip below 24, I'm happy ;) .
 
Originally posted by CrazyHarij
Films run 24Fps/second but they triple show every frame for smoothness.

Oh right, that must have been what I was thinking about, thanks for clearing that up. Interesting info.
 
Originally posted by JavaGuy
Considering that television is 24 frames per second, I think anything above that is pretty superficial.

Why won't you just ADMIT that you CAN notice the difference between 30 fps and 100.
omg
I even notice the difference between 70-100 ...and thats no joke.
Maybe i dont 'notice' it with my eyes, but i feel it, the motion and everything.
 
Yes , I do notice difference between 30 and 100, but the game is for ****s sake not unplayable at 30 FPS.
 
yea but the benchmarks people are using to gauge their systems...the numbers average framerates. so of course if your average is 30, chance are you dipped to around 20 at some point...which sucks.

i bet most people get 60-100 fps in hl1/cs now. go in your console and type fps_max 24. you'll notice a difference in how the game feels, even if you can't see the difference.

and you'll also notice a difference in how the game feels between fps_max 50 and fps_max 100.

movies also utilize something called motion blur...games don't (yet).

so 24fps in a game isn't quite equal to a movie or tv
 
Originally posted by Xenome
Why won't you just ADMIT that you CAN notice the difference between 30 fps and 100.
omg

LMAO chill dude, we're talking about framerates.

Besides, not everyone CAN notice it, so that's why there's nothing to ADMIT.

OMGWTF!!11!

Besides, the only people I've seen notice it are the ones who pay more attention to their FPS counter than actually playing the game. When I don't have the counter on, I don't notice it until it gets choppy... and when it gets choppy, I turn the counter on to see where it's at: I usually notice it at 20FPS.
 
i can notice a difference between 30 and 100.. thats big a jump

but your eye really only starts noticing under 20-25 fps, got to recall your eye is just like a camcorder aswell, your eye takes pictures, anything really over 60 will look no different from 80,100 but 30 is near that 20-25 cut off mark, and you amy see a difference


i want to run hl2 at 60 fps.. 1024x748... becuase 60 gievs you a nice comfort zone for when things might get heavy
 
go in half-life...play a game and put fps_max 30 on.
then play with fps_max 60.

if your system is capable of maxing out at 60 100% of the time no matter what's on the screen, you'll notice a difference in gameplay between 30 and 60...framerate counter on or not
 
I hope to get 10 fps on my 486 w/ my Voodoo 2...

and 100 fps on my o/c 3200 athlon with o/c 9800 pro 1024x768 no AA 4xAF

and 60 fps on my 2500 athlon o/c to 2.2 ghz, 9700 pro at 1024x768 no AA no AF

and 30-40 fps on my 1700 xp (1.5 o/c to 2.0) w/ Geforce 4 ti 4600
800x600 , no AA and no AF

all but one of these should work.. can you guess :D
 
Originally posted by Maskirovka
if your system is capable of maxing out at 60 100% of the time no matter what's on the screen, you'll notice a difference in gameplay between 30 and 60...framerate counter on or not

It's so nice having people speak for me, alleviating me of needing to draw my own conclusions.

I already stated that above 24FPS, I can't tell the difference. That's a fact. Maybe my eyes are shit, or whatever, but the fact is, I can't tell the difference above 24FPS.

I don't understand why people insist on jamming their experiences down everyone's throat as if they were law.
 
Anything below 20 fps will bug me. I'm hoping for a smooth 30, @ 1280x1024... Some nice AAx4 if my system can afford it.

AMD 2700+ XP
Radeon A-I-W 9800 Pro 128MB
nForce 2 Asus A7N8X
1024 MB Corsair PC3200 (DDR333-CL2)
Western Digital 120GB SE 8MB cache
 
Im shooting for above 30fps at 1024x768. If i get it then i'll turn on the special stuff, AF first, then if it's still good, bump up the AA until it's choppy, then turn it down till its a steady 30fps. I'll be lucky though. If I can't get that i'll drop to 800x600. I have a GeForce 4 Ti4200 128mb overclocked as much as I can without burning it up.
 
Originally posted by JavaGuy
Considering that television is 24 frames per second, I think anything above that is pretty superficial.

Two things make that frame-rate work for film: First is the motion blur. Second is that there's a tiny *black frame* (although not an actual frame) between each image frame. Because games don't have a natural motion blur, they can't get away with 24 fps in the same way.
 
You all say that a FPS under 40 is bad and choppy, but that isn't true.
A cinema movie is around the 25 FPS and that's not choppy. So 30 FPS is perfect for a game. You start to see it when it drops down to 22 FPS or something, this depends on the action and movement and your own eyes. :bounce:


I want to play on 1024 with 4X AA and 16X AF at least.
 
I will be happy with a constant 30+ FPS in 1024x768 (I rarely bother with higher res., although I could) with everything turned up.

From Benchmarks my system P4 3Gb 400MHZ FSB, 1Gb RAM, 9800Pro should average about 50 – 60 FPS at those settings so I should be happy.

If it falls below 30 I’m going to start feeling like an FX owner though.
:cheese:
 
Everything maxed out @ 1024/768 - hopefully 40/50+ fps (things are smooth at 30fps anyways so i'm not exactly gonna freak out if it's lower than 40)

Athlon XP 2600+
768mb DDR PC2100
Radeon 9700 pro 128mb oc'ed to 361/341
MSI K7N2-L nForce2
60gb Seagate HD @ 7200rpm
 
Originally posted by Robson
Two things make that frame-rate work for film: First is the motion blur. Second is that there's a tiny *black frame* (although not an actual frame) between each image frame. Because games don't have a natural motion blur, they can't get away with 24 fps in the same way.

Just out of curiosity, how does this "black frame" assist in the visual smoothness of TV's 24FPS? I see what you're saying about the motion blur, though.
 
The key here is that although 30 FPS looks pretty smooth, an average 30FPS often doesn’t as when the action hots up you can easily loose 20 FPS in many games leaving things looking very jerky for a few seconds.

That’s why it’s best to average 40 –50 FPS at least. But it does depend how much performance drops in busy areas, which is more a level design issue.
 
I'll keep turning on features until I get 30fps minimum. That is more than good enough for me (I played through Unreal 2 on my old graphics card at around 22 fps).
 
35-40 FPS 1024x768 4xAA(maybe) 8xAF(maybe). Max details.

All this depends on when i get my new rig though
 
Originally posted by JavaGuy
Just out of curiosity, how does this "black frame" assist in the visual smoothness of TV's 24FPS? I see what you're saying about the motion blur, though.

IIRC, the "black frame" isn't an element of video. Video is 30 fps, and also features a natural motion blur.

The "black frame" was a pragmatic solution from back in the days when film technology was first being developed. They found that if they just showed the images straight, one to the next, this created some sort of visual/perceptual problem. (Sorry I can't be more specific than that!) Thus, they show a frame, then turn it off for an imperceptibly short period of time, then show the next frame.

More on the "persistence of vision":
http://www.wikipedia.org/wiki/Persistence_of_vision

:thumbs:
 
standard television is 30 frames per second.

http://www.audiovideo101.com/dictionary/dictionary.asp?dictionaryid=368

and like people pointed out, motion blur is the reason why movies look good at 24fps (even though some claim even their 24fps is not enough...and why it sometimes looks crappy if you sit all the way in the front row)

i'm pretty sure interlacing has something to do with why tv looks ok at 30fps...something about drawing the two (interlaced) images gives it an effective refresh rate of 60Hz

your brain can't comprehend the images faster than 25-30 fps, so it blurs them together, which is why 60fps looks better to most people than 30.

also, you know your fps is gonna be worst in multiplayer...in big fights with lots of smoke and other things like that happening...so that's why people are fussing more about higher framerates...they just mean average framerate...because the higher your average, the less the chance you'll drop below the minimum you consider playable.
 
Back
Top