40fps.

Originally posted by nsxownzme
Shutup dumbass.
Use the internet, research.

Stfu, i aint no computer genius and i im too lazy to look the shit up
 
Is'nt he using the Internet? :p Ok I agree in one way, and beeing lazy is not a good thing... Unless it's because your playing hl2;)
 
well i did look but no article with my specific questions..
 
I must say I can agree with both, asking on a forum qualifies as using the Internet, beeing lazy is another thing... I think chris_3's question was a good one for those who feel like answering it... This forum is full of people with just this nowledge, if no one will answer it, then no one will answer it, good thing he did'nt make a new thread about it, think about that one! :)
 
I entirely agree, people want to have the max fps possible on their machines which is normal but when they start thinking about upgrading because of one game that runs at 30 fps it gets ridiculous.
if it weren't for those people, we wouldn't have flagship products, so more power to them
 
I have a 3Ghz with a 9700 pro and i get 60 to 100 on every game. with all the settings the highest they will go.
 
To the people that say you can't notice the difference between XX FPS and XX FPS, well, you don't know jack shit.

First off, everybody will get different "results" cause no one has the same hearing, seeing, tasting, touching, smelling abilities.

One woman can smell a cigarette burning 3 rooms away, while I wouldn't be able to smell it if it burned my crotch to a cinder.

Some people can taste the difference between a Champagne 1899 and 1900 while others can't.

You get my drift...

You can exercice your senses, and get better. They can become more or less accute (sp) and precice during your lifetime. Age, sex and a number of other factors will make your different senses better or worse.

Concerning Framerates.
I can personnaly tell you the difference between 60 and 90 fps. I can even tell the difference between 70 and 90. But after 75 ish and up, I can't.
Now, some people can't see any difference between 30 and 60. Others can see a difference up to 120+. It all depends on the person. For exemple, I can clearly see a fluorescent light flicker (it's about 60Hz).
Try changing your monitor's refresh rate. Anything lower then 85 is a pain in the brain for me. Others are fine with 60Hz and can't notice it above.
Some jet pilots can spot a single frame out of 200.

What is MORE IMPORTANT then FPS though, is can you run it in sync with your monitor? (Vertical synchronisation, V-Sync).
Running 99FPS with V-sync off can still look rough, for the simple fact that your monitor, and your game will not refresh at the same time. This can cause some "Tearing".
Running 60FPS with V-Sync on, is almost always smoother and better then 99 (or 300+) FPS with it off.

The acceptable FPS also depends on what game (or type of game) you are playing. While playing an RPG or a RTS at less then 30FPS can be good enough, for a First person shooter or a driving game, you would want at least 30 (that's if your system is fast enough).

Anyways, this was just to clear up that our perception is different from one another.

*Counter-Strike Flashback of running de_dust at 9 FPS*
 
Originally posted by PSX
if it weren't for those people, we wouldn't have flagship products, so more power to them

I understand what you mean, but I wasn't clear on what I said previously: I think 30fps is fine, it's not really worth the 500 bucks or euros for a brand new 3d card. That's what I meant by ridiculous. :)

And also Chris 3, you ought to calm down a bit.
 
What my system will be: R9500Pro, Barton 2500, 512 mbs RAM. I should get ~40 FPS on 1024x786 right?
 
A video image on a television is quite different from a computer generated image. When a video image shows an object in motion you get a blur effect which looks quite natural and it is hard to notice each individual frame (NTSC 29.97fps, PAL 25fps).

The current CG images in 'first person shooters' are rendered perfectly clear frame by frame. This is very noticeable at 30fps, but even at 100fps a person with a trained eye will sometimes notice the flicker between each frame in high contrast images.

DirectX 9 compatible graphics cards do support motion blur, I wonder if HL2 is going to use this feature?
 
Originally posted by SpuD
What my system will be: R9500Pro, Barton 2500, 512 mbs RAM. I should get ~40 FPS on 1024x786 right?

If you over clock the 9500pro to 9700pro speeds then you could probably get more than 40
 
DirectX 9 compatible graphics cards do support motion blur, I wonder if HL2 is going to use this feature?
what about Voodoo5 in Quake 3 with motion blur? remember those videos? personally, it's shit, why use it? but I can understand in a racing game going 220mph, that's just natural, like in motogp or midnight club 2

If you over clock the 9500pro to 9700pro speeds then you could probably get more than 40
how do you know? you don't have the game yet, and can't say how many fps someone is going to get; what's important is minimum fps, average is just best for benchmarks

as for overclocking the card, it varies from game to game, but most likely, it'll yield some good differences
 
My opinion on the whole FPS debacle (we will omit Hz for the moment)

However many frames a human eye can actually see, our brains aren't in sync with the output source (eg Monitor, Screen) so we cannot see all of the frames no matter what. But when you up the framerate, you increase the amount of frames you miss out on, but also the amount of frames you don't miss out on seeing.
It's all luck really. Depends how hard you are concentrating, and whether or not it skips your own refresh rate.
Some people think 30fps is the max, other think that we have a kind of continuous stream of vision. I think that we do have a continuous stream, but our brain can't really process it all at once.

Anyway, back on topic, I can notice my card dropping to 30fps, and it looks like rubbish to me. After around 80fps, it's just garbage.

Same with looking at a 60hz monitor (eg. at someone elses house) after being used to 85hz, and i'm sure 85hz would bug me after being used to 120hz.

It's all about what you're used to.
 
Originally posted by Archangel
basically, movies are made with 26 frames per second (initially, nowadays the probably have gone up to 30 or something). You might notice the difference between 30 and 40 Frames (skilled eye of a gamer), but anything above that is unnoticable (besides, i loath people who oc their system just to get a few more frames, and the brang about how cool that is)
Actually, motion pictures are filmed at 24 FPS only because that is the slowest speed at which the eye sees the illusion of movement. Any slower than that and the eye can distinguish individual frames. But at that slow of a shutter speed, there is considerable motion blur, particuarly when recording fast action. This motion blur actually enhances the look of 24 FPS as it smooths out the individual frames, but you don't catch a lot of fine detail. Modern motion picture is actually screened at 48 frames a second. Each frame is displayed twice to reduce the amount of apparently flicker visible to the eye.

NTSC television is broadcast at 30 FPS, which is actually 60 interlaced fields per second. It's the interlacing that causes fine detail to buzz and flicker. This is most recognizable as the moire effect, the rainbow patterns that show up on things like suits with thin stripes.

The average computer monitor refreshes at a rate of 75 hertz (or cycles per second). However, computer monitors display images using progressive scanning which draws the entire image in a single frame. This is why computer screens generally appear sharper than your average interlaced television set and why they're able to display fine detail without producing the moire effect.

So when you really think about it, even if your graphics card it pumping out 100 FPS, you're only actually seeing 75 of those frames since you won't see images faster than your computer screen can refresh. In practical terms, anything over 75 FPS is wasted.
 
Originally posted by PSX

If you over clock the 9500pro to 9700pro speeds then you could probably get more than 40
how do you know? you don't have the game yet, and can't say how many fps someone is going to get; what's important is minimum fps, average is just best for benchmarks

as for overclocking the card, it varies from game to game, but most likely, it'll yield some good differences


dumbass go back and read the posts.

Look at the words i used.

Could

Probably
 
Personally I'd rather play games locked at 30fps at higher resolutions with all the visuals set to max detail. We all watch movies at frame rates of ~24-30fps (depends what country you're in and what medium you're looking at) all the time and it's never uncomfortable for us to watch. Gamers with money are getting way too fussy. Anyone who complains when their games run below 60fps is being pedantic. I suppose you'll never go to the movies now that you know they're shown at 24fps? I think not.

As for people being able to notice frame rate differences... most people probably can notice differences, but the differences are very slight. People who feel sick looking at 60hz are simply spoiled by higher refresh rates. Until recently I haven't even had a monitor that can display higher than 60hz and it's never been a problem. If you're getting flickering, it's not the refresh rate that's to blame, try changing your vsync settings. I've recently compared games (running at very high fps) on monitor refresh rates of 60, 75 and 85 (to test my new monitor) and while there's a minorly detectable difference, it's not enough to make any setting uncomfortable to look at.
 
I think Valve said that they were targeting a stable FPS for all environments rather than just max fps in a slow scene. Seems like a reasonable approach: instead of clean one minute, choppy the next, you get a nice stable environment, even if it isn't the absolute fastest fps you could get.

I'm sure there will be some features that wont hurt (visually) too much to turn off but will give big performance wins, and clearly if the game will play fine on even low end cards, I'm not going to worry too much if I can't have infinate detail brush draw distance right off the bat.

How do you set vsync on anything?
 
dumbass go back and read the posts.

Look at the words i used.

Could

Probably

sorry didn't mean to get that post involved with yours with the "and can't say how many fps someone is going to get" part.. apologies :p
 
to set vsync on\off: It really depends what card you have.. if you have any of the nvidia cards, you should be able to find a "tweak utility" that has the option to force vsync on or off. Some cards (like the geforce 4 ti4600, not sure about others) have the option to set v-sync in the video card portion of the display properties. As for non nvidia cards, I really don't know =/

Some games actually have vsync options in the graphics setup screen.. hopefully HL2 will be one of these. I like games I can tweak :cheese:
 
thx mate. how high has the 9800 not pro? and can i clock my card 2 that?
 
I hear you can oc it 75Mhz with no problems and stock cooling....not that much though.
 
Ok i have clocked me card to like 20mhz more. but i have clock both core and memory.
 
my current computer is a brand name sony and cant oc for crap. I clocked my ti4200 25MHz and there was like massive snow in all my games. I got this pc a while ago from someone. Obviously not my first choice, but its ok.

The really funny thing is though, it has 0 fans. HA. Its a P4 1.7 Ghz, 512sd ram, ti4200 and there are no fans, no fan slots either. Such a piece of crap in most aspects.
 
Originally posted by synth
Humans can't detect framerates past 60fps. so if it's 60 or 6000, you won't be able to tell the difference.

indeed, the human eye is only capable of 60 frames per second
 
That is weird i tell you.

When my computer crashes (regular accurance) the monitor switches back down to defualt 60hz. I notice this and change it bk to 75. I notice it becuase it gives me a slight headache after a while and makes my eyes hurt. No lie.
explain for me if you can please. :)
 
Originally posted by TsuNamI
indeed, the human eye is only capable of 60 frames per second
http://www.mikhailtech.com/articles/editorials/fps/

"THERE IS NO PROVEN LIMIT TO THE FRAME RATE THE HUMAN EYE CAN PERCEIVE"

"I bet if you play at 250fps (possible, but only with older games at low resolutions) for a couple weeks and then go down to 125, you will see a difference."

"whereas tests on Air Force pilots have shown their ability to not merely notice, but identify the type of aircraft when shown an image for only 1/220th of a second"

"Some claim 24fps, others 30, some 60, some 200, some even upwards of 2000 and above. Feel free to add any numbers in-between. The truth of the matter is, every one of these people is right and wrong in their own respect. Why? Because that's not how the brain works."

etc etc...
 
Actually I asked Gabe how will HL-2 run on my (certainly mid-end) computer

Hi, I thought you may help me spend (or not spend) some money ;-). I have a P4 2.0 with 512 MB RAM and Radeon 9600 Pro. Is there any point of upgrading for HL-2 ? Will I have playable (30-40) framerates in 1024*768 max detail on that rig ? Would be cool if help me out, I have some tough purchases ahead of me !

Actually that's P4 1.8 o/c'ed to 2.0, but that's not making much different. And here's Mr. Gabe Freeman himself:

Your system will be fine at 30-40 FPS at 1024x768.
 
Cool gabe mailed you that!

Anyways personally i dont want to play on 30-40 fps..... when i play games like cs i dont like it when the fps drops to belov 80 (i somehow feel it) so i would rather put it on 800* than 1024* on full detai to get the fps to my preferred level.

But anything over 30 fps i playable.... movies are 24fps you know
 
Originally posted by h3nkoS
But anything over 30 fps i playable.... movies are 24fps you know
Once again, double to 48 to get the roughly eqvivalent computer fps. Is it so hard to understand after the first explanation?
 
I was actually thinking of buying R 9800 in September, but with new academic year spendings (dorm, books) that would be VERY risky purchase.

I have 25-40 fps in Splinter Cell, if HL-2 will run like Splinter does, then it's party time baby. :bounce:

However I will upgrade at some point because I'm certain that mods will push Source to the limit (oh and there's Vampire II around the corner...

Oh and it was so cool for Gabe to answer me ( /me giggles like a small girl). :cheese: :cheese: :cheese:
 
The difference between 40 fps and 70 fps, or between 70 fps and 100 fps is VERY easy to notice....
And some people here don't understand that when you turn off vertical sync, you CAN have 200 fps on a 85hz monitor...
 
Originally posted by Xenome
The difference between 40 fps and 70 fps, or between 70 fps and 100 fps is VERY easy to notice....
And some people here don't understand that when you turn off vertical sync, you CAN have 200 fps on a 85hz monitor...

dude your freaking crazy eyed retard sydrome. Nobody on earth except for superman can tell the difference between 45 and 61,000 fps.

EDit: Dont tell me im wrong, because im not wrong, your wrong.

Edit2: Its not my personal opinion either, its fact. Get over it, all of you.


"whereas tests on Air Force pilots have shown their ability to not merely notice, but identify the type of aircraft when shown an image for only 1/220th of a second"

once again i have to tell you that test is not the same thing and should not be any where near video games. Its 1 picture unlike games which are tons of pictures flashing at you a certain times each second.

Of course you will be able to see 1 picture out of none on any fps. but thats totaly different then a game so stop useing that.
 
Originally posted by [Hunter]Ridic
dude your freaking crazy eyed retard sydrome. Nobody on earth except for superman can tell the difference between 45 and 61,000 fps.
w00t! I'm superman!!!!!!! :bounce:

If you are so bright and know all, then tell me how I can see so clear difference between 60hz and 75hz? through my computering years I have tried it on Riva 128 with a 21inch, Riva 128 with a 17inch, unknown intel on 15inch, Geforce DDR on 17inch, GeforceDDR on 21inch and 9700 Pro on 21inch... I see the exact same thing, every single time. 60hz flickering. 75hz or more, smooth screen. I see it in games to. The differences between 45 fps and doubling that to 90 is very visible.
Bf1942 on my 9700 Pro does vary in intense situations, from the VS locked 85fps to half, 42 fps. Its easy to see when it happens, without ever seeing the fps.
 
you cant, give it up you know you cant. Its all in your mind.
 
Originally posted by Gud
I was actually thinking of buying R 9800 in September, but with new academic year spendings (dorm, books) that would be VERY risky purchase.

I feel you buddy, I'd also like to purchase a 9800 pro but the price tag is as big as my rent check lol :)
 
actually i think Dawdler is right he will notice the difference if 85 FPS drops to 42 FPS but if it was constintly at 42 he would not notice the difference between them. That is because slow down mainly occurs when it jmping between a low and high FPS.
 
Originally posted by Mad_Eejit
actually i think Dawdler is right he will notice the difference if 85 FPS drops to 42 FPS but if it was constintly at 42 he would not notice the difference between them. That is because slow down mainly occurs when it jmping between a low and high FPS.

exactly
 
Back
Top