40fps.

I wasnt trying to tell you what to do at all, its just that when i looked at the thread it looked very much like it was going round in circles. I was just trying to help :(



sits and crys in corner,




then goes bk on web looking for a magazine scan for HL2 to add to my collection (accidently deleted :()
 
Originally posted by marksmanHL2 :)
I wasnt trying to tell you what to do at all, its just that when i looked at the thread it looked very much like it was going round in circles. I was just trying to help :(
sits and crys in corner,
then goes bk on web looking for a magazine scan for HL2 to add to my collection (accidently deleted :()
lol I guess your intentions were good, so don't worry about it :cheese: I guess the thread has gone around in circles, but people still have the right to talk about it if they want. And I'm sure we'd all prefer that as many people as possible put forward their oppinions on the matter now, so that we don't get more threads popping up in the future with someone else's oppinions, causing arguments, and "we've done this before" posts :)
 
heh, i love the sad smilies on this forum, they really make you feel sorry for the person lol.

Anyway!, thanks. I see what you mean. It gets on my nerves when people state that the thread should stop because they are no longer interested. I really should have read through my post before i sent it.
And yeah, the "this has been discussed" should be avoided at all costs. So post away!
 
So now we are gonna discuss how this got out of hand for another 10 pages? Cool :)

Some people (no hunters named) reminds me of this quote:

Person A: "You're in denial"
Person B: "No I'm not!"
 
Originally posted by GreaseMonkey
The human eye can see 30 fps i believe, anything above that makes no difference. But you, on really quick movements ingame, the fps might affect it.


Absolutely correct, hence why television programs are broadcast at 35fps because a steady framerate and science shows humans cant tell the diffrence. however dogs and cats can they see the tv like we see it when its recorded (camcorder recording a tv thats switched on)
 
Originally posted by Andy018
Absolutely correct, hence why television programs are broadcast at 35fps because a steady framerate and science shows humans cant tell the diffrence. however dogs and cats can they see the tv like we see it when its recorded (camcorder recording a tv thats switched on)
ROFLMAO!!!!!!! :D :D :D

I'm sorry, I just had to...
 
The more i read up on the subject the more i agree with the people who say it does make a differance....

mabey people should actaully educate themselves before talking about a bunch of crap they just made up.
 
the only mp i am playing now competitively is sof2...if i got 35-40 fps i would quit. i like to average 150 (capped), randomly-generated maps are different.

of course this is at the expense of alot of eye candy...but wtf, id rather have 16-bit textures and the like at the sake of nailing my enemy.

on the other hand, single player experiences, 50 fps is great imo, with video options maxed (or close to)...lmao, the doom alpha ran great at 30 fps UNTIL an enemy jumped into play, then i dropped to 9....that is not good.
 
How can we really have any intelligent discussions when people are allowed to act.. Well, as RIDICulous as this, without any word or action from an administrator.. To claim things, thats one thing, but this.. It should'nt be like this... :( It's a shame really!
 
Heres something to try if u have GTA: Vice City or GTA 3

Play the game for a while WITHOUT framelimiter on. Framelimiter limits it to 30 fps.
Now try it WITH framelimiter on. You WILL notice a difference.
 
Originally posted by tuemmykids
Heres something to try if u have GTA: Vice City or GTA 3

Play the game for a while WITHOUT framelimiter on. Framelimiter limits it to 30 fps.
Now try it WITH framelimiter on. You WILL notice a difference.
Or try GTA 3 on a Geforce DDR. No need for a framerate limiter. Just play and listen to your own sighs of pleasure. Then you know you got 60+ fps in just that instant. Of course, by then its to late, so listen well cause you'll crawl again in no time :)
 
A quick note about Hz... dawdler when you talk about your eyes hurting from looking at a 60hz screen, that's because of problems with monitors. Since the monitor is concerned with the actual phyical projection of the image, it is vulnerable to tiny fluctuations. So when you're running at 60hz, the image appears to flutter just a little. Your eyes try to account for this, and you get a headache. At 85hz, its fast enough to "beat" your eyes...no frame is in place long enough to detect those tiny fluctuations. Hence your eyes don't try to correct, and you don't get a headache. The problem is less on LCD screens, but its still there.

Now, as to FPS, I think with FSAA and AF and all that we are seeing less of a need for high FPS. On old games, when just standing still the distance would flicker (as it still does in BF1942, but that since the level is so huge...I digress) and that could cause a similar reaction as described above with 60hz monitors. But in games that do not have this flicker, you would have a lot of difficulty noticing a flicker (and hence a difference) between say 30 and 60 if you were standing still.

If you're moving, of course, it becomes easier to see the difference. If pixels jump to far around the screen you notice. However, whether you notice at 30 or 60 or 90 depends on your mind. Not your eyes, but you mind. I say that because I was perfectly happy with my 2 fps playing Contra. ;)

So much for quick...lol. Anyway, my main point was that the flicker dawdler is talking about comes from monitor fluctuations and errors and isn't connected to the eyes ability to notice differences between 30, 60, or 90 fps.
 
The theory of "persistence of vision" dictates that it takes only 12 "flashes" per second(same as fps) to trick the brain into seeing motion as opposed to a series of images. "Fluid motion" happens at only 20 FPS. The max FPS your brain can translate is about 30 fps. Anything above that..you can not see, regardless what you may think.

Movies run at 24 fps and Digital Video runs at 30. There is only a few instances where "flicker" may occur in movies/video. For instance, panning across a picket fence produces an almost unnoticable "flicker" when your brain jumps back and forth between the fence and the background...In this case filming at a higher framerate would produce a smoother effect, only because of the higher sample rate. IE your brain has more options of where to inturpret the movement lessening the flicker.

I think the "pickett Fence" effect is actually constantly happening on a computer screen in a way(refresh rates, frames being "drawn" and other factors), resulting in "chopiness" at even greater framerates than your brain can actually perceive...hope that that makes sense.

Anyway...I think 40 fps will be perfectly smooth if it stays consistant.
 
Originally posted by Parasite
The theory of "persistence of vision" dictates that it takes only 12 "flashes" per second(same as fps) to trick the brain into seeing motion as opposed to a series of images. "Fluid motion" happens at only 20 FPS. The max FPS your brain can translate is about 30 fps. Anything above that..you can not see, regardless what you may think.

Movies run at 24 fps and Digital Video runs at 30. There is only a few instances where "flicker" may occur in movies/video. For instance, panning across a picket fence produces an almost unnoticable "flicker" when your brain jumps back and forth between the fence and the background...In this case filming at a higher framerate would produce a smoother effect, only because of the higher sample rate. IE your brain has more options of where to inturpret the movement lessening the flicker.

I think the "pickett Fence" effect is actually constantly happening on a computer screen in a way(refresh rates, frames being "drawn" and other factors), resulting in "chopiness" at even greater framerates than your brain can actually perceive...hope that that makes sense.

Anyway...I think 40 fps will be perfectly smooth if it stays consistant.
Real video and video games (ones that are rendered in real-time) are different, stop comparing them... the reason the brain will interpret 20fps of video as smooth motion is because video cameras (like human eyes), including digital video cameras, blur moving objects (the amount of blurring in the video depends on the speed of the film) and the brain is tricked into seeing the motion between the frames.

Video games do not have any blurring due to motion (except for a few failed attempts, though the lights in GTA3/VC and the light sabers in KotOR look nice) because a scene is exactly the same throughout the rendering process. It is very easy to see a big difference between 40fps and 80fps when something is moving because the gaps between the frames are much smaller and the distance an object will travel (at the same velocity) between the frames is half as large. Your eyes will gather the light emitted by every frame. The more frames that are displayed in a given amount of time, the more fluid the motion appears to the brain.

You have a device sitting right in front of you that can prove that increasing the frame rate helps make motion seem more fluid... open up a white screen (like notepad) on your computer and wave an object (like your finger) in front of it.
Do you see the gaps between where your finger was when each refresh occurred? Yes.
Change the refresh rate and repeat moving the object at the same speed. A higher refresh rate will be closer to fluid motion than a lower refresh rate... even it is past the level you say your eyes can't detect.
 
Yes, games, movies, Video are all comparable. We are talking about Frames Per Second. Film and Video are pretty much the only things you can compare it to. But in a way your right, The way a computer screen displays a frame works somewhat different from any of those other medium. Thats what I was trying to explain with my "picket Fence" example. But at the same time both film and games have a Frame rate and a flicker rate (refresh rate for games)

Good point about blurring in games though...That would help smooth out the motion. BTW Persistence of Vision is not something I made up...check it out, or more specifically, look up "phi phenomenon". Motion does not become increasingly smoother beyond a certian point, that point is about 30 FPS. I explicitly remember 30 FPS, but will look around to see if its the number is wrong. The reason POV works is because your mind actually "sees" an image for a split second longer than its actually there, by replacing that image with another causes the 2 images to "blend" resulting in perceived motion. This Blending can only happen so fast, and that is why we can not and do not perceive motion beyond a certian threshold.

Try watching a spinning wheel increase in speed, eventually the wheel will appear to slow down. Then try this. Same exact concept, exept your moniters refresh rate plays the role of POV. This happens because you mind is "dropping frames". You can not see the motion of the wheel beyond a certian speed...the slow down effect is a result of that. Otherwise we would be able to see objects move at any speed and thats why we see "blurring" when objects move too fast.
 
I have heard of that effect (pov)... but have also heard of many scientists and scholars that disagree with it.

Here are two examples:
http://www.uca.edu/org/ccsmi/ccsmi/classicwork/Myth Revisited.htm
http://www.grand-illusions.com/percept.htm

I have also read that the wheel spokes thing (the real one, not the movie/computer one... as that is just the framerate of the video) has to do with the reflectivity and curvature of the spokes... but I'm not sure either way.

I haven't actually studied any of that myself... and there are many conflicting sources on these subjects. All I know for sure is that I can tell the difference when a game drops from 85 to 70fps (though 60fps is satisfactory for me)... and I can tell the difference between a stable 50/60fps and a stable 85fps. I haven't tested as far as 100fps yet... though, I will do that later
 
Everyone is wrong, cause Brain Rev 2.0 (commonly known as the "20th Century Version") has very flawed frame buffer and the driver could still use some improvement. Brain Rev 2.1 should fix all problems.
 
I once read a study that said that the human eye can only "produce" 60FPS. So having more than that is pointless.

:p
 
It's not really about the # of frames, it's about how steady that number is. If a game jumps fromo 20 - 30- 60 - 15- 20.. it will not be smooth.

Anything running at a steady framerate at 24fps and up is good.
 
Now that you're talking about Frames Per Sec, have any of you ever watched a film on a new TV with "Natural Motion"? It looks retarded! It looks like some cheap ass home-video full length video, with poor Australien actors...No offense!
 
Originally posted by Lifthz
It's not really about the # of frames, it's about how steady that number is. If a game jumps fromo 20 - 30- 60 - 15- 20.. it will not be smooth.

Anything running at a steady framerate at 24fps and up is good.
Not in my experience... I have taken games that get a steady 100+fps and locked them at 24fps, 30fps, 40fps, 60fps, 72fps, 85fps, and 100fps.
There is a noticeable difference in how smooth the motion appears between each of those steps. I would prefer to have a frame rate that is between 40 and 80fps instead of locked at 40.

EDIT: There is a problem with introducing motion blur/smear into games. The problem is that in the real world the object you keep your focus on will not blur even if it is moving... the object appears still to you and the rest of the image is moving in relation to your view. Games would have to determine what object your eye is tracking on the screen and blur the objects with different velocities... which would require an eye-tracking device and too many CPU cycles for it to work well with games.
 
Originally posted by OCybrManO


EDIT: There is a problem with introducing motion blur/smear into games. The problem is that in the real world the object you keep your focus on will not blur even if it is moving... the object appears still to you and the rest of the image is moving in relation to your view. Games would have to determine what object your eye is tracking on the screen and blur the objects with different velocities... which would require an eye-tracking device and too many CPU cycles for it to work well with games.

I wasn't talkig about the percieved motion blur from your eyes, but the motion blur present in video footage. A videocam era has the shutter open for 1/24 of a second so any object moving quickly will be blurred across the frame, this gives the appearence of very smooth motion using 24fps wheras in games yoiu need up to 70-80 fps for perfectly smooth motion, depending on the viewer. In a game each frame is dead sharp, no matter how fast the boject is moving.
 
That is also what I was talking about when I mentioned that they should add motion blurring... but then I also said that it wouldn't be like real life because if you track something with your eye that is moving on the screen it will still look blurred (but it shouldn't, because relative to your eye's view there is no motion).

It would be quite expensive (both in monetary cost and performance cost) to implement it properly.

This problem also keeps depth of field blurring from being realistically implemented in games... because how is the game supposed to know what your focal point is?
 
Depth of field blurring is retarded, I mean, everything you dont focus on is blurred out anyways when you look at the monitor.
 
In games, if you focus on something that is close to your eye you can still see objects in the distance with perfect clarity or vice versa... the only things that are "blurred out" are the objects that are not within 20-30 degrees of the point you are looking at.
This lets people see snipers without having to actively look for them... which sucks for the snipers.
 
Why is this thread 16 pages long?
I say we cut it down to 2 posts.
 
You wont be able to notice any extra smootheness if its passed your monitor refresh rate (optimum is 85hz, 60hz usualy blinks up and causes eye strain). Infact if you go past your refresh rate it will actually get worse cuz of the tearing (where the screen is in the middle of drawing the new frame and the frame on the front buffer changes so half way through it draws the next frame. This can get ugly). Your eye doesnt notice past 24fps (movie fps). The main reason we notice slow FPS in games more is because games require extra input. Mouselook at keyboard can combine to the slow FPS to make you feel like the display is lagging and slow, because your mouse movements are lagging and slow. The difference between 40 and 80 fps is pretty big. You notice its a little more fluid in terms of input. past 80fps you cant notice much at all. 25fps is the limit of playability for games like HL2.
 
Back
Top