VirusType2
Newbie
- Joined
- Feb 3, 2005
- Messages
- 18,189
- Reaction score
- 2
I was discusing FPS or frames per second with someone on the forum and felt that I would find out some more facts concerning the human eye and its perception of frames per second. Thought i would share that with you guys here in a new thread and also in order for that particular thread to stay on topic
Basically T.V. and video games display images differently. DVD's and broadcast T.V. displays Half images so the fps looks smoother. Computer generated graphics don't display half images the whole image changes. This doesn't decieve the eye the way watching t.v. does so a higher fps is required to replicate the smoothness and trick the eye into seeing it as seemless. However a lower framerate that is locked-in and steady is preffered over a higher framerate that drops. becuase the eye sees that drop in framerate and it looks jerky.
Source:
http://bf1942.boomtown.net/en_uk/articles/art.view.php?id=8643
EDIT: MORE INFO
http://amo.net/NT/05-24-01FPS.html
This is what I understand so far:What is the maximum number of frames per second the eye can register?
Peter Koch Jensen: I guess you mean how fast the images have to change in order for the human brain to perceive them as continuous motion. The flicker fusion frequency (caused by positive after images in the field of view) increases with the luminance and eccentricity in the field of view. A TV displays 60 half-images (NTSC in The USA) or 50 (PAL in Europe). This interlaced video frequency was invented for the exact purpose of smoothing out movements so they are perceived as continuous.
Do people’s ability to perceive images vary? And if so, how big can these differences be?
There can be quite large differences, depending on presentation, purpose, experience, motivation and alertness.
What is Hertz in computer monitor terms?
Morten Striboldt: Hertz (Hz) is the update frequency that the screen uses. On a monitor the images are drawn by an electron cannon that sends electrons at a fluorescent material that then lights up the screen. The images themselves are being drawn horizontally, line by line unlike a projector that updates the entire image at once. A screen with an update frequency of 60 Hz will therefore draw the image 60 times a second. This is not optimal, however, since the human eye can see these updates. You have to be at 72 Hz and above to perceive the image as steady (where the updates are too fast for the eye to see).
What is the definition of Frames per second in computer games?
FPS indicates the speed with which the graphics card can generate images and send them to the screen. 30 FPS therefore means that the graphics card can render 30 frames per second. A computer game running with 30 FPS can look quite ok, but it will seem imprecise as soon as the objects on screen start to move really fast. A regular DVD-movie also runs at 30 FPS and this seems quite natural, due to the motion blur technology which, in short, is a technique that generates a transition image between the real images.
In games you smooth out the motion by running it at 60 FPS, thereby making the motion more consistent and natural. The rule of thumb for FPS is, however, that a low and stable FPS is preferable over a high and irregular FPS. The latter will be registered by the eye and be perceived as jerky.
Those two things are interrelated. But how?
Since the human eye is limited as to how many images it can perceive per second, it ought to be unnecessary to play with more than 60 FPS. However, a higher FPS is preferable since the objects you see on screen will be more precise. If the graphics card generates 100 FPS for instance, an object that is situated at a given place at a given time will be more accurately depicted even though the screen only updates at 72 Hz.
If the screen only updated at 1 Hz, 100 FPS would still give a pretty good image of the moment (when the screen updated that is) as opposed to running at 1 FPS where you would have delay at up to one second.
Basically T.V. and video games display images differently. DVD's and broadcast T.V. displays Half images so the fps looks smoother. Computer generated graphics don't display half images the whole image changes. This doesn't decieve the eye the way watching t.v. does so a higher fps is required to replicate the smoothness and trick the eye into seeing it as seemless. However a lower framerate that is locked-in and steady is preffered over a higher framerate that drops. becuase the eye sees that drop in framerate and it looks jerky.
Source:
http://bf1942.boomtown.net/en_uk/articles/art.view.php?id=8643
EDIT: MORE INFO
Source:Our Brain is smart enough however to "exact" 24 frames into motion, isn't it ignorant to say we can't distinguish 400, or even 4000 into motion? Heh the skies the limit, oh wait, then space...oh wait. Give us more, we notice the difference from 30-60, the difference from 60-120. It is possible the closer we get to our limit, be there one, the harder it is to get there, and there is a theory about this. Someone is across the room. Take one full step towards them. Now 1 half step towards them, then 1 half step of a half step, on and on until your 1 half of each movement you take. Will you ever get there? That my friend is open to debate, but in the mean time, will you take one step towards me?
The Human Eye perceiving 220 Frames Per second has been proven, game developers, video card manufacturers, and monitor manufacturers all admit they've only scratched the surface of Frames Per Second. With a high quality non-interlaced display (like plasma or a large LCD FPD) and a nice video card capable of HDTV resolution, you can today see well above 120 FPS with a matching refresh rate. With some refresh rates as high as 400Hz on some non-interlaced displays, that display is capable of 400 FPS alone. Without the refresh rate in the way, and the right hardware capable of such fast rendering (frame buffer), it is possible to display as cameras are possible of recording 44,000 Frames Per Second. Imagine just for a moment if your display device were to be strictly governed by the input it was receiving. This is the case with computer video cards and displays in a way with adjustable resolutions, color depth, and refresh rates.
http://amo.net/NT/05-24-01FPS.html