40fps.

http://amo.net/NT/02-21-01FPS.html

Interesting little article I found on refresh rates, turns out Hz and fps aren't the same. here's a quote

"TV's in homes today use the standard 60Hz (Hertz) refresh rate. This equates to 60/2 which equals 30 Frames Per Second."

Guess that explains why we are able to notice 60Hz (or 30fps) and not 75Hz (37.5fps).

Wonder how much I'll get flamed for this one? :)
 
im not really talking about hertz, but if you have your monitor set at 85hz and you have 45fps, you wont tell a difference if it goes up and stays steady at 100
 
Originally posted by [Hunter]Ridic
im not really talking about hertz, but if you have your monitor set at 85hz and you have 45fps, you wont tell a difference if it goes up and stays steady at 100

I completely agree with you, i was just trying to give you some info the proves you right.
 
Interesting little article I found on refresh rates, turns out Hz and fps aren't the same.
Both are updates/second. Even if you have 42fps in half sync, the screen still updates at 85hz, it just render slower than it updates.
But the thing is, fps doesnt matter. Most say 60 is smooth, but that's cause they are seeing the screens refresh which is often higher (75 or 85)! However, run at 60hz and 60fps and it WILL flicker. Run at 60fps and 40hz and you'll think the screen was a crappy LCD. Run at 60fps and 30hz... Well, you get the idea.
 
Originally posted by Logic
Run 30 fps at 60hz, and it'll look just fine.
Of course it'll look fine. It would just be slow and make me nauscious. But otherwise fine.
 
THERE IS NO PROVEN LIMIT TO THE FRAME RATE THE HUMAN EYE CAN PERCEIVE.
He's not siting any scientific sources in that article, you could only state this with certainty if you'd read some literature about the huamn visual cortex, so just assume the rest of it is bullshit also.
 
Isnt the best way the find these things out to just wait for the game?

Im getting sick of threads like this!
 
Originally posted by Majestic XII
Isnt the best way the find these things out to just wait for the game?

Im getting sick of threads like this!

Firstly: Don't take this the wrong way, I'm not flaming you, but to be honest I'm getting sick of people saying "just wait for the game". Considering these forums are ENTIRELY about a game that hasn't been released, the discussion is obviously going to be speculative. What's wrong with speculation? If you don't like it, you're certainly entitled to excercise your right not to participate in the discussion.

Secondly: In this particular case, the discussion has moved away from speculating about the game itself, and more towards the technicalities of frame and refresh rates. How will waiting for the game help us find out any of that? It won't, any more than playing current games will. So your point doesn't really apply anyway. Again, don't take this as a flame, but I'm getting sick of people telling everyone how pointless they think speculative threads are, and it's even more annoying when the thread isn't really one anyway.
 
Originally posted by Logic
Firstly: Don't take this the wrong way, I'm not flaming you, but to be honest I'm getting sick of people saying "just wait for the game". Considering these forums are ENTIRELY about a game that hasn't been released, the discussion is obviously going to be speculative. What's wrong with speculation? If you don't like it, you're certainly entitled to excercise your right not to participate in the discussion.

Secondly: In this particular case, the discussion has moved away from speculating about the game itself, and more towards the technicalities of frame and refresh rates. How will waiting for the game help us find out any of that? It won't, any more than playing current games will. So your point doesn't really apply anyway. Again, don't take this as a flame, but I'm getting sick of people telling everyone how pointless they think speculative threads are, and it's even more annoying when the thread isn't really one anyway.

Maybe this thread is one of the better threads in the forum right now, but im kinda overwhelmed with pointless threads, like "doom3 vs hl2".

I will now run along to the editing forum where this problem isnt that big (yet)...

cya...
 
Very off-topic: usualy TV's today have 100 Hz refresh rate. :)
 
With my Barton 3000+ and my Radeon 9800 i get about 120 fps in UT2003 with everything set to high running at 1600x1280
 
Originally posted by DaLys
Very off-topic: usualy TV's today have 100 Hz refresh rate. :)

No they don't.

TV's run at 24 or 26 frames per second.

Movie theaters run at 21 frames per second.
 
Originally posted by Mr.Magnetichead
No they don't.

TV's run at 24 or 26 frames per second.

Movie theaters run at 21 frames per second.

He didnt say anything about fps, only Hz.
 
Originally posted by Majestic XII
He didnt say anything about fps, only Hz.

HZ is roughly the same thing as FPS.

hence why higher Hz on monitors give you a smoother desktop.

Hz just like FPS is all to do with refrsh rates.
 
hz and fps are two completely separate things. Both of them deal with "refreshes per second" but they are independant of each other.

For example, if you're running a new game on an old system, and you're getting 15 frames per second, your monitor is still refreshing 60, 75, or 85 times per second, depending on the monitor's current setting.

I've found that flickering occurs if the frame rate exceeds the refresh rate.
 
Originally posted by Logic
hz and fps are two completely separate things. Both of them deal with "refreshes per second"


Then they arn't completely different things now are they. ;)

Yes I know that but my point was that they are still to do with refresh rates.
 
The thing was that he didnt say ANYTHING about FPS, only Hz so dont say anything about fps...its much better that way.

And yes, almost every new TV today have 100Hz (observe...no FPS)
 
well, in the end, they determine how often you get to see a new picture on the screen, but usually the limiting factor is the graphicscard, that renders images only at a certain speed - FPS. Who owns a videocard that can render frames faster than the monitor can display them should re-think his priorities.
 
Originally posted by Mr.Magnetichead
Then they arn't completely different things now are they. ;)
Notice I said separate not different :cheese: - It was hz that was being spoken about, and you replied about fps, and backed up what you said by saying they are basically the same thing. They're similar concepts, but they are independant of each other. You could even go as far as saying they are "different" :O
 
Originally posted by Logic
You could even go as far as saying they are "different" :O
You can even go so far as saying they are exactly the same!*

*Assuming the card is able to render the exact same speed as the refresh, that is, vertical sync

:p
 
Originally posted by Mr.Magnetichead
No they don't.

TV's run at 24 or 26 frames per second.

Movie theaters run at 21 frames per second.

My TV is using 100Hz and it's pretty old. But the <whatever it's called> in my country is PAL and PAL is only 60 Hz if i remember correctly.
 
Ridic, quit claiming that because you can't tell a difference in anything above 40fps that everyone can't. I can tell a difference between 60 and 100 fps clearly. It's very subtle, but noticeable. I play Morrowind all the time, and it's not uncommon in that game for your fps to be at 15 in one area and turn around and have it jump to 120 because of all the architecture in some areas - and the fact that you can see so far in the game. As long as you can get approximately 25 fps constantly, the game will be playable. That's usually what I shoot for when playing Morrowind with FPS opitmizer in high arch. areas. I don't mean to flame you, but just because you think you can't tell a difference doesn't mean other people can't. I don't understand either why 60 fps became this magical number for frame rates. Maybe monitors in the past had a max refresh rate of 60 Hz; I don't know, but I'd be interested in knowing...
 
Originally posted by Archangel
well, in the end, they determine how often you get to see a new picture on the screen, but usually the limiting factor is the graphicscard, that renders images only at a certain speed - FPS. Who owns a videocard that can render frames faster than the monitor can display them should re-think his priorities.
First of all, every video card can render frames faster than a monitor can display them if there is very little to render... even a TNT or voodoo.

It depends on the games you are using the card for.

Even the fastest cards (from both ATI and nVidia) have trouble running games like UT2K3 with every setting on the highest possible and still keeping up with the refresh rate of the monitor... so right now there is not a problem of video cards being too fast for the monitor.

If you only play CS, get a $50 (or lower) video card until you find something to replace CS (CS2, maybe?).
 
frankly, I can see a BIG difference between 40 and 100 FPS. not if they are a STABLE 40 and a STABLE 100, but if I'm getting 40, turn a corner and get 100, then turn another and get 40, I can REALLY tell the difference.
 
Is there anyone else than me that feels it pointless to even talk about it?

There are those that are certain you can see the difference between 60 and 100 fps, or 40 and 100, or whatever number. And especially, see the difference between 60hz and 75hz.

Then there are those that either beleive 30 is smooth (by going by TVs) or 48-60 is smooth (by going double TVs, a slightly more informed assumption). And that you cant see a crap difference over 60.

Neither sides give in :p
 
Originally posted by Khaos
http://amo.net/NT/02-21-01FPS.html

Interesting little article I found on refresh rates, turns out Hz and fps aren't the same. here's a quote

"TV's in homes today use the standard 60Hz (Hertz) refresh rate. This equates to 60/2 which equals 30 Frames Per Second."

Guess that explains why we are able to notice 60Hz (or 30fps) and not 75Hz (37.5fps).

Wonder how much I'll get flamed for this one? :)
That's because NTSC television is displayed at 60 fields per second. It takes two fields to make up a single frame (hence, 1/30 of a second to make the two field passes). When a frame is drawn on the picture tube, it starts at the top and draws all the odd scanlines from 1 to 525, then it goes back to the top and fills in the even lines from 2 to 524.

Computer displays are progressively scanned meaning the entire frame is drawn in one pass. A refresh rate of 75 Hz (hertz means "cycles per second"; old time engineers hate the term "hertz" still call them cycles) means that a new image is displayed ever 1/75th of a second. So your framerate is effectively limited by your screens maximum refresh rate as you won't see images any faster than your screen can draw them regardless of how fast your video card is.
 
Originally posted by Mr.Magnetichead
No they don't.

TV's run at 24 or 26 frames per second.

Movie theaters run at 21 frames per second.
Um, not sure where you're getting those numbers. NTSC television like we have here in North America is broadcast at a rate of 30 FPS. Our motion pictures run at 24 FPS. I think PAL television is 25 FPS and movies in England also run at 25 FPS (which explains why a film's runtime in England is always a few minutes shorter than it is here in the U.S. as they're running their movies one frame per second faster.)
 
Originally posted by crabcakes66
Id like to see any human notice the differance between 40 and 72 FPS....

Almost everyone can tell the difference between 40 and 72 fps, in terms of games. The reason you can't see the difference with anything iver 24fps on a TV as that with TV footage, each frame has slight motion blur, so the whole effect is more fluid. With 3d graphics each frame is dead sharp. I'd like to see an engine that tries to approximate this motion blur effect, to improve the fluidity of the graphics.
 
I played UT2003 at 1280x1200x32 4xAA 4XAF and my FPS never dropped below 70's.
 
Originally posted by Mountain Man
Um, not sure where you're getting those numbers. NTSC television like we have here in North America is broadcast at a rate of 30 FPS. Our motion pictures run at 24 FPS. I think PAL television is 25 FPS and movies in England also run at 25 FPS (which explains why a film's runtime in England is always a few minutes shorter than it is here in the U.S. as they're running their movies one frame per second faster.)

At film school I was told cinemas ran at 21.
 
you cant, give it up you know you cant. Its all in your mind.

That is true, it is all in the mind. Like it or not, higher and higher fps's are just that: numbers of frames per second. They can never be infinite, as the world around us is. Our eyes (as far as I know) do not take in a frame of our surroundings and wait for the brain to process it before sending the next one. It is instead a constant stream of crap that bombards the back of our eyeballs. This is why we perceive blur, and indoors looks all green when we come inside on a sunny day.

Of course different people can detect framerate differences at different levels. Even if their nervous system isn't physically fast enough to detect it directly, it just doesn't look right to them. A microsecond of frame stutter can affect a person subconsciously, reinstilling the feeling that the image is not fluid.

Also our eyes are not perfectly designed to receive input from all angles and directions. If a 60Hz monitor looks fluid to you (as it does to me) turn your head 90 degrees over and look at your monitor sideways. I can detect an obvious flicker when the refereshing is (from my perspective) horizontal.

What would improve how games look would be monitors with individual pixels that glow with their specified color when hit by the cathode ray gun. That way images projected on the monitor would linger for perhaps half a second, fading out. This effect can be simulated with software, true, but the key here is that it would be an analog effect. The fading would have infinite fps. This is similar to the way our eyes receive images, and I think that it would drastically lessen monitor-related nausia and headaches.
 
Originally posted by Mr.Magnetichead
At film school I was told cinemas ran at 21.
Perhaps originally film was projected at 21 FPS, and I'm talking back in the days when a Thomas Edison projector was state of the art. As motion picture technology advanced, the framespeed was increased to 24 FPS as it produced the optimal results at the least cost (faster frame speeds use more film stock; lowering the framespeed directly lowers cost.)

Modern motion pictures are projected at 48 FPS with each frame being projected twice (24x2 FPS). This reduces the amount of apparent flicker visible to the eye.
 
The human eye can only caculate up to 32fps anyways, so its not like anything higher will make a difference.
 
now to sum all this refresh rate crap up.. if u increse your refresh rate does it increase your fps.... and or does it slow your copmmuter down....??
 
Wow. Okay, here's hardware 101:

Your processor sends information to your graphics card, telling it the basics of the 3d scene to be rendered. Your graphics card does all the tricky bits of deciding what triangles to draw and which ones not to, and sends a finished frame off to your monitor a certain number of times per second. This is your fps. Your monitor updates itself completely separately from your graphics card, drawing on the screen the latest color information for each pixel.

Perhaps your monitor decided to draw a frame, but has only received half of the video card's update of the screen? The result is the 'tearing' effect when one half of the screen is 'older' than another.

How many fps a person can see varys by the person.
 
Back
Top