For anyone with questions as to why AA wasn't turned on in the videos...

Status
Not open for further replies.
Originally posted by grambo
Personally I find under 60fps to be unplayable in an fps game, especially if you are playing online against other people in a game where precision is very important (Quake3/RA3, Counter-Strike).

This is a common misconception, but it has a very valid reason behind it.

The human eye can't really see the difference between 30FPS and 60FPS. (for reference, motion pictures are recorded at 24 FPS, but it doesn't translate well because film captures motion blur which helps blend frames together, typically computer games do not have motion blur -- Jedi Outcast's lightsabers excepted.) The thing is, if you are averaging 60FPS, there are very good odds you are never dropping below 30fps, and are frequently up near your monitor's refresh rate. (No matter what a benchmark tells you you are getting in frames per second, you cannot physically display more than your monitor can refresh per second.) If you are averaging 30 FPS, large portions of the game will be beautiful and probably run at 45FPS or higher. But...other large portions of the game will slow down to a crawl with a slideshow of 15FPS or even less, and below 30 really starts to get noticeable.
 
Frames per Second

Originally posted by JackiePrice
(No matter what a benchmark tells you you are getting in frames per second, you cannot physically display more than your monitor can refresh per second.)

Actually, yes it can, but you have to physically tell it to by disabling v-sync. Disabling v-sync can lead to ugly tearing and other bad effects, so it's best to leave it on.

The only reason you want frames per second above 60 is so that you can turn on FSAA and AF without it seeming like your watching a slide show.

Also, the human eye can detect a difference between a refresh rate of 100Hz and 60Hz (60Hz can cause headaches), although the difference between, say, 72Hz and 100Hz is small but is still there.
 
Re: Frames per Second

Originally posted by Lt. Data
Actually, yes it can, but you have to physically tell it to by disabling v-sync. Disabling v-sync can lead to ugly tearing and other bad effects, so it's best to leave it on.

No, it doesn't. Disabling v-sync does not raise your ACTUAL framerate above your refresh rate. Why? Because your ACTUAL framerate is how many different pictures are drawn per second to your monitor. If your monitor is refreshing 85 times a second, it doesn't matter if your video card thinks it's getting 385FPS, and that it's sending that much data to your monitor, your monitor is only drawing 85 frames per second.

The only reason you want frames per second above 60 is so that you can turn on FSAA and AF without it seeming like your watching a slide show.

Errr...no. turning on FSAA and AF just lowers your framerates. If you are getting 60 without FSAA, you will get less with FSAA. If you are getting 60 WITH FSAA, you'll get more without it. That's irrelevant to what we are discussing.

Also, the human eye can detect a difference between a refresh rate of 100Hz and 60Hz (60Hz can cause headaches), although the difference between, say, 72Hz and 100Hz is small but is still there.

Entirely different thing. Refresh rates are not the same as framerates (although your maximum framerate is in actuallity, regardless of v-sync, your refresh rate.)

Refresh rates that are too low cause headaches because your screen starts to fade in between redraws, causing a flicker. This is not your eye distinguishing more than 30 frames a second, this is the disorientation that your nerves get when they try to adjust for the light filtering through your optic lenses that bloody fast. Low refresh rates can also cause epileptic fits, strobe light effect. Regardless, there are entirely different processes at work affecting you due to low refresh rates than low frame rates. Don't get them confused.

BTW, try watching a game on a cheap LCD monitor. Despite what windows tells you, LCD monitors do not have an actual refresh rate. They do have a maximum rate they can change pixel colors, but most of them are pretty limitted. You can see streaking and fading because the redraw rate is so poor on many of them.
 
Originally posted by grambo
Is there really any chance that current top end cards (9800 Pro, FX 5900 Ultra) are going to be remotely capable of running AA/Ansio on Half-Life 2 or Doom 3 at a decent frame rate? Personally I am skeptical at best. Maybe at 640x480.


Yes.

I happen to have played with the <shhhhh...>d3 al<ahem>pha...<who said that?>

It plays pretty micely with 4x FSAA at 10x7 maintaining a steady 30 FPS up to 60+ FPS, and looks a WHOLE lot better that way as well.

(Yes, I have a 9700 Pro with 9800 drivers on a 3.06 GHz crappy P4).
 
The human eye can't really see the difference between 30FPS and 60FPS

The human eye can actually see a difference up to ~ 70 frames per second. Lots of long winded technical reasoning aside, if you don't believe me, just do a simple test. Load up an old game such as quake1, where most computers can produce consistantly high frame rates. Lock the frame rate at 60fps, walk around a bit, then lock the frame rate to 30fps. You will be a pronounced difference in the degree of smoothness.
 
Way higher

I've just recently read an article that is not recent and in it, the author talks about framerates and refresh-rates a human eye can tell a difference between. From studies, it showed that people could tell a difference even between 120 FPS and 200 FPS. Same goes for refresh rates. The author specifically mentioned the 70 FPS max. myth that was proven to be false.

In regards to AA and AF, once you use it, you don't want to ever disable it again. It makes a big difference no matter what resolution.

As to someone mentioning playing games without AA and AF and loving them, sure, I loved 256 color games too in their prime, would I play a new game using 256 colors? No way.


MoochieK
 
Honestly anyone who thinks you need even more than 60fps in games is a moron.
 
Re: Re: Frames per Second

Originally posted by JackiePrice
No, it doesn't. Disabling v-sync does not raise your ACTUAL framerate above your refresh rate. Why? Because your ACTUAL framerate is how many different pictures are drawn per second to your monitor. If your monitor is refreshing 85 times a second, it doesn't matter if your video card thinks it's getting 385FPS, and that it's sending that much data to your monitor, your monitor is only drawing 85 frames per second.

It raises the maximum frame rate from the refresh rate to the max the video card can put out



Errr...no. turning on FSAA and AF just lowers your framerates. If you are getting 60 without FSAA, you will get less with FSAA. If you are getting 60 WITH FSAA, you'll get more without it. That's irrelevant to what we are discussing.

Not quite. Turning on FSAA and AF does lower your framerate, but if your at, say, 100fps without it, it should still be playable with FSAA and AF on a moderate level.


BTW, try watching a game on a cheap LCD monitor. Despite what windows tells you, LCD monitors do not have an actual refresh rate. They do have a maximum rate they can change pixel colors, but most of them are pretty limitted. You can see streaking and fading because the redraw rate is so poor on many of them.

That's the slow pixel response time. Technically, they do have a refresh rate, but it is irrevelant to said streaking and fading.

Originally posted by Lifthz
Honestly anyone who thinks you need even more than 60fps in games is a moron.

Sigh. The only reason that you would (normally) desire framrates above 60fps, as I said above, is so that your videocard can keep the game playable and you can still turn on FSAA and/or AF.

Or to just brag about your videocard's power.
 
Honestly anyone who thinks you need even more than 60fps in games is a moron.

You certainly don't need it, but the fact is you CAN tell the difference above 60fps. But even that's not the point. The real point is not what fps you get normally, but rather what is the lowest it will drop to in the worst of conditions. I wouldn't be satisfied if I got 60fps in D3 most of the time, but it dropped to 20fps or something when lots of lights and monsters all started doing things all at once.
 
w00t! Another refresh rate discussion :)

It has been PROVEN by US MILITARY INVESTIGATIONS that the human eye can distinguish around 200 refreshs per second. And that there is a distinct difference between 60 and 200. (stare long at 200 and 60 would appear slow)

And to those thinking FSAA and AF immidietly mean a drop in FPS, not neccesarily... I forgot to turn on that for Morrowind when I played, I was at the beginning house, just saved, noticed I got around 30-35 fps. So I quit, turned on 4xFSAA and 16xAF, started morrowind again. fps? 30-35.
 
I wasn't pointing any fingers, I was only saying if anyone actually thinks that you need more than 60fps.

Either way, movies in the movie theatre run nat 30fps. Wait, not even... they run at 24fps. They're just very steady and equal instead of jumping from 1 framerate to another. Once games run at a steady framerate of 24+frames, then i'm happy.

The main concern is the steadiness, not the amount of frames.

Originally posted by dawdler
w00t! Another refresh rate discussion :)

It has been PROVEN by US MILITARY INVESTIGATIONS that the human eye can distinguish around 200 refreshs per second. And that there is a distinct difference between 60 and 200. (stare long at 200 and 60 would appear slow)

And to those thinking FSAA and AF immidietly mean a drop in FPS, not neccesarily... I forgot to turn on that for Morrowind when I played, I was at the beginning house, just saved, noticed I got around 30-35 fps. So I quit, turned on 4xFSAA and 16xAF, started morrowind again. fps? 30-35.

You can prove that Jerry Sienfield lives up my butthole even when i'm taking a dump. It doesn't matter.

Games do not need to be run anymore than 60fps. You will not see chopiness unless it's nut running steady.
 
Originally posted by dawdler

And to those thinking FSAA and AF immidietly mean a drop in FPS, not neccesarily... I forgot to turn on that for Morrowind when I played, I was at the beginning house, just saved, noticed I got around 30-35 fps. So I quit, turned on 4xFSAA and 16xAF, started morrowind again. fps? 30-35.

Well, simpler games, such as Morrowind, don't hose your videocard when you turn on FSAA and AF. Most games, though, do show a drop, especially when FSAA is enabled.

Ignore that. Lack of knowledge about Morrowind led me to type before I thought to see what the game looks like.
 
Movies is different, they blend together each frame to form motion, meaning the effective fps is something like twice that.

And yes you can distingiush more than 60 fps. I would guess around 80 fps. Face it, a refresh rate of 60hz, HURT MY EYES! I can see it shimmer. On the other hand, 120 like I use now, is totally smooth. I know 80 is totally smooth too. 60 isnt.
 
Originally posted by Lifthz
Either way, movies in the movie theatre run nat 30fps. Wait, not even... they run at 24fps. They're just very steady and equal instead of jumping from 1 framerate to another. Once games run at a steady framerate of 24+frames, then i'm happy.

The main concern is the steadiness, not the amount of frames.

Movies do run at 24fps, yes, but the frames are blurred so that it seems smoother. (Try pausing a video tape). Computers, though, show an unblurred image unless told otherwise. Therefore, most games need to run somewhat faster than 24fps to get the same effect as film at 24fps.
 
We don't recommend turning multisample antialiasing on for cards that don't have centroid sampling. . .your mileage may vary.

This still begs the question: Q3 didn't have this problem, so what is he talking about? And will ATI cards be fixed in time?

Worse, don't VALVE have cards already that support centroid sampling just fine? Why couldn't they use those cards for the movie?
 
Originally posted by dawdler
On the other hand, 120 like I use now, is totally smooth. I know 80 is totally smooth too. 60 isnt.

Most of the time, anything above 72 or so is smooth.
 
Originally posted by Lt. Data
Well, simpler games, such as Morrowind, don't hose your videocard when you turn on FSAA and AF. Most games, though, do show a drop, especially when FSAA is enabled.

LOL :D
Simpler games?! Morrowind is one of the most graphically packed game there is!!!!! And its OLD! It renders EVERYTHING, things you dont see, it renders and lights and animate it. It is a slowass engine, but one that can display 100,000 polygons with 20 fps on a Geforce DDR. Simple? Tell me the engine that can do the same.
 
Originally posted by dawdler
LOL :D
Simpler games?! Morrowind is one of the most graphically packed game there is!!!!! And its OLD! It renders EVERYTHING, things you dont see, it renders and lights and animate it. It is a slowass engine, but one that can display 100,000 polygons with 20 fps on a Geforce DDR. Simple? Tell me the engine that can do the same.

OK, so I confused Morrowind with a 2D game. Sorry. I realized that after I posted the message but didn't edit it just yet.

Edit: earlier message is now fixed.
 
Originally posted by Lt. Data
OK, so I confused Morrowind with a 2D game. Sorry. I realized that after I posted the message but didn't edit it just yet.
Thought of Daggerfall? :)
 
For those of you arguing, all of the theories of how many fps the human can see could very easily be misconceptions.

There are SO many variables that could affect your perception of fps, that you can't pin down what the eye can really see.
 
Originally posted by dawdler
Thought of Daggerfall? :)

:LOL: Not quite that bad. More along the lines of games like Age of Empires, SimCity, etc.
 
please could some mod close this thread ? New guys just ask the same questions again and again and it's all in there.. nobody reads all the 12 pages of this thread.

I doubt that there will be new information in this issues the next days and anyways when the situation changes it's worth a new thread.

DaFire
 
Originally posted by Sorris
Sorry qckbeam - but who are you? Could you be providing us with anti nVidia sentiment for ATI's gain? News like this can cost a company millions of dollars (and directly effect my next purchase as I'm about to buy a new card) so excuse my bluntness.

Anywho, if this information is true then thankyou for the heads-up but until I see it confirmed then I remain a sceptic.

I agree.
 
agreed

Yea, so I'm responsible for some of the tangents. Sorry. Most of these discussions should have been new topics.

"Shutting up now, sir."
 
dx9 hardware issue?

if this is a dx9 hardware issue i just dont see the difference between ati or nvidia and it shouldnt affect the geforce 4 at all right because thats a dx8 card. seems to me that its valves problem with there software,im thinking dx9 hardware is dx9 hardware and as it is now ati wont run it either so if they could possibly fix it why couldnt nvidia. im thinking its valves problem and they dont want to fix it or take blame for it.
 
Re: dx9 hardware issue?

Originally posted by wingmaster
if this is a dx9 hardware issue i just dont see the difference between ati or nvidia and it shouldnt affect the geforce 4 at all right because thats a dx8 card. seems to me that its valves problem with there software,im thinking dx9 hardware is dx9 hardware and as it is now ati wont run it either so if they could possibly fix it why couldnt nvidia. im thinking its valves problem and they dont want to fix it or take blame for it.

"dx9 hardware is dx9 hardware" is an over simplification, just because they're both dx9 hardware doesnt mean they do everything the same way. they both are supposed to have a certain set of operations that they can perform that make them DX9 compatible, but how they go about doing those things is different, if it was all the same then the cards would be the same speed and there would be no competitiveness in the videocard industry.
 
Re: Re: Re: Frames per Second

Originally posted by Lt. Data
It raises the maximum frame rate from the refresh rate to the max the video card can put out


No.

Let me try to explain:

Your actual framerate is how many different images your monitor draws per second.

Your "calculated" framerate is how many times your video card draws a scene a second. This is the number that can exceed your refresh rate if you disable v-sync.

NEVER can your monitor actually display more frames per second than it is refreshing. If your video card draws 160 frames in a second, but your refresh rate is 80 Hz, half of those frames are lost, never to be displayed. In addition, there are ugly "tearing" effects as partial frames are rendered without v-sync. While your actual framerate does not improve, your image quality goes down. Disabling v-sync is useful for benchmarking because we want to see the raw power of the videocard, but it is useless for gaming.



Not quite. Turning on FSAA and AF does lower your framerate, but if your at, say, 100fps without it, it should still be playable with FSAA and AF on a moderate level.

I never said it didn't lower your framerate. But we aren't talking about the performance effect of FSAA or AF in this thread. However, if you have FSAA and AF enabled, and the game tells you you are getting 60FPS, that's what you are actually getting. Disabling FSAA or AF would raise that number. You don't need a higher stated framerate with FSAA and AF enabled to make it look good.



That's the slow pixel response time. Technically, they do have a refresh rate, but it is irrevelant to said streaking and fading.

I know, i said that.
 
The thread won't be closed because it's on bluesnews as a story.

I play at 1024x768x16, so all this talk about some glitch on the high end doesn't bother me.

I'd rather hear talk about the MP of HL2, since CS2 and TF2 are rumored to be sperate addons that will come out later.

To make this talk go away, Gabe needs to spill the beans on MP that will ship with HL2. Yeah, the SP is important, but most will beat the game in 1 weekend playing non stop, and then it's on to MP.
 
People love bringing up the 24FPS film argument in debates like these, but it is always based on a misunderstanding of the principle.

24FPS is the slowest framerate at which the eye is fooled into thinking it is seeing consistent motion. Anything slower and you'll start to recognize individual frames. And for the record, modern motion pictures are actually displayed at 48FPS because each frame is displayed twice in order to reduce apparent flicker.

However, 24FPS is a poor framerate at which to capture fast action because there is too much blur recorded on each frame. Next time you watch a film, pay attention to how little the camera actually moves. Even pans are usually executed at a leisurely pace simply because trying to move the camera too fast would create a noticable strobing effect, and when the camera does move quickly, there is usually an object in the frame that provides a relalatively stationary focal point. Look around the object and you'll see just how bad 24FPS is at capturing fast motion.

Given the above information, I think you can understand why playing a game at 24FPS is certainly not ideal. Faster is definitely better, but you also hit the point of diminishing returns rather quickly. Anything above 40FPS will appear reasonably fluid to your eye, and as someone pointed out earlier, consistent frame speed is better than a wildly varying one.
 
Re: Re: Re: Re: Frames per Second

Originally posted by JackiePrice
No.

Disabling v-sync is useful for benchmarking because we want to see the raw power of the videocard, but it is useless for gaming.



Although this is technically true, disabling refresh CAN be usefull to games. If you're video card it capable of rendering more frames than your monitor can handle, and vsync is disabled. It will attempt to render them all, causing your monitor to display several partial frames (which causes tearing). Even though not all frames displayed are complete (perhaps several only render to the 3/4 point, or halfway point) , there are more total frames (or changes to the scene) being at least parcially shown by the monitor, which in some cases can appear smoother (albiet, with tearing) than with vsync enabled.
 
Re: Re: Re: Re: Frames per Second

Originally posted by JackiePrice

I never said it didn't lower your framerate. But we aren't talking about the performance effect of FSAA or AF in this thread. However, if you have FSAA and AF enabled, and the game tells you you are getting 60FPS, that's what you are actually getting. Disabling FSAA or AF would raise that number. You don't need a higher stated framerate with FSAA and AF enabled to make it look good.

OK, I'll try this one more time. Lifthz said that there was no need for framerates above 60fps, and I'm saying that there kinda is: the ability to have a playable game with FSAA and AF on. Most of us already know there is a performance hit, but if you videocard can give you above 60fps in a game, you therefore can turn on some FSAA or AF. I'm just saying there is a partial need for framerates above 60fps. Besides, how else would you brag that your videocard is the most powerful? :D

I think we would all like to know what kind of MP is included with HL2. Multiplayer and modability is what keeps Half-Life (the first one) as an immensly popular game STILL!
 
Originally posted by dawdler
w00t! Another refresh rate discussion :)

It has been PROVEN by US MILITARY INVESTIGATIONS that the human eye can distinguish around 200 refreshs per second. And that there is a distinct difference between 60 and 200. (stare long at 200 and 60 would appear slow)

And to those thinking FSAA and AF immidietly mean a drop in FPS, not neccesarily... I forgot to turn on that for Morrowind when I played, I was at the beginning house, just saved, noticed I got around 30-35 fps. So I quit, turned on 4xFSAA and 16xAF, started morrowind again. fps? 30-35.

The eye deosn't take in a whole picture at once but is simutanuosly takeing in small parts of information and updateing them to what you can see, so you will be able to tell the difference if you concentrate but if you don't take any notice then it won't occur to you that the frame rate is under 30-50fps because your eye only updates the entire picture every so many 100ths of a second.
 
Sorry qckbeam - but who are you? Could you be providing us with anti nVidia sentiment for ATI's gain?

I am just a member of this forum who e-mailed Valve about the lack of FSAA in the videos. I am not posting anti Nvidia statements for the gain of ATI, I do not work for ATI, I am not in any way connected with ATI and I do not like ATI over Nvidia. Whatever I have posted here came directly from a programmer named Gary McTaggart.
 
Also remmeber: this issue started not when qbm asked a question about ATI: he asked a question about the lack of AA in the videos.
 
I'm not all that knowledgeable in game development, but from what I have read, it still sounds like some of the problem is Source's implementation of DirectX9 and accessing texture images. So the question I would want to ask is would it be possible after the release of Half-Life 2, to give us the textures individually instead of packed into larger images so that the multisampling problem will not occur, while patching the game to access the new files.

I've seen a few people say that this method of packing textures doesnt save as much processing time as one would think, so I wonder if it would be a better alternative for those that can't push thier resolution to 1600x1200.. i.e. if you could turn on anti-aliasing at a lower resolution and use the slower method of accessing textures, would it be less system intensive than using multisampling on one-step up in resolution?

If this is not possible then ignore me :p
 
um nvidia these days.... whats there to like unless you like 2nd best.
 
about 60 fps and eye limits and stuf

the air force states that air force pilots can differenciate between 200 fps and 100 fps. now thats the USAF, top-of-the-line equipments with crazy specs and billions of dollars of research and testing telling that.

even though i do understand some ppl saying we cant see past 30 fps and stuf, everyones' eyes are different. i personally can see difference between 85 Hz refresh rate and 120 Hz refresh monitor when i see it. i always get dizzy when i go to movies cuz of their horrible frame rate and flickering get to me. (so i just touch my girlfriend :) )

anyway, its been proven that to the trained eyes, 30 fps or 60 fps is not the barrier.

all i can assume is that since ppl have different reaction time and considering some brain signal speed-ups and slow-downs, and considering some fact that your brain remember only what happend about 0.5 sec of what u see with the eye, someone can calculate roughly how much - maybe thats where 60 fps comes from. but since everyone is different, we cant have a precise number.
 
I think it's time for us to get rid of this thread, it's gone on long enough and it's not even about AA anymore. The official word has been given by Gabe himself so this post is now useless.
 
Status
Not open for further replies.
Back
Top