NEW Doom 3 innovation news-interesting

Joined
Jun 30, 2003
Messages
6,847
Reaction score
0
Doom 3's FPS will be capped at 60 FPS.
Why?
That's the average of what the human eyes capture.

"Get out of my bathroom with that damn camera!"
-John Carmack

After John put on a towel, he said the following:

"The game tic simulation, including player movement, runs at 60hz, so if it rendered any faster, it would just be rendering identical frames. A fixed tic rate removes issues like Quake 3 had, where some jumps could only be made at certain framerates. In Doom, the same player inputs will produce the same motions, no matter what the framerate is."

So, in other words, they're saying something like, if you get more than 60 FPS, it will still seem/feel like 60FPS--I dunno, something like that.

You can catch this news at IGNPC.com

-Just thought I'd let you know:)
 
that would be nice, if i could even achive 60 fps in doom3 ;x
 
Not like there is any hardware capable of achieving an average 60 FPS in Doom3 anyway.
 
And theres me getting ready for 100fps of Doom III glory. Bollocks.
 
I'm no coder, but as far as I've aware, this technology has already been incorporated in Halo.

All player animation etc. as only ever rendered at 30fps, regardless of what your engine frame rate actually is.
 
Originally posted by rec
I'm no coder, but as far as I've aware, this technology has already been incorporated in Halo.

All player animation etc. as only ever rendered at 30fps, regardless of what your engine frame rate actually is.

Even though 30 fps looks horrible.
 
Originally posted by rec
I'm no coder, but as far as I've aware, this technology has already been incorporated in Halo.

All player animation etc. as only ever rendered at 30fps, regardless of what your engine frame rate actually is.

the animations were originally made at 30fps, but halo as a game can run faster than 30fps, so the animations get sped up, but people want high framerates :)
 
Originally posted by nw909
Even though 30 fps looks horrible.

You take your 30fps and like it! :cheese:
 
Originally posted by Xtasy0
the animations were originally made at 30fps, but halo as a game can run faster than 30fps, so the animations get sped up, but people want high framerates :)

I understood that no matter what frame rate you were running, the model animations were still at 30fps.
 
Originally posted by rec
I understood that no matter what frame rate you were running, the model animations were still at 30fps.

hm, possibly.
 
the flicker you get from too low frame rates is different than seeing low framerate animations. if you're displaying 60 fps you won't see jerkiness.

consider movies, which have something like 26 fps but are shown with the slides doubled to 52 fps, looks fine.

and i don't want to get into this whole argument, but anyone who claims 60 fps hurts their eyes and they need 100fps is full of shit.
 
You cannot compare the technology behind a TV to the technology behind a monitor.

- Low refresh rates alone cause a "flicker", and can strain sensitive eyes.

- Low frame rates alone will not strain the eyes, but most can see a drastic different between 60fps and 100fps. The motion isn't as smooth, it's as simple as that.
 
Originally posted by LoneDeranger
Not like there is any hardware capable of achieving an average 60 FPS in Doom3 anyway.
Of course there is: 86 FPS with a GeForceFX5900 Ultra

One good thing that will come out of this, is that now no pro gamer will attempt to switch of all the detail in the game to achieve 200 FPS.
 
Isnt this used in nearly all major games? Old games that doesnt take in account the CPU clock, run like they are on crack. An old 2D game would move so fast its impossible to do anything (have to slow down the CPU clock, an example is UFO, nearly impossible run with a 2ghz box). I thought most games now a days took this in account and limit it to a certain max refresh, independant of FPS.
 
There's a difference between how many times the game logic updates and how many times the screen is refreshed. Since most game models don't have enough animation frames for 200 FPS, what they do in most games is interpolate between the animation frames to make it smoother.
I can imagine that when a model is somewhere half way between two animation frames, it's gonna be difficult to keep the per-poly-hit detection accurate. I think that's why Carmack decided to cap the framerate. To make it more consistent with the animation frames and game logic.
 
60 is fine, excellent for me. 30 is great too, shut up and eat your peas.
 
Originally posted by Arno
Of course there is: 86 FPS with a GeForceFX5900 Ultra

One good thing that will come out of this, is that now no pro gamer will attempt to switch of all the detail in the game to achieve 200 FPS.

From that article: "Bottom line is this; whoever has the fatest video card with image quality to match when DOOM 3 hits retail shelves 'wins'."

:LOL: I guess that was before HL2 was announced!
 
As I understood it, this IS about FPS (frames per second).
Carmack wants to cap the FPS at 60 so it runs in sync with the game logic updates.
 
everything above 25 frames (for players maybe 30...) is ok. every film has 24 - 25 frames per second. the human eye can't tell the difference between 25 ..ok 30 - 100 frames. u simply can't.
 
When objects on the screen move at high speed (which happens a lot in a game), you can see a difference between 30 FPS and 60 FPS.
 
On the contrary, as I got it he isnt talking about FPS, he is talking about game ticks (animations, physics updates (I suppose), etc) being updated at a constant 60 times a second. It is still 60 times a second whether you got 30 fps or 90 fps. Similar to the frequency of the monitor, its also updating 75 (if you got that) times a second no matter your fps.

The system with old games is of course different, but the theory is the same. In UFO, if you got a 2ghz box, everything updates so fast you hardly see it update. It is not locked at any particular speed at all, it simply is as fast as the computer can make it. The computers at the time wherent fast enough to ever reach this speed, the game played fine even if you got the fastest thing available.
This could have been avoided if they would have synched it with the CPU clock (updated certain times per second, aka hz) instead of the fps (or nothing), even if you got 3000 fps, the game would still only update at certain intervall, for example 60 times a second.

Or at least, that's how I got it. I could be wrong too, hehe :)
 
Originally posted by Echelon
everything above 25 frames (for players maybe 30...) is ok. every film has 24 - 25 frames per second. the human eye can't tell the difference between 25 ..ok 30 - 100 frames. u simply can't.

Actually you can. The reason for 100fps and silly numbers like that in games like this is because none of them have motion blur. An object in Half-Life that moves incredibly fast is still sharply defined as it would be if it were static. Natural objects that move fast will blur normally, when looking at it or filming it. Unless the camera in question is set to capture the image at a faster framerate, which will give a clearer image but result in having to play at the same faster framerate to keep things sharp and without slowing down everything (you can't film something at 200fps and play it back at 25fps without it being a slowmode effect.. often used for large explosions and those done with models, to make them bigger and more dramatic than they really are).

Games don't have motion blur like that so a fast moving object on a slow frame rate will end up appearing to jump from position to position. If the game had motion blur then it would look fine and pretty realistic (since the human eye can't see fast moving objects in the real world as clear sharp objects) Hence the need for silly frame rates to keep things running smoothly. You could of course run it properly and at 25fps but everything would slow down in the game, though every frame would be sharp.

It isn't so noticable with consoles as they use TV's, which use interlace (exluding HD TV's playing proper HD pictures which don't need interlace) interlace or fields interpolate two frames to create a third inbetween, and is only still around cause even though its awful to edit with its everywhere (though it hasn't actually been needed to display a picture on a TV correctly since the early days of TV) Monitors however don't have that, as each frame on a monitor is progressive, one frame one picture, no interpolating frames. Thats why what looks fine on a TV screen will look like its tearing or just jerky on a monitor.

The fix is motion blur, but its not very easy to do in realtime without looking terrible. The problem in realtime motion blur is its simply used previous frames slightly faded out each time to give the impression of motion blur. But this causes the whole image to blur and have a strange effect to it, which looks awful. Proper blur would depend on distance of an object, direction, speed and field of view of the camera/view. At this point you'd be able to mimic the effect of traveling in a car and looking out of the side window. little to nothing is blurred in the distance, but the road just next to the wheels is one big blur. Now if they somehow got that to work in HL2, it would be amazing to see in action. It's just difficult right now to do it. Also the trade off is if they crack motion blur in first person shooters. Then they'll have no problem with DOF (depth of field) either. Have these two running together, and it wouldn't be surprising to have the odd one or two people throwing up from motion sickness or vertigo. But it would look really cool.. eg: when you go to reload, your view will be on the gun and clip, not whats going on elsewhere, so you'd focus on the gun, the rest would become blurred. Much more realistic than current games. The developers should be trying to do this instead of farting about with bullet time effects ;)
 
Originally posted by dawdler
On the contrary, as I got it he isnt talking about FPS, he is talking about game ticks
He's talking about both.

From pc.ign.com
At a recent NVIDIA Editors' Day, id Software CEO Todd Hollenshed announced that DOOM 3 will be capped to 60 frames per second in the rendering engine.
Note that Todd mentions rendering engine. Since all animations and physics update at 60Hz, it makes sense to cap the actual framerate at 60Hz too.
 
"The game tic simulation, including player movement, runs at 60hz, so if it rendered any faster, it would just be rendering identical frames. A fixed tic rate removes issues like Quake 3 had, where some jumps could only be made at certain framerates. In Doom, the same player inputs will produce the same motions, no matter what the framerate is."
I thought that was what the discussion was about. Even I see it makes sense to cap the fps at 60, but the quote was about it not mattering :)
 
Ugh, please, do some damn research on the technologies behind monitors/televisions and the way they output display. Do some research on refresh rates, frames per second and how they work on both platforms.

The human eye CAN see above 25 frames. We can see MUCH higher.

Fenric has the right idea.
 
WTF?! The difference between 30FPS and 60FPS can sooo be detected!

Consider that the average human reaction time is about 160ms. Running at 30FPS the average response of the computer is about 33ms, making you feel nearly 20% slower than you would react in real-life (ie. after pressing the button you must then wait a further 33ms to see the result).

You would be surprised how precise we humans are. The whole reason ASIO was invented (a music thing) is because we can detect differences of <5ms between pressing a note and hearing it play - the purpose of ASIO is to provide a response time of <5ms.
 
besides if you turn on Vsync your fps is limited to 60 anyhow (refresh rate of monitor). And playing without Vsync sux imo.
 
Originally posted by rec
Ugh, please, do some damn research on the technologies behind monitors/televisions and the way they output display. Do some research on refresh rates, frames per second and how they work on both platforms.

The human eye CAN see above 25 frames. We can see MUCH higher.

Fenric has the right idea.
To add to that, and to effectivly end the discussion:
US Airforce testing shows the human eye can see, and not only that, it can also determine the type of an enemy aircraft being shown at less than 1/200th of a second.

Wanna argue with the US armed forces?! Didnt think so ;)
 
Originally posted by dawdler
To add to that, and to effectivly end the discussion:
US Airforce testing shows the human eye can see, and not only that, it can also determine the type of an enemy aircraft being shown at less than 1/200th of a second.

Wanna argue with the US armed forces?! Didnt think so ;)

ooh ooh i wanna argue with the US armed forces. Ok the eye can't determine anything, it just see's stuff and that image is then transmitted to the brain via optic nerves and its infact the brain that can work out what type of aircraft it would be, the eye just records everything it sees

Does anyone actually know the "assumed" limits of the human eye?. I know some university spent a few thousand trying to come up with what they thought was the actual resolution the eye can see at. Was just a guess but was interesting all the same.. And there are of course limits to how dark something is before it stops working properly aswell as a certain brightness that limits it, distance etc.. There's a commercial on TV at the moment about the eye, I think it says something like the average human can see a candle flame from so many miles or something

It's pretty interesting is the eye, considering its not really much more than a lens, its the good ol optic nerves that are neat. Imagine a computer with things like that in. Course it'll probably be years or more before anyone could build a monitor to display at the possible resolution of the human eye (and yes before anyone says anything I am quite aware the human eye does not have pixels lol :p I know that :))
 
Originally posted by Fenric1138
There's a commercial on TV at the moment about the eye, I think it says something like the average human can see a candle flame from so many miles or something.

40 miles, or was it kilometers, I forget.

:afro:
 
Originally posted by Fenric1138
(and yes before anyone says anything I am quite aware the human eye does not have pixels lol :p I know that :))
Pff, organic pixels, artificial pixels, same thing! (with the difference we have night/day/color focused 'pixels' instead of RGB) :)
 
Err, flame me if I'm wrong, but aren't the rod and cone cells of the retina somewhat equivalent to pixels?

dawdler: D'you use MoSlo?

Heh, I suck. Crappy computer, mine. I'm used to playing at less than 30 fps.

Seeya.:E
 
I bet you can achive this frame rate on a 320x240 resolution on a next gen ATi or nVIDIA cards (i.e. post R360 and NV40!) :)
 
Back
Top