Quick Question about the graphics

the 3.0 has more advantages over the 3.2 ghz anyway. and u can just OC it later.
 
"e3 demo was at 30 fps because of vsync, but because of vsync it looekd like 50-60"

In an email gabe himself said the e3 demo ran at 60 fps
 
The e3 demo only looks like 30FPS because of the recording used to keep the filesize down. BaNDiT is right, it was at around 60FPS.
 
look at the sticky info. thread and you will find an email about this.
 
"We target 60 FPS, and then adjust details accordingly. At E3 we were running 60 Hz on a 128 MB 9800 at 13something by 7something (the resolution of the plasma screen, which I don't remember exactly).

We are in the middle of a lot of performance analysis and tuning with ATI and NVIDIA. We will release benchmarking tools before we ship.

-----Original Message-----
From: The boogeyman
Sent: Wednesday, July 02, 2003 9:48 AM
To: '[email protected]'
Subject: HL2 performance numbers


Hi Gabe,

I feel guilty sending this to you since I suspect you're being inundated with emails every minute of every hour buuuuuuuuut.... ;-)

Any idea if you guys are planning on releasing some performance metrics/benchmark numbers to the net before Half-Life2's release? I'm sure there are legions of folks out there wondering what kind of performance they can expect out of the game running on high end ATI/nvidia hardware. 30 fps? 60fps? 100fps? I realize the engine is highly scaleable but for the 'ultimate' HL2 experience (ie 1024x768 to 1280x960, 32 bit color, all the eye candy cranked, 60 fps+) are the ATI 9800Pros and Nvidia 5900Ultra's going to be able to deliver? I know I for one want to pull the trigger on a 9800Pro but there's no way I'm jumping in if its not going to run the game the way it deserves to be run. I know the hardware AND gaming sites would love to get this kind of data a la the preliminary Doom3 benchmarks that id/nvidia released a while ago.

Any chance?

96 days and counting!

Regards!
" -- JGF.
 
from what i saw the video got choppy(lowered fps) in certain parts. I noticed this at the last scene the most when you are looking around from up top on the elevator. it will all come down to how well they do the lod system not your card. 9800 got stressed from what i saw so they are probably tweaking as we speak.
 
Hey stay on topic, we're talking about me and bandit, not video n stuff. God, u nooooooooooooooooooobie

:E
 
Originally posted by droper
from what i saw the video got choppy(lowered fps) in certain parts. I noticed this at the last scene the most when you are looking around from up top on the elevator. it will all come down to how well they do the lod system not your card. 9800 got stressed from what i saw so they are probably tweaking as we speak.

I saw that too, but I think it may have been Quicktime doing that rather than the game. Quicktime is often known to go "choppy" in areas of media. It happened on a few occasions throughout the movie - another instance being when Gordon "rushes" the Ant Lions towards the sentries then follows them.
 
Yeah I actually experienced some of that, not even during areas that were particularly taxing on any system, so my guess would be it's a quicktime thing.
 
Hmmm, how do you guys think it will run on my system?
P4 1.9ghz
768Mb RIMMRAM
Geforce4 ti 4600 128mb
soundblaster live value (yuck)

thanx
 
Originally posted by theotherguy
Hmmm, how do you guys think it will run on my system?
P4 1.9ghz
768Mb RIMMRAM
Geforce4 ti 4600 128mb
soundblaster live value (yuck)

thanx

Here
 
Which is better, the Nvidia 256MB Geforce 5900 Ultra or the ATI Radeon 9800 256MB? just curious.
 
Originally posted by MIWojo11
Which is better, the Nvidia 256MB Geforce 5900 Ultra or the ATI Radeon 9800 256MB? just curious.

The 5900 is better, but now I'm probably gona get flamed by people who think nVidia cheats :(
 
I would go with the Nvidia, Why you ask? Well because ATI driver support has always been lacking.
 
Yah I saw two different pages on benchnarks for both. On one page the 5900 Ultra killed the ATI Radeon 9800 and on the other it was the opposite so I was confused.
 
damnit, 600 mhz POS [barely runs GTA3 on lowest EVERYTHING] and no money for an upgrade. why, hl2, WHY???!!!?!

graphics will look awesome though

anybody feel like giving me 1 grand us?
 
The nVidia 5900 and Radeon 9800 256mb both have advantages over each other, but in my opinion the Radeon is superior, as it utilizes better Image Quality and Anti-Aliasing.
 
As to the choppiness, I was worried about that too. Then I played the movie file on a far superior computer, and the choppiness wasn't there. So what you're seeing could well be Quicktime, or even the videocam itself chugging. No real way to tell at this point.

Benchmarking this game is going to be VERY hard given that it scales the LOD and other engine features dynamically to take advantage of different hardware.

Many people are already confused: "only 60fps with a top-of-the-line system?" They don't realize that the engine targets 60fps on ALL systems, scaling things back or forward depending on the load to get the best look and feel possible on any system.
 
but he also said that you could turn of the "dynamic lod" so that it doesnt aim for 60fps
 
Originally posted by LoneDeranger
The 5900 is better, but now I'm probably gona get flamed by people who think nVidia cheats :(

well if its better than why does it perform worse than a 9800 when nvidias cheats are disabled?

as an aside, ati's anti aliasing looks better, and their overall IQ is better with or without AA and AF.
 
I have an ATI, and their driver support and misc patches r pretty shitty. Sometimes I have to downgrade to an older patch, even to play newer games! now thats retarded.
 
Originally posted by Fiddle
I have an ATI, and their driver support and misc patches r pretty shitty. Sometimes I have to downgrade to an older patch, even to play newer games! now thats retarded.

well they dont release "misc. patches" they release a new driver set and a new control panel at the same time, have you tried 3.5 cats? i dont know of any problems with any current games on them.
 
damn my pc could prolly run this at best gfx just fine...

but i want to see it well to. IM probably gonna be running this at 40-60 fps but on a nice 50' DLP HDTV. SWEET, especially using that dvin/out.
 
Originally posted by Fiddle
I have an ATI, and their driver support and misc patches r pretty shitty. Sometimes I have to downgrade to an older patch, even to play newer games! now thats retarded.

With what games do you have problems? I always upgrade my drivers for ATi, but never had any problems. Latest games I played were Vice City, Elite Force II.
 
but he also said that you could turn of the "dynamic lod" so that it doesnt aim for 60fps

Sure, but that doesn't solve the problem: that's not necessarily how everyone is going to play. The basic problem is that before you could compare different systems by fps strictly: more fps = better in a very obvious and linear fashion (unless there were graphics cheats). Thus, you could use benchmarks to judge how well a system might run a game. But in this case, a better system might not run HL2 any faster at all: it might dump on more graphical effects and get the same fps. How do you compare? Straight non-LOD just wont give you the same range you get when you do have LOD in effect, as should be the normal case in playing the game.
 
i think that it might just need a new paradyme in benchmarking. Instead of measuring the FPS, maybe they will encourage measuring other factors. for example, what level of detail can be maintained at 60fps?? Then again, peopl are stubborn about this kind of thing, and if not adopted it wouldn't be a very good measurement across the board.

As for running old hardware with high options: don't bother. DX7 (or whatever) cards trying to emulate just about any DX 8 or 9 function would probably reduce your framerate to 30% alone. Buy new hardware or accept less than perfect graphics. We all will, if the source engine is as extensible as Gabe suggests. They'll have source engine advancements and texture resulution upgrades for the next generation of cards too, I'll bet. As long as their sales stay high enough to continue the level of investment...

But I could be wrong.

-Phision
 
By the way people, do you really check benchmark for FPS? Heh, I never did that myself, so I don't see a point in it. But I am talking about 3dmarks 2003, do you guys using something else?
 
the demo may have been run on a 9800 pro/ 2.0 but the demo was really a movie wasnt it? what was the movie recorded on?
 
oh and the whole quicktime choppy thing i just validated.

i have both full movies in windows media and quicktime and the wmv doesnt chop at all with the sentry gun/antlion or the elevator.
 
Peks: it was a demo file, so it's not like recording a movie really. The computer it was running on is rendering it all in real time.
 
A Ge-forceTi 4200 128MB DDR should run the game perfectly....Except no DX9 support
 
oh i thought i remember somewhere they said it was a movie file, pre-recorded and not a demo. oh well
 
Back
Top