Sums it all up nicely.

farcry - opengl with directx 9 shaders, hardware dependent for shaders, full shaders on dx8 card water broken.
half-life 2 - direct 3D with direct x 9, non hardware dependent for shader, only power of graphic card, shaders on dx8 card = full dx9 shading effects, water not broken, up to 20fps more lost, due to extra processing, but not water broken, just a slight slow down :thumbs: :farmer:
 
Play4Fun said:
but theres alot of new games that using nVidia's OGL (like CoD, PK, UT2004 etc) and the only game that has the ATI logo in it is HL2 :\

Rainbow 6 3 - Raven Shield (Which I believe runs on the Unreal engine) has a nice ATI logo
;)
 
what i mean is half-life 2 can run on low spec machines, unlike farcry, like a dx8 card acn run full dx9 stuff. :frog:
 
[SARCASM]ATI will win because they made....uh....the graphics for the gamecube! :rolling: [/SARCASM]
 
I hate this gfx card crap.

HL2 runs @ 40 fps with DX 9.0 ON in singleplayer with a GF4 TI 4800
HL2 runs @ 80+ fps with DX 9.0 ON in singleplayer with a Radeon 9800

The human brain/eye can't register anything that goes faster than +-40 fps, all that "OMG i've only got 30 fps and I need 120" is BULLSHIT!
Every game that runs a steady 30 fps is perfectly playable.

THe only thing you could need 120fps+ for is for certain jumps, like in the Q3 engine, but HL2 doesn't run on the Q3 engine now does it.
 
its a my dad could beat your dad thing...

sure your dad could, my dad could put your dad in an insitute for the rest of his life :D
 
ferd said:
I hate this gfx card crap.

HL2 runs @ 40 fps with DX 9.0 ON in singleplayer with a GF4 TI 4800
HL2 runs @ 80+ fps with DX 9.0 ON in singleplayer with a Radeon 9800

The human brain/eye can't register anything that goes faster than +-40 fps, all that "OMG i've only got 30 fps and I need 120" is BULLSHIT!
Every game that runs a steady 30 fps is perfectly playable.

THe only thing you could need 120fps+ for is for certain jumps, like in the Q3 engine, but HL2 doesn't run on the Q3 engine now does it.

Woah, I'd like to know where you bought a Ti4800 that is DX9 compliant!

And it's not the framerate that matters, it's the peaks and especially the valleys in it that matters. I can guarantee you when you run an average of 40 fps, you have dropdowns to 15 in certain area's. A framerate of 80 may drop to 35-45 but that won't be noticable.
 
ferd said:
The human brain/eye can't register anything that goes faster than +-40 fps, all that "OMG i've only got 30 fps and I need 120" is BULLSHIT!

Also you should really go check up on that and get your facts straight before making an ass of yourself.

The human eye is perfectly capable of registering hundreds of frames per second.
 
saying that cartoons run at 24 fps and we can't notice their changes.

This 6800 vs x800 debate is killing me. What about PS 3.0? What about future DX editions, is the x800 just a single bang while the 6800 is a long term thing?
 
Back
Top