Halflife 2 Performance Revealed

Originally posted by Razak
hahahha omg i just had a flashback to my freshman year at high school.

:cheers: I'm glad that someone remembered!
 
Originally posted by Mountain Man
We're going to buy the game and happily play it without worrying that someone else might be getting a few FPS more than us.

(By the way, I have a Ti4200, so this pissing match between the two heavy weights doesn't concern me in the least.)

concerning the heavyweights, this isn't a few FPS, this is almost a 20fps lead, a 33% increase in performance from a 5900 that's not even running on the dx 9 standard codepath to a 9800 Pro that is
 
I have a Ti4800se and I'm not at all botherd about this. That chart represents nothing.
 
One thing I think we all failed to notice is, most of us right now can't use the source system to it's full potential. So it doesn't even matter what cards any of us have right now, it will matter later when the source engine is reaching it's peak. That's when I think bragging rights come in, but if you want to know as of right now yes ati wins, and no I don't own an ati card, but still as gabe said the source could be an engine used up to 3 years down the road.
 
HL2 Benchmarks


Dunno if this has already been posted in here because i didn't read all the pages. If it has, o'well, read it again. :cheese:
 
Silly nVidia fan boys. Even when everyone w/ half a brain included Valve and reputable review sites are telling them their cards are inferior, they still can't accept it.

Oh and it's not Valve's problem that nVidia can't code a ****ing card to work w/ DX9 w/o optimizations and or cheats. They should not be expected to deal with that crap. There is a DX9 standard released by Microsoft that is meant to be followed. Don't follow it, pay the price.

That is all.
 
Wow, I just read the firingsquad article. Any fools shouting "but Valve optimerizated HL2 for ATI!" need to get a clue. According to the article, Valve spend 5 times as long creating a mixed-mode (DX8 and DX9 feature) to improve performance on the crappy NV3x pipeline than they spent optimizing the normal DX9 path.

Translation: Valve spent 5 times as many hours optimizing HL2 for nVidia than they did for Ati, because Ati works fine in the native DX9 path and nVidia doesn't.

They could've just treated NV3x cards as though they were DX8 hardware, but they went to all that trouble just to squeeze out a few more fps for the 5900 and 1 or 2 fps for the other FX cards. nVidia owners (like myself) should be GRATEFUL.
 
Originally posted by dscowboy
Wow, I just read the firingsquad article. Any fools shouting "but Valve optimerizated HL2 for ATI!" need to get a clue. According to the article, Valve spend 5 times as long creating a mixed-mode (DX8 and DX9 feature) to improve performance on the crappy NV3x pipeline than they spent optimizing the normal DX9 path.

Translation: Valve spent 5 times as many hours optimizing HL2 for nVidia than they did for Ati, because Ati works fine in the native DX9 path and nVidia doesn't.

They could've just treated NV3x cards as though they were DX8 hardware, but they went to all that trouble just to squeeze out a few more fps for the 5900 and 1 or 2 fps for the other FX cards. nVidia owners (like myself) should be GRATEFUL.

That must've been really frustrating for Valve and may actually explain why the release date was up in the air for a while ("No Comment").
 
Who gives a shit. Was HL1 any worse with just a p200 and no video card? Did it magically become better when you got a voodoo2? No, it just looked better.
I have a g4 ti4400, and i don't really care, as long as i can run in 1024 or 800 with medium detail and good fps.
 
I guess what concerns me most in DX8 vs DX9 is the normal-mapping on the faces. The smoothness of the faces is a huge factor in how real they look, and a big part of the 'emotional attachment to characters' Valve keeps talking about. We don't know how they will look in DX8.
 
I find this comment kind of amusing:
"... and we were all baffled at just how sorry the performance is for this game with NVIDIA silicon."

I mean come on, baffled? Its like the freakin release date.
3Dmark says its slow in DX9, everyone with Nvidia cards says its gonna be fast.
Benchmarks for DX9 are slow, everyone with Nvidia cards says its gonna be fast.
Games with DX9 are slow, everyone with Nvidia says its gonna be fast as this doesnt represent anything.
More games with DX9 are slow, Nvidia says its gonna be fast with their new drivers.

Baffled? Expected! :p

And the 50.x drivers cant be THAT good, cause they said they made them available on the 8th to reviewers. We havent seen a "NVIDIA IS 1337 KING AGAIN!" yet... Which is odd... Someone should have immidietly said it improved performance alot on DX9 applications. Maybe I just missed it, need to check again :dozey:
 
Man, i cant get over how rich everyone in here is. I have a geforce 2 mx100/200 and a amd duron 1000. I dont mind if i can only play the game in low detail, as long as its smooth. All you people going "omg my fx5600pro wont be able to run halflife2 for shit!!!" What are you trying to say, you wont be able to run it maxxed out everything totally full graphics? Or am i unable to run the game at all on my system. Its all very depressing.
 
Originally posted by Fuzzy
Man, i cant get over how rich everyone in here is. I have a geforce 2 mx100/200 and a amd duron 1000. I dont mind if i can only play the game in low detail, as long as its smooth. All you people going "omg my fx5600pro wont be able to run halflife2 for shit!!!" What are you trying to say, you wont be able to run it maxxed out everything totally full graphics? Or am i unable to run the game at all on my system. Its all very depressing.
You can get a cheap G4ti, they arent that expensive, and obviously much faster than the FX :D
 
yeah, i am looking at upgrading just the vid card for the moment. i have 750megs of SDR ram, hopefully that will do.

mate of mine snagged a radeon 9600pro for AUS$240. damn nice price.
 
Originally posted by Fuzzy
yeah, i am looking at upgrading just the vid card for the moment. i have 750megs of SDR ram, hopefully that will do.

mate of mine snagged a radeon 9600pro for AUS$240. damn nice price.
Yep, the 9600 Pro is definetly the best buy ever. But the G4Ti is still cheaper I beleive, so it all depends on how much one want to spend.
 
Back
Top