Halflife 2 Performance Revealed

Originally posted by Northwood83
D8, all you keep doing is slamming Nvidia owners. Knock it off.


I agree, people think that because the FX line has been shown to perform worse in DX9/HL2/blah blah blah that they are nothing more than coasters now.

I owned a geforce 4 ti 4600 and it worked wonderfully until my crappy mobo corrupted it.

By the way, I own a 9800 pro 128 :)
 
Originally posted by Slash
more ram is ALWAYS better

No, it's not. The amount of RAM you have only comes into play if you run out of it. Having more ram doesn't somehow make reading/writing to RAM any faster.

I seriously doubt any game at this point can even USE more than 512 mb. Unless you're running 3DStudio and Photoshop in the background while playing games, having more that 512 is just preparing for the future.
 
9600 v 5900ultra??? thats bs.... i dont see how this can be..... they're just trying to get everyone to buy a friggin ati. i got a 5600.. i dont beleive wut they say about having to use dx8.. until i get my ****in copy of HL2... because i bet the benchmark is gonna be fixed so that a 9200 pro beats a 5600 in performance...
 
Originally posted by SuperFat
Damn, I don't understand why they just can't get better support for nvidia cards, I mean I own one and I understand ati makes a better card, and I just got unlucky in which one I chose, but still I'm sure at least 50% of gamers own nvidia cards, this could seriously lose some sales for hl2? I mean if people realize this and in fact don't buy hl2, based on this review written about nvidia cards, what's going to happen to all the people who own nvidia cards when hl2 is released?

:rolleyes: People aren't going to dismiss HL2 just because their NVidia card gets a couple of FPS's less than ATI's.



NOTE: As an ATI owner, I couldn't be happier.:E
 
- even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;

Can anyone say shoddy programming? Real programmers don't have problems like these.

- ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor.

And this, my friends, is what we call bullshit. You don't specifically optimize a game for a certain graphics card and then claim not to be supporting it. Consider that Valve was showing off HL2 at ATi's booth. Also consider that Gabe continously praises ATi's cards for their "performance" over other cards. "not optimized" for any special card my ass. It's all about business, it seems, and Valve apparently got paid a LARGE sum to make their game work well with only ATi cards.
 
hl21.gif


:eek:

EDIT:Wow, hot linking doesnt work.....crap.
 
Originally posted by Mountain Man
It makes sense that Valve would release the benchmark software a little earlier to reputable sites before releasing it to the public. This gives the sites in question time to run the software through its paces and get an article up in time for the public release. That's actually excellent marketing.

It's a very common practice too...Almost all games are released to reviewers very very soon after the game goes gold. That's why almost all reviews hit the sites the day the game comes out, even with a game that takes 40+ hours to play through.

There are just very strict NDAs to prevent information from coming out before hand. A reviewer will respect them because otherwise they won't get early copies in the future and they'll be out of business!
 
Originally posted by Joneleth
Can anyone say shoddy programming? Real programmers don't have problems like these.



And this, my friends, is what we call bullshit. You don't specifically optimize a game for a certain graphics card and then claim not to be supporting it. Consider that Valve was showing off HL2 at ATi's booth. Also consider that Gabe continously praises ATi's cards for their "performance" over other cards. "not optimized" for any special card my ass. It's all about business, it seems, and Valve apparently got paid a LARGE sum to make their game work well with only ATi cards.

what about the tomb raider and halo benchmarks? thos are DX9 benchmarks, and the radeon 9800 wins in those over 5900 as well, why not just accept that 9800 performs better with shaders? its been known for a long time, look at any shadermark benchmarks.
 
Originally posted by Joneleth
Can anyone say shoddy programming? Real programmers don't have problems like these.



And this, my friends, is what we call bullshit. You don't specifically optimize a game for a certain graphics card and then claim not to be supporting it. Consider that Valve was showing off HL2 at ATi's booth. Also consider that Gabe continously praises ATi's cards for their "performance" over other cards. "not optimized" for any special card my ass. It's all about business, it seems, and Valve apparently got paid a LARGE sum to make their game work well with only ATi cards.
Somebody's cranky because they bought an Nvidia card...
 
Originally posted by Joneleth
Can anyone say shoddy programming? Real programmers don't have problems like these.

Oh yeah, great thinking. Because it CLEARLY couldn't just be the fact that the whole FX series completely sucks when running DX9. Obviously Valve and Eidos and id and everyone else who codes anything in DX9 are in a big fat conspiracy against nVidia.

Or maybe you just bought the wrong card and now you're in denial.

EDIT - Disclaimer: I own an nVidia card too.
 
Yea man, nobody is optimizing crap for any card. ATI just has better shaders and stuff over the FX 5900. Thats a simple understanding right their. I'm sure sometime down the road you might see a game here and their where Nvidia takes the cake, but its plain and simple, ATI just wins here. Im sure Nvidia will win for Doom 3.
 
Originally posted by dscowboy
Oh yeah, great thinking. Because it CLEARLY couldn't just be the fact that the whole FX series completely sucks when running DX9. Obviously Valve and Eidos and id and everyone else who codes anything in DX9 are in a big fat conspiracy against nVidia.

Or maybe you just bought the wrong card and now you're in denial.


Don't forget Gearbox.
 
Originally posted by d8cam

Those are pics from the media seeing the benchmark.

LOL! At first I thought the glasses and table displayed on the monitor were part of the benchmark. "Wow, that spectral lighting is GREAT!" hahahaah
 
jone is right... and i emailed gabe askin him if a 5600 with 512ram and 2800xp dx9 would work with 30-40fps and he said yes but some character bump mapping and some Long range dynamic lighting will not work too well. this will be configured for u
 
Good god i screwed up big time. damnit. Wish the dev team revealed this earlier right after the fx5900 was released. Would of saved many people headaches. I am now screwed.
 
Originally posted by chris_3
jone is right... and i emailed gabe askin him if a 5600 with 512ram and 2800xp dx9 would work with 30-40fps and he said yes but some character bump mapping and some Long range dynamic lighting will not work too well. this will be configured for u

Uh, yeah dude. It's called "DX8". That's exactly what anandtech said.

EDIT: People, realize that character bump-mapping only works when running DX9. So if you're running in DX8, the character's faces will look like polys, not smooth like they are in the videos.
 
Originally posted by Razak
Good god i screwed up big time. damnit. Wish the dev team revealed this earlier right after the fx5900 was released. Would of saved many people headaches. I am now screwed.

Nah, just wait and have to configure your game(As mentioned above) to get decent rate.

Or just wait for nVidia to pop out those infamous "optimizations".
 
hmm can someone gimem a chart that gives teh dif's between dx8 and dx9 exactly??? if its just long rang dynamic lighting and character bumpmapping then that dont really matter :\ i thought it was more than that
 
Originally posted by chris_3
hmm can someone gimem a chart that gives teh dif's between dx8 and dx9 exactly??? if its just long rang dynamic lighting and character bumpmapping then that dont really matter :\ i thought it was more than that


hl24.gif
 
Originally posted by chris_3
hmm can someone gimem a chart that gives teh dif's between dx8 and dx9 exactly??? if its just long rang dynamic lighting and character bumpmapping then that dont really matter :\ i thought it was more than that

The list of the different features that HL2 uses are in one of the first couple questions in that interview with Gabe.

GD: It's obvious that the Source Engine takes advantage of certain DX9 features, can you explain what those features are?

Gabe: High-dynamic range rendering (HDR), bump-mapped characters, soft shadows, improved full-scene anti-aliasing are the most interesting. In the future our enhancements to Source will all be DX9 specific (we won't be creating DX8 equivalents).

Here: http://www.gamersdepot.com/interviews/gabe/001.htm
 
umm i play ut2003 with dx9 anti aliasing 4x and everything on high and 5600 and i get 42fps steady... that chart is weird....
 
Originally posted by chris_3
umm i play ut2003 with dx9 anti aliasing 4x and everything on high and 5600 and i get 42fps steady... that chart is weird....

UT2003 isnt a DX9 game, it's dx8 (or 8.1), fo0.

and that chart was Half-Life 2's performance i believe (it doesnt say..).
 
Wow, I feel like the guy in "Indiana Jones and the Last Crusade". I chose poorly.

At the risk of "n00b-ing" myself: does anyone know if the BIOS on the FX 5900 is flashable? I only ask, cuz when I boot, I get a nice screen showing my BIOS version...

Ok...I'm grasping here...I need some solace in the fact that I now have an expensive card that won't run HL2 as nice as it should be. I'm going to cry now. And drink. And after September 30th, if things don't change, I will take my 5900 out back and shoot it.

"Tell me about the Rabbits, Lenny..."
 
I wonder how the 9600pro will run HL2 compared to 9700/9800pros. Hope it isnt to much of a gap.
 
every one head over to firing squad, they seem to have the most in depth one so far, things that cracked me up.

Optimization Investment
• 5X as much time optimizing NV3X path as we’ve spent optimizing generic DX9 path
• Our customers have a lot of NVIDIA hardware
• We were surprised by the discrepancy
• ATI hardware didn’t need it

Great Optimization
• Treat NV3X as DX8 hardware
• Customers can always set DX9 themselves
• Would have saved us a lot of time
• Most developers won’t have the budget to create their own “mixed mode” equivalent
• Must use DX8 with the 5200/5600 to get playable framerates
 
It'll run it just fine. Not everything cranked up to the sweet Lord's fullest, but it'll do it's justice.
 
all those 5900 boys that got so voilently sure there card was better... are crying and hiding right now

/me strokes his ATi Sapphire Radeon 9800 pro
 
High-dynamic range rendering (HDR), bump-mapped characters, soft shadows, improved full-scene anti-aliasing are the most interesting

Hmmm now if this DX8 thing is true from AnandTech and not just to sell more ati's for HL2 can you toggle anti aliasing on and off right??? and does anyone have any photo comparisons of high dynamic range rendering and non hdr, bump-mapped characters and non bumpmapped characters, "soft shadows" and non "soft shadows". Honestly does anyone here really like "really" detailed shadows? I find that if u make ur shadows less "detail" teh game runs smoother with my old geforce 3 lol

"GD: It's obvious that the Source Engine takes advantage of certain DX9 features, can you explain what those features are?

Gabe: High-dynamic range rendering (HDR), bump-mapped characters, soft shadows, improved full-scene anti-aliasing are the most interesting. In the future our enhancements to Source will all be DX9 specific (we won't be creating DX8 equivalents)."
 
Originally posted by BlackSun

"Tell me about the Rabbits, Lenny..."

hahahha omg i just had a flashback to my freshman year at high school.
 
Originally posted by Razak
Good god i screwed up big time. damnit. Wish the dev team revealed this earlier right after the fx5900 was released. Would of saved many people headaches. I am now screwed.

Ebay it quick! Before everyone else figures out.

EDIT: Firingsquad has a good review, but they messed up their interpretation of FPS/$. They got it backwards. LOL.
 
man...I feel sorry for all the FX owners here...
bummer guys, be don't say we didn't try to tell you, NVidia just doesn't support DX9 very well...
 
Hmmmm doesn't the firingsquad article confirm that the benchmark won't be available till the 30th of September? ;(
 
Originally posted by SFA
Hmmmm doesn't the firingsquad article confirm that the benchmark won't be available till the 30th of September? ;(

false information...
 
Originally posted by alco
what's going to happen to all the people who own nvidia cards when hl2 is released?
We're going to buy the game and happily play it without worrying that someone else might be getting a few FPS more than us.

(By the way, I have a Ti4200, so this pissing match between the two heavy weights doesn't concern me in the least.)
 
Back
Top