NVidia users need not worry, just yet

DOOManiac

Newbie
Joined
Jul 19, 2003
Messages
115
Reaction score
0
Disclaimer: There seems to be a massive amount of Fanboy-ism on both sides of the camp here on these forums. Please, lets try to avoid that in this thread. I'm not out to praise or condem one or the other. This thread is simply to provide current and potential future NVidia or ATI customers with more information to make a better informed decision in regards to the video card they purchase next.

As you all know Valve (who is partnered with ATI) came out today with some questionable benchmarks that show ATI products completely creaming NVidia products. Shortly afterwards NVidia issued a statement, of which the below quote is an exerpt from:
The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.
If you would like to read the whole press release, click here.


Here's what I think:

1) NVidia can't afford for Half-Life 2 to run like crap. They'll get it running great. They'll have to to stay in business.
2) After ATI screwed up with id, they were probably very desperate to find another software partner...
3) Valve would be silly to have over 50% of their customer base (last time I checked NVidia still had the majority of the market) run their game like crap...
4) As much as you may want to believe to the countrary, Half-Life 2 is not the only game coming out. If you plan on playing an array of games, I suggest you do more research (including waiting till said games have shipped and you can get actual benchmarks rather than estimates) and get the best all around card, rather than one that tailors to a specific game. (To those who like to take things the negative way, this doesn't just mean "ATI for Half-Life 2" but also means "NVidia for DOOM 3".)

5) The vast majority of people (read: ones not posting on or reading these forums) aren't going to be upgrading their system to play Half-Life 2 if its within reason (say 1.8ghz with a Geforce 3). So don't let the benchmarks scare you completely if you have an older system: Benchmarks by nature are meant to stress hardware to its limits, not give an approximation of performance during actual gameplay.

As for myself, I'm going to keep the current video card (its about a year old) that I have for a bit until after Half-Life 2 and a few other games I plan on playing all ship, and then I'm going to check with independant third party sites (HardOCP, Anandtech, etc.) to find out which one I should choose.

Bonus Note: At Quakecon there were representatives at HardOCP, NVidia, and ATI that casually mentioned that some big big video card price cuts were coming around November 15th. I would recommend waiting until this date for making any video card purchase unless you've got extra money to blow, or are on an unbearably crappy card already.

Again, before you post I'd like to remind you that this is supposed to be a flameless, fanboy-less thread. If you can't post without saying "X sucks, Y rocks" then please do not post.
 
Gabe quote:

As far as having deals, we have had a lot of deals. We have had deals with ATI and NVIDIA. We have had deals with Intel and AMD. We have had deals with Sega and Sony. We are always looking for ways to work with other companies on development or marketing to create better things for customers of our games.

They have had a lot of deals.
 
This: "The optimal code path for ATI and NVIDIA GPUs is different" is very questionable. Dx9 and Pixel 2.0 are standards. They are publically available. You should be able to do them in a standard, very straightforward way. That's why Valve simply programmed the game to Dx9 specs, not to any special code path. The fat that NVIDA needs a special code path to be "optimal" is a bad thing, not a factor of having two different cards.

And there's a problem with your #4. That problem is that the issues being discussed have nothing to do with what game or even what API you use. They have to do with the power to implement certain FEATURES. Some of them, like the spec-bump on characters, are very FP math intensive. That demand is the same whether you are using OGL, Dx9, or any game at all. And if NVIDA really doesn't have enough focus on FP bandwidth in their cards, using a different game isn't going to help. It's a problem with the particular shader effect demanding very particular hardware. And it's either there or it isn't.

That said, we can't say for sure whether it's there. One major problem is that NVIDA has been very very secretive with their designs. So we can't know if the FXseries really lacks in ways that can't be fixed by driver updates. We'll just have to wait and see if the 50 series drivers make significant improvements.
 
good info for me, im in school right now cant afford a card.. but want one :p

im on a athlon 1.6xp,g3ti500,512ddr,120gHD7200rmps.. hope that can cut it for now :)
 
To clarify on point 2, I did not mean to imply that Valve would be unfairly biased towards ATI simply because they were partnered with them, I meant merely to state that ATI would be especially anxious to get HL2 performing best on ATI products.

And Apos: While they both comply to software instruction standards, the way the cards work on the inside is very, very different. I agree with you that NVidia has been a bit shady as of late, but to be honest, so has ATI. I don't really think either are angels.. Anandtech said it better than I can:
It's almost ironic that the one industry we deal with that is directly related to entertainment, has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run, instead we're left to argue about the definition of the word "cheating", we pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.

What's interesting is that all of the things we have occupied ourselves with in recent times, have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).
 
i'm not givin in to the hype, i'll get the newest nvidia drivers, and with my geforce 4 ti4400 i will enjoy hl2 one way or another
 
Right now its either this 5600 or the 9600pro. I'll wait on those drivers and if its not much of a update, im switching.
 
Apos is right. According to Gabe's slide that describes some of the destestable things NV did to their 'special HL2 drivers', they're trying to cover up for some major deficiencies in their product. They know that HL2 benchmarks will influence a lot of purchasing decisions so they tried to manipulate the benchmark number and hope that maybe people won't notice how poorly in actually runs in-game, or in other DX9 games.

I'm an nvidia owner myself, and I'm pretty appalled at the list of hacks nvidia tried to give to valve for the benchmarks. Changing the detail level when a screenshot is taken!?!? Manual occlusion culling for the benchmark camera-path?!?!

Of course, we don't know how much the detonator 50 drivers will improve DX9 performance. Heck, everything might turn out just fine for FX cards (unlikely). But IMO, nVidia having the gall to try to fool us with BS benchmark-specific drivers AGAIN so soon after the 3DMark03 fiasco is pretty damning. Their credibility at this point is on par with the iraqi information minister as far as I'm concerned.
 
I think Anandtech summed it up the best at the very end of their article:

When we first heard Gabe Newell's words, what came to mind is that this is the type of excitement that the 3D graphics industry hasn't seen in years. The days where we were waiting to break 40 fps in Quake I were gone and we were left arguing over whose anisotropic filtering was correct. With Half-Life 2, we are seeing the "Dawn of DX9" as one speaker put it; and this is just the beginning.

The performance paradigm changes here; instead of being bound by memory bandwidth and being able to produce triple digit frame rates, we are entering a world of games where memory bandwidth isn't the bottleneck - where we are bound by raw GPU power. This is exactly the type of shift we saw in the CPU world a while ago, where memory bandwidth stopped being the defining performance characteristic and the architecture/computational power of the microprocessors had a much larger impact.

One of the benefits of moving away from memory bandwidth limited scenarios is that enhancements that traditionally ate up memory bandwidth, will soon be able to be offered at virtually no performance penalty. If your GPU is waiting on its ALUs to complete pixel shading operations then the additional memory bandwidth used by something like anisotropic filtering will not negatively impact performance. Things are beginning to change and they are beginning to do so in a very big way.

In terms of the performance of the cards you've seen here today, the standings shouldn't change by the time Half-Life 2 ships - although NVIDIA will undoubtedly have newer drivers to improve performance. Over the coming weeks we'll be digging even further into the NVIDIA performance mystery to see if our theories are correct; if they are, we may have to wait until NV4x before these issues get sorted out.

For now, Half-Life 2 seems to be best paired with ATI hardware and as you've seen thorugh our benchmarks, whether you have a Radeon 9600 Pro or a Radeon 9800 Pro you'll be running just fine. Things are finally heating up and it's a good feeling to have back...
 
Things are finally heating up and it's a good feeling to have back...

Fluckin right, im starting to remember that sick anticipation feeling.. i love it :)
 
i think performance with nvidia cards wont be so bad in the end. What bothers me is that if its true that they are altering valves pride and joy by making it display differently for the sake of benchmarks, in the end its the user who suffers. It shouldnt be up to NVIDIA or ATI to determine what eye candy to not include. If the card cant do something thats one thing but if it can and NVIDIA (or ATI, i'm being objective here) choose to not display a feature to inflate benchmarks, that is wrong. Dunno if i make sense...its late.
 
I am not scared.. I have a Radeon 9700 pro in my main computer, and a Geforce Ti 4600 in my secondary computer.. I play games on both, and the only thing I really notice different is the Radeon can handle AF and AA better when maxed.. without AA and AF they are almost the same graphics wise. I play games on my 2.0 ghz Radeon powered PC at 1024x768 with 4xAA and 8xAF, and on my 1.5 ghz GFTI4600 at 1024x768 with 4xAA and no AF , and both play games of today at a locked 60 fps (except Morrowind) which is all anyone really needs to play at :D
 
I really hope nvidia doesn't try to pull a fast one on its FX users with some cheap HL2 only optimizations in its next DET's. With the recent scandal over 3dmark, it seems like the graphics industry is turning into a late night talk show cast.

This was just a huge blunder by nvidia, and a chance for ATI to gain some market share. With ATI's recent partnership with microsoft for the XBOX2 and HL2, things are looking up. The next blockbuster game being a year or more off (DOOM3) ATI's got quite a good lead on nvidia. there's nothing like healthy competition.
 
Originally posted by rage_fan
there's nothing like healthy competition.

Indeed. Perhaps we can get to relive some of the glory 3Dfx vs. NVidia Voodoo against TNT matches. :D
 
I agree with DOOManiac on his last point, Man the 3dfx VS Nvidia days were cool :) I also remember 3dfx having the same raw power over quality ethos that Nvidia now seem to suffer from.
 
benchmarks really say nothing to me. i mean, if im gonna invest in a new top end card, i want the one which lasts the longest.

i'm planning to buy a new computer, cause i know my geforce 2 mx 32 mb can't handle anything at this point.
but what worries me is 3 things.
1. should i buy the 9800 pro? will this last longer on other games
2. or the geforce fx 5900, maybe this will have more power when the new drivers come out.
3. or maybe i should wait for the new cards that are due in november.

I mean, anyone got any suggestions here? right now, im more concerned about the whole gefore vs radeon thing, which is actually best, for all games?
 
Nividia users shouldn't worry yet, but they should on september 30th. The numbers are there, and I feel sorry for FX 5900 owners. I really do : )
 
DOOM3 will run ATI card as well as Nvidia. with both great image quality, maybe ati will have 10 fps less or 3% less eye candy image quality than the later.

I am going for ATI,
 
AFAIK, (as John Carmack himself said), it's nVidia hardware that have to work with a special path, with lower precision and quality, in order to gain speed.
And that's valid for both OpenGL (Doom3) and D3D.

I would be VERY pissed off to have a GeForce FX 5900 right now :p
 
I wonder if Valve will let the ATI cards use the optimized (read: ugly) DX9 shaders if it gets such a performance increase on the 5900.
Then people with ATI cards could sacrifice some shader quality and run with AA and/or AF at a solid frame rate.
 
Originally posted by Pr()ZaC
AFAIK, (as John Carmack himself said), it's nVidia hardware that have to work with a special path, with lower precision and quality, in order to gain speed.
And that's valid for both OpenGL (Doom3) and D3D.

I would be VERY pissed off to have a GeForce FX 5900 right now :p

Thats true, gamers depot sent Carmack an email about PS performance and he gave an reply saying, Nvidia cards were being run on a lower bit rate for some shaders. Which kind of nulifies the Doom3 benchmarks where Nvidia was winning.
 
One thing I still find a bit suspicious is that Valve, who is closely partnered with ATi, blasts ATi's biggest competitor at a press event sponsored by ATi less than three weeks before the hottest game of the year is set to be released. The time and manner in which Valve chose to "expose" nVidia is a little fishy, to say the least.

At the same time, the gaming public appears to hold the misconception that all nVidia cards will perform poorly when it's only the FX series cards and then only if they're used in DX9 mode (which is the crux of the issue, seeing as they're billed as DX9 cards). Unfortunately, Valve doesn't seem in a big hurry to correct this misconception by telling owners of previous nVidia products that the game should perform as expected on their hardware.

In other words, if you don't own an nVidia FX card, you have nothing to worry about, and even if you do have an FX card, you'll be able to get playable performance but at the loss of DirectX 9 features, at least until nVidia "fixes" the problem.
 
Another Gabe quote:

Right now we have a deal to combine HL-2 with ATI hardware. If you were looking at the benchmark data we were, then it wouldn't be hard to figure out what the best hardware was to put together with our software for customers to play on. The deal doesn't determine the benchmark results, the benchmark results determined who was best for a deal.

They partnered with ATi, apparently, due to the results they got using their hardware.
 
Originally posted by Mountain Man
At the same time, the gaming public appears to hold the misconception that all nVidia cards will perform poorly when it's only the FX series cards and then only if they're used in DX9 mode (which is the crux of the issue, seeing as they're billed as DX9 cards).
Yes, the FX cards are the only ones having problems... but if you look at the DX8 test you'll see the Ti 4600 beating the FX 5200 and 5600.
It barely lost to the 5900.

Something is seriously screwed up in the FX cards... and not just in DX9 performance.
 
Back
Top