ATI or NVIDIA?

ATI or NVIDIA?

  • ATI

    Votes: 121 72.0%
  • NVIDIA

    Votes: 47 28.0%

  • Total voters
    168
Status
Not open for further replies.
Originally posted by teobg
Not if it's a hardware problem, like the NVIDIA FSAA bug ...

Are you talking about the HALF-LIFE 2 bug? Because that occurs in both ATI and Nvidia... they have "hope" of fixing it for ATI at least though.
 

I think this is a great article with a few significant flaws in its benchmarking.

Firstly, the Doom 3 numbers. Anand acknowledged that he could not get the 9800P 256MB to run the tech demo properly, yet he includes the numbers anyway. This strikes me as not only incorrect but irresponsible. People will see 9800P 256MB numbers and note that its extra memory makes no difference over its 128MB sibling, yet only if they read the article carefully would they know that the driver Anand used limits the 9800P 256MB to only 128MB, essentially crippling the card.

Also, note the difference between Medium Quality and High Quality modes in Doom 3 is only anisotropic filtering (AF), which is enabled in HQ mode. Note that forcing AF in the video card's drivers, rather than via the application, will result in higher performance and potentially lower image quality! This was shown to be the case both in a TechReport article on 3DM03 ("3DMurk"), in forum discussions at B3D, and in an editorial at THG. Hopefully this will be explored fully once a Doom3 demo is released to the public, and we have more open benchmarking of this anticipated game.

Secondly, Anand's initial Quake 3 5900U numbers seemed way off compared to other sites that tested the same card in similar systems at the same settings. At 1600x1200 with 4xAA 8xAF, Anand was scoring over 200fps, well higher than any other review. And yet, after weeks of protest in the forum thread on this article, all that happened was the benchmark results for 12x10 and 16x12 were removed. The text, which notes:

"The GeForceFX 5900 Ultra does extremely well in Quake III Arena, to the point where it is CPU/platform bound at 1600x1200 with 4X AA/8X Anisotropic filtering enabled."

was left unchanged, even though it was based on what many assumed were erroneous benchmark data. I can only conclude that the data were indeed erroneous, as they have been removed from the article. Sadly, the accompanying text has not been edited to reflect that.

Thirdly, the article initially tested Splinter Cell with AA, though the game does not perform correctly with it. The problem was that NVIDIA's drivers automatically disable AA if it's selected, yielding non-AA scores for what an unsupsecting reviewer believes is an AA mode. ATi's driver allow AA, warts and all, and thus produce appropriately dimished benchmark numbers, along with corresponding AA errors. The first step at correcting this mistake was to remove all Splinter Cell graphs and place a blurb in the driver section of the review blaming ATi for not disabling AA. Apparently a second step has been taking, expunging Splinter Cell from the article text altogether. Strangely, Splinter Cell is still listed in the article's drop-down menu as p. 25; clicking will bring you to the one last Quake 3 graph with the incorrect analysis, noted above.

Finally, a note on the conclusion:

"What stood out the most about NVIDIA was how some of their best people could look us in the eye and say "we made a mistake" (in reference to NV30)."

What stands out most to me is that NVIDIA still can't look people in the eye and say they made a mistake by cheating in 3DMark03. Recent articles have shown NVIDIA to be making questionable optimizations (that may be considered cheats in the context of a benchmark) in many games and benchmarks, yet I see only a handful of sites attempt to investigate these issues. ExtremeTech and B3D noted the 3DMark03 "optimizations." Digit-Life has noted CodeCreatures and UT2K3 benchmark "optimizations," and Beyond3D and AMDMB have presented pictorial evidence of what appears to be the reason for the benchmark gains. NVIDIA appears to currently foster a culture of cutting corners without the customer's (and, hopefully, reviewer's) knowledge, and they appear reticent to admit it at all.

I realize this post comes off as harsh against both Anand and NVIDIA. In the initial comment thread on this article, I was gentler in my (IMO, constructive) criticism. As the thread wore on for weeks without a single change in the multiple errors perceived in the original article, I gradually became more curt in my requests for corrections. Anand elicits possibly the greatest benefit of the doubt of any online hardware reviewer I know, as I've read his site and enjoyed the mature and thoughtful personality he imbued it with for years. I'm sorry to say his response--rather, his lack of response, as it was only Evan and Kristopher, not Anand, that replied to the original article thread--was wholly unsatisfactory, and the much belated editing of the article into what you read today was unsatisfactory as well. I would have much preferred Anand(tech) left the original article intact and appended a cautionary note or corrected benchmarks and commentary, rather than simply cutting out some of the questionable figures and text.

Consider this post a summation of the criticism posted in the original article thread. I thought they would be useful to put this article in context, and I hope they are taken as constructive, not destructive, criticism. The 5900 is no doubt a superior card to its predecessor. I also believe this article, in its current form, presents an incomplete picture of both the 5900U and its direct competition, ATi's 9800P 256MB. Hopefully the long chain of revelations and commentary sparked by and after this article will result not in hard feelings, but more educated, thorough, and informative reviews.

I look forward to Anandtech's next review, which I believe has been too long in coming. :)
 
I'm going with ATI because really, what difference does it make if the 5900 Ultra gets 20 more fps in UT2003 or whatever if both cards are supplying framerates of over 60? Your eyes won't even be able to tell the difference.

Right now ATI has the best image quality with AA, so I hope to upgrade to the 9800 pro.
 
Originally posted by LoneDeranger
That is called bribing, not marketing. Whether ATI cards work better with HL2 FSAA accidentally or by design, you can't deny that it's a fact.

It's a fact even though they didn't fix the problem with the FSAA for ATI cards yet either? Man.... Lol.

And even if they do fix the FSAA problem for ATI cards, that doesn't say it will have better performance (framerate).
 
I still havn't seen substantial proof that Ati cards are more efficent then Nvidea cards.

I will continue purchasing Nvidea and recommending it to others, as it has never let me (or any other people who I know) down.
 
I wrote this "short" post in the "this whole "cheating" thing topic", but I thought I'd post it here too. It's a short comparison between the 9800pro and 5900U:

First of all, let me start by saying that I am NOT a fanboy of either company; I'm a hardware enthusiast.

Let's start with the most "important" question:
Which card is the better one?

Now, what is the definition of "the best card". Is it the card that gets a few more fps in a game than another card, or is the cheapest high-end card on the market? Is it the overall performance, perhaps? Well, in my opinion, it's the overall performance.

There is no "proof" that either one is better (9800pro and GFX5900U). There are many reviews out there, and I'm sure you've read alot of them (or perhaps not?). Many reviews has come to many different conclusions, but they've also come to different results in benchmarks, games and the like.

These two reviews/comparisons show that the 9800pro is better than the GFX5900U:

Review 1

Review 2

These two, though, show that the FX5900 is better than the 9800pro:

Review 3

Review 4

When deciding which graphics card you want to buy, you'll obviously read a number of reviews to get the picture of which is the better one. Today, it's much harder to get the "right" picture because of the fact that both ATI and Nvidia have been known to cheat (some said, or still says, that Nvidia "crimes" were worse, though that comes to a matter of opinion) in benchmarks by optimizing their drivers. While doing a little review of them both my self (9800pro vs. FX5900U), I came upon something intresting. I had come to the point when I would run 3d mark 2003. First, I ran the benchmark on both cards, and I could see that the FX5900U had a small, though very noticeable lead over the radeon card. What I did then, was to change the name of the .exe file from 3dmark03.exe to whatever.exe. To my slight surprise, the 9800pro now came on top of the 5900U. The 9800pro also droped on percent, though that wasn't as much as the nvidia card lost in points. By this I'm not saying that the 9800pro is better than the FX5900U, or vice versa, though it's just one example of the cheatings we've been hearing so much about lately.

One thing every reviewer seems to agree on is that the 9800pro has better image quality than the FX5900U. Do you know how they came to this conclusion? They took a screenshot of a game (most commonly UT2003), studied it for several minutes, then came to the conclusion that the radeon had slightly better image quality. Well, let me tell you something, you will *not* notice any differance when playing games.

One thing that I can tell you, though, is that the radeon card has a slight performace lead when it comes to FSAA (FullScreenAntiAnialising) and AF (AnisotropicFiltering), though the difference's hardly "good enough" to make you buy a 9800pro instead of a GFX5900U.

I noticed that the 9800pro was a better overclocker. It could go from 380/340(680 effective) to 470/370(740 effective) with the standard cooling. The FX card came up to 480/880 from 450/850. Both ran 100% stable after the overclockings, and they ran with the standard cooling system that came with the card. Though I must remind you all that this has more to do with which 9800pro/5900U card you buy, not the chipset itself. The cards I had was Sapphire Atlantis Radeon 9800pro and Gaindows GeforceFX 5900 Ultra. Now the Sapphire card is known to very overclocker friendly in comparison to, for an example, the Hercules version of the card. Sometimes you will notice very large differences of how much you can overclock the cards. I'm sure that many people have 5900U cards that will overclock quite alot more than my Gainward.

Now, back to the benchmarks. I wanted to get a good view of how well the cards would perform, so I ran quite alot of games and benchmarks to test them. I'll list some of them:

*UT2K3*
*Unreal2
*GTA: VC
*Half-Life: CS -
*Splinter Cell
*NWN*
*Mafia
*BF1942
*Serious Sam: the second encounter*
*Quake3*
*Jedi knight 2: Jedi outcast*
*Anarcy Online
*Max Payne
*Aquamark
*3d mark 2001SE -
*3d mark 2003

Well, I actually ended up listing all of them. As you can probably see, this was *very* time consuming. As you may've noticed, I've put a little star "*" after a few of them. The 5900U were the winner in the games/benchmarks that have a "*" after them.
Something to note was that the 9800pro performed horribly with FSAA turned on in NWN. I've heard that NWN is nvidia optimized, though this is something I cannot confirm. It should be noted, though, that no card ever had any real "über power performance lead". I've put a "-" after the 3d mark 2001SE. It was so f***ing
close that I just couldn't put ATI as the winner in (there was a 12 point difference). I got 20087 with the 9800pro (processor and graphics card overclocked) and 20075 with the FX5900U. They was a 54 point difference when they weren't overclocked (in favor the 5900U), though that still wasn't enough to put on of 'em as a winner.

Moving on to driver stability. I've heard alot from both "sides" that the ATI/Nvidia drivers are more stable/unstable. Now, it is true that the FX5800U could fry up with some older version of the detonator drivers, becuase the fan would turn off when running 3d screensavers. That problem, however, really didn't cause as much damage as some sites made it look. Oh, and if you wonder, that problem doesn't exist anymore. The latest detonator drivers (currently 44.03) and the latest catalyst drivers (currently 3.6) are both rock solid. I havn't any problems what so ever with any of them.

Conclusion: No card is the real winner here. They're so very close in performance. IF you're a fanboy, go with the card that your company has developed. :) If you're just intrested in getting the best card, get the cheapest one. When I did the review, the 5900U was more expensive, though as far as I know, the price differences are close to nothing. The 9800pro is a little better than the 5900U when you turn on FSAA and FA, but the 5900U, on the other hand, has a slight lead in games that requires more from the speed of the card (Q3, for an example).
I'm sitting on a 9800pro myself, but that's because I could get it extra cheap; I found an offer on a site that had got too many cards, and was selling them for 350 dollars (keep in mind that this was last month). You couldn't go wrong with either one

By the way, I didn't post the results of the games/benchmarks in picture formats. For one, my server hosting them is down, but it would also take alot more time to put them all up (not that this post didn't take along time to write, but still). Though, as I hope you can all see, this is a very fair review with no "fanboyism".
Hope you enjoyed the text as much as I did writing it. ;)

*edit* I forgot to put a "-" after CS, meaning that it was a tie, but I've done that now. :)
 
That's some great info there hatred, thanks.

I'm still probably going to go with the 9800 pro, but it is definitely a hard decision.
 
found this an thought since HL2's release date could be anytime now, thought I should revive this thread so confused little n00bs (like the starter of this thread :p)
/me puts hands out over thread
REVIVE!
/me says chant to revive thread
 
Status
Not open for further replies.
Back
Top