NV38 / R360 Half-Life 2 Benches

Read the comments, on Anand's Weblog:

http://www.anandtech.com/weblog/comments.cfm?bid=10

These kids know what they're talking about, at least. I especially like this kids post:

This is some dangerous marketing stuff, meaning some uninformed people that might buy an ATI card might now be confused or changing lanes to Cheatzilla. In other words it stops potential buyers who 'think' that info coming from such a "reliable source" (pun intended) should be taken seriously, just to find out later that the game might look like this:
http://www.iol.ie/~baz8080/crap.jpg
 
yep, shame that assuming they're true, the nvidia card is running on the 'mixed mode', featuring patented reduced image quality(tm) less graphical niceness, especially shaders(tm) and probably a price difference of about £50 minimum (tm).
Wow mummy, I want a Nvidia card for christmas!
 
It doesn't matter. It's been proven through numerous benchmarks with other software, that the 5950 Ultra performs almost identical to the 5900 Ultra (Less than 1% difference in some cases).

These numbers will be what you'll see on the 5900/5900 Ultra/5950 Ultra.
 
Oh word? Then I hope they did those benchmarks using the det 51.75's! I heard those are real winners in the IQ department!
 
All thanks to the Cheatonators 50s + HL2 mixed path for nVidia users.
Got ATi? No need to worry about image quality reduction :)
 
omfg .. this is wieeeerd .. how the hell did they do that ? There is NO way that these benchmarks can be true .. and I think it's a shame that the "guru of 3D" posted this crap in his news. The site just lost my respect :p
 
its still in mixed mode which is dx8.1/9 and ati is running full dx9 mode.....
 
Those are probably using the new 5x drivers... And it wouldnt surprise me if the source is Nvidia themselves or something. Just giving those results show SO much errors. No mention about it being mixed mode (in the text of course). No mention on resolution (also in text, but not in the results). No mention on drivers. Not even a mention on the system. No image comparison. Its a typical propaganda display.
 
Rejoice? Because they're practically equal scores, yet the nVidia card has reduced IQ, lack of full DX9-technology and is the newer card on the market? Hmm.
 
i already said i'm a changed man (started hating nvidia), so i'm not saying anything good in this thread...
 
i wouldnt bash nvidia yet, lets wait for real benchie's result and iq quality picture to compare. You guys are drinking in all this marketing crap from either nvidia or ati. Well I am not better than any of you as I got suked in getting a radeon 9800 pro (which so far isnt that better than my ti4200 in games like RVS or OFP). Anyways lets judge l8r.
 
Originally posted by ale2999
i wouldnt bash nvidia yet, lets wait for real benchie's result and iq quality picture to compare. You guys are drinking in all this marketing crap from either nvidia or ati. Well I am not better than any of you as I got suked in getting a radeon 9800 pro (which so far isnt that better than my ti4200 in games like RVS or OFP). Anyways lets judge l8r.
i know hl2 is going to run FINE in my rig, but not as fine as i wanted, so there!
 
Originally posted by ale2999
i wouldnt bash nvidia yet, lets wait for real benchie's result and iq quality picture to compare. You guys are drinking in all this marketing crap from either nvidia or ati.

Um Ati will be displaying true DX 9 shaders and these Det 50's will be displaying 'mixed mode' effects and you want to wait for IQ quality to compare?

I am going to stick to true DX 9 performance, thanks.
 
From the Anand thread:
I would say calling DX9 a standard is a bit of a stretch. It's not as if it was defined by some independent standard body, like say the OpenGL ARB. Let's call it what it is - a Microsoft specification.

And it seems rather obvious that Microsoft has been somewhat concerned of late with Nvidia dominance in the graphics industry. Nvidia had the audacity to refuse to lower the already slim margin on the XGPU, despite repeated aggressive demands from Microsoft. Aside from that, Nvidia has consistently pushed a multi-OS strategy and OpenGL as an alternative API. This obstinance clearly is intolerable.

Could it be a coincidence that MS gave their seal of approval to Nvidia's struggling competitor and at the same time left Nvidia high and dry with their ambitious 32-bit architecture?

I would say the surprise is not that Nvidia is behind. It's how well they manage to keep up. And it remains to be seen how much of a tradeoff the mixed precision path will be.
Just helps to keep things in perspective.
 
note on the other site.... dont remmeber link will find it after it was doing test of 4xfsaa8xaf and the ati looked better not a lot but u could see the nv card was not realy doing 4xfsaa more like 2xfsaa which is cheating.
 
Originally posted by Frizz
yes its mixed mode

http://www.anandtech.com/video/showdoc.html?i=1890&p=4

also was reading one somoene eles and on dx9 only games ati stil lwon by a long way
Yep, on the Halo bench there is 0.2 fps difference between the NV38 and NV35, 1.6 for Aquamark3, showing that the either:

A) Unconfirmed, incomplete and flawed HalfLife2 results are borked.
B) Thouroughly reviewed and compared Halo/Aquamark3 results are borked.

Guess which one has a 99.9% chance of being the correct one? :p
 
Originally posted by dawdler
Yep, on the Halo bench there is 0.2 fps difference between the NV38 and NV35, 1.6 for Aquamark3, showing that the either:

A) Unconfirmed, incomplete and flawed HalfLife2 results are borked.
B) Thouroughly reviewed and compared Halo/Aquamark3 results are borked.

Guess which one has a 99.9% chance of being the correct one? :p

you forgot c

c) they put every cheat in the book to get the half life 2 benchmark to have good scores on there cards.
 
Originally posted by shapeshifter
you forgot c

c) they put every cheat in the book to get the half life 2 benchmark to have good scores on there cards.

No, he didn't need your "C)". Your "C)" = A) Unconfirmed, incomplete and flawed HalfLife2 results are borked.
 
It is funny how ultimately everything leads to the big king sitting on the top of the hill playing with his minnions none other than ours beloved Microsoft.
 
i would like to see how the other cards will be running this game as well. so they came out with the nv38, whoop de doo.its not like the majority will be using the 5950 anyway. does nvidia expect their customers to buy their new card, which is more expensive than the radeon 9800 which performs virtually the same, just to play ON a nnvidia card? what about the 5900 and 5600/5200 users? is nvidia just going to forget about them? how about some more benchmarks from cards other than their flagship.
 
Originally posted by fudnick
what about the 5900 and 5600/5200 users? is nvidia just going to forget about them? how about some more benchmarks from cards other then their flagship.

they forgot all about those few 5800 users :D, hell they even removed all mentions of it from their site.
 
haha im so happy now i actually own a fx5900u. By the way, who plays a game because of graphics? And, the diffrence between mixed mode dx8.1 and dx9 are never noticable while your playing the game. I can almost gurantee it. Their also comparing the gfx5900 with the 9800xt which i think is a bit better.
 
Originally posted by KaoS87
haha im so happy now i actually own a fx5900u. By the way, who plays a game because of graphics? And, the diffrence between mixed mode dx8.1 and dx9 are never noticable while your playing the game. I can almost gurantee it. Their also comparing the gfx5900 with the 9800xt which i think is a bit better.
who plays a game becuase of graphics? LOL glad not every one thinks like you, or we would still be in the area of blake stone graphics.
 
Originally posted by KaoS87
haha im so happy now i actually own a fx5900u. By the way, who plays a game because of graphics? And, the diffrence between mixed mode dx8.1 and dx9 are never noticable while your playing the game. I can almost gurantee it. Their also comparing the gfx5900 with the 9800xt which i think is a bit better.

Sorry to burst your bubble, but the Half-Life 2 Benchmark might have only ran this well on the Nvidia card because of scripts that enable clip-planes, and other hacks similar to the 3dMark2003 hacks that gave the Nvidia cards higher numbers.

I wouldn't be surprised at ALL if the game actually performs a lot worse.
 
Graphics arn't everything in a game i'd happily take something that looked a little worse but played better. I can't say the same thing when it comes to graphics cards tho, by their very name its that graphics and only the graphics that count.
 
Back
Top