Benchmark Result Debate - Lies??

Nvidia went to .13 micron chips TOO EARLY, and now they're paying the price. ATi just made a better business decision sticking with .15 micron.

Of course the real market is the mid range cards, not the top of the line. However, ATi seems to be winning there too.

The only thing Nvidia is doing well is mobo chipsets, they better concentrate on that before they go the way of 3dfx...
 
Originally posted by Ralphus
Nvidia went to .13 micron chips and now they're paying the price. ATi just made a better business decision sticking with .15 micron.

Of course the real market is the mid range cards, not the top of the line. However, ATi seems to be winning there too.

The only thing Nvidia is doing well is mobo chipsets, they better concentrate on that before they go the way of 3dfx...

ATi used .13 micron for the 9600 pro.
 
It was a PR statement from Nvidia , they said that there new videocard would use FP32. So Microsoft made FP24 a requirment for DX9. So its really nvidias fault. Sure it might look good on paper but when the time came it didnt work so well. DX9 is Nvidias nightmare.
 
Originally posted by Xtasy0
ATi used .13 micron for the 9600 pro.

Nvidia went with it too early and missed a couple of product cycles (got behind).
 
Originally posted by Xtasy0
thats in the game (well the menu at least with the console), notice the choices on the bottom left...and the console window...

Any other pics from that press event?
 
would be nice if ati used the .13 on the 9800xt, then it could really get some mhz over the 9800pro (which uses the .15)
 
Originally posted by Ralphus
Nvidia went with it too early and missed a couple of product cycles (got behind).

yes i know, the person i quoted said ati stuck with .15 micron though, which isnt true :) i was just clarifying.
 
Originally posted by TAZ
I just bought a Radeon 9700 pro and with the newest Cat 3.7 drivers all my games seem relatively dark except when I look at the light (bulbs, sky, glare) and then it's almost blinding. Never experienced this before on my old GF 4600. Sure the games look great and run smooth, but it's hard to play FPS games the way I used too when the enemy is to hard to see. I tried playing with the gamma/contrast/brightness hotkeys to try and get a balance in the game, but then other games are too bright and/or dark. It's annoying as hell, and me thinks of going back to Nvidia since I never had to fiddle with that crap to enjoy gaming with those cards..

You've been having lighting problems with your 9700 and the new Cat 3.7 drivers?? I've got a 9700 pro as well, but everything works just fine for me. Maybe its your monitor? Some things just don't look the same of some monitors.... Or you could try re-installing the new drivers....
 
In response to nVIDIA:

http://www.techreport.com/etc/2003q3/valve/Image3.jpg

Above is a list of some of the changes that NV made to their drivers so that their HL2 benchmarks would appear better. If it's true, this is just as slimy and shameful as the 3DMark fiasco. I'll break it down to the best of my ability:

Camera path-specific occlusion culling

Means NV looked at the camera-path used in the benchmark, and hand-coded the drivers not to render certain polys because they knew those polys wouldn't appear in the static path. This does nothing for real, dynamic performance. It's BS.

Visual quality tradeoffs

This one isn't as bad, it just means DX9 ATI looks better than the DX9 NV hack.

Screen-grab specific image rendering

Here's one that someone should get fired for. The NV drivers detect when the user is making a screenshot and TURN ALL THE DETAIL BACK ON. So this way people can't take two screenshots and compare the visual differenced between HL2 ATI and HL2 NV. BS!!!

Lowered rendering precision

Again, a hack that degrades visual quality. Carmack talked about this in depth.

Algorithmic detection and replacement

I'm lost on this one, anyone know? If I were to guess, I'd say that the NV drivers check for certain types of video code in HL2 and replace that with their own video code.

Scene specific handling of Z-writes

Pathetic. Any scene-specific hacks NV added for the benchmark only serves to inflate numbers artificially. Unless their driver team is prepared to write scene-specific hacks for every DX9 game that ever comes out. :upstare:

Benchmark specific drivers that never ship

This is BEYOND slimy. Valve is saying that NV made these kinds of changes to a detonator branch that will never be released, NV just gave this version to Valve so that the initial HL2 benchmarks would look better. What a surprise for their customers when we try to play HL2 at those settings and get terrible framerates! Oh well!

So get this kids: While NV is saying that they're 'confused' on why Valve didn't use the beta 50 drivers for the benchmarks, Valve says that the hacks in the beta drivers NV tried to give them were never intended to ship! I'm a little more inclined to believe the company that wants their game to run as good as possible on all video cards (Valve) than a company who blatantly tried to cheat 3DMark03 (NV).

App specific and version specific optimizations...

Meaning that instead of having hardware and drivers that follow the DX9 standard, NV is putting little exceptions into their driver code for HL2 to try to make it look better in benchmarks. Again, if NV is going to write these for every update of HL2, and every other DX9 game that comes out, great. If not, it's BS.
 
Originally posted by TAZ
My Geforce 4 ti 4600 is in my Athlon 1800+ which had a geforce 3 ti 200 , this way me and my son can play HL 2 together.. my 2 ghz Athlon 2500 (200x10) should be more then okay to run HL2 with my 9700 pro, but I might upgrade the cpu to a 3000 xp before then.

And I adjusted everything .. Gamma, Contrast, brightness... it's rather too damn dark in places, or way to bright where the light is when I make the dark places more visible. There probably is a setting to make it all clear, but why the hell should I have to play with it in the first place? Should this not be set at defaults? Or could it be my monitor with Xp since when i install the monitor drivers it says it's not recommended by MicroSuck?


Ohhh!!1 You installed drivers for you monitor.... YOU IDIOT!!! if it was working fine before, why did you bother with drivers. Most monitors can be used as PnP. No need for drivers. If I were you, I'd rollback those drivers or completely remove them.

g'luck with making your monitor brighter...
 
Personal Opinion:

Nvidia had a number of problems. The Ati 9700 was top dog. Nvidia's chip manufacturer was having problems moving to 0.13 micron, and the 5800 chip was running way too hot and was way too late. They panicked and released the 'heater with a jet-pack' onto the world....we all know what happened.

Nvidia, knowing the Ati 9800 was just around the corner, released an updated version, the 5900 which ran cooler and a bit quicker. Unfortunatly, the problems in DirectX9 were still there. No time to fix the problems, they needed a card to put out their fast.

No one would know until the first DX9 games came out that Nvidia hardware was not up to rendering DX9 shaders. Lucky for Nvidia, their cards were just as good as ATi's at DX8.1 rendering.

Then, UT 2003, Hl2 benchmarks arrived. Nvidia tries to cover up its badly designed hardware by 'upgrading' drivers that omit things viewers normally dont see. All because Nvidia made its hardware wrong after $5 billion of investment in the FX series.

IMHO, Nvidia is where 3DFX were with the Voodoo 3. After making a bad product and them an even worse one, they go down the shit hole. Luckily for Nvidia they also have their NForce2 to keep them floating.

IMHO, Ati should seize this chance, just as they seem to be with their relaly competative prices. Nvidia have handed them the crown, they had better run with it.

Nvidia, if your listening......reduce your prices, redesign your cards, or you will go the way of 3DFX.
 
And thus another graphics card company will be born to take it's place.
 
Originally posted by TAZ
All games are competitors for GOTY :)

Even that stupid Sims crap which most ppl somehow enjoy and buy.

That raises an interesting question: Was Doom3 delayed until 2004 so it could be game of the year then? Since they knew HL2 was gonna steal the show just like E3, they said, "Screw it, they can have GOTY for 2003. 2004, here we come baby!"
 
Here is what Nvidia's response says to me:




Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers(We're gonna hack the sh*t out of these drivers to get what looks like a performance increase) - which reviewers currently have a beta version of today.(HAHAHAHA, Those reviewers are the mothers of our coders. Who the hell is going to find out?) Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.(These are the WORST drivers we've ever made. Optimizations? What optimizations? If anything, these drivers just make things worse. Highly-Programmable? This just means that game developers have to work longer, harder hours coding their game spcifically for our GeForceFX's. It also means that we get more time to negociate with game developers to get them to be a part of our "Way it's meant to be played" campaign. Oh yeah, boy did we fu*k over those 100+ million customers of ours....)

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation.(We can't figure out a damn hack to make this fecking fog work!!!!) It is not a cheat or an over optimization.(Yes its a cheat that over optimizes) Our current drop of Half Life 2 is more than 2 weeks old.(Gabe won't give us the latest drop, so now we don't know how the game ends1!) NVIDIA's Rel. 50 driver will be public before the game is available.(As soon as we get done writing those h4x0rz) Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit.(yeah, we can't figure it out, so we are just going to give up on PS2.0) Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation.(No, really. We are actually telling the truth now. We can't figure out this PS2.0 sh*t. Feck those ATi guys for figuring it out before us) Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.(I pole dance)

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other.(It mainly disadvantages us, those ATi bastards!) The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.(This is a lie. ATi got to choose their settings before us...boo hoo)

In addition to the developer efforts, our driver team(Our coders parents actually do all the work. Hahaha, bitches!) has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board.(We actually figured out how to make this h4x work) The fruits of these efforts will be seen in our Rel.50 driver release.(Yes, we are fruity fruitcakes) Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe(Screw that fat bastard. He totally screwed us over!) to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.(Losers)
 
Originally posted by Anthraxxx
Screw that fat bastard. He totally screwed us over!)

Well, if that is from NVIDIA, there goes any chance of Valve working with them in any capacity again.
 
Originally posted by spitcodfry
Well, if that is from NVIDIA, there goes any chance of Valve working with them in any capacity again.

You do know it wasn't. Right?
 
Originally posted by spitcodfry
Well, if that is from NVIDIA, there goes any chance of Valve working with them in any capacity again.

yeah dude. That's just how Nvidia's response looks like to me. It's not real.....or is it?
 
i believe absolutely NO company that is selling video cards and I now will only listen to Gabe cause they tested things clean and clear without bells & whistles. I am sooo glad they called out Nvidia, i bought 2 of their cards and neither played up to it's hype. I've been ATI ever since and very happy! Hello radeon 9800 PRO!!
 
w00t! Radeons rule!!! I wonder how my 9700 will perform in HL2....
 
Shame! Shame on Nvidia for crying rape because of their driver version. Thats lame and im probably not going to buy another Nvidia graphics card for awhile.
 
I think it's safe to say that Nvidia got pwned! Just look at anandtech.com's benchmarks....
 
Its not lies...

Nvidia is basicly saying this (loose translation):

"We here at Nvidia thinks DirectX9 over little quality difference from DirectX8. So therefor we do not want our customers to use DirectX9. This is The Way Its Meant To Be Played after all."

In about 2 weeks, I expect there will come another paper from Nvidia showing that DirectX7 is both faster AND with negligable quality losses, and that the new NV40 will be highly tuned for it.
 
And my contrast is at 100%, brightness 50% and I fiddled with them all, the monitor, the desktop and the D3D settings to try to get a good balance.. but when i finally do, and play another game, I have to do it again.. and yes i save the profile for the game i finally set. It's just a pain that I never had to do before with my GF card.

never had any probs with my 9700np. everything looks perfect, and it runs excellent. try to remove the monitor drivers.
 
Back
Top