Futuremark 340 patch results...

Originally posted by Habibapotamus
The patch seems to slow down nVidia quite a bit!

you mean it stops nvidias "optimizations"[Read: Cheats] from working.
 
not news to me.......nvidia has "optimised" so much for 3dmark it makes it a useless comparison tool anymore.

I still think its funny when people are like "i just lost 40 3dmarks with these new drivers....they must be garbage".

it becomes more useless everyday.....alot of review sites dont even use it anymore.
 
Naw, not really news. We KNEW Nvidia did some very dubious optimisations... Even if shaders do produce a fairly accurate image, they have to be specificly tweaked for Nvidia hardware.

And of course, its not really any point with comparing the new drivers anyway. The new patch only shows 3Dmark specific optimisations removed, not the global optimisations done on the new drivers.

Heres another: http://www.beyond3d.com/articles/3dmark03/340/
Take note to read in the end, with comments from both ATI and Nvidia.

I love this one:
At one we asked Derek how this sat with the optimisations guidelines that were given to press by NVIDIA, specifically the guideline that suggests "An optimization must accelerate more than just a benchmark" To which Derek's reply was "But 3DMark03 is only a benchmark" -- we suggested that this particular guideline should read "An optimization must accelerate more than just a benchmark unless the application is just a benchmark"!
 
it definitly will be interesting to see where nvidia goes from here. everyone knew that they cheated for a long time and now FM is trying to save face show some more evidence. of course nvidia can just skip around 3dmark now and continue to optimize for other applications and leave 3dmark behind. it would do them no good, but they could bypass having to change alot of code. i think the most simple thing for them to do now is to own up to everything and start over.
 
best run ever.... see my specs, ran my computer at that, all defaults, no fancy stuff...

got a score of 1998, thats right, almost 2000, and they said no directx8.1 card or lower get to 2000 or more... ill prove them wrong!!!
i love my card, its an albatron, it runs better then any other benchmark anyone has posted on futuremark... ive seen like two scores above me, at most like 80 points more, ( they had a p4 @ 3.4ghz) and my athlon 2.2ghz was right up there) i can only image what my beauty could do with a dx9 card ^.^ :D
 
Originally posted by Washuu
best run ever.... see my specs, ran my computer at that, all defaults, no fancy stuff...

got a score of 1998, thats right, almost 2000, and they said no directx8.1 card or lower get to 2000 or more... ill prove them wrong!!!
i love my card, its an albatron, it runs better then any other benchmark anyone has posted on futuremark... ive seen like two scores above me, at most like 80 points more, ( they had a p4 @ 3.4ghz) and my athlon 2.2ghz was right up there) i can only image what my beauty could do with a dx9 card ^.^ :D


this kind of thing was exactly my point...

....why do you care about 80 points on this all but useless test?


Do you want to know why your cpu was so close to that p4? Becuase this test has almost nothing to do with your CPU.....i could run it on P3 500mhz and get the same score if i had the same gfx card.
 
Damn... I never really knew that 3DMark03 was becoming so irrelevant these days. I think it's mainly 'cause it looks soooooo damned good (looks like it was designed by 2Advanced.com... not sure, though.)

Is AquaMark3 any good?
 
Originally posted by NSPIRE
Damn... I never really knew that 3DMark03 was becoming so irrelevant these days. I think it's mainly 'cause it looks soooooo damned good (looks like it was designed by 2Advanced.com... not sure, though.)

Is AquaMark3 any good?
AquaMark3 is nearly as bad as a benchmark.

The problem with 3DMark is that it isnt a game engine. It really doesnt reflect game performance in any way, not now or the future. Its fancy effects, and that's it.

The problem with AquaMark3 is that for a first only a few of its 100+ shaders are DX9. So it isnt a modern benchmark as it claims to be. Further, the company is VERY doubtable when it comes to Nvidia performance. Because original games was TRUE TWIMTBP titles. Lots of Nvidia optimisations already in the engine. What does AquaMark3 have? No one knows.

Though its only nearly as bad, Aquamark3 wins over 3DMark cause it can test more, and its slightly more 'game accurate' performance.
 
From what I've seen, AuqaMark3 can give a nice indication of the power of your CPU and GPU. But comparing 0.1 FPS differences is rather pointless, as the real performance can differ greatly from game to game.

The visual scenes in AquaMark3 are great for impressing your friends, though. Same with 3DMark03.
 
Hmm. So what would you guys suggest for the best bechmarking tool out there?

'Cause I'm currently downloadin' AquaMark3, 'cause I donno' what else I should download :).
 
Originally posted by NSPIRE
Hmm. So what would you guys suggest for the best bechmarking tool out there?

'Cause I'm currently downloadin' AquaMark3, 'cause I donno' what else I should download :).
Every application is a benchmark :)
The best benchmarking is done with a variaty (10+) of applications and settings and analysing the result images. There is no other way.
And if Nvidia will have their way, ALL benchmarks (including games) will be useless. Cause any application using pixel shaders (standard in all benchies) will be 'optimised' on the fly. A benchmark is only a benchmark as long as both cards are put through the same test. With Nvidia's system, that is impossible.
 
Originally posted by NSPIRE
Hmm. So what would you guys suggest for the best bechmarking tool out there?

'Cause I'm currently downloadin' AquaMark3, 'cause I donno' what else I should download :).

Download Aquamark and 3DMark 2003.. I mean all you're going to do is compare scores anyways.
Aquamark is VERY CPU intensive while 3DMark is pretty much GPU.
My Aquamark is like 44,500 and my 3d03 is 6100. Seems good enough for me, it's not like the benchmarks tells you that you have a good pc or anything.
Let us know what you score!
 
Alright, thanks for all of that advice. I'll be downloadin' AquaMark3 then later today. :p

PS - I'll let ya' know what I score :D.
 
Originally posted by nsxownzme

Let us know what you score!

no....the whole point of this thread is that those scores dont mean jack shit.

Why is that so hard to comprehend?
 
from harocp article:


What about NVIDIA?

NVIDIA has been caught with their hand in the cookie jar this year. They got busted cheating at 3DMark2003. I am not sure if there were upper level management involved in the decision to "aggressively optimize" for 3DMark2003, but surely it was done regardless of motive. While the synthetic video card benchmark was already headed down a rocky and treacherous road, NVIDIA stepped in with their cheating and drove the bus right off the cliff and into a blazing fireball. Yes, NVIDIA killed the synthetic benchmark in my eyes. Did we know the benchmark could be compromised? Of course we did. Did we ever think any company would have the guts to twist the results the way NVIDIA did? Honestly, I did not. How naive I was.

NVIDIA is far from being "clean" on this whole deal, but I am unsure of whether or not Futuremark has not stepped too far the other direction this time. Is it OK for Futuremark to say, we like the way ATI does things with their drivers so we are going to leave that alone? Then look at NVIDIA and decide that NVIDIA is doing it "wrong"?

I could care less if NVIDIA has to hire 100 times more engineers to get the same results as ATI. I do not care if they have to optimize. I do not care if they have to do more work than ATI to get the same image quality. It really makes no difference to me and really it will not make much difference to the end user buying the card. He just wants the games he plays to perform well with stunning visual quality.



Our Thoughts:

A benchmark is worthless to me if overnight the results can change by 15% without proper explanation as to why exactly that happened. Futuremark knows very well what exactly has changed with their benchmark but has not filled in the public that pays attention to their tool. That is inexcusable. Shame on you Futuremark. If they are going to take such actions, they should be accountable for them beyond a spineless PR ramble that tells us nothing.

Who knows what NVIDIA is doing to twist the benchmark results? We know from their track record that they are not above cheating. Is that what they are doing here again? I do not know and I would not count on Futuremark to ever tell us the truth as we saw them eat a lot of crow last time they came out and said NVIDIA was cheating. Right or wrong, NVIDIA has more money and a lot more lawyers. I do not much believe anything that I am told by NVIDIA anymore. NVIDIA has lost their credibility this year and that is not something that is easy to regain. They need to stop “PRing” us to death about products that don’t deliver and start getting technology into the hands of board partners that will sell itself once again. Is what they said above true about the latest 3DMark2003 instance? I do not know that either, and I really do not care either as it does not impact realworld gameplay. NVIDIA has shoveled tons of cash Futuremark’s way. They helped build the beast and they are still helping keep it alive. My thought is “deal with it”.

As we said in the opening, games are getting so very complex and diverse that a single tool such as 3DMark2003 is of little or no value. Aside from all the cheating and optimization, 3DMark2003 is simply too narrow of a look at gaming to accurately represent the big picture. Is it a fun tool that can easily be used by the enthusiast? Of course it is and it can be a ton of fun to use and watch or compare data with other enthusiasts. It has a great value there. Should hardware buyers, all the way from you and me to giants like Dell and Compaq/HP, be using 3DMark2003 as a tool for making a purchase? No, we should not. I think it is irresponsible to do so.



Our Options?

If you check our last couple video card reviews (here and here), you will see that we no longer "review" video cards using traditional benchmarks. It came to me one night that what we were doing was all wrong. As a computer hardware reviewer, we are still stuck in that 3dfx frames per second mindset, even though I thought I had already come to terms with that. It seems we were in denial or truly did not understand the whole issue, more likely being the latter.

My thought was that we should NOT be reviewing video cards, but rather evaluating the experiences they provide while gaming. What value is a 3DMark2003 score of an incredible 10,000 points, if playing games on my new video card simply sucks? Exactly, it is of no value.

We have gotten away from the normal "benchmarks" and started focusing on actual performance and image quality in real gameplay with retail games and demos. Yes, the same exact ones that you will buy or download. Yes, and the same drivers that you will have access to as well. There will still be some instances that we have product and drivers that are not public yet, but we are going to stringently focus on retail product. Performance and IQ are the two things that really matter, with driver stability and hardware compatibility being a close third and fourth.


Conclusions & Delusions:

Bottom line is that no company is beyond reproach when it comes to your money being on the table. You need to be able to make an informed decision about the hardware you are buying and that decision is becoming more difficult and not as clear as it used to be. We are going to try our best to help you make that informed decision. My thought is that no current synthetic benchmark is going to tell you really what you need to know when it comes to the gaming experience that a video card and driver will deliver.

If you take but one thing away from our editorial, remember this. The gaming experience supplied by a video card is of the utmost importance for most of you purchasing a video card. So logic would tell us that we should base our decisions on that. Games don’t lie.

Be smart and steer clear of brand loyalty as the landscape can change quickly. Vote with your wallet, it is the only real voice we have when it comes to computer hardware.

couldnt agree more.
 
Hahaha. Stupid nVidia.... oh well, at least they TRIED. >__<
 
i couldnt care less if it makes the score less... as long as real game performance is good thats all that matters and the image quality is getting better so theres no problem!

anyhow you cant "compare" results realistically because everything in everybodys computer is different, the power supply, the memory, the memory timings, the cpu, HELL EVEN THE GRAPHICS CARDS DO THINGS DIFFERENTLY!

nonbody's computer is the same so you CANNOT call it a "BENCHMARK"

"A standard by which something can be measured or judged - dictionary.com"

and as nobody's computer is built to a "STANDARD" everything is different in each pc... different software on there and different drivers

so the "benchmark" is a load of rubbish because every computer isnt the same so you cant really compare them based on graphics cards...

TheRook
 
Originally posted by TheRook
i couldnt care less if it makes the score less... as long as real game performance is good thats all that matters and the image quality is getting better so theres no problem!

anyhow you cant "compare" results realistically because everything in everybodys computer is different, the power supply, the memory, the memory timings, the cpu, HELL EVEN THE GRAPHICS CARDS DO THINGS DIFFERENTLY!

nonbody's computer is the same so you CANNOT call it a "BENCHMARK"

"A standard by which something can be measured or judged - dictionary.com"

and as nobody's computer is built to a "STANDARD" everything is different in each pc... different software on there and different drivers

so the "benchmark" is a load of rubbish because every computer isnt the same so you cant really compare them based on graphics cards...

TheRook



the whole point is to see how differant setups/components compare to each other other under a simliar set of circumstances.....

nvidia currently bypasses(and admits to it) those circumstances to make there hardware looks faster than it really is.
 
Well...you see this benchmark may not be the best tool out there as it does not represent real game performance but it is a benchmark.
You have to use it as one though. Sites have the same machine and switch out one thing, the Video Card. Everything else is the same. About 3dmarks database listing, they include the specs and driver info for a reason. You can find a similar setup and get an idea of how a comparable system will perform. But that is only in 3dmark. We don't play 3dmark and every game is setup differently.

So it is a benchmark but it is best to look to the game benchmark that you play for performance indication or have many games benchmarked.

ATI also has had a compiler since CAT 3.6...
 
Originally posted by Asus
ATI also has had a compiler since CAT 3.6...
Yeah, I read that today... Apparently the drop is NOT because of Nvidias fancy shader compiler, the patch doesnt disable the compilers. Meaning, its probably shader replacement we are talking about... Which of course, is not a good way to handle it. Still just theories. Maybe some day someone will crack open the driver and have a peak :)
 
Originally posted by Asus
Well...you see this benchmark may not be the best tool out there as it does not represent real game performance but it is a benchmark.
You have to use it as one though. Sites have the same machine and switch out one thing, the Video Card. Everything else is the same. About 3dmarks database listing, they include the specs and driver info for a reason. You can find a similar setup and get an idea of how a comparable system will perform. But that is only in 3dmark. We don't play 3dmark and every game is setup differently.

So it is a benchmark but it is best to look to the game benchmark that you play for performance indication or have many games benchmarked.

ATI also has had a compiler since CAT 3.6...

this benchmark, no.

i was speaking in general.
 
Yeah, I was responding to therook.
Only added that ati comment at the end after you posted. ;)
 
Not sure if it's related but I'm sure I read an article which showed how the 9800XT reduced image quality in Aquamark 3 where FX5900 did not.
 
What you read was speculation.
It has been figured out that the scene is the exact same it is just that ATI rendered it darker. The same load and work went into that scene and the same score as a result.
 
Back
Top