Synthetic Benchmarks Are Crap

SidewinderX

Newbie
Joined
May 15, 2003
Messages
4,884
Reaction score
0
Well, more evidence is mounting against Futuremark and thier line od 3DMark0x progams. Some people swaer by them as benchmarks to judge thier system by. THey need to learn something. Synthetic Benchmarks, such as 3DMark03, are supposed to show the performance of your system as a whole. from Futuremark's website:
By combining full DirectX®9.0a support with completely new tests and graphics, 3DMark03 Pro continues the legacy of being the industry standard benchmark. The high quality game tests, image quality tests, sound tests and others give you an extremely accurate overview of your system’s current gaming performance.

However, in a recent test by Ace's Hardware, A Pentium III 350Mhz with a 9700 PRO video card was able to score as well as a P4 2.8GHz machine with a 9600Pro and and as well as a 1.4GHz Celeron with a 9700 Pro in 3DMark03. (in the 3 directX 9 test. WoF, the Dx7 test, the scores were as expected)
However, in reall game benchmarks, the PIII was not able to even pretend to keep pace with the 1.4GHz and the 2.8Ghz PCs.

Does 3DMark03 really test your systema s a gaming machine? No.

linky

After testing both 3DMark03 and a variety of real-world games based on DirectX 7, 8.1, and 9, we suspect that 3DMark03 is really more of a video card benchmark than anything else. It seems that some of the game tests, especially Mother Nature, measure video card performance almost in isolation from the rest of the system.

For a synthetic graphics accelerator benchmark, this is an ideal situation. But this is not how 3DMark03 has been represented. It is stated to provide an overview of your "system's current gaming performance." Yet from our results, it has failed in that charge twice, by grossly exaggerating the performance of a 350 MHz Pentium II with a Radeon 9700 Pro and minimizing the performance of a 2.8 GHz Pentium 4 with a Radeon 9600.
 
How is what possible? that a 350mhz score as well as a p4 2.8Ghz? Because synthetic benchmarks are crap.
 
Agree with you Sidewinder... staring ourselves blind on synthetic benchmarks scores is verry stupid!
 
I've started to believe that Futuremark is crap once they took back the statement that nVidia was cheating.
 
...The main problem with any 3D benchmark ( synthetic or game based )
is that the results can be totally screwed by drivers that are "optimized"
for the benchmark.

"Optimized" drivers do not give you any idea of the real speed of the video-card.

i.e. What happens when I play an old game or a new game that the
video driver was not optimized for ? How bad will my $500 card suck ?

This is why I'm looking forward to new video-card reviews that ONLY use
custom benchmarks, or benchmarks that have cheat detection built-in to
them so I can trust the results.


G.F.
 
But still, syntetic benchmarks only benchmark your video card, and in 3DMark03's case, on technologies that are being used in any planned games (ps2.0)
 
...The main problem with any 3D benchmark ( synthetic or game based )
is that the results can be totally screwed by drivers that are "optimized"
for the benchmark.

"Optimized" drivers do not give you any idea of the real speed of the video-card.
but.. those drivers can even be optimized similarly in a game like ut2k3

this horrid CPU scaling you're talking about is 5 months old now; what we will have to do is wait at least 1-2 more years to see if things improve; they may say that it's a real-world test, but in reality, almost 100% of modern games scale with every part of your hardware; it was almost the same case with 3dmark 2001 when that was released; give it a little bit of time before bashing it
 
those drivers can even be optimized similarly in a game like ut2k3
...But what happens when you play a game that the driver has no "optimization" for ?

i.e. I don't plan on buying ONLY the games the driver has "optimizations" for.

"Optimizations" can fool you into thinking an average video card is a ROCKET
with games X / Y / Z, while it might be a total SLUG with everyting else.

Take away ALL the "optimizations", and you get a better idea of how fast a
card is with all 3D games/apps.

i.e. "OPTIMIZATIONS" = CHEATING


G.F.
 
But still, syntetic benchmarks only benchmark your video card, and in 3DMark03's case, on technologies that are being used in any planned games (ps2.0)
...So you have to realize what 3Dmark 2003 is useful for.

Is it useful for testing overall system game performance ? NO

Is it useful for testing DX 7 video card performance ? NO

Is it useful for testing DX 9 video card performance ? YES

Until we have a bunch of DX 9 games to test with, 3Dmark 2003 is one
of the few DX 9 benchmarks we have.

It matters because many of the new games in 2004 will use the same
features. Vertex and Pixel shader performance is going to be a very big
deal with future games ( i.e. Half-life 2 ), so it's good to find out now if
the card you're planning on buying will hack it.


G.F.
 
Originally posted by GordN FreeLoadR
...But what happens when you play a game that the driver has no "optimization" for ?

i.e. I don't plan on buying ONLY the games the driver has "optimizations" for.

"Optimizations" can fool you into thinking an average video card is a ROCKET
with games X / Y / Z, while it might be a total SLUG with everyting else.

Take away ALL the "optimizations", and you get a better idea of how fast a
card is with all 3D games/apps.

i.e. "OPTIMIZATIONS" = CHEATING


G.F.
That is generalizing. Optimisation is not neccesarily cheating. Optimising for a benchmark is, but not for a game. I wouldnt count anything that keeps IQ in a game and still raises fps a cheat... However, the UT2k3/Nvidia deal does NOT fall under this, as it obviously messed seriously with quality (really seriously).
But what are you going to do when the devs appears to ignore something, and the driver devs can fix it? NWN is an obvious example. The pixel shaded water is STILL broke, and performance is horrific on ATI cards (I'm starting to think there is an if (graphics_card_ID != Nvidia) insert_lag(graphics_card); in it). The devs arent doing anything, and the driver devs are saying they will look into it. So will ATI cheat in NWN? They might optimise.
 
Optimisation is not neccesarily cheating. Optimising for a benchmark is, but not for a game.
..."Optimizations" wouldn't bother me as much if companies like nVidia
published a list of all the apps their drivers are "optimized" for. Then I
would know which games their cards might be great for, and which
benchmarks to ignore because the optimizations were screwing up the
results.

I doubt that I will ever see such a list from nVidia.


G.F.
 
Well, those optimisations wouldnt need to be public. Mostly its just bugfixes and small tunings. And the rest... They dont want you to see them, cause there they are doing something against their own guidelines for the drivers ;)

Benchies should never be tweaked though. It is supposed to be raw code, and not a contest in who can compress it the most for fastest performance.

But in general, if you see a "The Way You're Meant To Get Fooled" sign on a game, its got Nvidia codings in it. Well obviously, it wouldnt run otherwise :D
 
Originally posted by GordN FreeLoadR
...So you have to realize what 3Dmark 2003 is useful for.

Is it useful for testing overall system game performance ? NO
Is it useful for testing DX 7 video card performance ? NO
Is it useful for testing DX 9 video card performance ? YES
Until we have a bunch of DX 9 games to test with, 3Dmark 2003 is one
of the few DX 9 benchmarks we have.
It matters because many of the new games in 2004 will use the same
features. Vertex and Pixel shader performance is going to be a very big
deal with future games ( i.e. Half-life 2 ), so it's good to find out now if
the card you're planning on buying will hack it.
G.F.

I'm going to point out a few things that are very wrong with what you said.
1)3DMark03 is not a Directx9 benchmark, or a vertex/pixel shading benchmark for that matter. Why? Wing of Fury, test 1/4, is a direct X 7 test. nothing uses directX 7. the next two tests use PS1.4 extreamly heavily, which was an advanced Dx8 shader that no game ever used. msot games used ps1.3, and the new game use ps2.0 (i was mistaken earlier). Mother nature is the only test in 3dMark03 that uses ps2.0, and it uses it sparsley. Thus 3DMark03 is neither a good gaming benchmark nor a good Dx9 benchmark.
 
I'm going to point out a few things that are very wrong with what you said.
...I've looked into it, and you're absolutely right. I had always thought that
3DMark 2003 used 2.0 shaders for most of it's benchmarks, but this is
not the case.

Here's a quote from an Extremetech 3DMark article from last February.
http://www.extremetech.com/article2/0,3973,1088795,00.asp
This test is supposed to be a DX9 showcase, but in fact, only one-third (three out of nine) shader rendering operations use DX9 PS 2.0 shader techniques. The other six, as in the other two shader-focused tests, use PS 1.4 if compatible hardware is present, or falls back to PS 1.1 if not. What we would like to see is all shader tests in this particular test using PS 2.0 shader techniques. FutureMark seems to have taken a design approach that implements PS 2.0 effects as "special effects" only on specific parts of the scene, and when considered in this way, the approach does seem more reasonable. It's also reasonable to foresee games that will implement PS 2.0 shader effects incrementally, rather than use them in an entire scene.
OK, so now that 3DMark 2003 is in the dust-bin, what's a good DirectX 9
benchmark ?


G.F.
 
still its fun to have a pissing contest, no?
 
OK, so now that 3DMark 2003 is in the dust-bin, what's a good DirectX 9
benchmark ?
Aquamark 3 is coming soon... But its originally VERY Nvidia biased and specific, so I dont know if I can trust it. They say they wont optimise for Nvidia, but if the base engine already is... Well, you get the idea.

And of course, wait a while and we got Half-Life 2!!
 
Back
Top