Angry at Valve for the first time :|

epmode said:
if i'm not mistaken, the 9800 runs the directx 9 codepath while the fx5950 defaults to 8. your version should look better, at least.

DX 8.1. ;)
 
DX 9.0 level 1024x768 default settings 4xaa, 8x aniso. ~40FPS (39,xx)
DX 8.0 level 1024x768 default settings 4xaa, 8x aniso. ~60FPS (59,xx)

Taken with
Athlon XP 2200+ 1GB DDR266(Dual channel mode) and a Radeon 9800 Pro running at default speeds on the AGP 8X bus.
 
CB | Para said:
This has been debated a dozen bazillion times :|

And does it really matter if X card beats Y card by 5fps?
its 20 fps fool. Read before you post :hmph:
 
fisker how did u force the 9800 into dx8? did u just use the -dx8 command line parameter?
 
-dxlevel 80 anyhow i ****ed up, found out i had read wrong on the bench. it was 49,17 not 59. (Redoing for confirmation)

Also there seems to be more water and specularity in the dx8 test.
 
someone force a 5900 series into dx9 mode. that will be fun :D

what looks better dx8/dx9 or no diffirence?
 
The water in the DX8 level is more disorted, also the textures doesn't look as good. Well atleast on the rotating thingies.

Btw there's also more water in the DX8 stress test, i don't know if it's because of rendering or what might cause it.

Here's the new tests:

DX 9.0 level 1024x768 default settings 4xaa, 8x aniso(-heapsize 256000 -dxlevel 90). 42,5xFPS Max
First try: 42,5x
Second try: 41,7x
Third try: 41,9x

DX 8.0 level 1024x768 default settings 4xaa, 8x aniso(-heapsize 256000 -dxlevel 80). ~49,9FPS max
First try: 49,xx
Second try: 49,92
Third try: 49,xx

I only did the number by memory cause i expected a performance increase, but i didn't get one.

Test done by Athlon XP 2200+ 1GB of DDR266 in Dual Channel mode, and a Radeon 9800 Pro card @ default speeds using AGP 8x.

Also after using -dxlevel 80 it will be forced until you do -dxlevel 90(deleting it won't help).

Edit: Someone should do some tests without heapsize.
 
good job fisker. I have a athlon 2800 & 9800 pro i might get cz today and run those tests.
 
New benchmark, only did one for the laughs, but still:

DX 7.0 level 1024x768 default settings (-heapsize 256000 -dxlevel 70).
92,32FPS!

I'll try to get some screenshots of it.

Edit: Got an error about the page wasn't found :( i'll try again
 
:LOL: :LOL: :LOL:

The 2 last here

Edit: and compression isn't the only reason it looks like crap. That's to be expected though.
 
psyno said:
I believe you are mistaken. Valve has the 5900 Ultra & 5950 Ultra make full use of DX 9.0. I don't know about the other 5900 models (SE, XT, whatever), but I'm almost certain those two do.


No it is well documented that this test is defaulting to the dx8.1 codepath for all FX cards. The r300 and up and the nv40 are all defaulting to the dx9 path. This is well known just check around the forums. I'm not sure what they are going to do in the final game though.
 
thehunter1320 said:
i can see the difference between 30 fps and 60 fps... i see a difference between 60 fps and 90fps (although slight)... i guess i'm not human :borg:

Bring on Hattori Henzo sword. Slaaaaaaaaaaaaaaaash :D

I am going to get x800xtPE with 512mb memory soon [OCTOBER].
 
Well it seems i've hit rock bottom, since my FPS are even starting to decrease!

DX 6.0 level 1024x768 default settings (-heapsize 256000 -dxlevel 60).
82,xxFPS :(

screenshots of dust and hardware test is getting uploaded.

It's pretty nice that it actually does scale down to DX6 level acceleration though, now if only i had a true DX 6 generation card, i might find out if it's the card or HL that's acting up with the low fps.

(I actually got about 50FPS in De_dust)

Anyways the screenshots
 
Wow - at direct x 7 and 6, half life 2 looks like crap. sheesh.

I pity the people who are still stuck with those kind of cards.
 
DX 6.0 level all whistles blowing!11 :LOL: :LOL:

DX 6.0 level 1024x768 6xAA 16xAniso full detail(yes even the shader details, shadows etc. even though it doesn't support it :D) (-dxlevel 60).
76,xxFPS :LOL:

I have 2 screenshots, without all whistles blowing and with. (i.e some of the same i just uploaded)

I removed heapsize since it had no affect on DX 6.0 performance :naughty:
 
Sorry but I can't help but LOL @ NVIDIA people would rather play a game with a DX8 code path which would produce uglier graphics for the sake of having the fastest framerate possible.

This thread reminds me of back in the day when NVIDIA users would put down $350 for a GF2 Ultra, NVIDIA's top of the line graphics card at that time, and then go play Quake 3 with the resolution at 640x480 with all detail turned down to a minimun and they would start humping their chair because they are getting 120+ FPS but that game looked as impressive as Super Mario Brothers.

The fact is, in DX9 codepath, Ati beats NVIDIA hands down in HL2. But Valve apparently underestimated the base of NIVIDA users, who would rather play with all the eye candy turned down so they can get a faster framerate.
 
well i guess the answers simple then, make a quick £400 and get a X800. :hmph:

...I guess I could pleasure 100 beautiful chicks for £4 each :naughty:
...or 4 ugly chicks for £100 each...
... :frown: don't look at me like that: ugly chicks need love too
...but they gotta pay. :p
 
Andy said:
well i guess the answers simple then, make a quick £400 and get a X800. :hmph:

...I guess I could pleasure 100 beautiful chicks for £4 each :naughty:
...or 4 ugly chicks for £100 each...
... :frown: don't look at me like that: ugly chicks need love too
...but they gotta pay. :p

moahahahaha :naughty:

£400 I am paying it no matter what. !!!!!!!!!!! I AM getting new card. !!!!!!!!!!1
 
A single real multiplayer game screenshot with DX 6.0 level.

The other's are pretty much the same

And then something i just couldn't resist on doing.

About CS: S in DX 6.0

Besides for obvious mipmapping changes, lowering of the world detail. Model detail is actually kept up, and it doesn't look all that bad. Well atleast not compared to the world detail.

The flashbang is only a white flash though. The smoke grenade still works and the lightning effects are not as good. (flashlight, muzzleflash etc.)
 
Why are all you spoiled brats complaining that one TOP OF THE LINE card does a little better than some other TOP OF THE LINE card. You need a good healthy smack in the mouth if your going to bitch over this.

IF you game runs a 24 FPS steady and up you are playing the game smoothly and you should shut they hell up. My computer will barely run HL2 but I should get 24.

Some of you need some chill pills. Enjoy the game not the (ooohhh look at my !337 FPS)
 
wonkers said:
Why are all you spoiled brats complaining that one TOP OF THE LINE card does a little better than some other TOP OF THE LINE card. You need a good healthy smack in the mouth if your going to bitch over this.

Yes, cause all spoiled brats can buy top of the line cards....I mean, they are all spoiled! :rolleyes:

I'll bitch about this if I want, I payed for the damn shit, SO I WILL BITCH!
 
About this "human eye can only see x fps" fight. It has really been all over again and again. There was even some nice 3d -app written especially to settle this kind of fights. I'll see if I can find it.
 
wonkers said:
IF you game runs a 24 FPS steady and up you are playing the game smoothly and you should shut they hell up. My computer will barely run HL2 but I should get 24.

But most of use aren't happy with 24 fps (i simply wouldn't play with that frame rate)

If these benchmarks are to be trusted it's not looking too good for the 9800 pro. Given a choice between 60 fps, with dx 8.1, and 40, with dx 9, i'd take the higher frame rate without hesitation.

Let's face it - there's not much difference in image quality anyway. There is, however, a massive difference between 40 and 60 fps (the difference between choppy and smooth gameplay). Not only does a lower framerate look worse, it also detracts from the immersion - the gameworld feels less believable, aiming is harder (40 wouldn't be enough for multiplay etc)

Of course, it's worth waiting for further benchmarks - these results do seem rather dubious. (if they do turn out to be true i'll be trading my 9800 pro for a 5950 :/ something I never thought i'd say)
 
Ah, here it is! See for yourself
http://www.intternetti.net/~jiri/fpscmp02/

[ What is FPS Compare ]--------------------------------------------------------

This is an attempt at a small tool to settle the 'can you see the difference between XX fps and YY fps'-argument that seem to arise now and then. The idea is taken from an old demo by 3dfx, which showed one half of the screen running at 30 fps and the other at 60 fps. In this version you can select the fps for both sides yourself, using 'Q' and 'A' keys to increase and decrease the fps for the left side, and 'W' and 'S' for the left side.
 
Warbie said:
But most of use aren't happy with 24 fps (i simply wouldn't play with that frame rate)

If these benchmarks are to be trusted it's not looking too good for the 9800 pro. Given a choice between 60 fps, with dx 8.1, and 40, with dx 9, i'd take the higher frame rate without hesitation.

Let's face it - there's not much difference in image quality anyway. There is, however, a massive difference between 40 and 60 fps (the difference between choppy and smooth gameplay). Not only does a lower framerate look worse, it also detracts from the immersion - the gameworld feels less believable, aiming is harder (40 wouldn't be enough for multiplay etc)

Of course, it's worth waiting for further benchmarks - these results do seem rather dubious. (if they do turn out to be true i'll be trading my 9800 pro for a 5950 :/ something I never thought i'd say)
Remember it was with FSAA, and aniso enabled. The last test i run was 75ish.
 
OMg...you ARE all spoiled. If you paid for your own computer then I suppose you can complain. Other wise ...no.
 
i dont see what graph you were looking at, the one i see is ati beating nvidia ??
 
wonkers said:
Why are all you spoiled brats complaining that one TOP OF THE LINE card does a little better than some other TOP OF THE LINE card. You need a good healthy smack in the mouth if your going to bitch over this.

IF you game runs a 24 FPS steady and up you are playing the game smoothly and you should shut they hell up. My computer will barely run HL2 but I should get 24.

Some of you need some chill pills. Enjoy the game not the (ooohhh look at my !337 FPS)
Well when YOU pay $300 for a top of the line video card you will be pissed too. 24 frames isn't going to make 95% of the people here happy. Just because you don't have a nice system doesn't mean others shouldn't. Calling people "spoiled" because they have a better computer than you shows how little credibility you have.
 
eyesore said:
i dont see what graph you were looking at, the one i see is ati beating nvidia ??
I thought this too but you have to look at the colors very closely.
 
lazicsavo said:
I'm not questioning that, I'm asking why Valve kepts saying that ATI cards will have an advantage and now we see midrange Nvidia cards have a 20 fps advantage when it comes to the stress test.


perhaps because, last year when these claims were made, the 5900 Ultra and 5950 were NOT AVAILABLE YET.

also, Nvidia got off their collective asses and fixed DX9 in their cards.

also, they used the Cat: 4.8 drivers, which had MASSIVE image quality boost. It's like playing with AA on when there is none on. I noticed a decline in EVERY games FPS, but much nicer looking too... :D
 
i did look the colors very closely, still looks like ati winningto me.
 
eyesore said:
i did look the colors very closely, still looks like ati winningto me.

Perhaps - but there's nothing that would make any real difference in gameplay/enjoyment.
 
i get 90 fps in it, without aa on and thats fine with me, everything else turnedo n high!
 
EC said:
Actually, that is very misleading. The videocards at the top of the statistics part are on the bottom of the legend. I was very confused at first to. I was wondering how the 5950 was beating the crap out of the 6800 GT.

Im sure someone has already said this but I got here late. Anyway...EC, you are right. (i have said this in so many ****ing forums....) Nvidia cards do not support 24bit, they only support 32 and 16 bit. HL2 happens to be programmed in 24 bit (o_O!!) therefore Nvidia cards are recommended to run at dx8 mode, if they do not the 5900 for example will run a shit load slower then the 9600XT at dx9 mode. Valve has reprogrammed some parts of the game into 16 bit, but wont re program th ewhole thing. These 16 bit parts wil only work on Nvidia harware, so we ATI owners, are lucky to not have to use 16 bit.

Therefore Nvidia owners will be running the game on dx8 mode. :x
 
ATI4EVER! said:
Therefore Nvidia owners will be running the game on dx8 mode. :x

That may be so - but dx 8 with a high frame rate > dx 9 with a lower one.

(I really hope these benchmarks are wrong and we 9800 users will get dx 9 quality visuals and a decent framerate)
 
Warbie said:
That may be so - but dx 8 with a high frame rate > dx 9 with a lower one.


Keep fooling yourself.



DX 8 with a high frame rate < DX 9 with a high frame rate


100+ FPS with DX 9 is nothing to sneeze at.
 
Back
Top