Angry at Valve for the first time :|

Joined
Jun 24, 2004
Messages
1,689
Reaction score
0
Valve has been saying that the ATI cards will be better for HL2 since as long as I can remember, and that's the main reason I bought my 9800 for.

You can guess I was suprized when I went to this site, that has the first semi-official stress test benchmarks of the CS:S beta:
http://www.vr-zone.com/?i=1193&s=1

Once AA and AF are turned on, the NVidia 5950 beats the ATI 9800 buy about 20 fps. Is it just me, or does anybody find that ridiculous. Valve, if you say that ATI is going to perform better why do we see a 20 fps difference between the midrange cards?

The only hope I have is that these are not timedemo benchmarks and reflect nothing about the HL2 gameplay, but still, Valve you should really stop doing any kind of announcements if you keep dissapointing us.
 
Actually, that is very misleading. The videocards at the top of the statistics part are on the bottom of the legend. I was very confused at first to. I was wondering how the 5950 was beating the crap out of the 6800 GT.
 
if i'm not mistaken, the 9800 runs the directx 9 codepath while the fx5950 defaults to 8. your version should look better, at least.
 
Why have all the benchmarks released so far been bogus? First those God-awful ones from Driver Heaven and now these? These Gamers Depot ones I found are the only that ones that seem halfway believeable:

http://www.gamers-depot.com/hardware/video_cards/source/002.htm

Don't worry ATI cards will run HL2 better than similar Nvidia cards. Which is too bad considering I own a 6800 GT.
 
People, please read my post. They are fairly accurate, just misleading.
 
Ok, NV 5950 gets ~60 and the 9800 series gets ~40 at 1024, 4AA, 8AF. That's what's this benchmark is claiming. And if it's true in actual gameplay, I'll be ****ing pisses
 
epmode said:
if i'm not mistaken, the 9800 runs the directx 9 codepath while the fx5950 defaults to 8. your version should look better, at least.
I believe you are mistaken. Valve has the 5900 Ultra & 5950 Ultra make full use of DX 9.0. I don't know about the other 5900 models (SE, XT, whatever), but I'm almost certain those two do.
 
Nvidia is the more stable hassle free and glitch free video card on the market.
 
RMachucaA said:
Nvidia is the more stable hassle free and glitch free video card on the market.

I'm not questioning that, I'm asking why Valve kepts saying that ATI cards will have an advantage and now we see midrange Nvidia cards have a 20 fps advantage when it comes to the stress test.
 
I thought valve made Nv3x cards use the dx8 path 5950, 5900 etc.) And the 9800 series use the dx9 path. Besides, I dont know why everyone is so mad about this, after the game is released there will be time for driver optimizations, and patches etc..

And why would you be mad at Valve, they were payed close to 7 million by ATI, of course they are going to say ATI Runs better/is the choice for HL2... please dont start any flame wars of nvidia vs ATI. I think weve seen too many of these :cheers: Its not like they can wave a magical wand over the code, as im sure its been optimized to death for nvidia and ati. You should be mad at ati (I still dont see why youd be mad, just accept the facts)

Both the 6800 and x800 series are very good cards.. and im sure there will be room for optimization on both ends as I said, especially for nvidia.

What im hoping for is that valve eventually adds sm 3.0 support to hl2. (Or did they already?) Since ATI cards get 3dc compression support to help them along.

You also seem to forget that you will be able to set different , graphic settings.. in the full version of Half Life 2. So you can customize it to your specific rig.

Cheer up :E

(Sorry for the double spacing between sentances, I dont know why its doing this)
 
The human eye can only see 60 FPS! Doesn't even need to be any higher. You ATI 9800 will run Half-Life 2 just fine. In fact Half-Life 2 was ran on an ATI 9800 Pro 128MB durring E3 2003.
 
Meh, if Valve just said that ATI will run better cause it was payed then I guess ID at least has integrity since D3 actually ran better on Nvidia hardware, as was said by ID.
 
Kschreck said:
The human eye can only see 60 FPS! Doesn't even need to be any higher.

here we go again :p :LOL: :LOL:

and a kitty :cat:
 
Hehehehehehe

Maybe Valve meant that an ATI Radeon x800 would run it better than a Geforce 1

Hehe
 
Valve said that the FX5900 series cards were using partial percitision which is half dx 8, half dx 9.
 
lazicsavo said:
I'm not questioning that, I'm asking why Valve kepts saying that ATI cards will have an advantage and now we see midrange Nvidia cards have a 20 fps advantage when it comes to the stress test.

One set of test by one person proves nothing.
 
jpr6130 said:
What graph are you reading? The Ati card out perform the Nvidia cars expect in two instances. One was a tie between the 5900 and the 9800 at 16x12 and the 6800 ultra slightly out performed the x800 in 1600x1200. The graph is misleading because the legend does not follow the same order.

Ok, NV 5950 gets ~60 and the 9800 series gets ~40 at 1024, 4AA, 8AF. That's what's this benchmark is claiming. And if it's true in actual gameplay, I'll be ****ing pisses

I double double checked the colours, this is right.
 
Kschreck said:
The human eye can only see 60 FPS! .


This is the start and END of the human eye fps discussion.

It has been proven many times that this is not the case. Dont derail this thread with the stupid bickering.
 
urseus said:
This is the start and END of the human eye fps discussion.

It has been proven many times that this is not the case. Dont derail this thread with the stupid bickering.

exactly because we all know the human eye can only see 30fps.
 
Kschreck said:
The human eye can only see 60 FPS! Doesn't even need to be any higher. You ATI 9800 will run Half-Life 2 just fine. In fact Half-Life 2 was ran on an ATI 9800 Pro 128MB durring E3 2003.

lol dude, a human eye can only see 20 fps
 
Actually we can't even see anything at all, its all in our minds... if we even have minds.. oO

Well, I have an FX5900 128mb, and I score about 77fps in the stress test at 1024.
 
Ok so basically the "ATI - Preferred card for HL2" was a complete load of shit to boost sales for ATI cards.

GG capitalist ****wits.
 
poseyjmac said:
exactly because we all know the human eye can only see 30fps.

i can see the difference between 30 fps and 60 fps... i see a difference between 60 fps and 90fps (although slight)... i guess i'm not human :borg:
 
thehunter1320 said:
i can see the difference between 30 fps and 60 fps... i see a difference between 60 fps and 90fps (although slight)... i guess i'm not human :borg:

i can see the difference between 85 and 100fps(barely)
 
OK, let me get this right. 5900 will run faster, but it will lack some of the effects of the 9800 series: am I getting this right?

Just read the article posted by ASUS, and I'm kinda relieved a little bit. the 5900 runs faster because it defaults on directX8.1, whereas the 9800 does not (ie: it runs the game at glorious directx 9)
 
lazicsavo said:
OK, let me get this right. 5900 will run faster, but it will lack some of the effects of the 9800 series: am I getting this right?
And what effects would they be, does the 5900 miss out on shiny pebbles or something?
 
Before we get into a big "I can see more frames than you" debate, I found this article which explains it all fairly well.

http://www.daniele.ch/school/30vs60/30vs60_3.html

BTW, anybody who claims they can distinguish anything above 85hz should have some real evidence to back it up. I've seen many cases where people think they can see the difference between 60fps and 85fps. Meanwhile, their monitor's refresh rate is only 60hz. Which if you don't know, makes it impossible for the monitor to draw more than 60fps. In fact, I'm pretty sure you wouldn't want your monitor over 85hz, because they tend to get hotter the higher the refresh rate is and have a shorter lifespan as a result.
 
Arrrrrggggg!!!!


There is almost no set limit. They teach air force pilots to identify enemy planes by flashing images of them up to 1 per 300fps alright.
 
Cyanide said:
Before we get into a big "I can see more frames than you" debate, I found this article which explains it all fairly well.

http://www.daniele.ch/school/30vs60/30vs60_3.html

BTW, anybody who claims they can distinguish anything above 85hz should have some real evidence to back it up. I've seen many cases where people think they can see the difference between 60fps and 85fps. Meanwhile, their monitor's refresh rate is only 60hz. Which if you don't know, makes it impossible for the monitor to draw more than 60fps. In fact, I'm pretty sure you wouldn't want your monitor over 85hz, because they tend to get hotter the higher the refresh rate is and have a shorter lifespan as a result.

no, only with vsync on. with vsync off even at 60hz you can achieve very high framerates above 60fps. and i suppose you've never used a quality monitor, because many good quality monitors recommend using 100hz+ at certain resolutions.
 
no, only with vsync on. with vsync off even at 60hz you can achieve very high framerates above 60fps

No. That's not how vsync works. Vsync (or vertical synchronization) is meant to force the video card to wait for the monitor to finish drawing a complete frame before before begining to draw the next frame. This prevents screen "tearing". You'll notice that if you play a game with vsync off, you sometimes get a split on your screen where the top part of the image does not match up with the bottom part. That's because your video card has fed the monitor new pixel color information while it was in the middle of drawing a frame. Vsync is meant to prevent this.

The refresh rate you set for your monitor at is the refresh rate that it will run at. Your video card can render 1000 frames per second of Quake at 640x480, and any fps monitor (like fraps) will tell you that you're running at 1000 frames per second. But your monitor will only ever draw 85 of them. Vsync or not.
 
lazicsavo said:
Ok, NV 5950 gets ~60 and the 9800 series gets ~40 at 1024, 4AA, 8AF. That's what's this benchmark is claiming. And if it's true in actual gameplay, I'll be ****ing pisses

That's not a proper benchmark.

Fx 5950 is running in the direct x 8 codepath, while 9800 pro/XT are running on direct X. 9.

Wait for the official benchmarks which will be show the performances of both cards in Drct. x 8 and 9 seperately.
 
Cyanide said:
No. That's not how vsync works. Vsync (or vertical synchronization) is meant to force the video card to wait for the monitor to finish drawing a complete frame before before begining to draw the next frame. This prevents screen "tearing". You'll notice that if you play a game with vsync off, you sometimes get a split on your screen where the top part of the image does not match up with the bottom part. That's because your video card has fed the monitor new pixel color information while it was in the middle of drawing a frame. Vsync is meant to prevent this.

The refresh rate you set for your monitor at is the refresh rate that it will run at. Your video card can render 1000 frames per second of Quake at 640x480, and any fps monitor (like fraps) will tell you that you're running at 1000 frames per second. But your monitor will only ever draw 85 of them. Vsync or not.

um i didn't describe vsync so how could i be wrong, genius? i know how vsync works, the point that you missed was that with it off you can get as many frames as the renderer can draw, where in the hell did you pull 85 fps out of? have any proof? until you do get this unattainable proof, please stop spreading misinformation.
 
um i didn't describe vsync so how could i be wrong, genius? i know how vsync works, the point that you missed was that with it off you can get as many frames as the renderer can draw, where in the hell did you pull 85 fps out of? have any proof? until you do get this unattainable proof, please stop spreading misinformation.

You're wrong because I said this:
I've seen many cases where people think they can see the difference between 60fps and 85fps. Meanwhile, their monitor's refresh rate is only 60hz
And then you said this:
no, only with vsync on. with vsync off even at 60hz you can achieve very high framerates above 60fps

Implying that somehow having vsync off automagically removes the limit on how many frames per second your monitor can draw. By definition a hertz (Hz) is one operation. In the case of your monitor, that operation is the drawing of a complete image on the screen. If your monitor is set at 60Hz it will draw 1 complete image on your screen 60 times every second, no matter what. If your monitor can draw only 60 complete images every second, then there is no way that you can tell that your video card is producing 85 frames per second because you will never see more than 60 frames per second.

Yes, your card can render well over 60fps, and depending on the graphical quality of the game, well over the maximum refresh rate of any conventional monitor. And yes with vsync off, you may get a few extra fps if your card isn't already rendering at or above your refresh rate. But, it's unlikely that your card will stay syncronized with your monitor and you will get image tearing, which really hurts visual quality. However, if you think 45 frames of torn images is better than 30 frames of solid images, then you should disable vsync. If not, leave it on. Personally I'd rather not have the tearing.

BTW, here's a nice article about vsync.
http://www.d-silence.com/feature.php?id=255

I'll not be responding again because you obviously don't know how to handle a disagreement in a civil manner. All you had to do if you wanted to discredit my claims is find some proof of your own.
:cheers:
 
This has been debated a dozen bazillion times :|

And does it really matter if X card beats Y card by 5fps?
 
Kschreck said:
The human eye can only see 60 FPS! Doesn't even need to be any higher. You ATI 9800 will run Half-Life 2 just fine. In fact Half-Life 2 was ran on an ATI 9800 Pro 128MB durring E3 2003.

Yeah, that's true, but when fps starts to change rapidly from 40 to 60 you notice it, hard. So FPS stability is more important then a high fps.
 
What does it matter if a 5950 is slightly faster than a 9800 Pro? They are competing at different price levels. A 5950 should be paired with a 9800 XT.
 
well with the 5950 in dx8 mode u dont get the HDR effects which are dx9 only. This is the high dynamic range lighting effects. It would be interesting to see how the 9800 performs at dx8 maybe usefull for extra fps in cs source. But for hl2 i want the eye candy!!
 
Back
Top