which is vid card is better. see inside.

gegam

Newbie
Joined
Jul 21, 2004
Messages
274
Reaction score
0
i bought a x800xtpe pcix, im not planning on updating soon but.

at the time ati was kicking ass right with the x800xtpe.

so now with the 7800gtx out topping 8k on mark05, it beats my card i bet, im not complaining though.


so right now, which card developer is ahead in the race? which provides better eye candy as far as halflife2 HDR is concered. those demos of the mermaids and shit on nvidia site are looking pretty nice. but they can be fooling i dont know. you tell me.
 
nVidia seems to be in the lead...they really could have done without the 'mermaids and shit' (lol) though...it's silly.
 
gegam said:
i bought a x800xtpe pcix, im not planning on updating soon but.

at the time ati was kicking ass right with the x800xtpe.

so now with the 7800gtx out topping 8k on mark05, it beats my card i bet, im not complaining though.


so right now, which card developer is ahead in the race? which provides better eye candy as far as halflife2 HDR is concered. those demos of the mermaids and shit on nvidia site are looking pretty nice. but they can be fooling i dont know. you tell me.
ATI was really never in the lead. the 6800 Ultra beats it.
 
That One Guy said:
ATI was really never in the lead. the 6800 Ultra beats it.

No, the x850 was ahead in nearly everything, except doom 3.

And i've never heard of any mainstream ati or nvidia gpu on pci-x
 
holydeadpenguins said:
No, the x850 was ahead in nearly everything, except doom 3.

The ATI x850XT had a few frames more in some games with very high resolutions and a lot of AA, I am talking about 5 frames more maybe, and you could run an NVIDIA 6800 ultra or GT at the same settings that you would play with that ATI card and you would not notice the difference in frames, so ATI was not really ahead in the previous generation there were pretty much equal, ATI still lost the race in that generation of card, mainly because NVIDIA had much more availability of their top cards and they cards were more future proof with options like 3.0 shaders (You can argue that that games are only really starting to utilize that feature but, but that is not the point, there were the first to introduce those cards and it paid off), plus NVIDIA introduces SLI configurations for the High End enthusiastic market willing to invest a lot of money in their rigs.

So like I said before, you can’t not say "ahead" because 5 more frames in some situations doesn’t mean is the better GPU.
 
gegam said:
i bought a x800xtpe pcix, im not planning on updating soon

which provides better eye candy as far as halflife2 HDR is concered.

sorry to break this to you man but you wont be able to run Halflife 2 in HDR with that card. ATI doesnt currently have a card that supports HDR.
 
jellydoughnut217 said:
sorry to break this to you man but you wont be able to run Halflife 2 in HDR with that card. ATI doesnt currently have a card that supports HDR.

WTFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF

who are you!!!!!!!! WHY ARE YOU LIEING TO ME!!!!!! LIER!!!!!!! tell me the truth now!!!!!!!!!! DIE!!!!! ALL OF YOU!!!!!!! i didnt spend a shit load for it not to run!
 
gegam said:
WTFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF

who are you!!!!!!!! WHY ARE YOU LIEING TO ME!!!!!! LIER!!!!!!! tell me the truth now!!!!!!!!!! DIE!!!!! ALL OF YOU!!!!!!! i didnt spend a shit load for it not to run!
it will work, dont worry.
 
gegam said:
holy crap you gave me a scare. my eyes are tearry.

youll run it at sm 2.0 not 3.0, sure you wont get %100 of all the jazzy stuff but it will look great.
 
bryanf445 said:
youll run it at sm 2.0 not 3.0, sure you wont get %100 of all the jazzy stuff but it will look great.
they are doing an sm3 version?
 
x800s run with 2.0b its very very close to sm3.0 except openfx rendering and umm wasnt ati in the lead with pwning the fx series with there 9800
 
HDR & You...

To be able to experience the Best Quality setting, you will need a graphics processor that is capable of 16-bit Floating Point filtered textures, 16-bit Floating Point targets with alpha bend support, and Shader Model 3.0. The lower quality mode will require 16-bit Fixed Point filtered textures and Shader Model 2.0 support.

This means that the preferred cards to run HDR are as follows:

Low Quality HDR:

ATi 9600 Series cards
ATi 9700/9800 Series cards
ATi X600 Series cards
ATi X700/X800 Series cards
ATi X850 Series cards
nVidia 6600 Series Cards
nVidia 6800 Series Cards
nVidia 7800 Series Cards

High Quality HDR:

nVidia 6600 Series Cards
nVidia 6800 Series Cards
nVidia 7800 Series Cards
 
... and you know this how? It will probually look almost the same.
 
i know this through a game known as far cry which utilized HDR way before valve even considered HDR for half-life 2
 
jellydoughnut217 said:
i know this through a game known as far cry which utilized HDR way before valve even considered HDR for half-life 2

there is no low quality hdr in farcry mabye u dont have the hardware to support (6 series video card)
 
Yo jellydoughnut...
I think you should stop posting things that you don't really know are true...
Ex.
you wont be able to run Halflife 2 in HDR with that card. ATI doesnt currently have a card that supports HDR.
...thats forgivable, you probably saw something about ATi not having support for 'high quality' HDR, and just read it wrong...

low quality HDR blows ass, its barely noticible
...and that's just talking out your ass...

i know this through a game known as far cry which utilized HDR way before valve even considered HDR for half-life 2
Um...How do you know when valve planned HDR for Half-Life 2? In fact I'd bet they were planning it ever since they started working on Source. They just haven't finalized it until now...

Ahem, What I'm trying to say is you sounded like an ass, but I'm not saying you are, and I just thought you should know. You don't want to sound like an ass do you?!? :O
mmkay. :)
 
HDR has been played with by valve since before HL2 came out. ATI was showing valve how to do HDR with SM2.0 over that summer before 'Sept 30th'. Farcry uses OpenEX HDR which IMO doesn't look any where near as good as Valve's results. Maybe it's just how they teaked it in Farcry that made it look like that. It looks way over done.
 
yo asus so what do you tihnk, you think my x800xtpe will give me the best of HDR? i mean blidning light and shit like that? or is it oging to be toned down because it does not have the HDR shit that NIVIdia has.
 
HDR used in far cry isnt as quality as the HDR in half life 2, making low quality HDR in Half-life 2 quite comparable to HDR in far cry
 
the video of HDR was run on 6800s in sli mode,

valve is a bitch for this, it says a lot.

they stress hl2 is for ati cards and bla bla bla, BUTTTTT

will HDR run better on SM3.0 or 2.0?
 
obviously 3.0.

The fact is you need Shader model 3.0 to do some of the fancy lighting effects and shit, so ATI cards will not have this, only Nvidia 6600 and up. Until the R520 is released anyway.

So no, you will not have the best quality HDR.
 
The ati cards cant do 16-bit alpha blending MUAHAHAHHAA. Too bad nvidia doesn't really support 24-bit color. :-(
 
Back
Top