Direct X 8 shaders vs Direct X 9 shaders images

you have never seen reflective surfaces havent you ever played splinter cell? or earth and beyond? im sad to say this but if my Geforce FX 5600 ultra cant even run hl2 with decent fps then im going to go buy a ATI 9600 pro cuz i say the benchmark scores and i dont like what i see cuz this isnt like nvidia cuz they have always put out a good video card ati just barly started putting out video cards that i would even think about buying nvidia has just lost there way dont worry they will pick back up
and about this whole thing about ATI owners and Nvidia owners
who the hell cares ATI cards will do better for now Nvidia will catch back up no problem they just incountered a bump in the road hopfuly with these new 51.75 dentenator drivers they will pick up more fps and with that whole imaging quality crap you have to have 2 computer one with a Geforce Fx 5900 ultra and the other with a ATI 9800 pro to tell the difference in imaging quality so im going to build another computer to see the true differnce im going to leave this computer alone and im going to build a AMD 2200 anthlon, 1 gig ddram, ATI 9600 Pro
and ill give you the results when the HL2 benchmark comes out there will be stock core and feq and overclocked =P
 
it's called a period. it ends sentences and greatly improves readability.

who the hell cares ATI cards will do better for now
because the point of buying a graphics card upgrade is to buy the fastest one available with relation to price. who would buy an nvidia gfx, apparently with a great inferiority to even the 9600p in dx9 (and a much higher price-tag), just because "oh nvidia's not doing too well but they'll come back strong!!!" that's called brand loyalty. it's a bad thing. and about the image quality difference... have you even seen the screenshots? the water shots aren't that great to compare, since the difference is hard to appreciate in a static shot, but every other comparison has blatant quality differences. or do they not matter because nvidia's doing poorly so we should therefore blame ati for making a better graphics card? o_o
 
yea thats compared with a screen shot could you even care if you even saw the comparison? the reason why i say Nvidia will come back strong is cause they have to or they will go out of busniess cause people will stop buying there products and switch over to ATI and i dont want that cause i like a varitity to choose from videocard makers and there is alot of people with Nvidia cards in peoples machines to power there games
 
Originally posted by Direwolf
Toi my knowledge any DX8 card is DX8.1 too....can anyone confirm?
Didnt see anyone note it, but I dont think so. DX8.1 calls for PS1.4 to exist I beleive. DX8 is only PS1.3. G4Ti only support up to PS1.3.
 
So...wtf are 8.1 cards then?
do any even exist?

I thought it went from DX8 technology PERIOD
to dx9.
 
lol i just hope this is all a big lie to make ATI look better or somthin...but if it aint i gotta get a 9600 pro then =P
 
Originally posted by Stiler
So...wtf are 8.1 cards then?
do any even exist?

I thought it went from DX8 technology PERIOD
to dx9.
The FX is DX8.1 (very much so, since they advise using PS1.4 before PS2.0 :))
Some of the radeons are that too, 9000 or 9200, or both of them (I'm not exactly sure).
And that's about it I beleive.

Edit: btw, read this: http://www.beyond3d.com/forum/viewtopic.php?t=7957

The FX couldnt even do HDR even if it could do DX9.
 
Originally posted by dawdler
The FX couldnt even do HDR even if it could do DX9.
Not true. I grabbed this little HDR test program from another thread: http://www.daionet.gr.jp/~masa/rthdribl/ . It runs on my FX card with Det45 drivers and it clearly shows the glare effects in action. It has low fps, but it definitly works.
 
Originally posted by Arno
Not true. I grabbed this little HDR test program from another thread: http://www.daionet.gr.jp/~masa/rthdribl/ . It runs on my FX card with Det45 drivers and it clearly shows the glare effects in action. It has low fps, but it definitly works.
Yes, I know that is kind of odd... But somehow I trust them saying the drivers dont work than those saying it work. At least they got technical :)
So question is what it is doing exactly... Still, its a crawl on any FX.
 
I ran that same program HDR and i ran it and i got 20 fps with my geforce fx 5200 ultra when i ran it with my Geforce Fx 5600 ultra i got 36 fps my cards are overclocked
 
Originally posted by BoRn[nBk]
I ran that same program HDR and i ran it and i got 20 fps with my geforce fx 5200 ultra when i ran it with my Geforce Fx 5600 ultra i got 36 fps my cards are overclocked
That sounds very unbeleivable... I know that the first time we tried it here (months ago), a 5900 Ultra scored 15 fps against my 32 fps on a 9700 Pro.
As far as I know, there is no drivers that have upped performance that much. Either that, or you didnt use standard settings.
 
lol you sure you took those test correctly? cuz thats wut i got avg fps on
 
Originally posted by dawdler
Yes, I know that is kind of odd... But somehow I trust them saying the drivers dont work than those saying it work. At least they got technical :)
The people who posted on that forum are probably all ATI card owners and were making wild guesses.
Look at the bottom of this page for a benchmark of the FX5600 running that very same HDR program. It performs poorly, but at least that DX9 feature is available.
 
Originally posted by BoRn[nBk]
lol you sure you took those test correctly? cuz thats wut i got avg fps on
We must not be talking about the same program as the 5600 Ultra above got 5.4 fps at the standard resolution. Getting 36 fps is obviously impossible at standard settings :)

Arno, the reason for the extremely (its not realtime anymore) poor performance might actually be because if it doesnt fully support the features used. We are looking at EXTREME differences here, not just the "Nvidia is half as fast in PS2.0/DX9" thing. But that's another ATI fan guessing, hehe :)
 
Originally posted by dawdler
Arno, the reason for the extremely (its not realtime anymore) poor performance might actually be because if it doesnt fully support the features used. We are looking at EXTREME differences here, not just the "Nvidia is half as fast in PS2.0/DX9" thing. But that's another ATI fan guessing, hehe :)
My guess is that the FX chip wasn't really designed with DX9 features in mind. I do believe that the Detonator drivers are capable of rendering all the DX9 effects, but since the hardware is not really intended for DX9 use, the performance is poor.

There's a difference between supporting a feature with poor performance and not supporting a feature at all, which I was trying to point out.
 
With my Geforce Fx 5200 ultra i was avging 20 fps keep in mind that it is overclocked has fast has it can go i have 2 fans to keep it cooled down
 
A few things....

1)Since NVIDIA has slipped up and sorta fell behind ATI, we have seen probably the slowest incremental release by ATI in a fair few years. Over a year ago I could buy the 9700pro, the best money can get, and right now that is the 9800pro, same chip, slightly faster clock, very little difference. What happened to Moore's Law? NVIDIA have been busier with their FX line, but they haven't really improved (and in some cases de-proved?!) over the ti4600.

2)I read a while back that ATI and NVIDIA have come to an understanding and signed a contract to slow down the release of their new technology, because the R&D costs are so damn high, they want to milk their cards longer, this is bad news :-/

3)NVIDIA, what did they do wrong? Hire too many 3dfx engineers? Too much marketing not enough engineering? Spend too much time developing their new NVIDIA only graphics API? (What a waste of time, GLIDE did nothing more than piss me off as a non-3dfx owner back in the day) They should wake up and realise that for now plain old DX and OGL performance is the way to go.

It's amazing how quickly product loyalty slips away, I think the majority of PC users are quite clued in. The Fanboy king, will ALWAYS be the best performing card while retaining image and driver quality. In a couple of weeks since the HL2 performance numbers were released I think NVIDIA have slipped from number 1 (abeit a slowly falling from grace number 1) to almost 3dfx like obscurity. I doubt anyone is planning to go NVIDIA for their next card.
 
Originally posted by mikesux
2)I read a while back that ATI and NVIDIA have come to an understanding and signed a contract to slow down the release of their new technology, because the R&D costs are so damn high, they want to milk their cards longer, this is bad news :-/
Isnt that kind of deal illegal? Doesnt matter though, still bad.
 
ATI is slowing down their production cycle by up to 6 months. It should go into effect after their next line of video cards is released.

Last I heard, nVidia wasn't making any changes to their schedule.
 
Back
Top