ATI DX9 card, worried about compatibility.

Ghostdog

Newbie
Joined
Oct 10, 2003
Messages
93
Reaction score
0
In a recent thread it was stated that at least some GeForceFX cards are forced by HL2 to run in DirectX 8 mode. But what about older DirectX 9 compliant from ATI? I have a Radeon 9700Pro, and so far I´m not experiencing any bad slowdowns in games. Doom3 even ran a lot better than I had expected it to run.

So, my question is; will Valve force me to run Half-Life 2 in DirectX 8 mode as well. Midrange GeForceFX´s perform about the same as my card.
 
ATi cards are all gravy. i have a 9800 pro :)
 
Your Radeon 9700 pro will run HL2 in Directx9.

The Nvidia FX series perform the same as a radeon 9700 pro in 3d games, except in directx9. They simply suck in directx9.
 
well my 6800gt works hunky dorey in CS:S. I think its just the "bodged" 5xxx series...
 
All ATi cards use the Microsoft standard spec version of DX9, as does Valve and most game makers when they write code for their games, so you won't have to worry about it if your card supports DX9
 
if you have anything above a 5800 i think you should force dxlevel 9 though, it should run just as fast.
 
WaryWolf said:
if you have anything above a 5800 i think you should force dxlevel 9 though, it should run just as fast.

thing is tho, the 5XXX nvidia series, seems to have an affinity to reveriting back to Dx8.1. its a bug.. hope valve fix it tho, for those peeps with the GFX!
 
Since the nVidia 5xxx series uses a non-standard (Read shitty, half ass version) of DX9, it's not a bug. nVidia dropped the ball and now you're paying for it.
 
DavE0r said:
thing is tho, the 5XXX nvidia series, seems to have an affinity to reveriting back to Dx8.1. its a bug.. hope valve fix it tho, for those peeps with the GFX!

No, it's not a bug. Nvidia 5XXX FX cards suck at direct X 9. You can force it, but don't whine about low FPS if you do.
 
I play CS:Source fine on my FX5600, but it doesn't look anywhere near as good as on my other PC, which has a 9800Pro.
 
I love how these thread weed out the idiots. People who can understand computers can think at least at college level. Then there are the people who don't know anything but would like to post like they do.

It's fun to read. Personaly though how do you guys spend so much time learning about all the latest features and specs? Do you actually read all the ATi and Nvidia sites about their latest and greatest ?

I usually don't do that unless I am looking for somthing specific.
 
Wait, so my new Leadtek 6800 that i just installed wont run in DX9??

I noticed that it was running in DX8 last night when i installed it and ran the stress test, but that was because i'm forcing it to do so and haven't yet removed the option from the launch parameter.
 
I'm hoping to get an 9600XT to replace my crappy MX 440, HL2 should run DX9 right?(With 9600XT)
 
6800 should run in DX9 mode...what was your previos g-card? Or is this a clean install of Steam?
 
Megalomaniac said:
I'm hoping to get an 9600XT to replace my crappy MX 440, HL2 should run DX9 right?(With 9600XT)

Yes, it will.
 
Previous card was a Ti4200 128MB.

I was running the stress test with different DX versions forced to see what the differences were.
 
It probably still has the settings for your old card (which would have chosen DX 8.1)...back up first, then delete the config file here:

C:\Program Files\Valve\Steam\SteamApps\marcspillman\counter-strike source\cstrike\cfg\config.cfg

Then relaunch. Someone please confirm this is safe?
 
OK. I´m still waiting with my purchase. I want to be surprised by the super-graphics.
 
Well, for some anetdotal evidence, going from a GeForce 4 Ti4200 128MB to a GeForce 6800 128 MB i saw a huge increase in FPS while being able to max out all the detail options.

Under the Ti4200, at 1280x1024, i had no AA options, simple reflective water detail, Triliear filtering, and everything else on high and was getting ~30 FPS.

On the 6800 with everything maxxed out, i'm getting ~78 FPS, and my doesn't it look smooth.

Keep in mind this is a DX8.1 : DX8.1 comparison. I've yet to run the stress test on the 6800 in DX9 mode.
 
A GeForce *6*800 doesn't fit the "5XXX FX" pattern I listed in my post. (They default to DX9.)
 
Gotter said:
Since the nVidia 5xxx series uses a non-standard (Read shitty, half ass version) of DX9, it's not a bug. nVidia dropped the ball and now you're paying for it.

Obviously havent dont your research
because Nvidia was not present for the desicions on what the DX9 standard will include and ati was, Nvidia didnt know exactly what features to build into their cards;
as a result, Nvidia cards actually support more features than Ati is DX9 (SM3.0 anyone?), since Ati only has the core features of the DX9 standard.
Hence Nvidia actually have bulkier, overdone DX9 compatibility, not a halfassed one.
This means that when DX10 comes out, some of the features present will already be available on the early Nvidia DX9 cards (although by that time these cards will be slowass anyway), Because Nvidia overdid it they actually suffer a small performance hit due to information having to pass through more channels.

I have an Ati, and i prefer them, but its no reason to talk crap about Nvidia
 
i dont know, i have a 9700 pro as well, and it used to run fine, but after a had to reinstall windows when my HD failed, it now runs all steam games (i dont have CS:S, just CS, HL, Sven etec) horridly slow, about 30 fps max and 10 in firefights no matter what resoltion. This is in D3D. In OperGL (which supposedly HL runs better on) its a constant 5 fps or so. I dont know what happened. I used to get about 120 fps. I get 45 fps on D3 on mdeium for comparison. My comp also restarts randomly a lot now. I dontk now if my card is fried or if its the catalyst 4.10 or what. Help if you can please.

nforce ASUS A7N266-E BIOS v1.004
512 ram
DX 9.0c
Catalyst 4.10 with CCC and AI
 
Nvidia was not present for the desicions on what the DX9 standard will include and ati was, Nvidia didnt know exactly what features to build into their cards;
You could be right, but I remember hearing the rumours a bit differentely. Nvidia tried to cramp in more functionality into the first DX9 shader model and the other parties included in the DX-workgroup (not just Microsoft and ATI) didn´t all support the notion. I doubt anyone here knows more than what that person has been reading on rumour- and news-sites on the internet, so we can´t be completely sure how the work on DX9 progressed.

Nvidia then built in "over-functionality" in their first DX9-cards but couldn´t provide enough performance for the standard DX9 shader code, so they instead tried to get developers to adopt Nvidias own coding language, Cg, in an early stage. Bottom line is, Nvidias first DX9 cards didn´t run DX9 code as well as ATI-cards did. This has of course been fixed with the new architecture (NV40), which in my opinion kicks ass in terms of features and performance.

Nvidia cards actually support more features than Ati is DX9 (SM3.0 anyone?),
Shader Model 3.0 is part of the DX9 specification and the posibilities and limitations of SM3.0 (code-wise) have been known by the public ever since the DX9 SDK was released publically a long time ago. SM3.0 isn´t over-functionality, it´s standard DX9.
 
RoyaleWithCheese said:
Geforce FX cards :)

another reason i threw my FX card in the ****ing trash and bought a new comp with a nice Sapphire ATi Radeon ULTIMATE 9600xt card,512 MB Ram,120 GB hard disk space and 2.8 Ghz.
 
I heard that DX9 was supposed to use 24-bit internal precision, but the FX cards used 32-bit precision(which was slow) and had an alternate 16-bit precision (which is twice as fast, but looks crappier).
 
I heard that DX9 was supposed to use 24-bit internal precision, but the FX cards used 32-bit precision(which was slow) and had an alternate 16-bit precision (which is twice as fast, but looks crappier).
Yep, ATI and Microsoft were apparently happy with having 24-bit precision being the lowest official precision. Nvidia aimed higher (32-bit) but were unable to get their hardware to provide enough performance. The fallback 16-bit precision was fast enough, however.
 
Back
Top