More of the same. Nvidia and dx9

I'm really (X100) glad that I got an ATI Radeon 9800. It runs games so smoothly and at max settings.
 
Too bad Nvidia has to use 'optimized' drivers to get a better score in 3dmark 03, im buying ati from now on.

Read this
 
Howdilee ho!

Just bought a Hercules 9800 pro! And it feels like I oc'ed the cpu, it's just that good! Games run with 16xAF and 4xAA with max settings! FPS AHOY!!!
 
Hehe, that Halo benchmark IS really ironic... I mean, if one consider the development base for it, the ATI cards would have like a snowballs chance in hell to be faster. It have to be so tuned for the Geforce type chips its ridiculus in its core. And ATI is MUCH faster... Its not even comparable. Its like comparing a 200mhz to a 2ghz, two entirerly different ranges of power.
 
Originally posted by )[eVo]( Para
Howdilee ho!

Just bought a Hercules 9800 pro! And it feels like I oc'ed the cpu, it's just that good! Games run with 16xAF and 4xAA with max settings! FPS AHOY!!!


only 4x? :(
 
TrueWeltall:
Read this

Also, with out the mipmap bands/gamma changes, you cannot tell any difference in image quality.

Dawdler: The Halo engine used in the PC version is almost completly differnet than the one used an the Xbox ersion, seeing as it uses many DirectX 9 shaders (PS, VS, etc) and since ATi does DX9 better, that's expected.
 
Originally posted by SidewinderX143
Also, with out the mipmap bands/gamma changes, you cannot tell any difference in image quality.
That is very subjective, and highly dependant on situation. That screenshot from UT2k3 with girders going across the screen (from an old hardocp 9800vs5900 review, I dont know which), shows the VAAAAAAAAAAAAAAAAAAAST superiority of ATI FSAA over Nvidia FSAA, at twice the speeds of course (when equal FSAA sampling, not considering quality for the speed). And that is probably an understatement. However, some screens can show a clear advantage for Nvidia AF, in games with lots of terrain (IE Morrowind and the likes, though one with longer view range is needed). Its not as clear as the FSAA bit though, maxing AF on ATI brings it 90% of the time in 90% of all games to near equal quality.

And yeah, I noticed that on the Halo engine...Just another pointer that 3Dmark2k3 was right after all, and is represantative for game speed :p
 
LOL... Yeh the guy who said he can run things maxed out on his 5900.... Try HL2.. see who wins then!? lol.. Also the ironic thing about HALO is that it was "optimized" for NVIDIA hardware since the XBOX uses a GeForce 3 Ti! even bigger LOL!
 
/me waits for price drops so he can by 9800 pro.
 
I know... but still it IS a NVIDIA... no doubt about THAT correct?
 
Originally posted by TrueWeltall
Xbox uses a varient of the geforce 3 fyi

Yes I know... but the engine was redone from the bottom up to use DX9 shaders and tools. Which ATi Is better at. It no longer has any optomizations for nVidia, seeing as uses as many dx9 shaders as it can.
 
The only problem there is that only NVidia cards run the game at full quality...If you hvae the game and update it, in the readme it states:



Splinter Cell has 3 different rendering pipes:

Class 2 Graphic Adaptors:
NV2x/NV3x chips
Dynamic Lighting system = Shadow Buffer
Vertex position modifiers = Yes
Light beams stopped by depth texturing = Yes
Pixel Shader effects/filters/water = Yes
Reflection/Details texturing/Specular = Yes

Class 1 Graphic Adaptors:
R2xx/R3xx/Parhelia/Xabre 200/Xabre 400/Xabre 600/chips/Creative P9
Dynamic Lighting system = Shadow Projector
Vertex position modifiers = No
Light beams stopped by depth texturing = No
Pixel Shader effects/filters/water = Yes
Reflection/Details texturing/Specular = Yes

Class 0 Graphic Adaptors:
R1xx/NV1x chips
Dynamic Lighting system = Shadow Projector
Vertex position modifiers = No
Light beams stopped by depth texturing = No
Pixel Shader effects/filters/water = No
Reflection/Details texturing/Specular = No

Class 2 adaptors can run as Class 2, Class 1 or Class 0 adaptors while Class 1 adaptors can run as Class 1 or Class 0 adaptors. Class 0 adaptors are only able to run Splinter Cell as Class 0 adaptors.
You can force a class 1 or class 2 adaptor to run as a different class by editing the splintercell.ini file in the \system directory. Uncomment “ForceShadowMode = 0” to force the card to run as class 1 adaptor (if able to) or change “EmulateGF2Mode=0” to “EmulateGF2Mode=1” to run as a class 0 adaptor.

Why does Splinter Cell have a special mode for NV2x/NV3x graphic chips?

Splinter Cell was originally developed on XBOXTM. Features only available on NV2x chips were used and it was decided to port them to the PC version even if these chips would be the only one able to support them. Considering the lighting system of XBOXTM was well validated, it was easy to keep that system intact.

So, whenever you hear about ATI outperforming here, it is either an uneducated or a biased statement. Surprisingly not many have noticed this.
 
Originally posted by NYHoustonman
The only problem there is that only NVidia cards run the game at full quality...If you hvae the game and update it, in the readme it states:



So, whenever you hear about ATI outperforming here, it is either an uneducated or a biased statement. Surprisingly not many have noticed this.

Thats one instance. Its a well know major flaw in that game....

I wouldnt base performance on a single game or bench(cept mabey HL2 :p )

There really isnt much doubt .........Nvidia dropped the ball with the FX series.....the 5900 is slightly faster in some DX8 games....... I dont think even ATI fanboys dispute that.

Just about everything ive seen DX9 wise, shows the 5900 more on par with a 9500/9600pro.....

The only thing ive seen thats odd is the doom3 bench....wich shows the 9800pro trailing....

Anyone shed some light on this for me cough*dawdler*cough....
 
Originally posted by crabcakes66
The only thing ive seen thats odd is the doom3 bench....wich shows the 9800pro trailing....

Anyone shed some light on this for me cough*dawdler*cough....
Doom III is OpenGL if you remember :)
And just as Splinter Cell, it uses a different Nvidia path with special Nvidia extensions (that are obviously faster, but lacks in quality). For ATI it uses the standard OpenGL ARB path, which doesnt cut any corners as far as I know (actually ARB2 for the DX9 cards, dont know the exact difference).

The thing with Splinter Cell is that it COULD be nice shadows on the ATI cards too. The shadowing technique (Ultrashadow or whatever they call it) is just a name... ATI cards can do the same thing. Check out the TRAOD default settings again. The ATI cards does ALOT (12 vs 1 for the 9800) more shadows than the 5900, and that is using PS2.0 when the 5900 doesnt use it. And its still twice as fast.
 
Also, Because Doom III is OpenGL, Carmack and the D3 team write seperate optmizations for chipsets, and (as far as I know) their nVidia code is better than thier ATi code.
 
No, no no, going from a ti4400 to the 9800 pro, the shadows don't look near as good. So that benchmark should be thrown out, or at least have an asterisk next to it, IMO. NVidia is still in ok shape (wait until the Cat3.8's :)) overall, though. The FX5900 is turning into a good value.
 
All i have to say is that nvidia ****ed up with the first batch of fx's, and ati did the 9x00 right serveral months before the gf fx ever came out. ATI still has some diver issues, but it seems to be more steady preformance wise than nvidia at this point. honestly, i dont care. Next video card im getting is going to be pci-x.
 
ATI is no doubt in the lead. NVidia has a LONG way to go to catch up.
 
so many fangirls of ati. But why?

btw i own both cards(radeon9800 n fx5900 ^_^)
I bet you dont :O
 
Originally posted by KaoS87
so many fangirls of ati. But why?

btw i own both cards(radeon9800 n fx5900 ^_^)
I bet you dont :O
No, cause I would never be so stupid that I first get a 5900 then a 9800. I would simply get a 9800 first and save me $400.

Stupid post get stupid reply :rolleyes:
 
actually dawdler... i got the radeon for free. By the way all you do in the forums is talk ****,hate on people, and flame the poor people who put up a thread that has been discussed already. Go jump off a bridge and do us ALL a favor. kthnx.


/me is sooo scared.
 
sorry i had to unload on you even though i wouldnt be, if you would like... stoped being so gay. !_!
 
LOL :p
How many of my posts have you read? Let me guess... 1? 2? I've made something around 1000.
And I wont even comment to complete void of logic in your posts :dozey:
 
Originally posted by KaoS87
so many fangirls of ati. But why?QUOTE]

How is everyone who likes the ATI cards better, a "fangirl"? Just because they see that the ATI cards are priced lower and perform better does not mean they are a fanboy/girl, it means they have common sense. Too many people on these forums get confused with being a fanboy and having an opinion.
 
yes the 9800 might bet the better buy but i think my 5900u performs a bit(meaning just a bit) better. As for quality... ati does have the upper hand.
.
My 2 systems are: p42ghz,radeon 9800, 512 sdram(sux).
p42ghz, gfx5900u, 256sdram(getting new mobo/ram).

as for dawdler, i apologize my almightyness. i will > u in hl2 :O
 
Originally posted by KaoS87
yes the 9800 might bet the better buy but i think my 5900u performs a bit(meaning just a bit) better. As for quality... ati does have the upper hand.
.
My 2 systems are: p42ghz,radeon 9800, 512 sdram(sux).
p42ghz, gfx5900u, 256sdram(getting new mobo/ram).

as for dawdler, i apologize my almightyness. i will > u in hl2 :O


Your actaully one of the few(only) people ive seen dawdler insult........

You need to pull you head out of your ass or go to a differant forum....

If you want to give reasons why the 5900 is better, do it.

Otherwise your no better than the people that say "nvidia sucks, go ati"


In my educated opinion...you will see just how good your 5900 is when we get the hl2 benchmark.
 
ROFL, did you not read my apologize? Before you go off on people telling them to get their "heads out of their ass," you might wanna do it to yourself first. lawl
 
Originally posted by KaoS87
ROFL, did you not read my apologize? Before you go off on people telling them to get their "heads out of their ass," you might wanna do it to yourself first. lawl


Yes i read it.....and what my post says is still my opinion.
 
Back
Top