GF FX vs ATI

C

CSDEKS

Guest
Hello guys!
That's true, i must have ATI graphic card because she has D3D , Geforce FX not have a D3D, but he have Open GL 2.0 (i mean gf FX 5900 ultra)?
What type of graphic card i should buy, and why, thak you!
 
LOL omg.. that post cracked me up, I'll take a wild guess that this guy's primary language isn't English

Anyhow, to answer your question, the ATi cards are still the leader for current generation graphics, I'd go with an ATi card
 
And don't laught at me, English is not my basic language! Thank you!
 
Im not entirely sure what your saying, but both cards have opengl/d3d. However in currently realeased benchmarks ati seems the clear winner plus if you buy a radeon 9600/9800 xt you get a copy of halflife 2 when its released for free.
 
In my country i don't get a copy of HALFLIFE 2 for free!
 
I kinda have this Yatta/supertrooper feeling about this guy..
 
Or you want to say that: ATI will be selling with a copy of HL2?
 
CSDEKS said:
Or you want to say that: ATI will be selling with a copy of HL2?
A voucher to get the game once it's out, yes
 
CSDEKS said:
How much ATI 9800 costs?
$499 USD :p

Edit: That's for the 9800XT

The 9600XT, which includes a HL2 voucher as well, is $200 USD
 
nVidia produce technically superior cards, but ATi had a rather large hand in the DirectX 9 specs. They perform much better in DX9 games than the equivalent nVidia card - but nVidia cards have superior performance in OpenGL games. (few though they may be)
 
Also, some to most of the high end FX cards take up 2 slots (1 AGP, and one for fan which blocks top pci slot).
 
i can add a bunch of features to something and "call" it technically superior....if those features dont work properly and are not even used in whatever job that "something" is doing.......


well....i guess its not so superior.
 
If you want a superior DX9 gaming card, go with a ATI card.
DX9 graphics cards start with the 9500 model and up to the 9800XT.
128 or 256mb does not really matter much at all.

From my experience, you want the standard model (e.i. 9600) or a Pro (e.i. 9600 Pro). There are even XT which are even better but do not buy a SE model (e.i. 9200 SE) as you might as well buy the Pro from the lower model (e.i. 9000Pro).

If you are looking about the price range of the ATI Radeon 9200 or the Nvidia FX 5200 then I might look into the Gefore 4 TI cards (Not MX). They do not have DX9 but they have great performance.
The Nvidia FX 5200 is not worth your time.

If you are looking into OpenGL applications (non-gaming applications) or 2D performance then I might look to Nvidia's FX for that.

Or you could wait til the new cards come out this spring. ;)
 
The definition stands, whether or not the features are used is beside the point - which is why I said technically superior. Most would agree that in real-life the nv cards aren't superior (as far as DX is concerned), but if you pit a top spec nv card against a similar ati one using an opengl app then the nvidia card will shine.
 
Errr... I beg your pardon?

I thought that the general cosensus was that ATI cards were better at DX9?

And everyone I know who's getting HL2 is going to use AA/AF if it is at all possible...
 
Dave said:
http://forums.overclockers.co.uk/showthread.php?s=&threadid=150919

As you can see.. Nvidia are the clear winners unless you're using AA/AF.

And you probably won't be using AA/AF for HL2.

Also worth noting that Nvidia cards have the best DX9 performance, therefore it makes sense they would provide the best performance with HL2.

You shouldnt rely on just one site.

Everyknows what happend with Toms hardware...
 
Shuzer said:
And yet you link to a picture of a Radeon 9800XT.

He's talking about the nVidia FX highend cards.
I think he saw that picture and linked to it before actually reading the address.
This is what he ment to link to. link

You cannot go by one review. You must look at a few good articles that might have different viewpoints.
From what I have seen, depending on the benchmark Nvidia or ATI may have the highest frame rate or the highest average but ATI remains pretty steady on the frame rate throughout the benchmark while Nvidia's frame rate on a graph looks like an earthquake up and down very quickly. I shouldn't even need to talk about quality or AA/AF.
 
Dave said:
http://forums.overclockers.co.uk/showthread.php?s=&threadid=150919

As you can see.. Nvidia are the clear winners unless you're using AA/AF.

And you probably won't be using AA/AF for HL2.

Also worth noting that Nvidia cards have the best DX9 performance, therefore it makes sense they would provide the best performance with HL2.
You only have one problem in that statement: NONE of the benchmarks there shows that Nvidia has the best DX9 performance, as NONE of them are anywhere close to DX9 graphically. Gunmetal being DX9 is just a big lie. Aquamark 3 has what, 4-5 DX9 shaders out of 100+?

There still isnt a game that can truly test DX9 features... Those few that comes close have rather poor engines from the start designed for Xbox and Geforce, and are STILL faster on ATI. (fun note: and often rendered wrong by the FX :))

HA!!! I still got it :p
 
Also in that test the only high end card was a 9800 softmodded to 9800 pro.. there wasn't a 9800 pro or a 9800 XT yet strangely there was a 5950?
 
MaxiKana said:
there wasn't a 9800 pro or a 9800 XT yet strangely there was a 5950?
No, that was a FX5900 LX, which is basically a stripped down version of the FX5900non-ultra.

That review is pretty dumb anyway. Only a handful of people would care wether Quake3 runs with 200 or 250 fps. What a good hardware review needs is popular, modern games that really take the fullest out of a graphics card.
 
Well most people will be running games with AA enabled, and thats when ATI pwns every nvidia card.
 
Arno said:
No, that was a FX5900 LX, which is basically a stripped down version of the FX5900non-ultra.

That review is pretty dumb anyway. Only a handful of people would care wether Quake3 runs with 200 or 250 fps. What a good hardware review needs is popular, modern games that really take the fullest out of a graphics card.
No, its actually the opposite. Modern games will still rely on brute force since the DX9 era is very young. The next generation games will take advantage of the card, and possible the second generation games will use it to the fullest.
The best example of this progress is Max Payne and Max Payne 2. Then again, many games dont always follow this. KOTOR hasnt learned crap from NWN's mistakes.
 
dawdler said:
(fun note: and often rendered wrong by the FX :))

HA!!! I still got it :p


Use the flashlight in Halo on a 9800XT, then compare it with a an nVidia card. ATi is not completely innocent of driver cheats - they just haven't been caught yet.
 
Dave said:
http://forums.overclockers.co.uk/showthread.php?s=&threadid=150919

As you can see.. Nvidia are the clear winners unless you're using AA/AF.

And you probably won't be using AA/AF for HL2.

Also worth noting that Nvidia cards have the best DX9 performance, therefore it makes sense they would provide the best performance with HL2.

that is one craptastic review.... :sleep:

jonbob said:
Use the flashlight in Halo on a 9800XT, then compare it with a an nVidia card. ATi is not completely innocent of driver cheats - they just haven't been caught yet.

halo is not a very good example.
 
ATI is the way to go these days, it's like picking between an Xbox and a PS2.. :)
 
Ati are the current favourite to be sure - if you want to buya top card now buy an ATi one. But keep your eye on nVidia for the future.
 
jonbob said:
Use the flashlight in Halo on a 9800XT, then compare it with a an nVidia card. ATi is not completely innocent of driver cheats - they just haven't been caught yet.
HA!!! No, I actually still got it :E

The flashlight in Halo is being rendered CORRECT on ATI.
It is being rendered INCORRECT on NVIDIA.

How can such odd people like me tell you say? Its easy. We have a reference: The Xbox. And Xbox renders it like ATI. Nvidia does not. Its actually quite ironic.

Edit: Whether you think the FX *looks* better is a completely other thing. I actually cant decide, I like them both.
 
Firstly, I want to apologise for the quality of the nVidia shot, but my V500 is the closest thing I have to a digital camera, but here you may be able to see what I'm referring to. Notice that the ATi card renders the flashlight as a single hard circle, while the nVidia chip in the XBox renders it with a soft edge and further lighting around the main beam. I'll link to the images to avoid the inconvenience to anyone who isn't particularly bothered.

ATi Image
nVidia Image (XBox)

You might also want to compare these two:
ATi Image
nVidia Image

I'm not saying that one method of rendering the flashlight is the right way, but the nVidia one certainly seems more faithful to what a flashlight actually looks like. And I'm not saying that nVidia cards are better, because in truth, the current range most certainly isn't. I merely want to illustrate the point that ATi aren't perfect; they have been known to cheat too - just not as much. Don't start calling me a fanboy, because to be honest minor discrepancies in image quality don't matter to me, because most of them don't affect how I enjoy the game, all I care about is getting a card that performs well for a fair price. This whole debate seems to have descended to the level of Jobs vs. Gates, where Jobbites say he can do no wrong, and Gates-types reckon all the Jobbites are just stupid.

(Oh, and this should prove quite conclusively that the XBox does not render the light like an ATi card)
 
Dave said:
http://forums.overclockers.co.uk/showthread.php?s=&threadid=150919

As you can see.. Nvidia are the clear winners unless you're using AA/AF.

And you probably won't be using AA/AF for HL2.

Also worth noting that Nvidia cards have the best DX9 performance, therefore it makes sense they would provide the best performance with HL2.

Nope, that the games they tested are DX9 (some of 'em atleast) but none of them use PS2.0, the weak point of the NV.3x FX range. HL2 on the other hand, is VERY PS2.0 intensive. In ShaderMark, ATI is always the clear winner, it uses PS2.0. So prove me wrong.
 
jonbob said:
Firstly, I want to apologise for the quality of the nVidia shot, but my V500 is the closest thing I have to a digital camera, but here you may be able to see what I'm referring to. Notice that the ATi card renders the flashlight as a single hard circle, while the nVidia chip in the XBox renders it with a soft edge and further lighting around the main beam. I'll link to the images to avoid the inconvenience to anyone who isn't particularly bothered.

ATi Image
nVidia Image (XBox)

You might also want to compare these two:
ATi Image
nVidia Image

I'm not saying that one method of rendering the flashlight is the right way, but the nVidia one certainly seems more faithful to what a flashlight actually looks like. And I'm not saying that nVidia cards are better, because in truth, the current range most certainly isn't. I merely want to illustrate the point that ATi aren't perfect; they have been known to cheat too - just not as much. Don't start calling me a fanboy, because to be honest minor discrepancies in image quality don't matter to me, because most of them don't affect how I enjoy the game, all I care about is getting a card that performs well for a fair price. This whole debate seems to have descended to the level of Jobs vs. Gates, where Jobbites say he can do no wrong, and Gates-types reckon all the Jobbites are just stupid.

(Oh, and this should prove quite conclusively that the XBox does not render the light like an ATi card)
Then its very odd indeed that both ATI driver developers and a multitude of gamers confirmed that ATI indeed rendering it correct, dont you say? The way a flashlight looks in reality also depends on the flashlight :)
And I would like to know what you are refering to when you say "they have been known to cheat too - just not as much". If its the quak cheat, it was so long ago nobody hardly remembers it.

At any rate, I agree this issue is a very minor discrepancy that doesnt matter. The FSAA quality is all the more important, and ATI reigns supreme, hehe. Freelancer, smooth a babys butt :) (note, and that's 10x7, not 16x12)
 
Back
Top