Sorry had to brag - Look what I just bought.

are we still argueing about this......


the 5900 and 9800 are both great cards............

for my money id go with the 9800 .......picture quality being my main concern when your already playing a game at 60-100 fps.

the 5900 is a little faster.......and has some nice features.

but people keep saying that one is so much better than the other....

all the nvidiots and fanATIcs (however you say that) please report to crabcakes(the big guy with that basball bat) in the parking lot.
 
Originally posted by Shad0hawK
yes, the machine was built with the 9800pro so no previous drivers were interfering, then we tried the 5900u afterwards after we got it in stock.




(in most cases)i blame the card, more specifically i blame the driver. if the game is really old that might be suspect, i am running a ti4400 tweaked to the max right now, and am waiting with great anticipation for my 5900.




i see no problem with the FX cards, they run great without any problems, of course i would not sell a 5200 to a gamer...as far as high end goes i have seen for myself the 5900 is tops. and the 5600 ultra is not a bad card either, in fact with the new driver release the 5800 is actually good too.



it is 8:15 pm here, and have a goodnight! :)


Sorry for me being so hostile yeasterday, I wasn't in my best mood.

The compatability issue could be due to a mix of bad game optimization and lousy drivers. But still, their drivers have improved but I've heard their drivers for Linux are even worse, much worse.


I do. The thing that Nvidia themselfes (spelling) have made a video where they joke about the sound and heat of the FX5800 doesn't really make one wanna buy it. The FX5800 is a big flop, I see the ATI counterpart, the 9700, as the better card there for it's better performance and low price (9700non-pro's can be found for down to $250 here).
I agree about the FX5200, I would never ever recommend it to a gamer. Even a GF4 MX card is sometimes faster.
I admit that the FX5900 is the better card, but I would not buy one because I will never pay over $400 for a graphics card and my other posts clearly shows my other reasons.
But I think that ATI has the best mid-range cards, the 9500PRO and 9600PRO are really really good for a very good price. And as I wrote before, a 9700 could be found for a very low price, wich makes it on hell of a card for such a low price.
I myself, plan to get a 9600PRO with my new computer, take a look.
http://www.mycom.se/produkter/436842

Edit: Isn't that the wrong box in the picture? :p
 
Originally posted by Draklyne
I would just like to say that the Doom 3 test was completely organized and hosted by Nvidia. Nvidia had the Doom 3 build used as a benchmark previous to the test, and could have made any amount of changes or optimizations. Meanwhile, ATI was also screwed over in that test because their on-and-off drivers didn't make use of the extra 128MB of RAM, or there was a different driver bug that flatlined the Radeon's FPS at 10 frames per second, forcing HardOCP to use older drivers.


Exactly!
 
i've just spent the last 30 or so minutes reading the last 6 pages and checking the links on all of the posts....

and..


i


have...


come to the...


conclusion...


that...



i hate you all.
:D

and also that i will be buying a 9700/9700pro.
thanks for the info.
-gl.
 
My geforce 2 mx400 will not own all unlike Nostradamus because PNY ripped me off an i ended up with a geforce 2 100/200 and it gets 1.2k 3d marks lower then the mx400.
 
Originally posted by A.A
Exactly!

yes...exactly!

guess what? optimizations for various games are written into every new driver release nvidia and ATI put out(that is after all most of the point in making a new driver...) so an optimization that makes the q3 or doom benchmark run faster will also make the GAME run faster as well. it is a matter of perspective, an optimization written to make a game run better will also make better scores on those games benchmarks that run on the same engine...

also that quote about hard ocp using older drivers is also proof what many of us already know, ATI drivers blow.

have a good one! i have to actually work now :(
 
Originally posted by Northwood83
I was going to go with the 9800 pro 256 until the 5900 came out. Its clear that the 5900 beats the 9800 pro in almost every game besides splinter cell.

Gainward always has the best cards, especially if its in the "Golden Sample" line. Those are highly overclockable cards which really perform well.

Not a rich one - I just sold like..Every old console I owned for that lol. Im planning on getting a new CPU and mobo once I get up enough cash for it. Abit IC7 and the CPU im not too sure, It really depends on my situation.

well kinda early to conclude ur card will run better with HL2...
 
Originally posted by Shad0hawK
yes...exactly!

so an optimization that makes the q3 or doom benchmark run faster will also make the GAME run faster as well. it is a matter of perspective, an optimization written to make a game run better will also make better scores on those games benchmarks that run on the same engine...

also that quote about hard ocp using older drivers is also proof what many of us already know, ATI drivers blow.

have a good one! i have to actually work now :(


But that is not what happened, it was no optimisaion.
Would you for example se lowering IQ as an optimisation then or as they did in the UT2003 benchmark, to never render what you did not see? That's more like adapting the game to get higher score, then adapt to the game itself.

Does it? I could dissagree more.
 
Originally posted by A.A
But that is not what happened, it was no optimisaion.
Would you for example se lowering IQ as an optimisation then or as they did in the UT2003 benchmark, to never render what you did not see? That's more like adapting the game to get higher score, then adapt to the game itself.

Does it? I could dissagree more.


in which game? IQ settings are adjustable in the properties menu. i have seen sites that display screenshots magnified 10-40 times expounding on how much better the ATI looks, but if you have to magnify something that much to see the difference, does it really mean anything? i am not talking about 3dmark, but games themselves.

another thing many people do not consider is what file type the screenshot is saved as. for example a certain site taking a screenshot and saving it as a tiff and comparing to a screenshot saved as a jpeg or a bitmap is really no comparision since tiffs look better than either, and the actual game will look better itself than all of them. file type and 2d settings are vastly more of a factor than 3d settings, even if the screenshot is from a 3d source.

you can test this yourself pretty easily, take your favorite game take a screenshot, alt-tab back and forth and compare the 2. that 2d screenshot will not look near as good as you actual live game screen no matter what video card you have.

considering that all screenshots that people see on the net really do is actually compare 2d performance and whatever program was used for the texture conversion. the only real way to tell actual 3d performance is to have identical machines with identical monitors running the same game with the settings as close as possible..something i myself have done and the truth of it is the ATI does look marginally better...if you stand perfectly still in the game and get about an inch from the screen in 4x AA some of the edges are a few pixels different.

ahh lunchtime!!
 
Originally posted by Shad0hawK
in which game? IQ settings are adjustable in the properties menu. i have seen sites that display screenshots magnified 10-40 times expounding on how much better the ATI looks, but if you have to magnify something that much to see the difference, does it really mean anything? i am not talking about 3dmark, but games themselves.

another thing many people do not consider is what file type the screenshot is saved as. for example a certain site taking a screenshot and saving it as a tiff and comparing to a screenshot saved as a jpeg or a bitmap is really no comparision since tiffs look better than either, and the actual game will look better itself than all of them. file type and 2d settings are vastly more of a factor than 3d settings, even if the screenshot is from a 3d source.

you can test this yourself pretty easily, take your favorite game take a screenshot, alt-tab back and forth and compare the 2. that 2d screenshot will not look near as good as you actual live game screen no matter what video card you have.

considering that all screenshots that people see on the net really do is actually compare 2d performance and whatever program was used for the texture conversion. the only real way to tell actual 3d performance is to have identical machines with identical monitors running the same game with the settings as close as possible..something i myself have done and the truth of it is the ATI does look marginally better...if you stand perfectly still in the game and get about an inch from the screen in 4x AA some of the edges are a few pixels different.

ahh lunchtime!!

no one cares....most people have already decided the 9800 is a better card overall
 
Originally posted by crabcakes66
no one cares....most people have already decided the 9800 is a better card overall


i disagree, many people already know the 5900 is better. at least nvidia cards dont crash my games all the time because of crap drivers! ARRRG!!!!

at least that is a factor i consider....as for other people, let them be blissfuly ignorant.

:)
 
I've had more problems with nVidia drivers and compatibility than ATI lately.

A card that performs as good (if you turn up the quality of the AA/AF on the nVidia card), looks better, has equal or better compatibility (lately ATI has been MUCH better than they used to be in terms of drivers), and costs a bit less is a better card in my book.
 
... and the whole controversy over the 3DMark scores was the fact that the drivers were purposefully overriding the IQ settings and setting them to the bare minimum so that the benchmark would score higher.

If you are using the drivers with the "optimizations" try this:
Turn up the IQ settings all the way.
Take a picture in 3DMark03.
Rename 3DMark03.exe to something like 3DMurk03.exe (anything other than what it is supposed to be) and take another picture of the same scene.

The picture on "3DMurk" should look a lot better than on "3DMark"... now that is what I call cheating to sell a few more video cards.
 
I have a 9800 Pro and my friend just installed his 5900 Ultra 256mb card yesterday and we did some image tests for the heck of it. I don't have screens of proof either so just take my honest word. My friend hates ATI and is the biggest Nvidia fan I know and he honestly had to say that in side by side comparisons (on 2 identical monitors) that the IQ was better on the 9800. I was speechless to hear him say it actually. I would be happy with either card but the simple fact is that for the money you can't beat a 9800 Pro.

Just my .02 bits
 
Originally posted by Shad0hawK
i disagree, many people already know the 5900 is better. at least nvidia cards dont crash my games all the time because of crap drivers! ARRRG!!!!

at least that is a factor i consider....as for other people, let them be blissfuly ignorant.

:)


......your full of shit :)
 
Back
Top