Sorry had to brag - Look what I just bought.

Northwood83

Newbie
Joined
Jul 11, 2003
Messages
382
Reaction score
0
http://www.newegg.com/app/ViewProduct.asp?description=14-128-159

Just got done ordering it, Ill provide some benchmarks when it arrives. Its not gonna be record breaking though since my computer specs are:

1.8 Northwood-A
Abit TH7 II Motherboard
512 Megs of Rambus 800mhz

Gonna have to upgrade those next, but they can wait. As long as I can see all the pretty graphics HL2 has to offer im content. :bounce: :bounce:
 
Cool card, I think you might want to get a better cpu though to get all you can out of it. ;)
 
Personally i would have gone for a 9800 pro (128mb) and then a 2-2.5ghz processor

Ah well its your money, play HL2 on max for me :(
 
I was going to go with the 9800 pro 256 until the 5900 came out. Its clear that the 5900 beats the 9800 pro in almost every game besides splinter cell.

Gainward always has the best cards, especially if its in the "Golden Sample" line. Those are highly overclockable cards which really perform well.

Not a rich one - I just sold like..Every old console I owned for that lol. Im planning on getting a new CPU and mobo once I get up enough cash for it. Abit IC7 and the CPU im not too sure, It really depends on my situation.
 
well, not to spoil your fun, but realisticly speaking, the 5900 ultra isnt better than the 9800 pro. im not saying this because i got a 9800 pro or anything, actually i dont have either card, but im familiar with both being the one who has lots of pc enthusiast friends. believe me or not but my friend and his brother has a 9800 pro and a 5900 ultra, right in 2 computers next to each other. i tell you this with 0 fanboyism and 0 opinion the 9800 pro is a better choice than a 5900 ultra even if the 5900 ultra has like 5-10 better FPS at high res its picture quality is not better than the 9800 pro...advidly playing with both of these cards sincec there release is what i have come down to. typiclly when i use the 9800 pro at 1024x768 it has pretty much the same fps as the 5900 ultra (bf1942,ut2k3,RTCW) but when you turn the AA and AF up on the 9800 pro, you most certainly get a better picture quality with the 9800 pro with an average of 10fps drop for me.

so to sum it up.....the 5900 ultra will give you higher fps at 1600x1200 but its only like 10-5 fps higher than the 9800 pro....go to 1024x768 and both cards got over 100 fps on high detail so fps is matter there, but with AA and AF on you get a better picture, ie nicer graphics. and i think the 5900 ultra is more money.....so in my opinion its an obvious choice with these 2 cards. just to clear this up i never said the 5900 ultra was bad, i just wanted to say this stuff because northwood said that the 5900 ultra beats the 9800 pro in most games, which is false, so are the benchmarks, so either way you go your going to end up with an amazing card.....but umm..the 5900 dosnt outperform the 9800 pro....in any game i play as a matter of fact


Think of it like this-

Would you rather have 110fps with a 5900 ultra in a game, or would you rather have 103 fps with a 9800 pro and better picture quality?

;) btw you cant tell the difference between 75 fps and 100+ fps...
 
nvidia bumbed up the fps by lowering the IQ, thats why it wins fps pissing contests. anywho if you want to have your 3-5 fps gain for lower IQ its all yours, i woulda saved around 100 bucks by getting sapphires 9800 128 pro but like someone stated above...its your money :dozey:

im not a ATi fan boy actually i was a nvidia not to far back, fx line was shit till 5900pro even though the 5900pro is still twice as loud as the 9800 pro.
 
I knew all this when i made my purchase. However I only disagree with one point you made, that AF has better quality on the 9800 pro as opposed to the 5900. That is simply not true. Check out the AF test at http://www.hardocp.com/ . It shows that Nvidia's is superior.

I will admit that the 9800 Pro's AA is better than Nvidias but that can be fixed with a simple driver tweak. Im willing to wait for that, so it doesnt matter to me.

As for lowering the IQ, show the proof. Lots of people say it yet none show proof. They did not Lowed the IQ just to get more fps, have you even looked at the clock speeds on the 5900 ultra? Also Im assuming you heard this at a forum where they were complaining about Nvidia's optimizing. Well if you were following it through these past days you would see that the new drivers remove the optimizations.

I got the 5900 because it performs a whole hell of a lot better in Doom3, which is one of the games I bought this for as well as STALKER and HL2.
 
look iv seen sites which i could find if i tried but this doesnt wanna make me because i know wut i know. when you bring up the AF and AA 9800pro is more constant than the 5900 ultra, in fact the 5900 ultra drops far further the 9800 pro.
 
1052729768FyFQIMpRcj_2_3.gif


Question: Where are you reading your reviews? Rage3d.com? lol.
 
um...wtf..those are doom3 benchmarks!??!? why.......not only dose the curent doom 3 test alpha have horrible fps, but why would you quote benchmarks from a game thats not gonna be out till next year? or maybe more....the right kidn of bench marks to get would be bf1942...ut2k3..etc but even then, nvidia still lies and uses unfair tweaked drivers.

oh yea, put the res at 1024x768, and the 9800 pro magicly rises above the 5900 ultra....hey i dont know about you guys, but i dont plan on playing any game at 1600x1200
 
ok u posted that one from 1600X1200 whic iv seen this review but wut about the other charts from 1280 ans 1080 with the aa and af
wtf do u have montier that can support that resolution anyway. beside all though fps suck for games. 1600 is not for doom3

hunter: the accused cheating was for mark3d and it was talked over and they are no longer accused. besides that the doom3 was released to that review with the latest build of doom3 by the dev's.
 
This isnt the doom3 alpha build. This is the offically sanctioned Doom3 test which was released to reviewing sites. For the record the 5900 beats the 9800 in 1024. If you checked the link before running your mouth you would see that.

Im done trying to explain all this. If you guys want to keep your heads burried in the sand beleive whatever some forum trolls tell you then go ahead.

Yes I do have a monitor that can support 1600x1200.

And at your request:
1052659778bQi2UIDZtB_9_3.gif
 
ok thats great that your moniter can support it but do u use it for games? not if u want your precious fps...
 
lol...I think 94.6 FPS is fine. Yes Ill use 1600..Whats wrong with you? 1600 not for doom? Why not?

Its 3dmark not mark3d and yes they only optimized for 3dmark but they were later removed. Some people speculated they might do it with games but it never happened. You shouldnt beleive everything someone posts on a forum as the truth.
 
why are u not showing the 1280 and 1080 aa and af...most of us dont use 1600
 
look honestly im not trying to be pushy i just believe wut i want and when someone says something i dont agree with i tend to type with an attitude.

both cards are good and both have their ups and downs. i personnally have my heart set on the 9800 pro 128mb but the 5900 ultra can be just as good or better(256mb mind u) than the one im looking at and from the bench marks from that reviewer better than the no name 9800 256 card is most cases.:dozey:
 
Sorry here too, Im just getting sick and tired of people spreading false information (Ridic with the cheats in games,ect). They are both great cards and I was a bit too harsh with some of these posts. Friends? :cheers:
 
Nvidia ROCKS!!!! thats all i wanted to say.
(and i also had to sell about $200 worth of old video editing equipment to get my 5600)
 
cheats in games...wtf are you talking about. i said nvidia cheats on the benchmarks and thats no lie
 
Ridic, that statement is no longer correct. Like i said if you have been following the situation they have removed the "cheats" from the current drivers. and where do you get off adding a plural to benchmark? Only 3dmark was "optimized".
 
Originally posted by Northwood83
Ridic, that statement is no longer correct. Like i said if you have been following the situation they have removed the "cheats" from the current drivers. and where do you get off adding a plural to benchmark? Only 3dmark was "optimized".

there were issues with games, and there still continue to be, even in the BFG review that Brent did, he kind of forgot to mention it, but the card isn't doing trilinear filtering in UT2k3. If you goto hardforum.com you can read the seven page thread regarding it. but before that, in splinter cell the 5900 ultras wern't doing all the shaders they were supposed to (you could visibly notice the detail missing). in UT2k3 timedemos there were "optimizations" and hmm well yeah, the trilinear thing is troubling, head over to beyond3d.com forum and read the thread about it there as well :)
 
im getting a new video card before Sept. 30th! As you can see im my sig, I only have a GF4 MX440 64MB ... which will not cut it for HL2
 
I have one question... WHY do people think that more is always better..........there is no reason to buy a 256 meg card.........

just a waste of money. ...i dont get it

edit: do people even research things before they buy?
 
Xtasy, I was reading about the whole thing last night. Over at the anantech forums one of the reviewers even said

"In any case, I think one of NVIDIA's engineers mentioned to me a while back that trilinear buffering is disabled in certain games (this could apply to certain levels in UT2K3) as it'll significantly degrade fps for no good reason (i.e. no IQ gain). Though I'll have to confirm that info."

Its just disgusting with this whole issue. But what can I do now? I already ordered it so im just going to try and help out with getting Nvidia to stop this whole thing.

It appears though that leaked drivers (44.71) are clean from these issues.

crab, I researched, and you should too. I play games in very high resolutions. You dont even need AA at such resolutions which is why I do it. The extra 128 Megs helps out quite a bit. In most cases it helps by 15-20 fps.
 
256mb cards are clocked lower, and with 128mb more ram on a video card you would not see that much of an increase considering no games require that much at the moment.
 
Not impressed at all :p

Would rather have taken a Barton 2500+, Epox or Abit Nforce2 mobo and 2x256 pc 3200 memory.

Planning to buy that in a near future :thumbs:
 
Originally posted by TrueWeltall
256mb cards are clocked lower, and with 128mb more ram on a video card you would not see that much of an increase considering no games require that much at the moment.

Sorry but they arent, I think your confusing the 5900 non ultra with the 5800. I bought the card for future games, not just for current games.
Non ultra clock speeds = 400/850
Ultra clock speeds = 450/850

Because its a golden sample and its got 2.2 ns DDR it will be a great overclocker.
 
I personally, wouldn't buy a new card from a company that doesn't doubt about cheating and lowering IQ for more FPS.
The never even admited their cheating, thought it was extensive.
And in several benchmarks they cheated, for example UT2003 and 3DMark.
It was also found that ATI had a little cheat, but it was almost insignificant and they confronted the statement that they had cheated and rid it out and told everything about it, and it was shown that it was only a small small optimization, but they deleted it with the new drivers.
It's a shame that some people just don't get this.

And where do you get your info? Nvidiot.com?
Who the f*ck cares about those D3 benchmarks? It's just for PR, and given the thought that D3 in the benchmark is better optimized for Nvidia cards.
 
Show me proof they lowerd IQ in games for FPS gains, I have yet to see it. All I've heard is wild speculation about UT2k3 and its trilinear buffering but Ive seen no proof. A reviewer at Anantech has emailed Nvidia about the situation and everyone is awaiting its reply before they go mouthing off as if its the gospel. Im waiting for it as well.

I get my info from Beyond3d and a number of other sites. Im sorry you dont like my purchase decision but why try and flame me? Its obvious by your use of nvidiot that your an ATI fanboy which throws almost all of your credibility out the window.

I care about Doom III benchmarks as it was the main reason I bought this new card.

Please try to remember im asking for proof of lowering IQ in GAMES and not BENCHMARKS. I agree that "cheating" in benchmarks is a lowly practice but Its not going to make me weep, as I dont base my purchase on benchmarks alone.

BTW you seem to forget/not know about about the Quack3.exe fiasco. If you havent heard of it I suggest you go look it up, It was quite amusing. ATI isnt as clean as you think. Before you go calling me a Nvidia fanboy, I have an ATI tv card inside my PC.
 
9800 Pro ****ing owns you.. yeah let's pay $XX more for 2Fps with worse image quality!
Yeah Even trade off!! I'm so smart..!
 
Originally posted by nsxownzme
9800 Pro ****ing owns you.. yeah let's pay $XX more for 2Fps with worse image quality!
Yeah Even trade off!! I'm so smart..!

$479.00 - Radeon 9800 Pro
$483.99 - Nvidia 5900 Ultra

The price is going to vary through what company's you compare, however I selected the 2 top manufactuers for each card - Saphire for ATI and Gainward for Nvidia.

Im sure a video card owns a human being. [sarchasm] Wow your right! a $4.99 difference! So much!! [/sarchasm]

Perhaps you should stick to more simple debates, such as why the blue ball at the daycare is superior to the red ball.
 
It's sarcasm Spell check is your friend. Red balls goes faster dont you know? ;)
 
Originally posted by TrueWeltall
It's sarcasm Spell check is your friend. Red balls goes faster dont you know? ;)

I think you mean Red balls go faster, right? You should spellcheck your own sentence when poking fun at typo's.
 
256 cards are good for games if you play at say.....1600x1200, which in my opinion is ridiculas and unnecessary. 1024x768, or the res above that are fine...and if you dont play at 1600x1200 with the 256 card you get like 2 fps higher if any. 256 should not be bought untill say....a year and a half or two when its needed.
 
Ridic, want to cite a source for that information? If not then quit spreading rumors, Its just hearsay. It happens with every video card release until someone gets it and proves it wrong. Unless you own the card and have personally seen the things you say then you shouldnt even be posting the information as if it were fact.

As for 1600, Since there are AA issues with HL2 currently on Nvidia and ATI hardware it would make sense to run the game in 1600 to minimize the jaggies. You have stated your opinion on the matter and I respect it but I do not agree, cant we just drop it?
 
wow your starting to pissme off with your "proof" asking bullshit....its been said plenty of times on numerous sites...why dont you show me some damned proof that the 5900 ultra dosnt loose image quality....or why dont you give me some proof that says you get better performance with the 256 cards on other res's than high ones....god almighty. why dont YOU show some damn proof, and when you do show proof...its most likly false benchmark information...
 
Back
Top