there is now virtually no IQ difference between the 5900 and the 9800

Originally posted by Northwood83
Yeah I know about the 8x thing but I decided to throw that in just for refrence. The fact remains, you have current hardware all around and I do not. I dont think you have any experience with a system like mine. Im running an RDRam board as well.

So far Ive only seen systems with a 100mhz FSB advantage over me getting 5890, which is what I think you mistook my system for. If you compare my system with others with my specs you'll see that they score the same because they are using Northwood A's, and not B's or higher.

I cannot give you a comparison (simply because yours is the only one I can find). What I can give you, though, is this:

Running at only 300mhz core (which means it's underclocked) and on a 1,4GHZ processor - 5489pts
http://service.futuremark.com/compare?2k3=1040586

Also 133mhz fsb and "only" 4x agp. If you still don't believe me, I will see to that the comparison of the system I told you about gets up in the futuremark comparison server.
 
Originally posted by reever2
Hey easy, look at the driver revision of the anandtech tests. Coincedentally those are the ones which enable trilinear filtering in applications specifically made to test aniso quality, but switches the filtering to bilinear once game benchmarks are used which lowers the quality...

Hrmm never heard this, hehe, have ta do more study then. I am useing the 3.5's and so far everyhting "overall" seems ok, yet I don't have every game under the sun to test with, so, I can't really see my results and compare them to others :(.
As far as tests go, I personally do not take heart in 3dmark, as it's just a synthetic test. Sure you can say it's built around gameing this and that, yadda yadda, I've heard just about everything, but it can be adjusted WAY to much to get WAY to many different scores. I myself have run 3dmark, same exact settings, same exact res etc....and I get 14k one time, and 12k the next, then 11k the next, then 13k again, etc.....so how is this a good test, it's not, it's to up and down. Is there another test, dunno, I just know I don't hold much water with 3dmark at all, as it and fps, do NOT reflect real game playing, because I have seen a game run at 20-30fps, and runs absolutely smooth as silk.....the game btw was Dungeon Siege. So to say because one person gets 400fps in this game, and 20 in this game doesn't hold much water, since it's how smooth the game actually feels and plays, is what matters, not fps or 3dmark scores.
People need to realise it's fine to use 3dmark as a "test" but to mark it as the gospel truth and the holy grail of testing, is kinda overboard. I think people hold fps and 3dmark to high on the list and use it to the point of overkill, instead of what its meant for and that is jsut to test against what you had before or to test driver aginst driver, but even then, I still don't put much faith in it, because if driver set can improve quality, while loseing a few thousand 3dmarks, but game play is STILL smooth, then why should I wanna worry what 3dmark says.
 
Originally posted by theHATRED
I cannot give you a comparison (simply because yours is the only one I can find). What I can give you, though, is this:

Running at only 300mhz core (which means it's underclocked) and on a 1,4GHZ processor - 5489pts
http://service.futuremark.com/compare?2k3=1040586

Also 133mhz fsb and "only" 4x agp. If you still don't believe me, I will see to that the comparison of the system I told you about gets up in the futuremark comparison server.

Most likely his hardware was detected wrong, something that is common when you overclock a lot. Also the Core overclock on 5900's in the orb is broken, it displays the default speed no matter what (I think it detects the 2d clock).

Also try expanding my search with this:
Select Pentium 4 from the CPU
Cpu clock range to 1800-2050
Chipset to 5900 Ultra
rez - 1024
OS to any

Youll notice the two lower scores are running Northwood A's
As for the guy ahead of mine, I dont know what the hell he is doing but thats a major diffrence. Could it be a cheat? Then of course he could also have A higher rambus Clock than mine, like 1066 or something.
 
Originally posted by Northwood83
Most likely his hardware was detected wrong, something that is common when you overclock a lot. Also the Core overclock on 5900's in the orb is broken, it displays the default speed no matter what (I think it detects the 2d clock).

Also try expanding my search with this:
Select Pentium 4 from the CPU
Cpu clock range to 1800-2050
Chipset to 5900 Ultra
rez - 1024
OS to any

Youll notice the two lower scores are running Northwood A's
As for the guy ahead of mine, I dont know what the hell he is doing but thats a major diffrence. Could it be a cheat? Then of course he could also have A higher rambus Clock than mine, like 1066 or something.

A cheat? No, I don't think so. I'm not saying that cheating is impossible, but let's not be naive. One thing you notice, if you take a look at the comparison between your and my system, is that you actually get more cpumarks. My processor was acting weird when I took that test, but still, it's what we're comparing with. RAM does make a difference, but not that much.
 
Originally posted by Northwood83
BTW about cheating, check this out lol http://service.futuremark.com/compare?2k3=1074604 .

HAHAHAHAHAHA!!!!!!!!!!!!! :eek: :dozey: :eek: :cheese:

That's insane 57449 3DMarks, pfft, 57 THOUSAND 3dmarks, CHEATER!!!!!!! :cheers: :)

I've seen that cheat with 3dmark, simple html editing or whatever, hehe pretty krewl though, yo got the highest score ever man, YOU RULE!!!! your my heeeeero! :)
 
Sometimes the fps o****er in 3dmark gets messed up. Theres lots of scores with people getting 600+ fps in game1, happens to me a lot of time when for no reason the counter will report like 400 fps
 
Well it looks like you were right, I had drivers mixed in with others. Heres my new bench with a clean driver http://service.futuremark.com/compare?2k3=1075195 . 1000 point increase! I must thank you man, Without you I would have gone around with these messed up drivers for a while :cheers: :cheers: :cheers: :cheers:
 
which one shall i buy

MSI fx 5900 ultra or Radeon 9800pro?
:flame:
 
Originally posted by Gorgon
which one shall i buy

MSI fx 5900 ultra or Radeon 9800pro?
:flame:

i think most of the people here would tell you to go with a 9800 pro.
 
Well I'de say get a 9800, simply becuase I still to this day, no matter what nvidia people are saying, think the ati cards have better IQ overall. More crisp, clear, and no blur.
Also I'm not sure if this helps any but here is a lil info I got straight from an ATI guy himself:
We have gamma correct A; Sample points are completely programmable; We have a rotated grid for 4x AA which you won't get on a GeForce card; Lastly, we support 6x MSAA
Not sure if that helps or not hehe, but if so glad I could help. I still haven't see anyone with an nvidia card compareing ati and nvidias visual difference in games
BTW I found out a little info today I never knew, in case anyone wants to know heh, if not then oh well yer getting it anyways...
Anands high quality performance setting
The above page where you see in the second set of picture below the top ones......these settings are done in in performance mode settings wich is basically bilinear.
I also found out that Nvidia has a new "optimization" where they don't run full trilinear at all anymore. I cannot verify this for sure, but it seems that it's true, becuase I heard it from more than one sorce.

Also How do you think that the fx5600 is not the same gpu as the 5900ultra shadow? it's the SAME gpu only the 5900ultra is clocked higher, just like the 9500 and 9700pro are the same gpu, only the gen of card, depending on wich one, is clocked higher. They are the SAME gpu though, they are all based off the r300 ati chip. However the 9600 and 9800 are both an r350 gpu. Dunno why ati used the r350 on the 9600, hehe but oh well.
With ati the 9500 and 9700 is the same exact gpu, only difference is as follows....
9500 = 4 pipes, bus width 128bit
9500pro = 8pipes, bus width 128bit
9700 and 9700pro = 8pipes, bus width 256bit
and each one is clocked higher than the one before.

Other than that, they are the same gpu and they all do dx9.
Unless you can show me otherwise that the 5600 is not the same gpu as the 5900, I'de say do some studying before you say something is not the same gpu. If you can show me it's not then I will humbly appologize.
 
My God - 7 pages of battle and all I can say is check Newegg: the 256 meg 5900U is over $100 more than the 128 meg 9800 Pro. Is it really worth it?

The difference in performance that all of the reviews everyone pointed to seems almost negligable real world.

I've had nVidia cards for the past 4 years, and tonight I ordered my first ATI card in that long of time (9800 Pro 128 Meg). I'm sure I'll be quite happy with it when that Half-Life 2 logo graces my screen for the very first time, rendered in all its glory. Mmmmm...

Cheers,


EnochLight
 
I get around 85 fps constant in 1942 with max AF on
Its fun when people try to prove something, then end up disproving themselves ey? I get 85fps with 4xFSAA and 16xAF with all details to max, and that is with a 325mhz core compared to your 518 (if you ran it overclocked that is). That's a difference of what, 193mhz? You could run a SECOND card on that!!!!!! That's why its so crapped out lousy. Its like using a nuclear powerplant to run a golfcar.
Oh and those IQ tests at toms and anand: They show absolutely NOTHING on who has the best or if they even have equal IQ. NOTHING. NADA. INGET. They dont even have ingame shots for gods sake. All they do is base it on the "theory" that Nvidia looks better. And sure, they did change in thedrivers. But not to the better. All they did was shuffle the names a bit. Application (which reviewers NEVER used before, they tested Nvidia Performance VS ATI Quality which is obviously totally wrong as the ATI Quality is 10 times better and does 10 times more work) became Quality like on ATI. The other modes have changed slightly, to have sharper edges. Which show in games, you see banding. You do not see this with the ATI smooth transition star.
 
All I know is, I hear all this "nvidias iq is euqla to ati's" "nvidias iq is far better than ati's" "I don't see a difference in nvidia and ati's iq". From ALL the research I have done, trying OH so hard no to pay attn to the stupid reviews, and compare MY shots, with online shots or to people with fx cards shots, and I'm sure I am forgetting some things I've done and researched. Out of ALL this, in the last 2 days, I can't say as I can actually see that nvidia is better IQ than ATI is or even equal to ATI iq. Even from those shots that were shown, shadow or anyone else that has an nvidia card or acess to one, can tell me the shots with the 44.xx drivers doesn't look blurry as compared to the 43.xx drivers. I know my 9700 looks NOTHING like that, it looks quite a bit better, and that's nto because I am flameing nvidia and praiseing ati, it is because the card I have, be it nvidia or ati, is doing it's job and showing crisp clear IQ and visuals, without the blur.

I realise this is not exactly the BEST game to use as far as graphics, but here is a pic I took of bf:1942.
Bf:1942 on a 9700pro aiw card
now granted it's not the best game as I said, but it's all I have at the moment. Tell me what you think. keep in mind it IS bf:1942, and bf doesn't exactly have the BEST IQ :).
Also, I do realise I am showing 60fps, but I haven't really tweaked anything yet, and since I am not one to even worry about fps, well, the fps counter doesn't evne mean much anymore.
 
Exactly... Notice that the terrain even at a distance is still crisp (for example by the fence). NONE of the pics of the "Nvidia high Quality" has shown anything close to being as crisp and clear.
 
Originally posted by dawdler
Its fun when people try to prove something, then end up disproving themselves ey? I get 85fps with 4xFSAA and 16xAF with all details to max, and that is with a 325mhz core compared to your 518 (if you ran it overclocked that is). That's a difference of what, 193mhz? You could run a SECOND card on that!!!!!! That's why its so crapped out lousy. Its like using a nuclear powerplant to run a golfcar.
Oh and those IQ tests at toms and anand: They show absolutely NOTHING on who has the best or if they even have equal IQ. NOTHING. NADA. INGET. They dont even have ingame shots for gods sake. All they do is base it on the "theory" that Nvidia looks better. And sure, they did change in thedrivers. But not to the better. All they did was shuffle the names a bit. Application (which reviewers NEVER used before, they tested Nvidia Performance VS ATI Quality which is obviously totally wrong as the ATI Quality is 10 times better and does 10 times more work) became Quality like on ATI. The other modes have changed slightly, to have sharper edges. Which show in games, you see banding. You do not see this with the ATI smooth transition star.

I wasnt trying to prove anything, Just simply stating my experience and showing that the 44.03 have IQ issues. Sorry you take offense when someone posts what frames per second they get but hey if you want to go and compare your system to mine, when you dont even list your spec's and try to say that the ATI card is better then your kidding yourself.

I dont think anyone is naive enough to beleive your claims of superiority over my 5900 when you dont even have the sense to list your specs. I wouldnt have minded if you left it at that but you claim that the 5900 sucks because your system is better than mine? Try running a 5900U on whatever you system is then get back to me. Id like to know your specs as well, as its highly doubtful you get that amount of frames with those settings even on todays current highest hardware.

For reference:
Pentium 4 1.8Ghz Northwood A
Abit Th7-II motherboard
512 Megs of Rambus pc800
Gainward Geforce FX 5900Ultra Golden Sample

I wouldn't even advise anyone to try and compare their cards against one another as there are too many variables (Ram,CPU,mobo,etc) to get an accurate reading. Try sticking to hardware sites that run the cards on the same hardware.
 
Damn.... another "ATI vs NVIDIA" topic.... forums all over the world seem infected by this plague :cheese:
People... don't flame eachother, don't argue.... just wait and see what happens.

I've heard somewhere both ATI and NVIDIA will have a hole new line of graphics cards ready by the end of this year, and then all of a sudden everyone stops bickering about the FX and 9x00(pro)'s and skip over to the next generation.

A good way to be sure what card produces the best framerates/image qualities is to wait for good, almost bug-free drivers and wait until half-life 2 is out. then you can all see for yourselves in forums like these. All this bickering is giving me a headache - i just read all pages :D lol

My opinion about Ati / Nvidia: I think in the past year 3D graphics card evolutions have been occuring too fast, forcing one of the two into launching a product too soon and not well-tested, resulting in dissapointment all over and a lot of arguing like we see here.
Obviously the radeons fare very well, especially the 9700 which has been a great success and is overall a very good card.
Nvidia has made some mistakes, but i think they'll make up for it with their new line of GPU's coming up ;)

I for one am going to sit back, watch how prices keep going down, keep my eye on benchmarks a little, and wait until i saved up enough money to buy a nextgen polygon-pusher :D
 
Back
Top