Sorry NVIDIA owners

Originally posted by crabcakes66
well. I started to quote you a bunch and reply to all the things you say.....

Im not going to....ill let someone else do it.



1. You sound like a fanboy for saying things that are completley unfounded and simply not true.

2. You are definatlely an nvidia "fanboy"(i hate that word). Although you are somewhat subtle about it. Ill give you that much.


Checked the benchmarks on that, if you notice at the very end on defaults it all evens out to being within about 10fps of each other. However it does match what i asked for, thank you. Have no real interest in tomb raider, but the features comparison was defiantely a good read. Ati did in fact have better performance with more features on defaults and with everything on high did have a signifigantly higher score on everything else. Still hate tomb raider, and it remains to be seen whether similar results will exist in Doom 3 and Half life 2. New drivers could increase Atis lead, or decrease it. If i can play half life 2 with full features at 30fps ill be happy. I just wish i could find some benchmarks on the non-pro/ultra versions of everything. A comparison of the 9800 vs 5900 would be more useful to me at those settings because the ultra/pro versions where never an option for upgrading for me. Tomb raider also uses a different engine that either of the two games im interested in playing.


Also



1. If im wrong prove me wrong, dont just say im wrong. If somewhere along the lines ive gained bad information id like to know it. So far you have shown me a benchmark where ATI is ahead in one game. You mentioned quoting me alot but decided to take a more minimalist approach. Feel free to correct anything ive said that is incorrect as long as it is backed up.

2. Only nvidia fanboy ive ever heard of who would put a 9800 pro in their Half Life 2 dream machine. <shrug> not exactly sure how that works out.
 
Folks, when the only way you can tell which graphics card is "better" is with benchmark numbers rather than through actual gameplay experience, it's time to stop worrying and simply buy the card you want. People are acting like the Radeon 9800 is getting 100 FPS while the nVidia 5900 is only getting 10 FPS in an identical situation when the truth is, they differ by mere single digits in most cases making the differences indiscernable without benchmark software.

Bottom line: Just go with the one you want as it will make little practical difference in the long run.
 
Originally posted by Detharin


1. If im wrong prove me wrong, dont just say im wrong.

I apoligise on that part. Ive been through this arguement so many times my brain turns to mush when i think about.

im just tired of repeating the same thing over and over.

Dawdler never gets tired of it, it seems. So ill just let him do all the typing.

edit: he is the preacher :)
 
Originally posted by Detharin
Checked the benchmarks on that, if you notice at the very end on defaults it all evens out to being within about 10fps of each other. However it does match what i asked for, thank you. Have no real interest in tomb raider, but the features comparison was defiantely a good read. Ati did in fact have better performance with more features on defaults and with everything on high did have a signifigantly higher score on everything else. Still hate tomb raider, and it remains to be seen whether similar results will exist in Doom 3 and Half life 2. New drivers could increase Atis lead, or decrease it. If i can play half life 2 with full features at 30fps ill be happy. I just wish i could find some benchmarks on the non-pro/ultra versions of everything. A comparison of the 9800 vs 5900 would be more useful to me at those settings because the ultra/pro versions where never an option for upgrading for me. Tomb raider also uses a different engine that either of the two games im interested in playing.
"Tomb raider also uses a different engine that either of the two games im interested in playing". This little line is funny. Cause if you read all the DX9 benchmarks, you will see that some have mailed both DoomIII and HL2 devs and asked if this is what they see. They both said yes, they see the very poor FX performance... In DoomIII however, Nvidia will use a default poor quality driver path to match and overcome ATI. In HL2, results will most likely be more similar to TRAOD...

Edit: and yes crabby, I love to preach. I'm the High Priest Of The Order Of Computology. ALL HAIL PIXEL!!!!!!!
 
Originally posted by crabcakes66
I apoligise on that part. Ive been through this arguement so many times my brain turns to mush when i think about.

im just tired of repeating the same thing over and over.

Dawdler never gets tired of it, it seems. So ill just let him do all the typing.

edit: he is the preacher :)

No problem, hey check this out as more of an FYI, they come from a performance review of two 5900 cards ultra and non. Both show higher benchmarks than the ones in the 9800 vs 5900 tests, also the last benchmark shows that disabling one feature (dof) almost triples performance (first link), and does a benchmark with the new drvers of Tomb Raider and it shows a rather large increase over the last tests. Still less than Ati but a good increase nontheless for just a driver update (second link)


http://www.beyond3d.com/reviews/asus/v9950/index.php?p=13


http://www.beyond3d.com/reviews/asus/v9950/index.php?p=19
 
Originally posted by dawdler
"Tomb raider also uses a different engine that either of the two games im interested in playing". This little line is funny. Cause if you read all the DX9 benchmarks, you will see that some have mailed both DoomIII and HL2 devs and asked if this is what they see. They both said yes, they see the very poor FX performance... In DoomIII however, Nvidia will use a default poor quality driver path to match and overcome ATI. In HL2, results will most likely be more similar to TRAOD...

Edit: and yes crabby, I love to preach. I'm the High Priest Of The Order Of Computology. ALL HAIL PIXEL!!!!!!!


As ive stated before i am only concerned with currently available games. Speculating on things not yet realeased is pointless. Nvidia has already show a rather large increase in performance on tomb raider with the current drivers. I still hate tomb raider mind you, but it does show that Atis higher performance may not be as much to hardware as software. Do not take this as a fanboy "behold the glory of nvidia soon it will release uberdriver 5X.XX that will bury ATI" but more as a statement of currently show performance increase over previous tests.

Would you care to discuss any fallacies in my previous posts as crabby has prefered to let you handle it?
 
Originally posted by Detharin
No problem, hey check this out as more of an FYI, they come from a performance review of two 5900 cards ultra and non. Both show higher benchmarks than the ones in the 9800 vs 5900 tests, also the last benchmark shows that disabling one feature (dof) almost triples performance (first link), and does a benchmark with the new drvers of Tomb Raider and it shows a rather large increase over the last tests. Still less than Ati but a good increase nontheless for just a driver update (second link)


http://www.beyond3d.com/reviews/asus/v9950/index.php?p=13


http://www.beyond3d.com/reviews/asus/v9950/index.php?p=19

Im just interested to see how this is going to effect people in HL2 and if Nvidias new 50.? drivers can improve things some more.

if disabling a few things makes the game playable with decent fps...thats fine, but will it really have any major visual impacts.

I feel sorry for the people that dropped 500$ on a 5900ultra256 when they first came out if this is going to be the case. I would be a little pissed at nvidia myself.
 
Originally posted by crabcakes66
Im just interested to see how this is going to effect people in HL2 and if Nvidias new 50.? drivers can improve things some more.

if disabling a few things makes the game playable with decent fps...thats fine, but will it really have any major visual impacts.

I feel sorry for the people that dropped 500$ on a 5900ultra256 when they first came out if this is going to be the case. I would be a little pissed at nvidia myself.


It will definately be interesting to see how things pan out once Half Life 2 gets released. Its been nice talking to you, im heading back home to get some rest.
 
Originally posted by Detharin
Would you care to discuss any fallacies in my previous posts as crabby has prefered to let you handle it?
I can comment this:
"So far you have shown me a benchmark where ATI is ahead in one game."

Yes, that is kind of true. TRAOD is the only new game using these kind of technologies. The is Halo PC too, but that is beta... It gives the same show, the Radeons are 2-3 times faster (though it was the laptop cards being tested).

Everything is still speculation for the average game using DX9. But you see, the Human Brain (TM) is a marvellous thing. It can piece together information using Logic (TM). And some of the so called "Fanboys" are quite technically skilled. I am not one of them that is in it so deep. They took the 3Dmark2k3 results too them, showing the Radeon to be much faster in DX9 parts. They tore the technical spec apart, finding major flaws in the Nvidia boards. We saw DX9 benches like Shadermark, showing the Radeons to be much faster. We saw minor applications (ie real time lighting using DX9) to be 2 times as fast on Radeons. Then we started seeing new games, that used DX9 parts, proving to be faster on Radeons. Then comes the really new game, showing DX9 to be ALOT faster on the Radeons. And a beta indicating the same. And in conjunction with this, we get confirmation of both Carmack and Gabe that they see the same in their games that havent been released, that the Radeons are much faster in DX9 parts.
During all this time, Nvidia has denied any error in the drivers, and claim to have the fastest available.
Use the Logic (TM) and piece together the information into a whole. What should the outcome of new DX9 games be? Will the be faster? Will they be slower? Use the Human Brain (TM) to figure it out yourself.
 
Wow, i have to get me one of those Human Brains(TM), they sound pretty amazing!

It's a shame i probably cant afford one after buying a fx5900. :(




:cheese:
 
For the record, nVidia has a long history of gaining substantial performance improvements through driver optimization. Just something to think about.
 
I got a Geforce Ti 4600 128MB running on a 2.53 GH Pentium 4.

with all this Nvidia talk I'm scared about how well HL2 will run on my comp but with a layout like that anything should run well with all settings maxed out.
 
"I believe I can fly... I believe I can fly... I believe I can fly..."
 
Originally posted by dawdler
I can comment this:
"So far you have shown me a benchmark where ATI is ahead in one game."

Yes, that is kind of true. TRAOD is the only new game using these kind of technologies. The is Halo PC too, but that is beta... It gives the same show, the Radeons are 2-3 times faster (though it was the laptop cards being tested).

Everything is still speculation for the average game using DX9. But you see, the Human Brain (TM) is a marvellous thing. It can piece together information using Logic (TM). And some of the so called "Fanboys" are quite technically skilled. I am not one of them that is in it so deep. They took the 3Dmark2k3 results too them, showing the Radeon to be much faster in DX9 parts. They tore the technical spec apart, finding major flaws in the Nvidia boards. We saw DX9 benches like Shadermark, showing the Radeons to be much faster. We saw minor applications (ie real time lighting using DX9) to be 2 times as fast on Radeons. Then we started seeing new games, that used DX9 parts, proving to be faster on Radeons. Then comes the really new game, showing DX9 to be ALOT faster on the Radeons. And a beta indicating the same. And in conjunction with this, we get confirmation of both Carmack and Gabe that they see the same in their games that havent been released, that the Radeons are much faster in DX9 parts.
During all this time, Nvidia has denied any error in the drivers, and claim to have the fastest available.
Use the Logic (TM) and piece together the information into a whole. What should the outcome of new DX9 games be? Will the be faster? Will they be slower? Use the Human Brain (TM) to figure it out yourself.

With new drivers, and new benchmarks done by the same people who did the original tomb raider benchmarks show Atis 70/20 lead now around 70/56 check the links its rather interesting. Notebook (calling em laptops will get you sued if you make the damn things. Stupid idiots burning themselves but i digress) video cards and desktop video cards are VERY different. I would never apply a notebook benchmark to a desktop card and neither should you. Heck my current video card is about as big as the motherboard on my notebook.

The human brain will never be objective, and logic is always up to interpretation. Personally im going to use my brain and do the following.
A. ignore Ati fanboys ripping nvidias card design and benchmarks apart to show what they want it to show, B. Ignore Nvidia fanboys ripping Ati benchmarks and past history apart to show what they want to show. C. Using my own jaded brain to compare what the cards do from multiple sources and decide which card is right for me, which i did. I got a 5900 not because i love nvidia, the price was right and it is better at doing what i need it to do. Note i have stated my "Dream" only get to play half life 2 machine would have a 9800 pro in it.

Actually what HAS been stated is that what gabe as seen has been pretty much what the current benchmarks where showing. Those benchmarks where done a driver release back, and nvidia is aleady working on more drivers. Will drivers fix there issue? Will Ati be bought out by Matrox, will Bill and Marsha ever get back together? We will just have to tune in once the game is released to find out. Until then its all speculation based on now, and the game isnt out now.


And for gods sake man, what do you expect Nvidia to say "Yes our drivers suck at something in dx9 tune in next release for a possible fix".
Company line is always "we do everthing right, and best, and fastest, and most cost effective" They will deny the situation until they fix it, then tought the fix as being "NEW DRIVERS INCREASE SPEED 80%". Ati would do the same its business.

As for Carmack, last i heard (read as cared enough to listen to) he was talking about how at least a Geoforce 3 was the next greatest bestest friend ever for running Doom3 at high visuals. (check back a couple years to when the alpha demo [by demo i mean used to show engine not download and play] was first put out.
 
Originally posted by Space Cowboy
I got a Geforce Ti 4600 128MB running on a 2.53 GH Pentium 4.

with all this Nvidia talk I'm scared about how well HL2 will run on my comp but with a layout like that anything should run well with all settings maxed out.

Ignore these forums. They are infested with underinformed over timed fanboys. Ati and Nvidia alike. Remember the minimums for this game is 733mhz and i think a g2. It should run fine on most systems. Most of the arguement is about high end, everything turned on, can i get every last drop of performance out of this system. These are guys who if there fps drops below 60 claim the games unplayable.

If you believe everything Ati fanboys are saying about Nvidias performance its time to get your spare change and Reeboks in order.

Course dont beleive the Nvidia fanboys either. Check the benchmarks (new and old) check prices in your area IF our thinking of upgrading. Then take the best advice i can give you. Wait until they release the game, if you find you want more visuals check the NEW benchmarks and buy whatever card gives you the performance you want.
 
Originally posted by Detharin With new drivers, and new benchmarks done by the same people who did the original tomb raider benchmarks show Atis 70/20 lead now around 70/56 check the links its rather interesting
Havent seen it. Link? The original still shows the 9800 to be twice as fast...

Actually what HAS been stated is that what gabe as seen has been pretty much what the current benchmarks where showing. Those benchmarks where done a driver release back, and nvidia is aleady working on more drivers. Will drivers fix there issue?
You never get something from nothing. Alot of Nvidia's speed increasements have been consistent with decrease in IQ. Even some of ATIs. Keep in mind, the FX was delayed ALOT. They have had lots of time to streamline drivers.

And for gods sake man, what do you expect Nvidia to say "Yes our drivers suck at something in dx9 tune in next release for a possible fix".
Company line is always "we do everthing right, and best, and fastest, and most cost effective" They will deny the situation until they fix it, then tought the fix as being "NEW DRIVERS INCREASE SPEED 80%". Ati would do the same its business.
Why shouldnt they? Concerning the drivers, ATI admitted a 2% increase in 3Dmark2k3 as they rearranged some pixel shaders. They also quickly agreed that it was wrong and removed it in the later drivers. Nvidia still hasnt commented on their insane increase with extremely questionable methods. Dont drag up the old Quak thing, it was long long long ago. Now the online community for both companies are 1000x bigger, and with that comes 1000x more demand on them.

As for Carmack, last i heard (read as cared enough to listen to) he was talking about how at least a Geoforce 3 was the next greatest bestest friend ever for running Doom3 at high visuals. (check back a couple years to when the alpha demo [by demo i mean used to show engine not download and play] was first put out.
Yes, of course. Why not? Its not like he could have said that 5900 Ultra was gonna rock with it :dozey:
 
GUYs. you want the Dream Machine: Buy this

64FX+9800PRO,NON PRO, or XT:cheese:
 
wow im really getting sick of people ripping on nvidia just like they do to the people that post threads about delays. I have a Ti4600, and i also have 2 times the money right now to buy a 9800 or whatever pro, but i obviously dont give a shit, because my game will run just as good as you ATI fanhoes.
 
This is awesome, there seems to not be a single forum where the ATI/nVidia Fanboys spam their useless Infos around.

Its the ****ing Penis-Comparison-Game of the Internet it seems, yay...
 
Originally posted by rebb
This is awesome, there seems to not be a single forum where the ATI/nVidia Fanboys spam their useless Infos around.

Its the ****ing Penis-Comparison-Game of the Internet it seems, yay...
Maybe I should edit my sig to replace the games with the hardware that runs them :dozey:
 
Originally posted by rebb
Its the ****ing Penis-Comparison-Game of the Internet it seems, yay...
No it isnt, that isnt even in question: We already know the FX is the biggest. Nothing has beaten the DustBuster yet, and there are some really LARGE cooling solutions for FX cards, like wrapping them in metal.

But then you come to fact: Size doesnt matter, its the way you use it :p
 
sheesh this thread got too big too fast.

If HL2 is truly very slow on nvidia cards I might buy an ATI.
The only performance hit ppl are talking about are on DX9 cards from Nvidia (FX series) using Pixel Shaders.
Not GF 4 cards.
They do not use DX9 features and therefore dont have that performance hit. But they do run slower just because they are older cards.
That brings up a question...are their any benchmarks highlighting Vertex Shaders?

ATI has 3 types of cards off the 9800
9800 128mb
9800 pro 128mb
9800 pro 256mb

nvidia has 2 off the 5900
5900 128mb
5900 ultra 256

shouldnt compare 9800 pro 128mb to 5900 ultra 256mb.
both cards with 256mb are similar in price (infact i found the nvidia card for 430$ but only found ATI's at 450$ btw not on sale)
the 9800 pro 128mb has very similar performance to its 256mb version yet at much lower cost.
The 5900 128mb vs 5900 ultra 256mb has a greater performance difference.
But i wouldnt get an OEM, ive heard ATI's partners can change the clock speeds on OEM products.
and Nvidia's partners can change the clock speed on non pro products.
9800 pro 128mb would be the best deal.
Low price (like the 5900) but high performance (very similar to its 256mb version)

btw if you enjoy Nvidia products/drivers, I would suggest you wait til this christmas when both companies have new cards are out and Nvidia's cards have proper dx9 performance. Not to mention even higher Frame Rates on the newest games.
Enjoy or dont mind ATI, you dont have to wait to get proper dx9 performance.
 
Originally posted by SilentKilla
wow im really getting sick of people ripping on nvidia just like they do to the people that post threads about delays. I have a Ti4600, and i also have 2 times the money right now to buy a 9800 or whatever pro, but i obviously dont give a shit, because my game will run just as good as you ATI fanhoes.

acutally yours won't run as fast as the 'ATI fanhoes" your card has no directx 9 features to speak of, so the games graphics will be shoved down a notch, so the ati users will be running with all features on, and im sure higher framerates.
 
Originally posted by Asus
sheesh this thread got too big too fast.


The only performance hit ppl are talking about are on DX9 cards from Nvidia (FX series) using Pixel Shaders.
Not GF 4 cards.
They do not use DX9 features and therefore dont have that performance hit. But they do run slower just because they are older cards.
GF4 use pixel shaders too. Except its not DX9 PS (2.0), they use 1.4. And it is of course a performance hit using any pixel shaders. Its just that its quite fast with fixed function 1.4 (so is FX, the reason many time you *think* it uses DX9, but in fact it has falled back to DX8 technology).
 
AFAIK, the GeForce 4 family support up to PS 1.3.
The ATi R200 family was the only hardware that supported DX8.1 (PS1.4).
 
whats the diffrence between 1.4 and 1.3? I thought the 8500 had the same feature set as the gf4
 
I know that my 5900 ultra will run perfectly on Half-Life². Although I know that the Radeon will make the shader run better.
 
Originally posted by Arnaldo42
I know that my 5900 ultra will run perfectly on Half-Life². Although I know that the Radeon will make the shader run better.

Could you tell me how HL3 will run on DX9 cards? Oh I'm sorry I thought you were from the future since you obviously benchmarked HL2 with your Cheese-ball FX® card.
 
Originally posted by shapeshifter
acutally yours won't run as fast as the 'ATI fanhoes" your card has no directx 9 features to speak of, so the games graphics will be shoved down a notch, so the ati users will be running with all features on, and im sure higher framerates.

Well i really feel bad for you guys who spend a half a day setting all your shit, i think it is easier to put the disc in the tray, install it, and click New game, cause ill bet my life to you ill have no problems.

/edit, i think the best bet would be to close this thread, it has become a, im jealous because you got a $400 card thread. Its just free posts for the spammers.
 
One thing I'm confused with, why can't nvidia come out with drivers that will support dx9 on older cards? I just don't understand that.

I have a ti4800, but I don't think it matters, I don't need to be able to see every one of my ass hairs reflecting off the water in hl2 to think that the graphics are good enough for my high standards of living. I mean maybe the next really popular game that comes out will be better supported by nvidia who cares, I think 3dfx was doing the best until they got creamed. It all depends on what you want. I mean just because I have a ti4800 doesnt mean that the physics will be any worse, or the gameplay/level design, it's just what stuff looks like. And besides that, you either have a geforce4/ati 9700+ card which runs the game great or a geforce3-/ati9600- which can barely run the game. Am I right?
 
Originally posted by SuperFat
One thing I'm confused with, why can't nvidia come out with drivers that will support dx9 on older cards? I just don't understand that.

I have a ti4800, but I don't think it matters, I don't need to be able to see every one of my ass hairs reflecting off the water in hl2 to think that the graphics are good enough for my high standards of living. I mean maybe the next really popular game that comes out will be better supported by nvidia who cares, I think 3dfx was doing the best until they got creamed. It all depends on what you want. I mean just because I have a ti4800 doesnt mean that the physics will be any worse, or the gameplay/level design, it's just what stuff looks like. And besides that, you either have a geforce4/ati 9700+ card which runs the game great or a geforce3-/ati9600- which can barely run the game. Am I right?

No, you're not. You're missing the whole point of the article I posted a link to on the first page of this thread to. DX 9 shaders are one of the cornerstones of Half-Life 2's graphical capabilities. NVIDIA's current hardware just cannot support it well. It doesn't matter what the drivers do, the hardware is still limited. I agree that your card will run the game just fine, but it will NOT if you have these shaders enabled. Then again, I'm not sure if a ti4800 is compatible with DX 9 features or not.
 
uh thats like asking why my Voodoo 1 with 6mbs of ram can't run morrowind, its hardware, drivers have zippo to do with it., you can't just add "use dx9 calls in the drivers" for something that was built to run say dx7, it just won't happen.
 
No I don't think so, but oh well I guess I just got stuck buying a bad card? But then again I don't care.
 
Back
Top