Sorry NVIDIA owners

Originally posted by shapeshifter
uh thats like asking why my Voodoo 1 with 6mbs of ram can't run morrowind, its hardware, drivers have zippo to do with it., you can't just add "use dx9 calls in the drivers" for something that was built to run say dx7, it just won't happen.

I didn't get that at all....
 
its not that its a bad card, just that it was not designed to run tommorws games with all the features that they have.

? whats hard to understand ok lets compare it this way, Why can't my old black and white tv, give me shows in colour?
 
Originally posted by SuperFat
No I don't think so, but oh well I guess I just got stuck buying a bad card? But then again I don't care.

What do you mean you don't think so? Read the damn article! It is a fact, for god's sake. John Carmack, one of the most brilliant people in the gaming industry states this fact regarding NVIDIA's capabilities with DX 9 shaders. If you're going to believe someone regarding video card tech, believe him! He knows what's up.
 
Originally posted by spitcodfry
What do you mean you don't think so? Read the damn article! It is a fact, for god's sake. John Carmack, one of the most brilliant people in the gaming industry states this fact regarding NVIDIA's capabilities with DX 9 shaders. If you're going to believe someone regarding video card tech, believe him! He knows what's up.

I mean no I don't think it supports directx9 features, wow chill out....
Sorry if I angered the almighty?
 
Originally posted by SuperFat
Oh ok, well that's why I hate big business crap.

EDIT: Sorry, simple misunderstanding. I retract my previous statements, then. I apologize for getting all riled up. Some users on this board tend to exude their ignorance a little too much, and I was mistaking you for one of those hooligans.
 
Ok understood, and I am still not sure what importance shaders play in graphics. Is it like really important or just those itty bitty extra points.
 
there will be a diffrence in a varity of areas of one the most notiable will be with water.
 
Yes I know which is why I said I didn't need XSTreem Watr GraFX (Cool spelling)
 
I think Pr()ZaC is right.
Nvidia GF4 cards will use PS 1.3.
They wont perform badly because of the performance hit that Nvidia cards (FX) get when using PS versions 1.4-2.0.
Besides...even if they did use 1.4, the way the GF4 and FX cards are built means the performance with PS 1.4 may be different.
And besides that, you either have a geforce4/ati 9700+ card which runs the game great or a geforce3-/ati9600- which can barely run the game. Am I right?
lol why are you pairing those cards together?
 
What I meant was you either have a really good card that can run the game well or an old card where you will notice a big difference in graphics. I wasn't trying to compare them.

Aka. I'm too tired right now to see or care about what I'm typing.
 
the ati 9600/9500 is leaps and bounds above geforce 3 tech, its higher then geforce 4 tech, they are meant to compete with the geforce fx 5600. (which they do quite nicely)
 
Originally posted by Asus
I think Pr()ZaC is right.
Nvidia GF4 cards will use PS 1.3.
They wont perform badly because of the performance hit that Nvidia cards (FX) get when using PS versions 1.4-2.0.
Besides...even if they did use 1.4, the way the GF4 and FX cards are built means the performance with PS 1.4 may be different.

lol why are you pairing those cards together?
Oh yeah, that's true, G4ti is only max PS 1.3... Thought it was 1.4, dang. Anyway, yeah, the 5900 Ultra at normal DX9 settings will probably perform worse than the G4ti at normal DX8 settings :)
 
yes but what is 1.3 compared to 1.4? I thought they were both dx8 so whats the biggy? is there a big diffrence? any articles around the web dealing with it?
 
Originally posted by shapeshifter
yes but what is 1.3 compared to 1.4? I thought they were both dx8 so whats the biggy? is there a big diffrence? any articles around the web dealing with it?
Probably lots of them, just search :)
The difference isnt that big... And 1.4 is DX8.1 I think. Its mostly just speed increases if I remember correctly. Its 2.0 that is the bigger change.
 
ok yea, found this by john carmack for doom 3

The fragment level processing is clearly way better on the 8500 than on the Nvidia products, including the latest GF4. You have six individual textures, but you can access the textures twice, giving up to eleven possible texture accesses in a single pass, and the dependent texture operation is much more sensible. This wound up being a perfect fit for Doom, because the standard path could be implemented with six unique textures, but required one texture (a normalization cube map) to be accessed twice. The vast majority of Doom light / surface interaction rendering will be a single pass on the 8500, in contrast to two or three passes, depending on the number of color components in a light, for GF3/GF4

http://www.webdog.org/cgi-bin/finger.plm?id=1&time=20020211165445
 
Originally posted by dawdler
In DoomIII however, Nvidia will use a default poor quality driver path to match and overcome ATI.
I'm sorry that I reply to a post from a few pages back, but this .plan file from Carmack gives some more inside info about DoomIII's rendering paths. Beware, it's quite techy. The .plan is from February and at that time there were 6 rendering paths coded into DoomIII:

ARB2 - works on both GeForce and Radeon and is fully featured
ARB - works also on both GeForce and Radeon, but with less features
R200 - specifically for Radeon
NV30 - specifically for the GeForce FX
NV20 - specifically for GeForce 3/4
NV10 - specifically for older GeForce

Both GeForceFX and Radeon can render in ARB2-mode. In this case the GeForceFX runs with 32bit precision and the Radeon with 24bit precision. The Radeon is much, much faster in this mode.
However, the GeForceFX has the option of running in NV30 mode in order to produce higher framerates then the Radeon. In NV30 mode basically the same effects are rendered as in ARB2, only at a lower precision (12bit or 16bit). Carmack doesn't mention whether the lower precision is visible for the naked eye, so we can only guess.

I realise the text I just wrote is quite complicated, but I hope it informed some people.
 
I think one thing people on both sides of the fence keep forgetting is that its gonna rock regardless of what card you get. In 6 months it won't matter if its Nvidia or ATI, as I'd be very suprised if both didn't run HL2 in full quality at high resolution with very good framerates, if the current hardware doesn't already now...

When people talk about benchmarks things always get blown out of proportion (especially with motherboards, where people bitch over the absolute smallest things).

Bottom line is, if you have the fastest card of your current favorite company, its gonna rock...
 
When you are playing the game you arnt going to stop and ask the other guy what he is using for a graphics card are you?
 
I would like to see fanboys (err, 'supporters') collide on that forum. Sorry, i'm just a little pissed off today.
 
Originally posted by Axxron
I would like to see fanboys (err, 'supporters') collide on that forum. Sorry, i'm just a little upset today.

Why's that?
 
Originally posted by spitcodfry
Guess you don't want to talk about it. Sorry for asking.
The "Haha!" is just something that goes around in our family. It started with my cousin (Miranda) when she was a toddler.

note to self: find that video
 
Wow..what a huge thread. Rather then going into statistics/comparisons/yadda yadda, I'll just take Gabe at his word when he says take ATI over Nvidia. I have a GeForce 4 and I have no problem upgrading, if something is better, its better. I wont stick with a company just because I like their logo or I have always bought from them. Performance speaks volumes, and from what the Devs from HL2 say, ATI is the way to go.

Wish I didnt have to upgrade, but apparently it is the case. Oh well, no harm done, life goes on..
 
Originally posted by Shocky
Wow..what a huge thread. Rather then going into statistics/comparisons/yadda yadda, I'll just take Gabe at his word when he says take ATI over Nvidia. I have a GeForce 4 and I have no problem upgrading, if something is better, its better. I wont stick with a company just because I like their logo or I have always bought from them. Performance speaks volumes, and from what the Devs from HL2 say, ATI is the way to go.

Wish I didnt have to upgrade, but apparently it is the case. Oh well, no harm done, life goes on..

Wait for the 9800 XT to come out and for the prices of the 9800 Pro 128 to lower and then buy that one.
 
I like pepperoni on my pizza, not ham. I like Ford, not Chevy. I like Rage Against the Machine better than John Mayer.

It's all about opinions. I have a 5900 which does very well, thank you very much. I'm getting about 100+ FPS more in 'Solitaire' with my new card than my old TNT2. And I still believe that the content of a game is much more important than the graphics. I have bought many games that were technically amazing, yet blew when it came to gameplay. (Unreal II, anyone?). I think that Half-Life 2 will be amazing with or without the eye-candy. So, do I regret my choice? No. I prefer to be comfortable with what I have, rather than covet my neighbors GPU. To all of you ATI owners: Cheers! And to my NVIDIA brethren: Cheers, as well.
 
i too have a 5900, ill just have to see what happens when the game comes out.
 
QUOTE]Originally posted by dawdler
Havent seen it. Link? The original still shows the 9800 to be twice as fast...


You never get something from nothing. Alot of Nvidia's speed increasements have been consistent with decrease in IQ. Even some of ATIs. Keep in mind, the FX was delayed ALOT. They have had lots of time to streamline drivers.


Why shouldnt they? Concerning the drivers, ATI admitted a 2% increase in 3Dmark2k3 as they rearranged some pixel shaders. They also quickly agreed that it was wrong and removed it in the later drivers. Nvidia still hasnt commented on their insane increase with extremely questionable methods. Dont drag up the old Quak thing, it was long long long ago. Now the online community for both companies are 1000x bigger, and with that comes 1000x more demand on them.


Yes, of course. Why not? Its not like he could have said that 5900 Ultra was gonna rock with it :dozey:
[/QUOTE]


Cut and paste of a post on page 6 i did before you came in. quote:
--------------------------------------------------------------------------------
Originally posted by Detharin
No problem, hey check this out as more of an FYI, they come from a performance review of two 5900 cards ultra and non. Both show higher benchmarks than the ones in the 9800 vs 5900 tests, also the last benchmark shows that disabling one feature (dof) almost triples performance (first link), and does a benchmark with the new drvers of Tomb Raider and it shows a rather large increase over the last tests. Still less than Ati but a good increase nontheless for just a driver update (second link)


http://www.beyond3d.com/reviews/asu.../index.php?p=13


http://www.beyond3d.com/reviews/asu.../index.php?p=19
--------------------------------------------------------------------------------


Takes time to steamline drivers, especially when the problems are just recently begining to show, first they have to track down where and it looks like the DoF is the problem. Especially considering even at higher detail settings the nvidia benchmarks arent really affected they still stay about where they were. Bad but consistant.

1000x times bigger true. But alot of Ati fanboys like to forget history and proclaim Atis spotless record as proof of their superiority. Besides it was funny, and is worth laughing about.

Ill wait for real world performance. In the end the benchmarks for Ati and Nvidia could end up being equal, but the way the engine handles each totally different.
 
Lets never foget the time ATI 'optimised' their drivers for Quake3 a long while back when everyone knew it was used for benchmarks. ATI had to take alot of sh*t from the fans before they took out the optimisations.
Can be found here: http://www.hardocp.com/article.html?art=MTEx

Anyone remember the ASUS drivers? Wow did that piss everyone off.

Now Nvidia goes out and 'optimises' 3dmark2k3. I guess history doesn't apply to them? They took more sh*t from the community than ATI and ASUS combined. Take a look at the HardOCP editorial titled "Is NVIDIA on a mission to see just how badly they can upset the enthusiast community that helped put them in the position they are in today?"
It can be found here: http://www.hardocp.com/article.html?art=NTAz

No one has a spotless record. Nvidias is just more spotted now than it was a few months ago. That's why I bought an Xtasy 9800 Pro 128 :)

And u know what, I love it!
 
Alot of nVidia's problems not only comes possibly from the drivers(they seem to not work as well as old ones) but the architect of the card itself is limited compared to the ATi card(BTW, I'm sure this has been said many times already, just adding it again)
 
Back
Top