ATI or Nvidia?!

ATI, or Nvidia?!

  • ATI(State Which Card.)

    Votes: 80 77.7%
  • Nvidia(State Which Card.)

    Votes: 22 21.4%
  • Other(State Which.)

    Votes: 1 1.0%

  • Total voters
    103
Matrox G400 8 MB, here too.

It pwns all you shitty nvidia and Ati fanboys.
 
I'm currently running the GeForce4 MX-440... Its a piece of poop.

I'm considering getting either a 6800 Ultra Extreme or X800 XT PE, can't decide, although the 6800's technological advancements seem to be swaying me in their direction recently.

Anybody wanna post smack against my choices and tell me what I *SHOULD* be getting? :)
 
Have a 9500pro flashed and oc'ed to past 9700pro...but been thinking of getting something new. Have a bit of extra cash.

Leaning to the x800pro....but it's hard to not look at thethe GeForce 6800GT. Roughly the same price (give or take a few bucks) and it has those extra features.

But those are a lot of money and I may just keep what I have or go with the 9800pro.
 
Kiwwa said:
I'm currently running the GeForce4 MX-440... Its a piece of poop.

I'm considering getting either a 6800 Ultra Extreme or X800 XT PE, can't decide, although the 6800's technological advancements seem to be swaying me in their direction recently.

Anybody wanna post smack against my choices and tell me what I *SHOULD* be getting? :)

go to page 1 of this thread and read Shuzer's post.
its informative if u are considering the 6800 :p
but like i also said on page 1, be careful of which brand of the 6800 u purchase :p
 
Nooooooooooooooooooooo!!!!!!!!!!!!!!!!!!!!!!!!!!!!111111111111
 
Dr. Freeman said:
go to page 1 of this thread and read Shuzer's post.
its informative if u are considering the 6800 :p
but like i also said on page 1, be careful of which brand of the 6800 u purchase :p

Yah shy away from the regular 6800..go with the 6800GT or 6800Ultra as they are on par with the best from ATI , I just got my 6800GT and it is a one slot/one MOLEX solution as the ATI cards are and it performs right on par with the X800 Pro..the differences that swayed me were support for Shader 3.0 (some people claim this doesn't matter (usually people that are jealous), I think it's always better to have a card with support for future enhancements) and I feel that the NVidia's displays always looked as if the color was a bit deeper and cleaner.....
 
ATI's been good to me, and I don't have any real reason to switch now since theres nothing better about any of the Nvidia cards (at least not enough to sway me).
My X800 pro is in the mail as we speak. I'd suggest ZipZoomFly.com to anyone who hasn't been able to find them in stock at a good retailer.
 
Radeon 9800 Pro right here.

I remember back in the days of nVidia being the uncontested king of the gfx world, and ATi were the lost puppy dogs following them with bad driver support.

Now, ATi seems to be just pwning nVidia left and right. Not to mention ATi has always had the cheeper cards. So now their less expencive and better!
 
im getting the 6800 ultra, but it says it has support for future enhancements? what does this mean, what enhancements? And what does this 2 slot thing mean?
 
Darkknighttt said:
im getting the 6800 ultra, but it says it has support for future enhancements? what does this mean, what enhancements? And what does this 2 slot thing mean?

Future enhancements means it has support for Shader 3.0 ..SM 3.0 allows for more complex shaders as well as for instruction branching within the shader code. ATi's newest generation of cards, on the other hand, is still limited to Shader Model 2.0 support. A quote from guru3d.com "It's now known that the x800 series from ATI (which we recently tested) does not support DirectX 9.0c Shader Model 3.0. The discussion we see from a lot of ATI fans is are like this "Hey, who needs Pixel and Vertex Shader 3.0?; there is not one game that supports it and it that won't happen this year also.''

That's so wrong people. The step from 2.0 towards 3.0 is a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more candy for the eyes. When DirectX 9 is updated we are going to see a lot of support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Personally I question that fact. Any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it gives a performance boost or significantly boosts image quality.

ATi makes you believe that Shader Model 3.0 is not important right now. True... it's not that important right now. But throw this thesis at it. Product A is extremely fast with SM2 and has awesome SM3 support. Product B has only SM2 support yet is only slightly faster. Looking at the future, we will see more and more games using SM3 even if it is purely for better performance, then what product has the advantage... Product A of course. In this case the 6800 outclasses the Radeon x800 from ATI."


As far as 2 slots goes, that was a concept NVidia has taken from 3dFX..back in the day you could run two PCI add in cards from them (Voodoo 2), and as long as the memory and manufacturer were the same the cards would work in connection to fill every other line of the screen, also known as SLI (Scan Line Interleave)..So when PCI Express makes its debut this year, you will theoretically be able to run 2 graphics cards together and double the capability of your PC horsepower ..
 
okay thanks for the info, it really helped. im not just buying a new card that i ant the best for hl2, i want a good card that will be better for future games to, in this case its the 6800 ultra, so im gonna stick with it. thx.
 
Darkknighttt said:
I think im going nvidia again. I'm going to get a geforce 6800 ultra, i know that the x800 series is probably better with hl2 but imgoing with nvidia because i have never seen a company with ATI preferred graphics on it, and i want my new card to last a long time and to be good with future games. And 6800 ultra has dx9.0c and x800 doesnt. So i think im going with Nvidia.

IVe worked in 2 high end gaming rig companies... and THEY all preferred ATI graphics and oddly enough... so did their customers... HMmmm interesting!

Andy
 
A-Train said:
That's so wrong people. The step from 2.0 towards 3.0 is a small one and most Shader Model 2.0 games can easily be upgraded towards Model 3.0, which means more candy for the eyes. When DirectX 9 is updated we are going to see a lot of support for 3.0 Shaders. Is it a huge visual advantage over 2.0? Personally I question that fact. Any technological advantage is always welcome and preferred over a previous generation development. The general consensus for developers is to use as low a shader version as possible. Shaders 3.0 will be used only in several critical places where it gives a performance boost or significantly boosts image quality.
Shader 3 HARDLY gives ANY(if it does give any whatsoever) image quality enhancements... if anyone notices some FarCry screenshots "comparing" PS2 and PS3 (side by side) its actually a PS1 and PS2/3 comparison but they say comparing PS2 and PS3 (as they are both included they can get away with it) As for performance improvements yes there willbe but at the moment with farcry its only 3fps more improvement... (im not impressed) and by the time any game takes into full advantage of PS3 your 6800 will be well out of date... and will be deemed "mid range" card so to be honest its just used as a marketing tool by NVidia (gotta hand it too em its a good one...) but it has no benefits to current and future games (next year or so)

Andy
 
In my experience - as soon as you turn on some AA and AF on an Nvidia card - the image quality suffers BADLY and although you expect a hit in performance, NV cards never handled that extra work very well. I dont know how these new 6800 jobbies do when stretched in that manner??

I'm not an ATI fanboy - but I do like the way their hardware deals with AA and AF. We shall see!!
 
X800XT, simply tha best... I'm not completely sure about that though... I think I'll by it because of tha catchy name...
 
wakkywheel said:
X800XT, simply tha best... I'm not completely sure about that though... I think I'll by it because of tha catchy name...


It's the best for hl2 but not certainly for other games. the 6800 ultra has support for dx9.0c so itll be better for other future games. Also the x800xt is only better by a hair for hl2.
 
Someone said:
Have a 9500pro flashed and oc'ed to past 9700pro...

Just how exactly did you do that?

If I'm not wrong,the 9500 is a direct x 8 card, so how exactly did you oc'ed past 9700pro?

I've heard people oc'ed their 9600pros past 9800pro - but that's believeable, I mean both are dx 9 cards...
 
What I love is when people assume because something has support and yet the other card doesn't, that it will be 'better'.
You aren't from the future.
If given todays examples with Farcry moving to SM3.0, performance hardly improves except in intense light conditions but then it sits right back down when AA/AF is enabled.

95% marketing
5% performance

Don't buy because of SM3.0...Buy because you like the card for other reasons.
 
Darkknighttt said:
It's the best for hl2 but not certainly for other games. the 6800 ultra has support for dx9.0c so itll be better for other future games. Also the x800xt is only better by a hair for hl2.


shhhhhhh. go back to your cage knight...its ok. your allowed to come out when you understand dx9.0c isnt any better than 9.0b.
 
Excuse me while I clean up some BS that was floating around in this thread.

Darkknighttt said:
Also the x800xt is only better by a hair for hl2.
Unless you work for Valve or own a time machine, you really can't be certain about that.

guinny said:
your allowed to come out when you understand dx9.0c isnt any better than 9.0b.
DX9.0c contains all the stuff that DX9.0b has, plus some extra stuff like shader branching and geometry instancing. It can replace certain complex PS2.0 shaders with shorter, easier and faster SM3.0 shaders. Clearly, DX9.0c is better then DX9.0b. You can argue about the significance of these extra additions, but to claim that DX9.0c offers nothing over DX9.0b would be ignorant.
 
Ive been using Nvida cards since the TNT but I plan to get a x800 XT PE in my new rig because of cost, reduced power consumption and general performance.
 
guinny said:
shhhhhhh. go back to your cage knight...its ok. your allowed to come out when you understand dx9.0c isnt any better than 9.0b.
Arno said:
DX9.0c contains all the stuff that DX9.0b has, plus some extra stuff like shader branching and geometry instancing. It can replace certain complex PS2.0 shaders with shorter, easier and faster SM3.0 shaders. Clearly, DX9.0c is better then DX9.0b. You can argue about the significance of these extra additions, but to claim that DX9.0c offers nothing over DX9.0b would be ignorant.
DX9.0c is better than DX9.0b in a number of ways but I think he was saying that Nvidia's performance with DX9.0c is not any better than ATI's performance with DX9.0b, give or take some.
 
eh...you guys didn't really catch what i meant but asus got the drift...sorta...
 
Asus said:
DX9.0c is better than DX9.0b in a number of ways but I think he was saying that Nvidia's performance with DX9.0c is not any better than ATI's performance with DX9.0b, give or take some.
Ah, thanks for clearing that up. If that's what guinny meant to say, then I can agree with that.
 
Back
Top