6800 GT for HL2

M

MoMo

Guest
I'm thinking of buying a 6800 GT for my new PC, but my friend says that since HL2 was developed for ATI's shaders, the effects will look better on an ATI card.
Is this true? And how much of a differance would there be?
 
well the required card for HL2 is a ATi card (boo...) i have a 6800 ultra and i think it will run fine cause the x800 and 6800 are very close
 
Well the 6800 is a very fine card.
So it will play the game very nicely!
I would not be concerned at all...
Besides, the idea of developing for one brand of card is just sales hype
 
It would most likly be overall the same quality as they are both really fast cards.
Though it is made for ATI, and is suppose to be 30% faster on ATI cards, thus I would get a X800 and it also requires less power to run it.

I probably wouldn't say the affects would be super better on an ATI cards, most likly similar but I would think the ATI card would get faster fps.
 
the visual differences between a 6800 gt and an x800 pro will essentially be imperceptible. the only thing you'll likely notice is a framerate gap. and even that won't be so substantial, due to hl2 being less demanding than, say, far cry.
 
frusion said:
well the required card for HL2 is a ATi card (boo...) i have a 6800 ultra and i think it will run fine cause the x800 and 6800 are very close
No, an ATI card isn't required, dude...just a DirectX-7 compatible one, if I remember correctly.
 
He_Who_Is_Steve said:
No, an ATI card isn't required, dude...just a DirectX-7 compatible one, if I remember correctly.
well i ment they prefer a ATi cards u know what i mean lol
 
I just got a Geforce 6800 GT yesterday, i used to have a 9600 pro and i gotta tell that the geforce card is crazy. i used to get about 30-40 fps now i get 90-100 fps for CS:S. its a good card. but whether you go for the ATI x800 or 6800 they are almost the same thing
 
Half-Life 2 was simply developed on ATi based machines. You will see a bit of visual enhancement while playing on ATi cards, but the Nvidia card drivers, as well as overall performance is a lot better; but then again that brings up a whole new ATi versus Nvidia debate...
 
I have a 6800GT this card ROCKS. Wide bandwidth, handles anything.
 
My bro has a 6800 GT and it runs great I think.
 
Developed on ATI machines? Are you serious?!

Vavle, if anything, made use of special instructions in the ATI software that allow for different types of graphical effects. It's not know if nVidia can produce the same effects as efficiently.
 
I'm getting a 6800gt in a few weeks, I will stick with nvidia cuz I've been using it since the tnts... never got a problem with them!
 
MoMo said:
I'm thinking of buying a 6800 GT for my new PC, but my friend says that since HL2 was developed for ATI's shaders, the effects will look better on an ATI card.
Is this true? And how much of a differance would there be?

The 6800GT is beating the X800Pro in CS:S benchies--if this is an indication of HL2, then the 6800GT is a better buy at $400 than the X800Pro. Here is some info from a recent post of mine:

ATi and Valve (the makers of HL2/Source) are pretty close. And the Radeon 9xxx series is MUCH better than the GeForce FX 5xxx series--the 9xxx series is faster AND runs in DX9, while the FX series runs in DX8.1 (or a mixed mode). But this is not true of the new cards--ATi is not the clear winner against nVidia when comparing the 6800 and X800 series.

The 6800Ultra beats the X800XTPE more times than not in CS:S @ 1600x1200 with 4xAA and 16xAF (links below).

Yet when we compare the X800Pro and 6800GT, the GT is the clear winner by an average of almost 10 frames per second at 1600x1200 with 4x AA and 16x AF.

Cobble 1600x1200 with 4xAA / 16AxF
http://firingsquad.com/hardware/counter-strike_source/page5.asp
GT: 83.9
Pro: 70
Difference: 13.9

Aztec 1600x1200 with 4xAA / 16xAF
http://firingsquad.com/hardware/counter-strike_source/page8.asp
GT: 52.4
Pro: 41.8
Difference: 10.6

Dust 1600x1200 with 4xAA / 16xAF
http://firingsquad.com/hardware/counter-strike_source/page11.asp
GT: 62.6
Pro: 49.2
Difference: 12.8

Italy 1600x1200 with 4xAA / 16xAF
http://firingsquad.com/hardware/counter-strike_source/page14.asp
GT: 53.2
Pro: 50.8
Difference: 2.4

Every test Firingsquad did favors the GT over the Pro. The average difference is 9.925 frames per second in favor of the 6800GT. And since the 6800GT also does well in DX9 and clearly outperforms the X800Pro on OpenGL, it would seem at the $400 price range the 6800GT is a better buy.

We will have to wait to see HL2 in motion, but from what CS:S shows, the X800Pro is not faster than the 6800GT in most cases in Source--the opposite is true. And what would you expect? The 6800GT has 16 rendering pipelines, while the X800Pro has 12.

Note: The X800XTPE does better against the 6800Ultra at times in CS:S, although it seems at 1600x1200 with AA/AF on the 6800Ultra beats the X800XTPE more times than not (at least in firingsquad's tests).

And for the record, the X800 and 6800 series of cards are all excellent cards and none are really a bad investment. Those stating that the X800Pro is much better than the 6800GT, specifically in Source games, is not jiving with the current evidence.

And finally, none of this considers features. The 6800 series is DX 9.0c compliant, meaning it has SM 3.0. SM (Shader Model; reference to Pixel and Vertex shaders) 3.0 allows better performance in shaders in the future and allows some new features, namely OpenHDRL and Geometry Instancing. ATi does Geometry Instancing in hardware, but it is not technically SM 3.0 compliant as it does it in a different way. If developers use it or not is another question. ATi's X800 chips are pretty much the same as the R3xx (Radeon 9xxx) series with slight tweaking. Most notable is 3Dc compression for normal maps.

So while how cards perform today is important, knowing what features may be used in the future is also handy. The 6800 series has the edge in features, but as FarCry 1.3 (which uses SM 3.0) shows, some features like HDRL are HOGS and it wont be until next year that the cards can do it WELL.

Again, this is all very exiting and ATi and nVidia users alike should be happy--the X800 and 6800 series are all great cards.

Based on what we know now, the 6800GT is a better card for Source games and most other games (and really outperforms in OpenGL where the X800 is a good 50-70% slower in D3... which the D3 engine will be used in lots of games).

Btw, I have not seen new benchies with the new drivers. THAT should be interesting.
 
punjabpolice said:
I just got a Geforce 6800 GT yesterday, i used to have a 9600 pro and i gotta tell that the geforce card is crazy. i used to get about 30-40 fps now i get 90-100 fps for CS:S. its a good card. but whether you go for the ATI x800 or 6800 they are almost the same thing

how much is an ATI x800? damn 30-40 to 90-100 FPS is a friggin awesome increase. my 9600XT 128 MB seems to be holding out decently...i get about 35-45 fps in Source....i want to upgrade but don't have much money.
 
Btw, many people are confusing the issues with last years cards and this years.

The R3xx (e.g. 9500-9800) does HL2 in DX9. The NV3x (FX series, 5200-5950) cards do it in a 8.1 mode. The R3xx series is faster.

The new cards (X800/R420 and 6800/NV40) are a different animal. nVidia solved their FX line issues and created a new product with new features. The 6800GT runs HL2 in DX9 just like the X800Pro. No reviews I have seen have said the X800 series has a better IQ (image quality) over the 6800 series. So, faced with paying $380 for a 6800GT or $380 for a X800Pro I would take the 6800GT (and did). Why? Same eye candy, but better performance.

And I am not a fanboy. My last card was a Radeon 9700 which was a GREAT card. I LIKE ATi a lot. But the 6800GT performs better in CS:S and most other games... it is a better $400 investment based on what we know NOW. That could change for HL2... but the 6800GT is a better all around card none the less.
 
It'll work better on an X800, but perfectly fine on a 6800.
 
As long as you have a card over the Fx series for nvidia (5600-5950 etc) you will get all the effects, (so you'd need a geforce 6000 series card)

And for Ati, as long as you have a 9000 series (anything 9500 and up) you will get all the effects, and of couarse the x300 and up will give you all the effects to.
 
they nvidias will run it fine.

i must say, i had a 6800gt, but just sold it for a x800xt.

in cs:S the GTwould sometimes drop to 35fps in action.


the x800xt is ALWAYS 85-218fps

1280x1024
4xAA 8xAF'


also if you want to read benchesm i think xbit labs did a report on 11/04 with updated ati drivers.
 
bizzy420 said:
in cs:S the GTwould sometimes drop to 35fps in action.

I have never had this happen, the framerate on my GT is rock solid. Did you do a clean HD install when you got the card?
 
Acert93 - great info i have one issue - i am sure the general consensus of reviews was that ATIs IQ from antialiasing was better than nvidias (but only marginally).

i have an x800XT and a 6800GT for testing my code on, i would recommand anyone who expects their card to last more than 6 months should get the 6800 of the current crop. SM3, full 32bit floating point blending, instancing, all these 6800 features will become more important in 6 months.

EDIT: to the originally poster - images should look the same as the card will be running the EXACT SAME shaders.

only image quality difference will be in the antialiasing, and this will affect EVERY game you play, not just HL2.

by all means, get the GT.
 
So many uninformed, ignorant people spreading propoganda.

The 6800GT performs excellent in CS:S, better than the X800 Pro.

Please, do not listen to the other smart people in this thread saying that the X800 is better
 
Subatomic said:
So many uninformed, ignorant people spreading propoganda.

The 6800GT performs excellent in CS:S, better than the X800 Pro.

Please, do not listen to the other smart people in this thread saying that the X800 is better

The newest 4.12's increase the entire line of performance for Ati's cards, for the x800's apparently for some people 10+fps, this puts it above the 6800gt in quite a few cases.

And these are just betas, can't wait for the full release next month :thumbs:
 
Thanks fragShader.

I think ATi uses progressive grid AA technique and have read that the AA can look a little better at the same sample (i.e. 2x vs 2x, 4x vs 4x). It is not super noticable, but it is there.

Shapeshifter: Anytime you get 10FPS jump you need to ask what is going on. Not all 10FPS are the same. 10FPS in Quake 3 when the game already runs at 200+ FPS is not relevant. 10FPS in CS:S would be HUGE (20% in some cases) and makes you immediately wonder: What optimizations are they doing to get this? What effect will it have on graphic integrity?

The 6800GT leads the X800Pro by a fairly wide margin in CS:S (almost 10FPS), even if ATi can close the gap nVidia will turn right around and do the same thing. I like ATi, but the 6800GT is a better buy at $400 than the X800Pro overall (there are always exceptions).

Anyhow, when paying $400 it is best to do some research. Tomshardware.com, Xbitlabs.com, Firingsquad.com, Anandtech.com, Hardocp.com, and so forth have good info on all of this. Have fun :)
 
Acert93 said:
Thanks fragShader.

I think ATi uses progressive grid AA technique and have read that the AA can look a little better at the same sample (i.e. 2x vs 2x, 4x vs 4x). It is not super noticable, but it is there.

Shapeshifter: Anytime you get 10FPS jump you need to ask what is going on. Not all 10FPS are the same. 10FPS in Quake 3 when the game already runs at 200+ FPS is not relevant. 10FPS in CS:S would be HUGE (20% in some cases) and makes you immediately wonder: What optimizations are they doing to get this? What effect will it have on graphic integrity?

The 6800GT leads the X800Pro by a fairly wide margin in CS:S (almost 10FPS), even if ATi can close the gap nVidia will turn right around and do the same thing. I like ATi, but the 6800GT is a better buy at $400 than the X800Pro overall (there are always exceptions).

Anyhow, when paying $400 it is best to do some research. Tomshardware.com, Xbitlabs.com, Firingsquad.com, Anandtech.com, Hardocp.com, and so forth have good info on all of this. Have fun :)

:dozey: Before you start pointing the "optimzation" finger at ati, mabye you should look at the history of Nvidia.
 
I have a 6800GT, i run at 1280x960 with 2xAA 16xAF - the game looks amazing. roughly 50-70 fps in action, 100 when I'm alone on a map.
 
I have never had this happen, the framerate on my GT is rock solid. Did you do a clean HD install when you got the card?

ofcourse i did man. anyways what settings u running on?
how much aa and af? for example in bombsite when bomb blows up gt = sometimes 30fps. this was a gt @ ultra speeds also.
same spot XT=65fps.


in aztec 16vs16 gt = heavy action 35-40fps
XT= 65 in heavy action.


and people talkin about benches, seriously look at the newer more recent test. with the new ati drivers.


the old test didnt use 256mb of the vid ram on the x800's.

gt @ultra 3dmark05 = 5100
xt stock = 5699
 
bizzy420 said:
the old test didnt use 256mb of the vid ram on the x800's.

gt @ultra 3dmark05 = 5100
xt stock = 5699

Which is what one of the bigger improvments is for the 4.12 beta's, the ram bug is fixed. Along with some other things.
 
the 6800 card gets 50 fps average with all aa/af, 1600x1200, with everything on high. I think I will get that card.. ;)

i'll be playing it with no aa/af, at 1024x768. so fps will probably be..hmm, 70? the 6800 gt got atleast 70 on the same settings. but i'm going for the 6800 oc because it's from bfg and it's $249
 
Sweet, a good ATI vs nVidia thread. I love these. Let me add my 2 cents, well maybe 4 cents.

I have never and will never buy an ATI card and here is a few reasons why.

1. Their drivers have a horrible track record. I think high school interns wrote drivers for ATI before they got serious about gaming. I admit that they are better now but they still have a long way to go to catch up with nVidia in my book. Where I work if we hit an nVidia driver bug we can call the nVidia guys on campus and get a quick response and a driver fix. The same scenario with ATI usually takes us two weeks. It has gotten to the point where we don't count bugs that only happen on ATI cards. We just ignore them until we are close to beta.

2. nVidia is better at developer support. Sure ATI can throw a lot of cash valve's way but I doubt that really helped them work out issues they were having during development. nVidia actually works with development partners to help get bugs resolved before launch so the nVidia card owners don't have to play beta testers for the first week a game is out.

3. They are better engineers over at nVidia. ATI took the lead in the last round because they bought out talent. I am not knocking that strategy. It worked well for MS to buy Bungie and it worked out well for ATI to buy ArtX and slap their stuff in a card as fast as possible. However that approach doesn't always yield good results.

I am now rambling and I don't even care what card you buy.

PS. If you are short on cash and can't buy a new generation card for HL2 then I would recommend the ATI cards of the past generation. They are faster, cheaper and the driver issues have mostly been worked out. If you have cash to spend on your ideals like I do then get the 6800 Ultra. You will actually be able to turn on all the DX9x effects that will show up in the next batch of games without having to buy a new card again.
 
I gotta say though, Nvidia has bought out stuff too *cough* 3dfx *cough* and look now, SLI is out.
 
oh, and for you nvidia buffs. drivers WILL increase preformance on half-life 2, I am sure nvidia is already working on this. :cheers:
 
Alientank said:
I gotta say though, Nvidia has bought out stuff too *cough* 3dfx *cough* and look now, SLI is out.

You are correct. The FX in the Geforce FX series is from 3dfx. The difference is they took the good engineers from 3dfx and the concepts they were working on and started "building" them into the Geforce architecture. This is a little different from just buying a company and placing their chips on cards and calling it a new series. nVidia started from the ground up with those concepts.
 
i just find it funny how ATI while still using older technology can keep on par with nvidia who has newer technology. Nvidia cards are big and resource hogs. ATI are small and compact. YOu dont have to give up and PCI slot for your ati card, unless you have an custom cooling system. ATi has shown alone how their drivers can increase peformance dramatically, without destroying image quality.

Im sorry but the FX line was a joke. Totally was out classed by the 9600's, 9700's, and 9800's. These new nvidia cards are great but FPS isnt everything. YOu have to look at the other facts. Do you want a large, cpu guzzling, noisy card with a 10 fps gain? or an quiet, small compact card that will do just as good. 10fps does not matter when the FPS range is between 45 and 60.

When ATI comes up with something new other than recycling their tech, they are going to make nvidia run for their money again as was done a year ago.
 
Back
Top