So the X800XT kills the 6800U - Stats, Benchmarks etc.

Did i say anything about whats better before your fanboism outrages...i simply showed you the real truth and if the PRO will retail at $450 and the 6800ultra at$500 i know which id prefer any day...durr durr durr :monkee: ATI obviously dont care much about openGL and if you really think the doom3 engine will be used once, for doom3 then you are naive, the Hayabusa motorbike is the fastest road production bike (top speed) it doesnt mean for a second that an R1 would'nt rip it to sh1t overall and ATI is like the hayabusa...they can only be the best in one thing and crap in the other.
 
gh0st: It shows that the cards are really closer than what anandtyech says.


besides, i wouldn't trust that one from anandtech, as NO one else had differneces that large between the two.
 
Alig said:
I've got something that will make a difference.

You can't show that because this contradicts exactly what your posting.

http://www.tomshardware.com/graphic/20040504/ati-x800-30.html

Thats outdoor and indoor with no aa/af at 1024 ress and it doesnt even come close (the X800XT) to the 1280x1024 pic they have up on anandtech...so obviously someone is talking crap and i would say anandtech is.

erm.... call of duty is based on the Q3 engine(OGL)....ofcourse nvidia is going to be faster.




....and i dont understand what people are talking about ATI's poor OGL performance? what is the differance beetween 3000fps and 4000fps :hmph:
 
Alig said:
Did i say anything about whats better before your fanboism outrages...i simply showed you the real truth and if the PRO will retail at $450 and the 6800ultra at$500 i know which id prefer any day...durr durr durr :monkee: ATI obviously dont care much about openGL and if you really think the doom3 engine will be used once, for doom3 then you are naive, the Hayabusa motorbike is the fastest road production bike (top speed) it doesnt mean for a second that an R1 would'nt rip it to sh1t overall and ATI is like the hayabusa...they can only be the best in one thing and crap in the other.


ahahah...Talk about the exact opposite of reality.
 
*Peaks in.
What in the hell is going on in here? Seriously, I mean what is wrong with anyone arguing over this stuff, especially when it all was just first revealed a day or two ago? This stuff doesn't have any immediete impact anyway, so theres plenty of time to see how it pans out. Wait until all the facts are nailed down.
*Goes away
 
Direwolf said:
*Peaks in.
What in the hell is going on in here? Seriously, I mean what is wrong with anyone arguing over this stuff, especially when it all was just first revealed a day or two ago? This stuff doesn't have any immediete impact anyway, so theres plenty of time to see how it pans out. Wait until all the facts are nailed down.
*Goes away

you dont understand....its a ritualistic thing between nerds :LOL:
 
Alig said:
Did i say anything about whats better before your fanboism outrages...i simply showed you the real truth and if the PRO will retail at $450 and the 6800ultra at$500 i know which id prefer any day...durr durr durr :monkee: ATI obviously dont care much about openGL and if you really think the doom3 engine will be used once, for doom3 then you are naive, the Hayabusa motorbike is the fastest road production bike (top speed) it doesnt mean for a second that an R1 would'nt rip it to sh1t overall and ATI is like the hayabusa...they can only be the best in one thing and crap in the other.

if you are going to buy a graphics card for one game, be my guest. but if youre going to be a hypocrite, virtually every hardware site says ati's new cards are better than nvidias, and ignore what everyone else is saying, then you are just stupid. and if you think the x800xt wont run doom 3 then you are retarded, because im damn sure my 9800pro will run it.

and just out of curiosity how would a slower bike rip a faster bike to shit? does the slower bike have bigger wheels and a leather seat?
 
Alig said:
I've got something that will make a difference.

You can't show that because this contradicts exactly what your posting.

http://www.tomshardware.com/graphic/20040504/ati-x800-30.html

Thats outdoor and indoor with no aa/af at 1024 ress and it doesnt even come close (the X800XT) to the 1280x1024 pic they have up on anandtech...so obviously someone is talking crap and i would say anandtech is.
1. they use different recorded demos that may stress the cards differently
2. they used different test platforms
THG = Intel 3.2ghz
Anandtech = AMD 3400+

Although you can see that when enabling AA/AF that those numbers do soar.
Link
 
OCybrManO said:
If it is supported in Doom 3 it might either close the gap in performance between ATi and nVidia or make the ATi version look better than the nVidia version
I've read that 3Dc actually costs a slight bit of performance, so if it's used in Doom3 only the IQ will improve.
The reason why I added "remains to be seen" to the 3Dc and SM3.0 features is because it's all marketing now and no immediate advantage.

crabcakes66 said:
....and i dont understand what people are talking about ATI's poor OGL performance? what is the differance beetween 3000fps and 4000fps :hmph:
Wolfenstein: ET (1600x1200 4xAA 8xAF) :
6800U : 73.9 fps
X800XT: 65.8 fps

Call Of Duty (1600x1200 4xAA + 8xAF) :
6800U : 82.7 fps
X800XT: 71.3 fps

And these are average framerates. The minimum framerate will be lower.
The difference will most likely be bigger in Doom3, as the NVidia card can do 32 Z/stencil-buffer operations per clockcycle and has UltraShadow.

I've read somewhere that ATI has started to rewrite their OpenGL code from the ground up, so perhaps things will improve in this area.
 
Being beat by a couple fps is not "pwned".Anyways nvidia is not sitting around on this..they should be releasing the GeForce 6800 Ultra Extreme sometime soon.All I can say is it's gonna get intresting these next couple of months...
 
crabcakes66 said:
erm.... call of duty is based on the Q3 engine(OGL)....ofcourse nvidia is going to be faster.




....and i dont understand what people are talking about ATI's poor OGL performance? what is the differance beetween 3000fps and 4000fps :hmph:


Right then...so you just landed yourself in it, if a few fps aint nothing to get fussed about then why are you in this thread because the ati only beats the nvidia by a 'few' fps, nuff said.

and just out of curiosity how would a slower bike rip a faster bike to shit? does the slower bike have bigger wheels and a leather seat?

Because it is fastest in a straight line...roads are'nt straight, race tracks are'nt straight and it would still loose of the mark against a 600cc motorbike, it is long geared therefore it can get upto stupidly fast speeds 220+ where as an R1 could get upto about 180 - 190 but only takes a mere 2.5seconds to get to 60 or something and doesnt stop going up fast for a long time, thats how a faster bike gets ripped to shit not to mention how big a 1300cc hayabusa is and weighty when an R1 weighs like 180kg.
 
Besides, as soon as we get a review of a Gainword Golden Sample, things will chnage :p
 
SidewinderX143 said:
Besides, as soon as we get a review of a Gainword Golden Sample, things will chnage :p

Heh, not when the Asus, Hercules and Sapphire versions of the x800 get released.
 
oD1Nz said:
I find X-bit Labs previews are always pretty good.

http://www.xbitlabs.com/articles/video/display/r420-2.html.

Plus, them naughty boys are using the stolen HL-2 build to benchmark with: http://www.xbitlabs.com/articles/video/display/r420-2_12.html.

And stalker. :)

I don't play OpenGL games myself except those games based off the quake3 engine which means I won't be scraping the bottom in fps. That's why a few FPS doesn't matter to me in that area.

When FPS matters is when its new or current games I'll be playing.
When I play new DX9 games I don't have very many FPS and the X800 come out on top. Also on those other games I do play...the X800 wins majority and basicly wins all when AA/AF is enabled and still playable.
Still holds the quality shaders in farcry etc (digit-life has pics).

So do I want 40FPS with poor shader IQ in that new DX9 game and 250FPS in that Opengl game.
Or 70FPS with awesome shader IQ in that new DX9 game and 230FPS for the Opengl game.
If we are talking Wolf: ET then 180 vs 160.
Then I can add AA/AF without almost any performance hit or Temporal AA. ;)

That's my perspective and why the 6800 is hitting the strong points in something I don't use while the X800 is all over it my needs.
 
http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD03NTgmdXJsX3BhZ2U9MTY=

I'v already posted this but I'm going to post it again because I really really like it.


On a side note 80% of the shader tests I'v seen ATi win's by a significant margin, but then there are those where Nvidia wins by a significant margin, but whether or not they are that important is debateable. On game tests it seems to vary from review to review but I would say they are equal. However NV's 8x AA mode is unuseable where as ATi's 6x is, which is something to bare in mind.
 
im still decideing on witch card im guna get... even thow im a nvidia fannie the X800 looks impessive... i wana know how good the NV40U will handle doom3 Hl2 Hl3 Hl2Xpack... mods... cz... ect... if someones got a nv40u post a pix of the FPS you getin that DX9 shader thing
 
XenoSpirit said:
im still decideing on witch card im guna get... even thow im a nvidia fannie the X800 looks impessive... i wana know how good the NV40U will handle doom3 Hl2 Hl3 Hl2Xpack... mods... cz... ect... if someones got a nv40u post a pix of the FPS you getin that DX9 shader thing
Then wait. ;)
Unless you wanna judge performance based on leaks,alphas, or betas.
2 Highly Anticipated Next Generation DX9 Games

mrchimp said:
http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD03NTgmdXJsX3BhZ2U9MTY=

I'v already posted this but I'm going to post it again because I really really like it.


On a side note 80% of the shader tests I'v seen ATi win's by a significant margin, but then there are those where Nvidia wins by a significant margin, but whether or not they are that important is debateable. On game tests it seems to vary from review to review but I would say they are equal. However NV's 8x AA mode is unuseable where as ATi's 6x is, which is something to bare in mind.
I saw that. hehe
Pretty interesting for being an intensive test.

I just think the X800 matches most gamers needs better.
There is nothing 'wrong' with the 6800 (except these Farcry shader pics) but it just doesn't shine where most gamers want.

Ruby wireframes
 
XenoSpirit said:
im still decideing on witch card im guna get... even thow im a nvidia fannie the X800 looks impessive... i wana know how good the NV40U will handle doom3 Hl2 Hl3 Hl2Xpack... mods... cz... ect... if someones got a nv40u post a pix of the FPS you getin that DX9 shader thing

You won't have to worry, these next generation cards won't begin to ship until the end of may. It will probably be a while before there is enough supply to meet demand.
 
blahblahblah said:
You won't have to worry, these next generation cards won't begin to ship until the end of may. It will probably be a while before there is enough supply to meet demand.
Well, they are shipping to stores like BestBuy and Compusa asap and on shelves about the 20-26th at most.


ATI's Richard Huddy - Some answers on the Radeon X800
DW: The whole "8 extreme pipelines" and all the "extreme pipeline" talk was just a smokescreen to give the rumor mill something to chew on, right? Could you elucidate a bit on it? (BTW-It worked great, you have no idea how many nights I spent trying to figure out what that means....but I forgive ya.)

Richard Huddy: It was always a smokescreen. It takes so long to build this hardware that you'd amazed it managed to fool anyone. There are people who believe that we cut and paste the extra 4 pipelines in about a month ago. They're wrong - by many months!

It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at.

And you wouldn't believe how few people inside ATI knew the exact data. If you want to keep it as a secret, you need to keep it secret.
Interview with ATI's Dave Orton

Doom3 Tidbits
HardOCP said:
This puts the latest-gen ATI (9700+) hardware and the new silicon from NVIDIA on "even ground" which will make the people more interested in benchmark numbers than actual gameplay very very happy.
Gamers-Depot said:
After talking with quite a few top-level developers, all of whom are working on future titles, we’re not thoroughly convinced that SM3.0 will be a must-have feature anytime soon – certainly not of the remainder of 2004.

Some of these might have already been linked before.
Oh and I'm bored, if you can't tell. :)
 
Arno said:
I've read that 3Dc actually costs a slight bit of performance, so if it's used in Doom3 only the IQ will improve.
The reason why I added "remains to be seen" to the 3Dc and SM3.0 features is because it's all marketing now and no immediate advantage.


Wolfenstein: ET (1600x1200 4xAA 8xAF) :
6800U : 73.9 fps
X800XT: 65.8 fps

Call Of Duty (1600x1200 4xAA + 8xAF) :
6800U : 82.7 fps
X800XT: 71.3 fps

And these are average framerates. The minimum framerate will be lower.
The difference will most likely be bigger in Doom3, as the NVidia card can do 32 Z/stencil-buffer operations per clockcycle and has UltraShadow.

I've read somewhere that ATI has started to rewrite their OpenGL code from the ground up, so perhaps things will improve in this area.


yeah but realisticly how many people run Q3/OGL games at that resolution?

i hate 1600x1200 ...i cant see a damn thing....and thats on a 21" monitor.



its less than 10% that would run anything at that res...even with a newer card.


id be much happier running 1280x1024 with 6xAA 16xAF.
 
Back
Top