Gabe claimed the X800 is 40% faster than

x800, dunno
x800pro , 399 (out now, go to best buy on monday ;))
x800xtPE, 499
gfrce6800gt, dunno
gfrce6800U, 399 or 499
gfrce6800UEXTREME, dunno or 499
 
40% faster ....sounds like gabes talking out of his arse again.
 
kendo said:
40% faster ....sounds like gabes talking out of his arse again.

And you would know this how?

2 posts in 10months and not 1 positive comment :)

EDIT: sorry for going OT there, but I'm stressed today :)
 
CoreyGH said:
No one said anything about the recent benchmarks being "a significant victory for the 6800". There have, however, been several comments about the X800XT being the absolute best at everything which is simply not true.

And as for "not trusting" the Anandtech review simply because it tests a larger number of games than the others reveiw; I submit that THAT is absurd. Anandtech was one of (if not THE) first sites to point out Nvidia's poor AA and AF quality back in the early 9700 days. Their reveiws are highly informative, technical, and non-biased.


Is that why the majority of games benched are TWIMTPB games.....
 
CoreyGH said:
No one said anything about the recent benchmarks being "a significant victory for the 6800". There have, however, been several comments about the X800XT being the absolute best at everything which is simply not true.
Sure I'll acknowledge that, and I did say the feature-set is a potential winner regardless. It just shits me when people try to "balance" victories in more meaningful benchmarks of games which people actually play by citing a string of relatively modest 'wins' in titles which are of mostly academic interest because they are either, ancient, shit-games, already past 100 FPS on both cards, or a repetition of an already acknowledged win in a particular engine. Admittedly it is up to the reader to determine what’s shit, but Aquanox and its Krass engine really gets a better run than it deserves. It is basically a crap game written under heavy influence from Nvidia - who gives a shit. Now take STALKER – there is a game written under Nvidia's direction and TWIMTBP which actually deserves to be a persuasive benchmark…

In regards to repetition of engines - well, if you know how to read benchmarks and reviews then that is good and you can take my inflammatory comment with a good grain of salt, however, for Joe Bloggs it is a different story. Most people just aggregate the numbers – regardless of which benchmarks should be considered more influential and so I do consider a numerically unbalanced benchmark selection to be misleading without appropriate commentary.

I guess it also reminds me a bit of when the R300 was just cutting up the FX and people were citing the FPS of Q3 @ 640x480 as a counter-point. FFS who cares whether you get 300 FPS or 275 - it is entirely irrelevant... I can understand why some sites do it, but seriously it is of very questionable value to the end-user interesting in getting a card of good performance in the resolutions they play at in the games they play atm and in the future.

And as for "not trusting" the Anandtech review simply because it tests a larger number of games than the others reveiw; I submit that THAT is absurd. Anandtech was one of (if not THE) first sites to point out Nvidia's poor AA and AF quality back in the early 9700 days. Their reveiws are highly informative, technical, and non-biased.
Anandtech was very good, and still is most of the time - I'll give you that, but they're hardly perfect. The only unbiased site regarding graphics cards I've ever encountered is Beyond3D, though the detail is probably a bit beyond your average netizen and I’m not being condescending either – it is a pretty damn hardcore community.
 
When Gabe says, "ATi product X is 40% faster", what he really means is, "ATi is giving me a big fat bonus to pimp their products."
 
Wilco said:
And you would know this how?

2 posts in 10months and not 1 positive comment :)

EDIT: sorry for going OT there, but I'm stressed today :)

actually i agree with him, i just wouldnt put it so bluntly
 
Mountain Man said:
When Gabe says, "ATi product X is 40% faster", what he really means is, "ATi is giving me a big fat bonus to pimp their products."

or proves nVidia sucks
 
I think he's just doing this because of their contract from last year.
 
Perhaps nVidia cards just run below par on HL2?

No vid yet. Seems that creamhackered still hasn't got his firewire. :p
 
Letters said:
Graphics cards being $500 doll0rz = poopoo!

my PC, my ENITRE, HL2 ready PC... cost $100 more than this 1 part... BUT JUST LOOK AT THOSE NUMBERS!!! :: drools ::
 
so how many hundreds of dollars will this card run?
 
from Creamhacked, who reported the Neowin story:

creamhacked said:
Hopefully we’ll have the half life 2 video off the camera by the end of today and it will be going straight to bit torrent sites etc. No promises on that but we’re working on it.

eeek!
 
Can't they just tell us what happens in the vid in more detail? How about they draw us pictures in ASCII art. :)
 
Sorry, off topic, but Apos I PMed you about the 9800 Pro, please hold it for me, I'll take it!!!
 
chu said:
so how many hundreds of dollars will this card run?

ati x800 pro = $399
nvidia 6800 GT = $399

x800XT platinum edition = $499
Nvidia 6800 Ultra =$499

Nvidia will have another card called the 6800 ultra extreme which might go for $599.

I would go for the x800 pro which holds up against the 6800 ultra for a 100 dollars less but the only thing stopping me is that ati's cards are not future ready, they don’t have 32 bit floating point precision and DirectX 9.0c (ShaderModel 3.0) support which future games might use.
 
OK here is how the cards will perform in Half Life 2:

dx1ud_1600_pure.gif


even at 1600 and 8xAA and 16 AF the 6800 Ultra only lags 10 frames, so where did that 40% claim come from?
 
darkmistx said:
OK here is how the cards will perform in Half Life 2:

dx1ud_1600_pure.gif


even at 1600 and 8xAA and 16 AF the 6800 Ultra only lags 10 frames, so where did that 40% claim come from?

God damnit, leave the stolen build alone. Stupid XBit Labs needs to STOP benchmarking it.
It's crap, not optimized properly, has very few of the said features of HL2, etc.. it's NO indication of how well the final product will run
 
I love how this thread looks like it is entitled "Gabe claimed the X800 is 40% faster than G0rgon"

Yeah, take that, you slowpoke G0rgon!
 
just for clarification, where does it say they will be releasing this 3 minute video?
 
thats so out! releasing another X800 card for hl2 and i just got my 9800 XT!!!
 
oooh sorry i do lol sorry :) i didn't know but will the 9800xt still be gr8 for the game
 
Just because the next range of cards comes out, doesnt make the last range obsolete, id say anything above a 9600 pro is gonna get a full feature experience, the frame rates will be better the higher u go of course, but i shouldnt worry :O
 
Lobster said:
id say anything above a 9600 pro is gonna get a full feature experience, the frame rates will be better the higher u go of course

thats what I pray for
 
If I get the X800XT, will I experience any bottlenecks with the folowing specs:

AMD Athlon 64 FX-51 2.2GHz
1024 MB ECC Registered PC3200 RAM
420 Watt Power Supply
ASUS SK8N mobo

Keep in mind that even though my processor is 2.2 GHz, it is 64 bit and had a small edge over the Pentium 4 Extreme Edition despite being 1 GHz slower in clock speed.
 
Yes Moto, a top-of-the-line processor/RAM will be a huge bottleneck for a top of the line graphics card :x
 
darkmistx said:
even at 1600 and 8xAA and 16 AF the 6800 Ultra only lags 10 frames, so where did that 40% claim come from?
I told you, it came from Gabe's desire to help ATi sell graphics cards. To put it another way, you could say Gabe pulled the number from a part of his anatomy where the sun usually doesn't shine.
 
Gabe's promoting ATI, there's no doubt in my mind.

What I wonder is why ATI in particular? Have they and Valve done promotions together before? Couldn't Nvidia just as easily made a deal to promote their cards?

All this talk makes my 9600p look like junk. :|
 
yeh, lol this is pretty funny, wtf are we natting about? that we havnt before?, these new GPU systems are easily able to handle any top game whichever game, on highest graphical settings. either card will handle games to come, easily, over the next year atleast.

so , who gives a carrot anymore, whichever card will do :angel: , both cards from both company's exceed the capacity to play the game in good enough detail for it to be enjoyed. so I say, WHICH EVER ONE IS CHEAPER ;)
 
Back
Top