Gabe claimed the X800 is 40% faster than

Shuzer said:
Yes Moto, a top-of-the-line processor/RAM will be a huge bottleneck for a top of the line graphics card :x

Well, someone said that 2.4 GHz was barely enough to get by, and mine is 2.2 GHz... I was wondering if there would be an exception with 64 bit CPUs.

Well, thanks for calling it top-of-the-line. :naughty:
 
hmm, seems a bit pitiful that they test render it a 640 by 480. atleast thats what im getting told by the sample image.

id much prefer to see some high resolution comparisons.
 
Should I circle the 640x480 for you on these pics too?
Link
 
lol, :p, anyway thats not my point, i did it just incase anyone cant read numerical, hehe.

for all i know the performance changes completely at 1600 by 1200. who knows,
 
clarky003 said:
lol, :p, anyway thats not my point, i did it just incase anyone cant read numerical, hehe.

for all i know the performance changes completely at 1600 by 1200. who knows,

maybe thats the only res the 6800 shows up on the chart at? :p
 
-Viper- said:
Gabe's promoting ATI, there's no doubt in my mind.

What I wonder is why ATI in particular? Have they and Valve done promotions together before? Couldn't Nvidia just as easily made a deal to promote their cards?

[Sarcasm detection]Searching...searching...searching...None found...[/Sarcasm detection]
 
Varsity said:
The cards are out now.

No they aren't. The X800 Pro is out (went out on the 4th) and the X800 XT is being released on the 21st this month (May).

Spiffae said:
don't forget that the FX6800 has TWO molex power connectors and requires a 500w power supply minimum... it can draw 100w of power!

the X800 draws less power than a 9800 Pro, and less power = less heat = quieter cooling.

and it's only one slot, vs. 2 for the 6800 ultra.

It's the Geforce 6800, not the FX6800 :p
Also, the minimum recommended PSU for the 6800u is 480w, not 500w

But apart from me being a retarded nit-picking useless swine, I completely agree. Can't wait to purchase the X800 XT :)
 
Oh, I thought it was the GeForce FX 6800 Ultra Extreme Edition (Platinum Pro)
 
Nope, the FX series was really the GeForce 5 series, and now the 6800 cards are all the GeForce 6 series. The FX easily throws everyone off, I hate how they did that. Damn marketing :(
 
clarky003 said:
lol, :p, anyway thats not my point, i did it just incase anyone cant read numerical, hehe.

for all i know the performance changes completely at 1600 by 1200. who knows,

at 1600x1200 my 128mb graphics card runs out of memory, at about 1280x960 it goes down to about 11fps from 38fps. If I switch off DOF and multi sampling then it goes back up to around 24-25fps.

I would say the x800XT would run at about 45-50fps at 1280x960 if it's anything like the 9800. As it's based on the r3xx core and it's performance patterns are very similiar, I would say thats a fair assumption.
 
Abom said:
There's unfortunately a lot of variation of performance in the different reviews. HardOCP's review shows the x800XT absolutely smoking the 6800U, while something like anand's review shows a minor victory from the x800XT. To be honest, at this time I actually trust the HardOCP review the most. They go into much deeper detail on the architecture, features etc. than any of the other reviews.


How can you say that? I'm not syaing you shouldn't trust the HardOCP review, but the part about it going into much deeper detail on the architecture. I thought anandtech did a pretty nice review, there was about 7 pages before they showed benchmarks.


Quick Question-Should I go with a 9800 Pro 128MB now, and then buy something more advanced lets say, near the end of the year?
 
Death.Trap said:
How can you say that? I'm not syaing you shouldn't trust the HardOCP review, but the part about it going into much deeper detail on the architecture. I thought anandtech did a pretty nice review, there was about 7 pages before they showed benchmarks.


Quick Question-Should I go with a 9800 Pro 128MB now, and then buy something more advanced lets say, near the end of the year?

I'm personally going to wait until all these new cards are released and then see how much the cost of the 9800 and 5900 go down.
 
Moto-x_Pat said:
Well, someone said that 2.4 GHz was barely enough to get by...
Apparently, that "someone" doesn't know what he's talking about. Half-Life 2 will more dependent on your graphics card than anything else.
 
Death.Trap said:
How can you say that? I'm not syaing you shouldn't trust the HardOCP review, but the part about it going into much deeper detail on the architecture. I thought anandtech did a pretty nice review, there was about 7 pages before they showed benchmarks.


Quick Question-Should I go with a 9800 Pro 128MB now, and then buy something more advanced lets say, near the end of the year?

It's the way that HardOCP went into further comparisons on the AA and AF aspects, providing a full page or so of information. They also provided screenshot comparisons as well as tables, and commented on how they differed... just stuff like that, allowing them to delve a little deeper.
 
Mountain Man said:
Apparently, that "someone" doesn't know what he's talking about. Half-Life 2 will more dependent on your graphics card than anything else.

Either Rick E or Gabe said after about 1.2Ghz it doesn't make much difference... or was it 2.2... I can't remember but the GFX card will make the most difference. You won't be able to reach a CPU bottle neck at 1600x1200 with about 2.2Ghz under the hood (especially if it's an AMD :LOL: )
 
I just got this great answer from Valve about whether they were going to support PS3.0

I like ATI's X800 because

- normal map compression
- support for longer shader programs
- it's in stores next week

Solving real problems and actually being available to buy are pretty cool.



--------------------------------------------------------------------------------
From: Wilco [mailto:[email protected]]
Sent: Wednesday, May 05, 2004 12:04 PM
To: Gabe Newell
Subject:


Hi.



There’s a lot of excitement around about the new NVidia and ATi Graphics cards, and obviously a lot of fanboy-ism. NVidia’s 6800U supports DX9.0c and Shader Model 3.0, unlike ATi.

Now we all know you’re fond of ATi, but will you be updating source to use Shader Model 3.0 in the near-ish future (like say before ATi bring out their next gen card J ) and take advantage of the new features, or do you think they don’t really add much to graphics performance?



Thanks in advance.



Wilco.

Lovely PR Talk, neatly sidesteps the actual question and manages to praise ATi at the same time!

And NVidias 6800U also supports longer Shader's (and branching etc, which ATi doesnt?).

So they probably will implement PS3.0 shaders for NVidia hardware, but will do it on the quiet so nobody notices.
 
Mountain Man said:
Apparently, that "someone" doesn't know what he's talking about. Half-Life 2 will more dependent on your graphics card than anything else.

The quote was taken out of context, what the guy said was a 2.4 ghz CPU should be enough to get by without causing a bottle neck on the X800 (not sure if this is true), not that it'll take a 2.4 ghz CPU in order to run HL2.
 
mrchimp said:
Either Rick E or Gabe said after about 1.2Ghz it doesn't make much difference... or was it 2.2... I can't remember but the GFX card will make the most difference. You won't be able to reach a CPU bottle neck at 1600x1200 with about 2.2Ghz under the hood (especially if it's an AMD :LOL: )
Yeah it was 1.2 for no more performance increase or whatever. It was 2.4 for not getting a bottleneck with a x800.
 
mortiz said:
The quote was taken out of context, what the guy said was a 2.4 ghz CPU should be enough to get by without causing a bottle neck on the X800 (not sure if this is true), not that it'll take a 2.4 ghz CPU in order to run HL2.

I have no idea if it's true at all. I was just estimating. If you want a top-of-the-range video card, you're going to need to have other hardware that can match it.
 
Abom said:
There's unfortunately a lot of variation of performance in the different reviews. HardOCP's review shows the x800XT absolutely smoking the 6800U, while something like anand's review shows a minor victory from the x800XT. To be honest, at this time I actually trust the HardOCP review the most. They go into much deeper detail on the architecture, features etc. than any of the other reviews.

But yeah, I agree with your last point. Both cards are running on uncertified, beta drivers... I'd like to see a rematch with officially released drivers when they get released.


How can you trust HardOCP ? they are the only site that shows the X800 XT totally smoking the 6800ultra like its an MX64mb card....they talk crap because every other site shows nothing of the sort. Plus the fact when the 9700PRO came out HardOCP was crowned the ATidiots and Anandtech the nvidia fanboi's....if i want a review i goto Tomshardware and not some fanboi site. The X800XT may be better than the 6800ultra but i would hardly call it smoked plus the nvidia will run doom engine better and the ati the HL2 engine, its tit for tat all the way and as ati are shite for support i know which one ill be buying, benchmarks aside and as for stability/issues in games...well, the ati gets smoked.
 
How can you trust HardOCP ? they are the only site that shows the X800 XT totally smoking the 6800ultra like its an MX64mb card....they talk crap because every other site shows nothing of the sort.

They show it doing better, yes, but the way they've drawn the table makes it look more impressive. It's just a few resolutions higher, with maybe some extra AA and AF in a few games, but it's still the leading card. There's no denying that.

Plus the fact when the 9700PRO came out HardOCP was crowned the ATidiots and Anandtech the nvidia fanboi's....if i want a review i goto Tomshardware and not some fanboi site.

So that's the same tomshardware that's renowned for being incredibly biased towards nVidia? Okay.

The X800XT may be better than the 6800ultra but i would hardly call it smoked plus the nvidia will run doom engine better and the ati the HL2 engine, its tit for tat all the way and as ati are shite for support i know which one ill be buying, benchmarks aside and as for stability/issues in games...well, the ati gets smoked.

Make your mind up. You start the paragraph by saying that the x800XT is the better card, then go and put it down shamelessly. ATi are no longer 'shite' for support, they make drivers that are on par (if not better than) with nVidia's now. They've really cleaned up their act, and as for stability issues in games, I honestly have no damn idea what you're talking about. Start making sense.
 
Two things...

You dont know if the NVIDIA card will run Doom3 better since the game isnt out and when JC said that it would, the radeons hadnt even dropped, so no one knows now.

Second...I think the Ruby demo sucked.
 
Start making sense.

or better yet...try making a radeon 9X00+ series card running 4.4 Cat. drivers crash or become "unstable."

Sounds to me like your last ATI experience was the Rage 128 Pro.
 
amneziac85 said:
Two things...

You dont know if the NVIDIA card will run Doom3 better since the game isnt out and when JC said that it would, the radeons hadnt even dropped, so no one knows now.

Second...I think the Ruby demo sucked.

Doom 3 = OpenGL
nVidia cards run the OpenGL codepath better than the Radeon counterparts

Therefore we can assume that nVidia's cards will run better than the Radeon card equivalents
 
Shuzer said:
Doom 3 = OpenGL
nVidia cards run the OpenGL codepath better than the Radeon counterparts

Therefore we can assume that nVidia's cards will run better than the Radeon card equivalents

Ahh I see, that seemed to have slipped my mind, but still, its always an assumption until some benches come in. NVIDIA will probably, but its still unknown.
 
It's not like many people are buying cards just for DOOM3 or HL2 anyway. (I mean normal people, not people in this forum :D )
 
Well they should be. The Doom3 and HL2 engines are going to be the two big heavy weight engines for the next couple of years.

My bet goes on ATI just because it has normal map compression (3Dc).
 
Shuzer said:
Doom 3 = OpenGL
nVidia cards run the OpenGL codepath better than the Radeon counterparts

Therefore we can assume that nVidia's cards will run better than the Radeon card equivalents

"That is a falsity, Judge!" "A falsity? Oh, you mean like an untruthitude" Homeworld 2 is an OpenGL game and the X800XT runs it much better than any Nvidia hardware.
 
CoreyGH said:
"That is a falsity, Judge!" "A falsity? Oh, you mean like an untruthitude" Homeworld 2 is an OpenGL game and the X800XT runs it much better than any Nvidia hardware.

Well, like I said, we can assume.. plus Carmack has said D3 runs better on nVidia hardware, but looks better on ATi
 
these 'my company's dick is bigger than your company's dick' threads always amuse me. I don't see why some people feel the need to pledge allegiance to one company, in the end it comes down to the cards, not the company's. As consumers we have the ability to compare two products, judge which is the better and make our purchase's accordingly.

Why do people find the need to defend their 'chosen' company's honour? Let their products speak for themselves. So please, for the sanity of us all, stop the company fan-boi`ism, it just makes you look retarded.
 
Alig said:
How can you trust HardOCP ? they are the only site that shows the X800 XT totally smoking the 6800ultra like its an MX64mb card....they talk crap because every other site shows nothing of the sort.

You should really look at the games these sites used to benchmark...
HardOCP used mostly new and popular games while THG used games from Nvidia's "The way it's meant to be played" and Anandtech did some as well.
Xbit had the most diverse set of games used.
Also pay attention to the drivers and testbeds they used.
That all adds into the POV you get when looking at these GFX card reviews.

Every site I saw basicly said the X800 was 'todays' winner.
Only Anandtech road the fence pointing out things we already knew. Either they really were split or they didn't know which side of fanboyism to join. hehe
 
Hmmm...the last Doom3 benchmarks show the 9800XT and FX5900 running at nearly 60 FPS. That benchmark was taken nearly a year ago.

http://www.tomshardware.com/graphic/20030512/geforce_fx_5900-12.html

The new 6800U and X800XT runs about twice as fast as the 9800XT/5950U. So even if John Carmack and Co. decided to go crazy on normal maps and dynamic lighting, these new cards should be able to handle Doom 3 at high resolutions and 4AA.

If you extrapolate further, you would get 120 FPS on high quality at resolution of 1024x768. Even though ATI may not have as good of a OpenGL driver, that is a pretty high framerate from a "state of the art" graphics engine. So even down the road a year or two from now, the X800 will still have decent/great openGL performance.

On the Direct 3D front it appears that X800 may win the current battle. Its 24FP shaders are really fast in games. Plus, DX9 games are still a rare bunch. We have no idea what SM3.0 will do in games and when it will actually be incoprated in to games fully (Far Cry/Stalker doesn't count, they weren't designed around 3.0).

Then you can't forget that HL2 runs faster on the new X800. ;) But both cards are extremely fast and will run any game in the upcoming future.
 
can anyone tell me if it's possible to get the ruby exe demo to run on a 9600xt to compare speeds? I'm guesing not cos Ruby uses new pixel shaders no?
 
the X800 isnt worth my money even if it comes out ontop of the ultra, the 6800 price will come down slightly if that happens, and if the ultra will run a game at around 70 fps, :) great. if the x800 pushes it up to 90 or 100 for the same settings. thats kool too. but to be honest its not gonna look that much better and at around 60 fps and above the game is beautifully playable anyway so it'll be a fanboy thing at the end of the day ;) like Mortiz said,,, whos got the biggest willy, hormones and all that... bleh, lol
 
They will most likely remain the same price bud.
The FX took a price drop last year because of it's DX9 performance revealed and yet most still went with ATI.
The 6800 doesn't have that same weakness (at least to that extent) and it has the onboard hardware encoder so demand as a whole won't drop (just possibly from gamers).
Online prices may adjust slightly based on which is better for games but not retail and nothing like last year.
I'd still rather have the X800 if it was a few bucks extra.
 
clarky003 said:
the X800 isnt worth my money even if it comes out ontop of the ultra, the 6800 price will come down slightly if that happens, and if the ultra will run a game at around 70 fps, :) great. if the x800 pushes it up to 90 or 100 for the same settings. thats kool too. but to be honest its not gonna look that much better and at around 60 fps and above the game is beautifully playable anyway so it'll be a fanboy thing at the end of the day ;) like Mortiz said,,, whos got the biggest willy, hormones and all that... bleh, lol


thought our eyes could only see up to 30fps? obviously there are benefits of high fps i.e. when game gets busy frames drop but isn't noticable (i.e. 100fps to 50fps) but that's it surely?
 
high frame rates = future proofing
high frame rates = allows you to use AA
high frame rates = justification for spending obscene amounts of money
 
jonnyapps said:
thought our eyes could only see up to 30fps? obviously there are benefits of high fps i.e. when game gets busy frames drop but isn't noticable (i.e. 100fps to 50fps) but that's it surely?

yeh 30 fps , but 60 basically is flawlessly smooth at a constant, and both cards achieve that, does the X800 need a 650 watt power supply like the 6800 u?

high frame rates = future proofing
high frame rates = allows you to use AA
high frame rates = justification for spending obscene amounts of money

most importantly, high frame rates = good gaming experiance

im saying a cheaper card that can perform to modern and future games maximum requirements will be the favourite as the majority of us arnt loaded with cash. :dozey:
 
Back
Top