ATi X800 runs 30% faster than Nvidia's

H

Head-Cracker

Guest
http://pc.ign.com/articles/535/535268p1.html

inimum system requirements appear to be a 1.2 gigahertz processor with 256 megabytes of RAM and a DirectX 7 video card. Of course, those of you interested in the full experience will want a 2 gigahertz or higher processor, 512 megabytes of RAM and a DirectX 9 video card. Doug specifically stated that the ATI x800 was the card of choice amongst many of the testers, as it ran roughly about 30% faster than Nvidia's best cards.

ATi owned Nvidia again ? :naughty:
 
GreasedNeut said:
companies lie often, until i see benchies, im skeptical.

plus if its like 200fps too 170fps i don't care :)


well that would only be a 15% increase but i understand what ur saying. I dont think its a full 30% quicker either and i think its wrong for developers to make there game for a certian card companey.
 
this may be a tad off topic but Doom 3 was supposed to only be able to run at full detail with the next gen nvidia cards, or thats what the interviews and benchmark said and i run it fine with highest everything with my 9800. So I'd definetly agree with benchmarks being for the most part a load of crap.
 
what rez u running and are u running with Ultra settings??
 
nvidia runs doom3 better than ati, and according to valve.. ati run hl2 better than nvidia.. o well.. the benchmarks that came out for hl2, nvidia was using the drivers which were not optimised for hl2.. its almost like valve want nvidia to look bad. a tad evil i say
 
If your still wondering why :rolleyes:

Doom 3 = Nvidia money involved
Half Life 2 = ATI money involved

Now I forget how much ATI payed? But I think it was 30 million. (Im probably wrong) But you dont pay any amount of money and get nothing (HL2 Bundled Vouchers etc..).. Of course they are going to say it runs 30% better. (Im not counting AA/AF or 3dc vs dxtc)

Im not sure what they mean by "Nvidia's best cards" But I really doubt they have included the 6800 GT and 6800 Ultra. =/ :hmph:

Untill I see some benchmarks/results from the actual game from forum goers/actual players Ill take everything with a grain of salt :p
 
I believe the deal was closer to 6 or 8 million... not 30.
 
If its beta/alpha then i dont give a damn

But i think every HL2 benchmark we see out on the net are beta, so its not as accurate as the full game
 
Nvidea cards have more raw power than ATI crads but they lack all the complicated Direct X9 shader support and texturing that ATI cards do so well with. There you go, thats the difference, of course with the newest generation of crads thats changed somewhat, with Nvidea finally catching onto that.

For me ATI Radeons are king, Radeon X800 XT all the way baby! :p
 
"Doug also stated that there wouldn't be any more screenshots" :) wtf valve have some strange customer relation methods
 
mutt said:
"Doug also stated that there wouldn't be any more screenshots" :) wtf valve have some strange customer relation methods
Indeedily doobly.
 
mutt said:
"Doug also stated that there wouldn't be any more screenshots" :) wtf valve have some strange customer relation methods


They have shown us enough already
 
Well that's false, since greg posted three new screenshots AFTER dougy boy said that.
 
I don't own CS:CZ and I never entered my ATI voucher number for the premier pack or anything. Would the CS:S beta still preload, or would I have to enter my ATI voucher number somewhere for it to preload?

BigNamek...
 
BigNamek said:
I don't own CS:CZ and I never entered my ATI voucher number for the premier pack or anything. Would the CS:S beta still preload, or would I have to enter my ATI voucher number somewhere for it to preload?

BigNamek...
just THINK about it... it's not hard
 
In no other DX9 game do we see differences of 30%, VALVe must have optimised for ATi.

If ATi don't get their video cards out soon though nVidia will win this battle just on pure availability. I know a massive number of people who've gone for nVidia simply because they're out, and with the performence differences they didn't see much of a point in waiting.
 
HaVoK said:
If your still wondering why :rolleyes:

Doom 3 = Nvidia money involved
Half Life 2 = ATI money involved

Now I forget how much ATI payed? But I think it was 30 million. (Im probably wrong) But you dont pay any amount of money and get nothing (HL2 Bundled Vouchers etc..).. Of course they are going to say it runs 30% better. (Im not counting AA/AF or 3dc vs dxtc)

Im not sure what they mean by "Nvidia's best cards" But I really doubt they have included the 6800 GT and 6800 Ultra. =/ :hmph:

Untill I see some benchmarks/results from the actual game from forum goers/actual players Ill take everything with a grain of salt :p


You got all mix up budy, it goes more like this:

-Doom3 = Nvidia's obviously better Open GL driver.
-Hl2= ATI money involved.
-FarCry= Nvidia money involved.
 
king John I said:
Nvidea cards have more raw power than ATI crads but they lack all the complicated Direct X9 shader support and texturing that ATI cards do so well with.

Now the opposite is the case.

(At the moment I think NVIDIA offer the best value for money with the 6800 GT. So, for me, they've won this round.)
 
Excellant! My Raddy 9600 Pro will run HL2 well. I'm not getting D3, so I don't care how my GC runs it. Oh, I'm new by the way. Hello!
 
There was no optimizing for Nvidia or ATI cards in either game.
Both ATI and Nvida bid on the bundle with HL2. ATI put more down for the bid and won.

150FPS at 1024x768 would great. It will make a difference at 1600x1200 . Especially with AA and AF.
 
velcommen unt da haef laef 2 forums!

Ja, yer 9600p'll be fine, me matey.

(no i'm not drunk.)
 
No nvidia money went into Doom 3? oh sorry.. my mistake But yeh, nvidia excels at opengl.

But to whoever said that nvidia lacks all the complicated Direct X9 shaders support and texturing, is there any comparison I can look at between the 6800 Series and X800 Series for shader support etc? :x.. I dont see how nvidia would make a new card series and then not include any support for complicated dx shaders and texturing.

I thought that the 6800 series supported DX 9 SM3, and full 64-bit texture filtering and blending :x. And the x800 Series supports up to SM2 and has 24bit precision.
 
ATI's shader engine is more powerful than Nvidia's.
Both ATI and Nvidia's new cards pushed pixel shader possiblities.

Tomb Raider was one of those games that used many DX9 shaders and effects (Even Depth of Field). It actually is one of the more advanced DX9 games.
Link you can see performance there.
 
Someone has to actually get an x800 to be able to run it 30% faster. :p
 
Back
Top