official doom 3 benchies

well it separates the cards price/performance a lot more then nvidia did. If your going to buy nvidia there is no point in gettin an ultra IMO because the GT can easily be overclocked to it speeds. John said something about having problems overclocking videocards with doom 3. I would like to see someone go more in depth on that but I cant find anything.
 
blackeye said:
well it separates the cards price/performance a lot more then nvidia did. If your going to buy nvidia there is no point in gettin an ultra IMO because the GT can easily be overclocked to it speeds. John said something about having problems overclocking videocards with doom 3. I would like to see someone go more in depth on that but I cant find anything.

I concur my friend... :cheers:
 
A-Train said:
Seems ATI made a mistake on the X800 Pro by not giving the card 16 texture pipelines instead of 12..it takes an ass whooping in most of those benchmarks from the 6800GT..

Oh Noes! I can only play doom 3 at 1024 by 768 with 4AA and 8 AF with high Quality at 42 FPS with my X800 Pro! What shall I ever do?

/me faints.
 
Its nice to have the extra speed because that way you can run the game at high resoultions with AA/AF on while still playing at a decent frame rate. Thats why I care so much about speed, because image quality is everything to me.
The faster my videocard the better my image quality.

Is this the article you speaking of
No I was talking about an article that will show and overclocked card vs a non overclocked card in Doom3. I want to know if this will affect Q4 also as its on the same engine as Doom3
 
I have to feel a bit sad for X800 Pro owners as the benchmarkers state.."Turning on 4X Antialiasing certainly does impact the frame rates by a good deal. If fact, judging by pure frame rate alone, we would have to say that the ATI X800Pro is giving an unacceptable level of performance while the rest of the three still stay above our imaginary 30FPS mark. The NVIDIA GeForce 6800Ultra surely stands head and shoulders above all cards in this particular run when you calculate in the price point on the 6800GT compared to our other two leading cards."
 
You can still run the game with a x800 pro but the image quality wont be as good as with the other cards (x800 xt or a 6800).

But doom 3 will probably look great at 800x600 from what I have heard. So anyone with geforce 3 should be happy with the game.
 
The demos I liked to from Anandtech ran SM2.0 for the X800 cards and SM3.0 for the Nvidia cards.
They both were ran at 1600X1200 which the Tom's hardware benchmarks where the 6800GT passed the X800XT PE were at lower resolution. Plus the two demos are not the exact same. The demos I linked to from Anandtech were recorded by themselves and were not from Nvidia. Plus, Tom probably used an Intel system.

And again we have TechReport's review.

What I was bothered by in some of the games with my 6800GT were some flickering with AA/AF enabled. Rails, thin wires or lights in both GTA:VC and BFV flickered as I moved. I have the optimizations disabled from the control panel.
My 9800pro never did that.
 
I will run Doom 3 with lowerst everything (although i can go high) under one condition: will people just shut up about all this benchmarking performance crap and just wait for the game to come out. Then complain: there is so much more to gaming than just worrying if you can run a game at 100x AA :)

Sorry I always get reved up about these issues.
 
I agree with you man. I could play the game at lowest settings and I have befor but if I can play the game at 30 FPS with everything on high im going to. I dont give a shit if its a 30 FPS or 60 just as long as image quality is good.

The faster my videocard the better my image quality. The better my image quality the easier I can see a mercenarie through a bush in farcry. The easier I can see a mercenarie in the bush the happier I am. The happier I am..............

This was taken from Hardocp forums.
Thanks to all you guys that appreciate the work. We are at id's offices now working on the hardware guide. It will contain all that good stuff that you have come to expect from [H] evaluations.
(And BTW, the game friggin ROCKS.)

Lucky bastards get to play the game already.

there is so much more to gaming than just worrying if you can run a game at 100x AA
You have to remeber people like me buy a card and its got to last 2 sometimes 3 years. So we need every last bit of performance out of it. I cant buy a new card evey year just to get my 30 FPS. Thats why a lot of people pay such close attention to the benchmarks.
 
Don't get me wrong, I'm in university, my budget is mcdonalds and pizza and I still bought a 9800pro half a year ago. I like games, but I always get angry at those people who are all obsessed with game performance, even before they see how it runs on their computers.

A funny example, my friend just installed a watercooling system in his computer (don't know the technical name), but now he's too afraid to close the computer in case it might leak :)
 
:LOL: Why would he even buy one then.

--------------------Quoted from someone else------------------------
In response to the above poster, of course people buy a card for a game, they get excited and want to play a game very much at the highest quality they can. This has always sold games. The 'game' to buy cards for before this announcement was Far Cry (or for those more hopeful, Half LIfe).

It isn't like this card WON'T PLAY OTHER GAMES, lol. IN the case of Nvidia, it plays other games very well, usually within a statistical dead heat with the XT. So when one card really trounces another card and feeds it baby milk like a momma....people take note.

This should do alot to help the GT's already great sales.
----------------------------------------------------------------------
That was of the hardocp forums.
 
In all fairness, and considering that these are OpenGL benchmarks, the X800XT PE is doing a decent job.
Granted that the nvidia cards do better than Ati cards in OGL, but shouldn't companies make cards that are good on both sides of the table? Shouldn't that be what drives the companies? Not to make it so people say "oh, well, this is the fasted card out there, but it sometimes takes a beating in openGL (or directX, either way)"


Asus: As for your problem... hopefully that'll go away with a release of WHQL drivers.

Wait until HL2, I'm more than positive it will own the 6800U in most areas, especially AF.

I'd say that's a pretty big statement to make, considering there haven't been any huge differences between the cards. The biggest differences i've seen between the cards are these D3 benchmarks and a few (like Asus's) Far cry ones (although the far cry ones have been, for the most part, differences at points where I doubt you'll notice (100fps v 87fps for example)). Valve staffers have said that if you're buying ones of these top of the line cards, you won't be disapointed. Of course we'll have to wait until the game is released.

That's not to say differences won't be noticed when Valve starts adding on... (SM3.0 support, 3Dc, etc)
 
Im not to sure about this one so im going to ask you guys. Does the 6800 series support 32 bit colour or something like that and the x800 only 24? Would we notice a image quality difference in doom 3 with that or not? I think the 6800 were benched on 32 and the x800's on 24.
Im not sure just checking with you guys.
 
blackeye said:
Im not to sure about this one so im going to ask you guys. Does the 6800 series support 32 bit colour or something like that and the x800 only 24? Would we notice a image quality difference in doom 3 with that or not? I think the 6800 were benched on 32 and the x800's on 24.
Im not sure just checking with you guys.

In about 3 years, yes you will notice artifacting with 24 bit color. However, right now 24 bit is more than enough. That should not be a selling point for the 6800 series, by the time Unreal 3 (or whatever) comes out, the 6800 is going to be too slow to properly use 32 bit calculations.

For reference, DX 9.0b mandates a minimum 24 bit floating calcualtion while DX9.0c mandates the full 32 bit floating calculation. So X800 and 9800 owners will not get screwed in the future because ATI cards meet and exceed DX 9.0 requirements.
 
And it isn't 24 or 32 bit color. We would have some issues there. hehe
It's internal precision to the Chip. ;)

24bit precision is the minium for DX9. Interesting how Nvidia still kept the 16 and 32bit combiniation for their card even with DX9.0c...
Both ATI and Nvidia will have 32bit precision cards for the next series and ATI will include DX9.0c then. Should be interesting.
 
whoops, typing error there. :eek:

Yeah, it is done for internal calculations only. Doom 3 will be one of the first games to experience floating point precision problems with older hardware. Apparently they did a good job of hiding those problems though.
 
lazicsavo said:
Don't get me wrong, I'm in university, my budget is mcdonalds and pizza and I still bought a 9800pro half a year ago. I like games, but I always get angry at those people who are all obsessed with game performance, even before they see how it runs on their computers.

A funny example, my friend just installed a watercooling system in his computer (don't know the technical name), but now he's too afraid to close the computer in case it might leak :)

Im not bothered that i cant play doom3 at 1600x1200 at full AA/AF with my X800pro, however i am grovling that i just spent £350 on a card that i thought would last me 1 - 2 years when in all fairness, this X800pro wont get me through to next year if games only advance over doom3. I play at 1280x1024 ress and they dont show that ress but the X800pro is an absolute rip-off...i knew i shouldnt of let all the fanboism and praising of ATI make me move to ATI and push my patience to the limit.

How much does anyone think i could sell a 4 week old tommorow Sapphire X800pro fully boxed for? I want a 6800Gt like i always did plan on getting until the X800pro was in-stock. So the nvidia's get a _few_ fps less in DX but the ATI's get hammered in openGL...there is a big difference between the two and with doom3 now out almost expect a lot of openGL games running the doom3 engine.
 
Well, considering that we all used to not use AA/AF and just run games at a good resolution and quality settings, I think we may be getting spoiled. Don't ya think?
That card will last you 1-2 years unless you demand to use AA and AF at high res which then the 6800GT won't last either.

I agree the 6800GT is a better deal but it isn't that big of a difference.
 
At 1280 by 1024 with 8AF, the X800 Pro should have an average FPS of 52. With AA enabled, the FPS drops down to 33 at 1280 by 1024 resolution.
 
What about the X800 XT at 1280 by 1024 with 4xAA and 8xAF?
 
There is this great little program called Microsoft Excel. I use it for a lot of things. From statistics to charts to data organization, Excel does it. I am basically making a scatter plot and then connecting a line between the two data points. This is reasonably accuraute since screen size goes along a linear path. However, actual results may vary.

Anyway, the X800 XT PE at 1280 by 1024 with 4AA and AF will run at 48 FPS.
 
Allow me to quote what I said in the other thread:

By the way, people - these are benchmarks, so they're made to stress the cards to their maximum capacity. Hell, it even says in the article that the actual gameplay won't be so demanding. Stop getting so worried.
 
;)

hardocp.com said:
One thing that you really must keep in mind when looking at timedemo results such as these, is that DOOM 3’s AI and physics engine are not being used as they would in a real-world gaming experience. That considered, we would guess our average framerates shown here to be a bit higher than if you were actually playing the game.

Look at this quote. :O

:p
 
blahblahblah said:
;)



Look at this quote. :O

:p
I refuse to believe that they'd be testing these cards on demos that won't be stressing the cards to the max. Besides, AI and Physics calculations will hardly cut into the overall framerate, since they're more CPU dependant than anything.
 
blahblahblah said:
Anyway, the X800 XT PE at 1280 by 1024 with 4AA and AF will run at 48 FPS.


What about 1280 by 1024 with no AA and 4x AF?
 
Back
Top