SM2.0 vs SM3.0 1.2 Far Cry patch.

Pretty interesting review. I'd like to see more indepth though.
Remember, this is an NVIDIA provided demo and highlights the performance benefits of the new rendering path.
So the first 2 benchmarks were custom and the remaining benchmarks were provided by Nvidia...
The exact reason HardOCP presents their reviews like they do. All custom with no standard demos that can be optimized for.
 
Interesting. Would have prefered to see more non-biased demos than the four that Nvidia gave Anandtech to test. Granted anandtech did created a couple of custom demos, some more would have been a nice balance.

I'm disappointed that my graphics card did not own the 6800 GT, but I am happy that there is competition in the graphics card industry right now. Can't wait for HL2 to use 3Dc and how it beats up on the 6800GT. :D
 
Asus said:
The exact reason HardOCP presents their reviews like they do. All custom with no standard demos that can be optimized for.
i cant say i like HardOCP's review. and x800 matching/beating a 6800U when everybody else shows a GT beating a pro. just too shady IMO. but if you like em so be it. i cant force you not to like them, but you should take this into thought.

Also, when we were informed that this patch was coming down the pipe, NVIDIA sent along a couple demos to test the performance difference with the new SM3.0 path enabled. Ubisoft is going to include these 4 demos with their patch, but we were obviously a little wary of just throwing these numbers up. We took a close look at the demos, and we are including them ALONGSIDE our original custom demo and a new custom demo that WE recorded for this article. The reason why we are including the NVIDIA provided demos is that they are definitely sections of the game that are really parts of the gameplay. Whether these are representative of overall gameplay or not, there are definitely experiences in the single player mode of the game that are represented by the demos.

and...
One of the largest caveats about benchmarking in FarCry is that demos don't work like one would expect. For example, in Unreal Tournament 2004, we can start a game, play our hearts out, start and stop recording somewhere in the middle, and we have a very cool little benchmark of the action. This repeatable benchmark is a fair representative of gameplay as far as benchmarks go. This seems reasonable for a game with a built-in demo mode.

FarCry, on the other hand, will record the movement of the player through a level without recording any of the players actions (like firing a weapon or pressing a button to open a door), and none of the other characters in the level are recorded either. When a demo recorded in single player mode is viewed, all the AI controlled characters man the same posts that they would in the game; only they ignore the player moving through the level when the demo is running.

This means that some demos have instances of passing through locked doors, and AI bots making their normal rounds regardless of the player (and can be in different locations for different runs of the benchmark). And as if this wasn't enough, the worst part of the whole experience (have you figured it out yet?) is that demos are entirely absent of any fighting, conflict, or gunfire.

We tried many times to benchmark this game using FRAPS, but our ability to be repeatable was worse than what demo playback gives us.

So, why are we at all OK with using FarCry's built-in demo mode? Because much of FarCry game play has to do with sneaking around, walking through the levels, and taking in the scenery. No, it's not the all-encompassing perfect benchmark, but it isn't the worst thing that we've seen either (*cough* - 3dmark - *cough*). We've compared the demo mode to our very non-repeatable FRAPS benchmarks of walking around levels and we are comfortable with the reliability of the scores that we get from the demo for that purpose.
btw, [H]ard|OCP used FRAPS. if that means anything.

edit: doesnt mean i dont like the [H], i post at their forums pretty much everyday, and i have been for a while. its just somthing about thier reviews i find fishy. they say the 6800U can get max playable gameplay of 1280x1024 w/ aa/af when ive seen 6800GT users that post on the [H] that play in 1600x1200 w/ aa/af.
 
I can almost promise you that those customer Nvidia demos are not the entire length of a level in Far Cry. I bet they are a specific segment where Nvidia cards shine. I bet ATI could put out its own custom Far Cry demos and show where the X800 beats the 6800's into submission.

You should take them with a grain of salt.

Anyways, when is the 6800 GT supposed to be available? Its been a week since they "supposedly" hit the streets. I'm tired of ATI and Nvidia paper launching their cards.
 
:O Saw no image differences, performance wise Nvidia cards did better, but the demos WERE from Nvidia. All I am concerned is to whether I should get an Nvidia or an ATI card now. Who knows if SM3.0 will be offered in some up coming games and if it will offer increases in performance and image quality.
 
yeah your probally right on that.

ATI assures us that they have also been working with CryTek on their efforts. Since we have seen a performance improvement with the latest driver and new 1.2 patch, we don't have any reason to think that anything extraordinarily fishy is going on behind the scenes between NVIDIA and Crytek. We would obviously like to see this texturing issue fixed.

umm.. i think the GT is avaliable, some people already have them around forums and stuff. but as of aquiring them, it will probally be hard finding a store with them instock because with both nvidia and ati cards, when they hit a store in little supply, they seem the sell out like hotcakes. so im not 100% sure on getting one on the streets.

ailevation said:
:O Saw no image differences, performance wise Nvidia cards did better, but the demos WERE from Nvidia.
the first two, were not.

but your right also. the nvidia demos probally had some big optimizations going on but im guessing this is just a taste. not the whole glory yet. theres alot more SM3.0 titles coming out that i would like to see because far cry definately is not the best game of the year ;)

This is another NVIDIA supplied demo, and it shows the largest performance gains that we see in the SM3.0 render path. From the looks of our other benchmarks, these numbers are not typical, but they do happen as our own exploration of this level proved to reflect the numbers that we see in this demo.
 
Looks like Nvidia took the lead. In the SM 3.0 tests the 6800GT won 7 and the X800XT won 5.
 
If these tests didnt come from nvidia i would probably believe them, but as they dont im not going to get pissed off that i didnt wait that little bit longer for a 6800GT instead of my X800pro ive already bought, because nvidia fanny about too much...
 
3.0 doesn't look much better than 2.0 so I don't care...Just another option trying to get people to buy nvidia and the game. Speaking of which on the farcry forum you guys should see the people dissing ati because the X800 doesn't have sm3.0 support, and most of them don't even have 6800U or the series for that matter, mostly 5950's hehe :)
 
x84D80Yx said:
edit: doesnt mean i dont like the [H], i post at their forums pretty much everyday, and i have been for a while. its just somthing about thier reviews i find fishy. they say the 6800U can get max playable gameplay of 1280x1024 w/ aa/af when ive seen 6800GT users that post on the [H] that play in 1600x1200 w/ aa/af.

If you read the disclaimer before all thier review, you'll see that that's the optimal playing resoltuion for thier hardware. Obviously if you had an fx53 and a gig or ram, you could probably play higher.
 
DiSTuRbEd said:
3.0 doesn't look much better than 2.0 so I don't care...Just another option trying to get people to buy nvidia and the game. Speaking of which on the farcry forum you guys should see the people dissing ati because the X800 doesn't have sm3.0 support, and most of them don't even have 6800U or the series for that matter, mostly 5950's hehe :)

It isnt meant to look any different. Its the performance difference.
 
Everyones making way too big a deal over sm3.0....there's pretty much nothing that it's offering other than "look we're using a bigger number than ati so buy our card." It's clear the XT PE has the potential and clearly can kick the shit out of the 6800 ultra. No im not going by hardocp, it's clear in the results on anandtech and a couple dozen other reviews i've read.

/waits patiently for alig to flame me :thumbs:
 
guinny said:
Everyones making way too big a deal over sm3.0....there's pretty much nothing that it's offering other than "look we're using a bigger number than ati so buy our card." It's clear the XT PE has the potential and clearly can kick the shit out of the 6800 ultra. No im not going by hardocp, it's clear in the results on anandtech and a couple dozen other reviews i've read.

/waits patiently for alig to flame me :thumbs:

ATi fan boy by any chance?
 
More like a logical thinker with common sense.

nvidia fan boy by any chance?
 
blahblahblah said:
I can almost promise you that those customer Nvidia demos are not the entire length of a level in Far Cry. I bet they are a specific segment where Nvidia cards shine. I bet ATI could put out its own custom Far Cry demos and show where the X800 beats the 6800's into submission.
Apparently the SM3.0 path gains the most performance in indoor area's, where it can use single-pass lighting. Both custom AnandTech demo's were taken outside, where most of the shaders are PS1.1 and thus provide little benefit for SM3.0.

guinny said:
More like a logical thinker with common sense.
People with common sense don't post flame bait on this forum.
 
The reason HardOCP uses FRAPS in FarCry is because demos do not represent playing the game. They do not play the AI, bullets, physics etc.

Again, I would like a more complete review with FRAPs or at least custom demos.
Because only 2 of the demos were not supplied by Nvidia and only one review so far with the benchmarks being in optimal favor to those who supplied the benchmarks means that we cannot take these results with meaning.
 
guinny said:
Everyones making way too big a deal over sm3.0....there's pretty much nothing that it's offering other than "look we're using a bigger number than ati so buy our card." It's clear the XT PE has the potential and clearly can kick the shit out of the 6800 ultra. No im not going by hardocp, it's clear in the results on anandtech and a couple dozen other reviews i've read.

/waits patiently for alig to flame me :thumbs:

Wait no longer little boy.

It is right there infront of your _****ING EYES THAT THE 6800ULTRA BEATS THE XT PE_ ffs man, either wake up, or go the **** back to sleep.

Your the sort of person that buys a Vauhall Corsa 1.1 and says my car is better than a Toyota Supra.

Say whatever you want guinny...everyone knows you will never back down from ati being worse than nvidia because you have put your money down on an XT PE ...thats where me and you differ, ive already spent £350 getting my X800 PRO delivered but that does'nt mean that im gunna phone up ATI and ask if i can stick my dick in their arse hole.

Infact, just because you are so blatently arrogant/ignorant/twat-like your going on ignore, i cant stand you. :flame:
 
TechReport has a less bias review. ;)
Version 1.2 of Far Cry will apparently come with four built-in demos for benchmarking. Those demos take place on the four levels mentioned in the NVIDIA presentation. Rather than use those pre-recorded demos, however, we elected to record five of our own—one on each of the four levels NVIDIA mentioned, and one on the "Control" level. The demos are downloadable via a link below.
They pushed Nvidia's demos aside but benchmarked the same levels.

I personally, don't think it was still worth it for Nvidia considering what is affected by adding the extra transistors for PS3.0.
Techreport comments

Here is an interesting demo based on PS3.0 and another for 3DC.
 
Its not whoes card is faster but whoes card can benefit you the most. Dont buy a card that beats another one buy 3 frames a second with features your never going to use. I would be happy with a card that is just as fast but will also benefit me more. Im not going to buy a card so I can brag on the forums saying I got a card that gets 74 FPS when someone else got 71. It all comes down to personall choice and what will benefit you the most.
 
guinny said:
Everyones making way too big a deal over sm3.0....there's pretty much nothing that it's offering other than "look we're using a bigger number than ati so buy our card." It's clear the XT PE has the potential and clearly can kick the shit out of the 6800 ultra. No im not going by hardocp, it's clear in the results on anandtech and a couple dozen other reviews i've read.

/waits patiently for alig to flame me :thumbs:



I haven't seen any reviews where the XTPE "kicks the shit" otta the 6800u. Yes, it's done better.... but I says the 9700pro v 5800u was "kicking the shit". These aren't anywhere near "kicking the shit" diffrenceces to me.

And yes, you're an ATi fanboy, that's fine. I part I don't like is how you act like a stuck-up rich boy. I mean, half a year ago you were buying a p4 system with a 9800pro and stuff. you still act like you know more thna everyone else, I think Asus here has demonstrated he is superior in terms of tech knowledge to you :p
 
guinny, no offence, but every thread where you see nvidia having some kind of positive measure, you always have to come with some statement like "oh it gets to the point where its just too much" or theres "no big deal" about it.

please, give it up. you make it hard for me not to call you a fanboy when you say you are not.

anyways.

asus, that humus demo. he works for ati, so im sure hes not trying to show massive improvements over the competitor just like nvidia

even his first page reads ...
Nvidia can consider themselves pwned.

his method has been used before in games but why hasnt it been picked up like this in games if it offers such a huge performance increase?

here is a DIRECT quote from humus on his demo. posted at rage3d forums, which i also post on.
Humus said:
Yup. In real world applications you likely won't see as much performance increase.

but its interesting on his method, if it brings performance increase for which ever brand so be it! even small performance increases are good in my book.

i gotta admit im kinda surprized nvidia is keeping up with this new architechture running 32bit mode. im also impressed how nvidia's drivers have improved in such a small period of time when offering the same IQ (or better some have noticed). since when are gains, even small ones, bad?

heres a quote from somebody who went to a ati/nvidia tech dev party of some sort.
SM3 is used to implement Geometry Instancing (batching of identical meshes) which gives up 40% gain but more usually 0-20% gain. They used this headroom to push out the point at which they render sprites (imposters) rather than meshes. They stated that this made vegitation rendering look better.

i beleive anandtech had the nv40 test with AF optimizations ON, if that means anything. why would a gamer turn those off if the IQ is the same?

i really dont want to discuss [H]ard's review anymore because its be discussed to death at their forums, obviously over there many find it suspicious, and the advid [H] fans see nothing wrong so why debate it.

i dont know why you call anandtech's review biased tho. they showed where it was useful, and showed where it had no use, and showed why they used nvidia's demo, and showed why they used their own.

FiringSquad used (which are similar to TechReport's numbers, who also used a 64 3800+)
Athlon 64 3800+
1gb RAM
DX 9.0b <- ??
WinXP SP1

Anandtech used
Athlon 64 3400+
Unknown amount of RAM
DX 9.0c
WinXP SP2 RC2

bit of difference

crytek even took the time to fix that banding bug in the 1.2 patch heh. if that means anything or not. like i said this is new architechture so it probally takes time. and so far they have shown impressive stuff. to me atleast.

blackeye said:
Its not whoes card is faster but whoes card can benefit you the most. Dont buy a card that beats another one buy 3 frames a second with features your never going to use. I would be happy with a card that is just as fast but will also benefit me more. Im not going to buy a card so I can brag on the forums saying I got a card that gets 74 FPS when someone else got 71. It all comes down to personall choice and what will benefit you the most.

i agree 100%!!


ah well ill shut up now :)
 
Here is xbitlabs review.
Using an Company sponsered and created demo is not the right way to do a fair review. You can surely make up your own on the same maps with the same indoor lighting advantages. Anandtech's review stands out next to the other reviews. I wonder why. ;)

Both are good cards. I just want to make sure people don't look and not think while asuming. Read reviews and think for yourself.

Nvidia has a decent boost in performance, especially on the levels with more complex lighting. I'm glad they fixed the shader issues with the 6800 compared to when it was first released. The shader effects on the floor were pretty ugly.

I dont' like this trend. How long do you want to wait for them to implement visual PS3.0 updates into other games like HL2, Doom3 and STALKER after the game is released? Would you wait til you can buy a 6800U and then wait longer for the game to actually support those features you waited for after you have beat the game?

I just don't think PS3.0 is the future. I see PS3.0 as making up for lack of detail and poly count for now, basicly. Tim has even stated that in the UE3 they don't need displacement mapping since the poly count is high enough to display the detail.
 
you have your opinion and you are entitled to it

Asus said:
Here is xbitlabs review.
Using an Company sponsered and created demo is not the right way to do a fair review. You can surely make up your own on the same maps with the same indoor lighting advantages.
THEY USED 2 CUSTOM DEMOS!....did you not see that part? big or small performance increase either way, you cant displace the fact that they didnt use ALL of nvidia's demos for their test. i dont see how you can call them biased for trying to do something thats not corperate given.

:\

Asus said:
I dont' like this trend. How long do you want to wait for them to implement visual PS3.0 updates into other games like HL2, Doom3 and STALKER after the game is released? Would you wait til you can buy a 6800U and then wait longer for the game to actually support those features you waited for after you have beat the game?
i really dont see how its a trend. its just one game doing this. as far as i know, most of the sm3.0 titled games later this year will be shipping with sm3.0 features

Asus said:
Anandtech's review stands out next to the other reviews. I wonder why. ;)
i could say the same thing about [H]'s standing out next to the other reviews. but you seem to support them just fine?

why argue about it. we are who we are, and personal preferrence involves personal opinion.
 
Alig said:
Wait no longer little boy.

It is right there infront of your _****ING EYES THAT THE 6800ULTRA BEATS THE XT PE_ ffs man, either wake up, or go the **** back to sleep.

Your the sort of person that buys a Vauhall Corsa 1.1 and says my car is better than a Toyota Supra.

Say whatever you want guinny...everyone knows you will never back down from ati being worse than nvidia because you have put your money down on an XT PE ...thats where me and you differ, ive already spent £350 getting my X800 PRO delivered but that does'nt mean that im gunna phone up ATI and ask if i can stick my dick in their arse hole.

Infact, just because you are so blatently arrogant/ignorant/twat-like your going on ignore, i cant stand you. :flame:

Time for one of my numbered owning sessions.

1. Try looking at more than one review site.
2. I haven't put my money down on anything yet I'm still A) deciding. B) waiting for the cards to come out
3. An ati fanboy is someone who completely disregards facts in order to stick up for ati. Me saying an ati rage 128 is better than a 6800 gt would be fanboy material.
4. Idk why your thinking about sticking your dick in atis asshole? :-\
5. I love watching you get so angry when your clearly wrong. :naughty:
6. Idk why you call me a stuck up rich boy my family income isn't even 80,000, just because I work and can pay for things doesn't make me rich.
7. badboy thanks for holding off on calling me a fanboy, i respect that. people arent looking at the wider picture. the xt pe has the POTENTIAL to "slightly be better" (dont want to piss anyone off by saying "kicking the shit" cuz then ull get all technical on me) than the 6800.
7.
 
who was the person that said their parents gave them the choice of getting a new car or a computer?
 
Guinny, seriously if you still don't have your X800 STFU, no one wants to hear your ati fan boi-ism.....Both cards rule, just be quite for christ sake.
 
Rumors are flying around saying that Anandtech did not enable AA for Nvidia's cards.
ATI's cards are running AA though.
With the drivers they used to benchmark Nvidia's cards, you have to enable AA in game for Far Cry, not in the control panel.
ATI's drivers you can enable via the control panel.
They simply think Anandtech enabled AA through the control panel which means Nvidia's cards are not actually running AA in those select benchmarks.

Some comments at Beyond3D

That would also help to explain the differences in the benchmarks between sites. Anandtech's review might have been a little sloppy. :\

Hexus' review
I don't think I will be upgrading to the 1.2 patch for my 9800pro. That is if I play the game again since I have already beaten it...
 
There should be seperation of game developers and graphics card companies. Looks like patch 1.2 is a performance drop for everything except the new 6800 series. Sounds a little suspect to me. Though I can't wait for the 1.3 patch with HDR and other goodies. Hopefully they actually take the time to optimize other chipsets besides the 6800.

Now the anandtech review makes sense, they seriously borked the AA tests.
 
blahblahblah said:
Now the anandtech review makes sense, they seriously borked the AA tests.
Hehe
Also the 6800U EE is not even a Nvidia product.
It's an OCed 6800U. I wonder how it got in there.
 
Asus said:
Rumors are flying around saying that Anandtech did not enable AA for Nvidia's cards.
ATI's cards are running AA though.
With the drivers they used to benchmark Nvidia's cards, you have to enable AA in game for Far Cry, not in the control panel.
ATI's drivers you can enable via the control panel.
They simply think Anandtech enabled AA through the control panel which means Nvidia's cards are not actually running AA in those select benchmarks.

Some comments at Beyond3D

That would also help to explain the differences in the benchmarks between sites. Anandtech's review might have been a little sloppy. :\
if it's true, that's a huge mess up. hope anandtech fixes those graphs accordingly if true.


but i honestly cant see the AA difference between these two shots...can you?
ati
http://www.anandtech.com/video/show...reviews/video/nvidia/farcrysm3/volcanoati.jpg
nvidia
http://www.anandtech.com/video/show...reviews/video/nvidia/farcrysm3/volcanosm3.jpg

^images where AT showed the most increase in performance.

heres a interesting SM3.0 sweedish review, cant really read it but the numbers are interesting
http://www.nordichardware.se/Artiklar/?skrivelse=252
^found on that same thread.
Asus said:
Hehe
Also the 6800U EE is not even a Nvidia product.
It's an OCed 6800U. I wonder how it got in there.
not exactly nvidia, but eVGA, Gainward, and BFG exclusive. partners were allowed to change things with the revision boards.

eVGA 6800 Ultra Extreme (450 core, 1200 memory)
http://www.evga.com/articles/public.asp?AID=188

Gainward 6800 Ultra Extreme (450 core, 1200 memory)
http://translate.google.com/transla...=UTF-8&ie=UTF-8&oe=UTF-8&prev=/language_tools

BFG 6800U OC Watercooled (470 core, 1100 memory)
http://www.chumbo.com/info.asp?s=MGX-10129072

true its a OC, but not even the BFG Ultra OC(non watercooled) has those stock numbers. so it is a real card.
 
Damn that watercooled bfg 6800U is ****ing nice looking :)

But that gainward looks wierd, how does it cool itself?
 
DiSTuRbEd said:
But that gainward looks wierd, how does it cool itself?
says...
This Ultra GeForce 6800 for which Gainward guarantees a frequency of 450 MHz for the GPU and 1200 MHz for the memory is cooled by a squanderer "CopperCoolerTM" very out of copper, with double ventilation.
Thermobabbited, the two ventilators which reach the maximum speed of 4500 tours/minutes, generate a noise which can be lower than 25 dB, at fallback speed.

but i still dont know what that is lol. some kind of copper ventilator or somthin :rolling:
 
looks like anandtech updated the review. the x800xt and 6800U exchange blows throughout. trading off, sometimes the x800xt-pe wins, some times the 6800U wins. the GT wins more than the pro, and even beat the XT-PE at some benchs.

anyways to me, gains are gains, big or small, ill take em. looking forward to more sm3.0 titles and 3dc titles, rather than just far cry.
 
Ah ha. Interesting.
It was the AA issue.
I'm glad they fixed their mistakes. :thumbs:
Except it is still uneven. I wouldn't include the 6800U EE in standard benchmarks. It should be in its own review as a special card.
They should OC the X800XT PE otherwise. ;)
jk

I looked through it comparing No AA/AF graphs first. Then went back and only looked at 4xAA/8xAF graphs comparing X800XT PE to 6800U EE. A very good review now.
blahblahblah, your X800 pro looks like it would be very playable with AA/AF and res up high.
The X800XT PE doesn't loose once to its competitor, the 6800U, with AA/AF enabled.
 
Asus said:
Except it is still uneven. I wouldn't include the 6800U EE in standard benchmarks. It should be in its own review as a special card.
They should OC the X800XT PE otherwise. ;)
jk

lol i dont see why not, its a real card. im sure if there was a 'x800xt-pe OC' they would. if you think about it is pretty much is a OCed x800xt (none pe)

:dork:
 
hehe
Well for instance when Gainward had an OCed GF4 back in the day, they only benchmarked it in it's own review. Never in a driver update review or anything like that. They should only show the standard cards which would compare to each other.

The X800XT PE is just as much an OC as the 6800U.
 
Asus said:
hehe
Well for instance when Gainward had an OCed GF4 back in the day, they only benchmarked it in it's own review. Never in a driver update review or anything like that. They should only show the standard cards which would compare to each other.

The X800XT PE is just as much an OC as the 6800U.

well if you think about it...

x800pro - 6800GT
x800xt - 6800U
x800xt-pe - 6800EE ?

but just because nvidia doesnt make the EE's directly they arnt compared like that. that, and including the different price range. its just more exclusive for those other partners. but i dont think that means they shouldnt be benchmarked as you can aquire them.
 
The X800XT PE is a 500$ card just like the 6800U.
The X800Pro is a 400$ card just like the 6800GT.
X800 pricing, naming and specs
6800U EE

“There is lots of confusion going on. NVIDIA is not launching anything new. NVIDIA does not have a product called the “GeForce 6800 Ultra Extreme,” an NVIDIA spokesman told X-bit labs on Thursday.
This is why I don't think they should be benchmarked in normal reviews.
 
lol i just said that nvidia didnt make the EE.
and i just mentioned the price range differences.
maybe you didnt read my post lol.
obviously anandtech thought it was ok the review it.

nevermind ...its just a personal opinion so no use of debating about it.
 
Back
Top