Lost Coast: The nVidia levels?

TheSmJ said:
They can't? What about Splinter Cell: CT? Farcry? They play smooth as silk for me with SM3. :rolleyes: :upstare:


suuure they do. sli doesnt count either.

So how does enabling Shader Model 3.0 actually provide a better gaming experience in Splinter Cell: Chaos Theory? Soft Shadows showed the least amount of difference, whether they were turned on or off, we honestly did not notice much difference in shadow quality as we were playing through the game. We did notice some differences with Parallax Mapping, but the differences weren’t earth shattering. We will definitely be looking for more examples in the game of differences with Parallax Mapping.

The only quality setting that really made a difference in gameplay was HDR. HDR is simply fantastic, and in this game, it is implemented very well. Unfortunately, it seems that even the BFGTech GeForce 6800 Ultra OC just isn’t fast enough to play with HDR enabled at high resolutions. In addition, Anti-Aliasing does not work with HDR; therefore, aliasing really becomes a problem at the 1024x768 resolution we had to play at to get acceptable performance. Since you can’t crank up the resolution to 1600x1200 with HDR enabled, it becomes a feature you’d rather disable to either enable AA or turn up the resolution. Both AA levels and resolution have a large impact on image quality in this game as well.

[H]

smooth as silk at 10x7 with no aa/af. have fun :hmph:
 
Either way, my X800 XT PE is on Ebay and the new 6800 Ultra is being posted next week :)
 
I'm sorry but you seem to be acting like a bunch of dumbasses. (no offense meant...really!)

How do you even think Valve would develop Lost Coast if there wasn't any hardware they could run it on to check if it works or not? Besides, the differences between SM2.0 and SM3.0 aren't trivial AND "visually same" is a stupid argument since it's the coders/artists who make the shaders that result in visual look of the game, not the techniques under it.
 
Para said:
I'm sorry but you seem to be acting like a bunch of dumbasses. (no offense meant...really!)

Hehe, that's an oxymoron if I ever saw one.
 
The reason why you "need" a Pixel Shader 3.0 Graphics card is because of the Specular lighting as well as everything else that will be in the level. However I think of it this way - If FarCry works, The Lost Coast will work.
 
nvrmor said:
suuure they do. sli doesnt count either.



[H]

smooth as silk at 10x7 with no aa/af. have fun :hmph:

I played it at 1280x1024 w/ 8X AF. It was smooth.

Pissed cause you can't play it in SM3? :upstare:
 
TheSmJ said:
I played it at 1280x1024 w/ 8X AF. It was smooth.

Pissed cause you can't play it in SM3? :upstare:


no, i'd rather have high rez with aa and a high framerate, than a minimal iq upgrade.

what are your specs btw? and what do you consider smooth framerate?
 
CPU: 2400+ Mobile OCed to 2.44GHz (231x11)
Motherboard: ABIT NF7-s REV. 2.0
Video Card: eVGA 6800 GT OCed to Ultra
RAM: 512MB Kingston HyperX DDR3500
Hard Drive: 2X WD800JB (80GB, 8MB cache) in RAID-0
Heatsink/Fan: SLK-947u + Enermax 92mm adjustable fan
Case: SX1040BII
PSU: Antec TruePower 430W
Mouse: Logitech MX700
Optical drives: 16X DVD-ROM, 16X DVD-RW
Monitor 1: Compaq P1100 (Trinitron 21")
Monitor 2: KDS VS 195 (19")
UPS: APC BackUp XS 1000

I consider an 30+ FPS "smooth".

At that resolution there was no real need for AA IMO. It looks nice in HL2 at that rez, but SC is too dark for the jagged edges to be noticeable.
 
nvrmor said:
nvidia putting sm3.0 in cards that cant run it well is pretty pointless.
If that's the case, why the hell does my 6800GT run Far Cry FASTER when using 3.0 shaders? Quick answer: because 3.0 allows for more shader work to get done per clock cycle than 2.0, or even 2.0b.

Leveraged strictly for shader performance enhancements (they can use it to beef up IQ as well, it's up to the developer as to how they use it), it's possible to get a LOT higher performance when running in 3.0.

Unless they've recently changed the definition of "fast", you're statement is incorrect.
 
TheSmJ said:
I consider an 30+ FPS "smooth".

Even though I quite agree with you on this, expect all the people with 200fps in HL2 that complain about dropping to 150fps is unacceptable and unplayable to flame you for this :)

I don't even know how they can even tell the difference, unless they have SUPER GO-FAST ROBOT eyes.
 
Duracell said:
I don't even know how they can even tell the difference, unless they have SUPER GO-FAST ROBOT eyes.

Eye/visual synchronization. What this means is that generally eye's FPS is around 25-30 (FPS is wrong term here because of the structure and behaviour of eyes but it's close enough to explain what's going on) and everything that is a coefficient of those values looks smoother than the ones that aren't even near. So that's why 60fps (30*2) looks actually smoother than 85fps (25*3=75, close but not exactly smooth - 30*3=90, also close but not close enough). There's of course some flux in the numbers but the basic principle should be clear for you now.

Oh and running game at 100+FPS on a monitor that gives out only 85Hz is useless since the Hertz count of monitor tells how many times per second the monitor refreshes the picture or in other words Hz = monitor FPS. Running at non-synced framerates will cause choppiness in output.
 
Para said:
Eye/visual synchronization. What this means is that generally eye's FPS is around 25-30 (FPS is wrong term here because of the structure and behaviour of eyes but it's close enough to explain what's going on) and everything that is a coefficient of those values looks smoother than the ones that aren't even near. So that's why 60fps (30*2) looks actually smoother than 85fps (25*3=75, close but not exactly smooth - 30*3=90, also close but not close enough). There's of course some flux in the numbers but the basic principle should be clear for you now.

Oh and running game at 100+FPS on a monitor that gives out only 85Hz is useless since the Hertz count of monitor tells how many times per second the monitor refreshes the picture or in other words Hz = monitor FPS. Running at non-synced framerates will cause choppiness in output.

I understand this, I was making a joke about how stupid these people are when they complain that 150fps is unplayable when they are probably only running their monitor at 85hz :p
 
Duracell said:
Even though I quite agree with you on this, expect all the people with 200fps in HL2 that complain about dropping to 150fps is unacceptable and unplayable to flame you for this :)

I don't even know how they can even tell the difference, unless they have SUPER GO-FAST ROBOT eyes.

i hate having 60 fps, you wanna know the only reason in cs?

the crosshair lag. :bonce: :eek:
 
Hmmm....I think of all the parts for my new compy, I'll get the graphics card the very last. (It would be common sense to do so, anyways...)
 
Heh, yea, I've seen that before... how the eye sees only to a certain FPS level. So all the people with those SLi board w/ 2 X800's in them are pwnt.
 
Yea, ATI is releasing their own version of the "SLI"

Anyways, interesting to say the least (original post). I do think you're right, that VALVe is holding back because ATI doesn't have a 3.0 capable card released that I know of. Kinda sucks though that the x800 people won't get to enjoy the Best Quality (am I right on that?)

Either way, HDR will be awesome, no matter which card you have!
 
Back
Top