Lost Coast: The nVidia levels?

Narcolepsy

Newbie
Joined
Sep 13, 2003
Messages
1,715
Reaction score
0
Well, isn't this an ironic twist of fate. For the best quality on the upcoming Lost Coast levels, you need a card with Pixel Shader 3.0. (That was the supposed "fluff" feature that everyone ignored on nVidia's 6xxx line of cards.) There are no current ATI cards that meet this spec - we'll be waiting until ATI's new line for that.

Click here for the gruesome details.

What does this mean?
1.) 9800 Pro users can still play the levels, just at lower quality.
2.) [speculation] We'll have to wait for the R520 cards to come out for Valve to release the levels due to the marketing deal with ATI.
 
They aren't called nVidia levels, it was called ATi levels. Just because nVidia have the only ps3.0 cards out doesn't mean the R420 or whatever can't do just as well in the levels.
 
I wonder if PS2.0b will work as well since it has many of the features PS3.0 has, just not to the same extent. Both are much beyond 2.0. No matter, I'm prepared!

Again, it's not like The Lost Coast was an expantion such as Aftermath. It's just a level, it won't make or break your choice of gfx card. ;)
Thanks for the heads up, will be looking for new info to come.
 
PS 2.0b should work better then 2.0a but it is still not the same as 3.0 and i suspect that it will not fall into the high graphics setting category. Also I hate to say it but to people who thought it was not important I told you so. PS3.0 is a benefit and will allow high quality HDR because of what now seems to be the only HQ implimentation.

ATI levels would be appropriate if HL2 was only 2 years or so older but until the new PS3.0 cards come out, which might put them back on top. Good thing people like ASUS and I chose correctly. However I would like to see the performance difference between HQ and LQ.
 
DiSTuRbEd said:
They aren't called nVidia levels, it was called ATi levels. Just because nVidia have the only ps3.0 cards out doesn't mean the R420 or whatever can't do just as well in the levels.
So did you read the post?

Asus said:
Thanks for the heads up, will be looking for new info to come.
No problem. It sure does seem like info and excitement are scarce these days. I hardly ever venture into General Discussion anymore.
 
It does seem strange that Valve haven't employed a type of shader that ati can currently support. I can almost guarantee they will use the lost coast level as a demonstration of ati's 'ultimate new graphics card' with ps3.0.
 
J_Tweedy said:
It does seem strange that Valve haven't employed a type of shader that ati can currently support. I can almost guarantee they will use the lost coast level as a demonstration of ati's 'ultimate new graphics card' with ps3.0.
Exactly. I would be good marketing to show off a shiny new ATi card with the Lost Coast level and good time to do this is just around the corner - E3!
 
Whew, I was getting kind of worried about my 6600gt. Because if you type "mat_info" in the console, it states that its not capable of HDR (which I know is not the case), but I thought maybe the source engine didn't pick up the new 6 series cards. Glad to hear the news.
 
I'm not worried, my 512 meg of pc2100 ram and 9600XT will shit itself if I play the Lost Coast, so I'm gonna find someone who CAN play it and either watch them, or have em FRAPS it.
 
yeah this will something you could just watch and 'oooh' and 'ahhh' at. less of a game than a demonstration of technology. I have doubts that my a64 3200+ with 1gb ram and 6800gt will even run it very well. If i find out it won't, i'd rather not download it and just watch it on a computer that will run it smooth as butter.
 
I can't believe I'm going to have to buy yet another graphics card just to see the best quality for Lost Coast! I've only just bought one!
 
T.H.C.138 said:
J_Tweedy you should be fine with that..I hope so,cause thats the same as my PC..

there is a HDR demo here that might give you some indication of how it (HDR) will run on your PC

http://www.daionet.gr.jp/~masa/rthdribl/

That's a demo with a couple of shiny balls and a HDRI background, a long shot from a game environment with 20x the complexity and interactions.
 
its still a little taste..and a little test..and like I said, SOME indication..at least if the video card blows up or not..;)

heres an idea! run the HDR demo thingy AND run the NovadeX physics demo..with winamp playing

THAT would be CLEAR indication of a PC being able to run the lost coast..or melt down..

one or the other :naughty:
 
is that the same one with the skull? if so ive tried it, did some tweaking with bloom and depth and got some very respectable framerates. Looks frickin awesome
 
thats the one! PvtRyan is right though,it isn't truly a good indicator of much beyond the vidcard aspect of a PC..

like my post above yours says,try running the physics demo,the HDR and winamp for max audio visual CPU stressage!!!

I am kidding about all that...although don't let that stop me...I mean you from trying it out;)
 
Methinks Narcolepsy may have an nVidia card...
 
J_Tweedy said:
It does seem strange that Valve haven't employed a type of shader that ati can currently support. I can almost guarantee they will use the lost coast level as a demonstration of ati's 'ultimate new graphics card' with ps3.0.

This is most likely the case. Besides, ATI is already going to use Remedy's Alan Wake (Remedy are the guys who made Death Rally and both Max Paynes) as some sort of graphical presentation medium at E3 so what Tweedy said would only be logical.
 
Well as far as i know Shader Model 3.0 would require atleast 32-bit floating point. Which as far as i know was why they could say they supported SM 3.0 since nvidia has 16-bit and 32-bit floating point support.

ATi only has 24-bit support if i'm correct.
 
PS3.0 isn't the issue. Anything you can do in PS3.0 can basically be done in PS2.0.

The issue is floating point blending, which ATI cards don't have, meaning they can't natively support the floating point colour format that HDR needs. So for the ATI cards, Valve is converting that floating point format to an integer format, storing all that colour information in a 16 bit integer buffer. However, this greatly restricts the colour range, leading to banding and other artefacts.

The high-end HDR uses PS3.0 because it's a lot easier to program for, and all the cards that support floating point blending also support PS3.0.
 
after a beer or two you can't tell jack from PS2.0 or PS3.0.

Man, no wonder the consoles hate us!
 
And here I thought ATI was trying to make everyone believe PS3.0 (and true 32-bit floating point) are useless features.
 
lovin my 6800gt right now! but i bet it'll chug with only a gig of ram and aA64 3200+
 
maybe, maybe not.

HDR in Far Cry and Splinter Cell are pretty demanding but my 6800gt runs them both fine (my other specs are 512mb RAM and a p4 2.4ghz). But then again Valve are saying they're pushing the limit with bump-maps and details etc. so we'll see.
 
Who wants to bet the release of the Lost Coast will coincide with the release of ATI's new graphics cards?
 
wilka91 said:
ps3.0 is just a little faster, visually exactly the same
Have you played Splinter Cell: Chaos Theory? ps 3.0 makes a huge difference with HDR Rendering and other fancy stuff.
 
Correct me if I'm wrong, but wasn't, "3DC" ATI's version of SM 3.0? It makes you wonder why valve would say you should use SM 3.0 for their "ATI" levels, especially after announcing that HL2 would make use of 3dc. Which has yet to happen. :(
 
ShmengeTravel said:
Correct me if I'm wrong, but wasn't, "3DC" ATI's version of SM 3.0? It makes you wonder why valve would say you should use SM 3.0 for their "ATI" levels, especially after announcing that HL2 would make use of 3dc. Which has yet to happen. :(

3Dc is just a normal map compression technology.

If I recall correctly, the next-gen nVidia card will also have something like it.
 
ShmengeTravel said:
Correct me if I'm wrong, but wasn't, "3DC" ATI's version of SM 3.0? It makes you wonder why valve would say you should use SM 3.0 for their "ATI" levels, especially after announcing that HL2 would make use of 3dc. Which has yet to happen. :(


Because ATI was caught with their pants down when nV came out with a card supporting SM3.0 and ATI had not. The fact that ATI is so gung-ho about having SM3 support on their new card (and having Valve push it as well) proves it.
 
TheSmJ said:
Because ATI was caught with their pants down when nV came out with a card supporting SM3.0 and ATI had not. The fact that ATI is so gung-ho about having SM3 support on their new card (and having Valve push it as well) proves it.


and when nv40 cards choke on the lost coast, ati will be proven correct that the hardware wasnt capable at the time to run it properly. nvidia putting sm3.0 in cards that cant run it well is pretty pointless.
 
somehow I don't think that nv40 cards will "choke" on this..they may not be as |337 as the NEW ATi cards but when nVidia comes out with the next one, ATi cards will "choke" on whatever game is out at the time...rinse wash and repeat forever

just change company names and products as needed
 
nvrmor said:
and when nv40 cards choke on the lost coast, ati will be proven correct that the hardware wasnt capable at the time to run it properly. nvidia putting sm3.0 in cards that cant run it well is pretty pointless.


They can't? What about Splinter Cell: CT? Farcry? They play smooth as silk for me with SM3. :rolleyes: :upstare:
 
Back
Top