thehunter1320
Newbie
- Joined
- Jul 9, 2003
- Messages
- 3,361
- Reaction score
- 0
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: this_feature_currently_requires_accessing_site_using_safari
Abom said:http://mbnet.fi/elixir/NV40/
As I've already said, it's impressive, but I was kind of expecting more... the frame rates are higher, but not an incredible amount higher than the 9800XT. I'm waiting to see what ATi can do.
Netherscourge said:I think the great thing about all of this is that ATI FORCED Nvidia to make this card.
So we, the gamers are the true winners. Well, lets hope the price is decent...
*cough <$500 cough*
Asus said:A week or so back I was hearing 299$ for the 6800 (non Ultra) and 399$ for the 6800 Ultra. But the source was from Europe so I have some doubts of those prices being in USD.
no problem bro, glad to helpAsus said:OOo thx for the extra links.
I forgot to check Firingsquad.
Asus said:Something I forgot though.
ATI may match Nvidia at the high end but battle between the X800 and 6800 will be interesting because the X800 will have less pipelines than the X800 XT.
6800 I would think would be a clear winner.
Again, we have to wait to find out.
Netherscourge said:http://www.anandtech.com/video/showdoc.html?i=2023
How will this affect Valve, HL2 and Gabe Newell going with ATI as the endorsed video card manufacturer? HL2 is not even Gold yet, with no end in sight. And here Nvidia will have their 6800 Cards out before HL2 and their card is effectively 2 times more powerful then the 9800 XT, now sporting 16 Piplines as opposed to ATI's 8 pipes.
But to be fair, ATI also anounced 16 Piplines in their new card, which is yet to be fully revealed.
Will the mighty Gabe be eating any crow? Or is this just all a result of falling behind in the development process due to the Source Code fiasco?
Please, Discuss!
Asus said:Something I forgot though.
ATI may match Nvidia at the high end but battle between the X800 and 6800 will be interesting because the X800 will have less pipelines than the X800 XT.
6800 I would think would be a clear winner in certain areas (AA/AF performance?)
Again, we have to wait to find out.
Majestic XII said:The new shader thingy is just for the show off atm. The card will be outdated when games use/need 3.0... marketing thing.
Majestic XII said:The new shader thingy is just for the show off atm. The card will be outdated when games use/need 3.0... marketing thing.
Majestic XII said:You really dont need the huge amount of data shader model 3.0 gives you yet. It will take some time until we get games that really makes it into something than just a cool feature.
Majestic XII said:You can call me a ATI fanboy but i think its better that someone dont say "OMG LOOK AT ALL THE NUMBERS!! F0XXorS 9800 XT!" like all the others.
You really dont need the huge amount of data shader model 3.0 gives you yet. It will take some time until we get games that really makes it into something than just a cool feature.
The card dont overclock much either, because of the big die size. Ive seen ~11% on the core and 6 % on the mem... not really impressive if you ask me (But then again... im a ATI fanboy...)
We dont know much about ATIs cards. We know that the die size will be smaller (better overclocker), higher clockspeeds and 16 pipelines (just like the new geforce).
Like i said before... will be very interesting.
The prices seems to be OK too (both parts).
Ok, here are some of my report notes (from speech):
NVidia poured $1 billion R&D into NV40 lineup, including highend down to entry level, all set to be rolled out this year.
Focused on 4 priorities:
Priority 1: make sure NV40 was fastest on market, period.
Priority 2: advance shader technology, support ps3.0
Priority 3: prosumer home theatre video, Hi-Def
Priority 4: Film quality fx in games, real time
6800 Ultra model with 512MB of GDDR-3 coming in June.
Introduced Nalu, video sample.
Slide: Superscalar 16-pipe GPU
NV40 is completely new architecture
NV40 quadrupled # of pipes of NV3x, doubled ALUs
(Democoder note: probably more significant, that bugs in NV3x with FP32 and register limits were also fixed)
Chart: raw lab data from performance lab (see image I sent), graph of suite of games at every resolution/setting, comparing NV40 vs NV35
Slide: 2X NV40 is 2x faster than 5950 in games across the board (DC Note: and probably WAY faster in games like HL2)
Demos: Nalu
150,000 vertices in hair alone
A single shader for entire model, uses branches to determine whether to write skin or mermaid scales
Demos: Asteroid Field (Geometry instancing)
Shows thousands and thousands of asteroids, 200,000 polygons. 11fps on 6800, very jerky. Uses 11,000+ DrawIndexPrimitive calls
Switch on Geometry Instancing -> butter smooth (verticle sync locked? ) 29.99fps. 23 DrawIndexedPrimitiveCalls
Demos: Timbury. Bugs-life type movie. Impressive. Demonstrates HDR rendering.
Demos: EQ2. (See video)
Demos: FarCry PS3.0. Uses displacement mapping all over the place (not "offset mapping" like Unreal3 engine) Also dynamic shadows everywhere. They showed side by side images. It looks *WAY* better hands on compared to current FarCry.
Demos: Lord of the Rings RTS (canned video, playable version at E3)
Demos: UnrealEngine3
Music Video: Set to Evanescence "Bring me to Life", Nalu, Dawn, Dusk, and indeed, mostly every NVidia Demo character dancing (just fancy editing)
Our card is using 256MB of 2ns Samsung GDDR3 SDRAM rated at 500MHz. So the fact that the memory is running at 550MHz by default means it is "overclocked" past its rated speed for these modules. Looking at Samsung’s website we find that that there are faster modules in this series. It seems there are GDDR3 modules from Samsung capable of running at 1.25ns delivering 800MHz. So the future looks bright with the possibility of seeing faster chips out there.