The Nvidia 6800 and 6800 Ultra

http://mbnet.fi/elixir/NV40/

As I've already said, it's impressive, but I was kind of expecting more... the frame rates are higher, but not an incredible amount higher than the 9800XT. I'm waiting to see what ATi can do.
 
Abom said:
http://mbnet.fi/elixir/NV40/

As I've already said, it's impressive, but I was kind of expecting more... the frame rates are higher, but not an incredible amount higher than the 9800XT. I'm waiting to see what ATi can do.

ya, i now realize i'm a little late to the punch... i was so proud of myself, too!
 
Here

Is the best review I've read. And this thing absolutely blows away the 9800 XT in anything that isn't totally CPU-limited. I need one for graduation :p .

New best review I've read: Here
 
Woah... i really dont want that piece of metal in my computer.. will break and blow away half of it in a sec.

Im waiting for ATIs responce to this... will be very interesting.

Edit: Nvida sounds like Intel now. Get as many transistors you can on a small area and feed it with ALOT of power ("NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.") and screw the temp.... just slap a bigger cooler on it!

I think ill pass this one and go with ATI instead...
 
Enjoy ;)

Reviews:

Tom's Hardware
Anandtech
[H]ard|OCP*
Beyond 3D

*[H]ard|OCP benchmarks by playable results.
Resolution, AA, and AF may differ between Models and Benchmarks based on performance.

I will say. I am very impressed.
I now await ATI's response at the end of April.
 
New Nvidia Card SMOKES the ATI 9800 XT - Should Valve Reconsider ATI Endorsement?

http://www.anandtech.com/video/showdoc.html?i=2023

How will this affect Valve, HL2 and Gabe Newell going with ATI as the endorsed video card manufacturer? HL2 is not even Gold yet, with no end in sight. And here Nvidia will have their 6800 Cards out before HL2 and their card is effectively 2 times more powerful then the 9800 XT, now sporting 16 Piplines as opposed to ATI's 8 pipes.


But to be fair, ATI also anounced 16 Piplines in their new card, which is yet to be fully revealed.

Will the mighty Gabe be eating any crow? Or is this just all a result of falling behind in the development process due to the Source Code fiasco?

Please, Discuss!
 
christ almighty, some people love to stir it up dont they
 
we'r already discussing this in the UE 3.0 thread.

EDIT: the x800 will go faster in HL2 :thumbs:
 
Merged with another thread in the HARDWARE forum.

EDIT:

Two seconds. My bad
 
Moved to Hardware.

Edit: Okay, merged properly :p
 
Too many links. ;) hehe

The cards performance is awesome but shader quality could be an issue.
Luckly it's a driver issue that COULD be changed. Lets see if it gets changed or if they keep it low.

Plus ATI's card isn't even out. You cannot claim anything yet.
It should be better than the 9800XT or they would look bad. Especially with that power requirement. ;)
ATI shouldn't be any better for power though. Both will have 2 molex conntectors.
Until we get to PCI Express, they need power from an external source.
 
I think the great thing about all of this is that ATI FORCED Nvidia to make this card.

So we, the gamers are the true winners. Well, lets hope the price is decent...

*cough <$500 cough*
 
Netherscourge said:
I think the great thing about all of this is that ATI FORCED Nvidia to make this card.

So we, the gamers are the true winners. Well, lets hope the price is decent...

*cough <$500 cough*

Think they're looking at a $400 pricetag for the 6800U, which isn't too bad.
 
1872.png


<expletive>
 
A week or so back I was hearing 299$ for the 6800 (non Ultra) and 399$ for the 6800 Ultra. But the source was from Europe so I have some doubts of those prices being in USD.
 
Firing squad had and toms hardware had the best reviews hands down.
http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/

im impressed greatly. and with nv40 so focused on 3.0 shader support (as well as dozens of other game developers) im pretty sure the ati 420XT will be the only thing faster, but definately not smarter.

if you saw the unreal 3 engine video...it was amazing.
one of the character models has the same data in texture and graphic size as a entire level in the orginal UT.

more reviews you can handle: :D

http://www.beyond3d.com/previews/nvidia/nv40/
http://www.extremetech.com/article2...,1567272,00.asp
http://www.firingsquad.com/hardware...rce_6800_ultra/
http://www.guru3d.com/article/Videocards/128/
http://www.hardocp.com/article.html?art=NjA2
http://www.hardwareanalysis.com/content/article/1708/
http://www.hardware.fr/articles/491/page1.html
http://www.k-hardware.de/artikel.php?s=&artikel_id=2782
http://techreport.com/reviews/2004q...ra/index.x?pg=1
http://www.tomshardware.com/graphic/20040414/index.html

but again i think toms hardware and firing squad have the best detailed reviews. ati is going to have to pull something mighty fine out to beat the nv40's performance and many usable features.
 
If ATI's card match's it or comes close while holding it's image quality and shader quality I would probably pick them over Nvdiia's card.

If they match or are close in performance and Nvidia fixes the shader quality then I'd go w/Nvidia.
If ATI fall short in performance, I am lost. hehe

Performance or Quality...
Have to wait and see I guess.

OOo thx for the extra links.
I forgot to check Firingsquad.
 
Asus said:
A week or so back I was hearing 299$ for the 6800 (non Ultra) and 399$ for the 6800 Ultra. But the source was from Europe so I have some doubts of those prices being in USD.

the orginal source was from korea. and the official price of the cards is:

6800 ultra : 500 USD
6800 non ultra: 300 USD

its confirmed on alot of review sites and even yahoo confirmed it.
only place that said an odd number is firing squad which is kinda wierd, maybe a typo?

Asus said:
OOo thx for the extra links.
I forgot to check Firingsquad.
no problem bro, glad to help :D
 
btw id like to add.
im glad that nvidia is finally bucking up and taking their cards seriously. ati is proving to be a great competitor and these two cards (6800 ultra + x800 XT) are going to seriously raise the stakes. and its all for the better of the consumer for better prices as competition is stiffer and graphics and performance come with next gen cards.

cant wait to play far cry, doom III, hl2, and stalker on these puppies :D
 
Something I forgot though.

ATI may match Nvidia at the high end but battle between the X800 and 6800 will be interesting because the X800 will have less pipelines than the X800 XT.
6800 I would think would be a clear winner in certain areas (AA/AF performance?)

Again, we have to wait to find out.
 
Asus said:
Something I forgot though.

ATI may match Nvidia at the high end but battle between the X800 and 6800 will be interesting because the X800 will have less pipelines than the X800 XT.
6800 I would think would be a clear winner.

Again, we have to wait to find out.

It's the balance between technology and price. Because it has less pipelines, it'll probably be much cheaper, therefore more people may buy it... be interesting to see how it all turns out.
 
The fact that ATI didn't incorporate Pixel/Vertex shader 3.0 in their upcoming cards is whats really gonna hurt them this generation.
 
Netherscourge said:
http://www.anandtech.com/video/showdoc.html?i=2023

How will this affect Valve, HL2 and Gabe Newell going with ATI as the endorsed video card manufacturer? HL2 is not even Gold yet, with no end in sight. And here Nvidia will have their 6800 Cards out before HL2 and their card is effectively 2 times more powerful then the 9800 XT, now sporting 16 Piplines as opposed to ATI's 8 pipes.


But to be fair, ATI also anounced 16 Piplines in their new card, which is yet to be fully revealed.

Will the mighty Gabe be eating any crow? Or is this just all a result of falling behind in the development process due to the Source Code fiasco?

Please, Discuss!

Why do you compare a 24-ish old chip with a 1 year old chip? That dont make any sense to me.

The nvidia card SHOULD be better than 9800 XT. 9800 XT is old, wait for the X800 XT card before you say anything.

The new shader thingy is just for the show off atm. The card will be outdated when games use/need 3.0... marketing thing.

I want ATI before i say anything, the card is impressive, but we dont know how impressive it is yet.
 
Asus said:
Something I forgot though.

ATI may match Nvidia at the high end but battle between the X800 and 6800 will be interesting because the X800 will have less pipelines than the X800 XT.
6800 I would think would be a clear winner in certain areas (AA/AF performance?)

Again, we have to wait to find out.

the real thing is between the x800XT and the 6800 ultra.
they both have 16 pipelines.
x800xt is supposedly clocked slightly higher, but doesnt have the shader support for the new games so it wont be fully DX9.0c+ ready.

so its more of do you want more fps? or better shader + image quality?

the other battle is the x800Pro, and the 6800 non ultra
both 12 pipelines but again the ati series doesnt have the new shader support

the x800 SE which has no next gen competitor. its closest rival is probally the 5950ultra, because at 8 pipelines, it wont be much faster. but might cost more.
 
Majestic XII said:
The new shader thingy is just for the show off atm. The card will be outdated when games use/need 3.0... marketing thing.

hate to bust your bubble, but theres a SHITLOAD of games coming out with shader 3.0 support which the ati next gen series wont have ALL within the next 6 months.....

New Shader Model 3.0 Titles

Lord of the Rings, Battle for Middle-earth
STALKER: Shadows of Chernobyl
Vampire: Bloodlines (valve's engine, so im sure hl2 will also)
Splinter Cell X
Tiger Woods 2005
Madden 2005
Driver 3 (aka Driv3r)
Grafan
Painkiller
FarCry
...and many more

if you want the image. ill upload it for you.

edit: looks like mrchimp beat me lol
 
That's what I ment. X800 Pro. :)

I originally thought the 6800 would have 16 pipelines but I guess not.
Should be interesting anyway.
 
Majestic is an ATI fanboy....

But look at it this way: When the 9700pro came out, it beath the TI4600, but not by doubling it's scores....

I'd say this is pretty fecking impressive for nVidia.

Kudos, thanks for coming back into the game.


Time to ATi's next to see if they cna match it.
 
Firingsquad.com mentioned the forums in their article.
"If you think back to last summer’s Half-Life 2 AA fiasco that erupted on the halflife2.net forums, chances are you’ve heard of centroid sampling. Just in case though we’ll provide a quick refresher."
 
You can call me a ATI fanboy but i think its better that someone dont say "OMG LOOK AT ALL THE NUMBERS!! F0XXorS 9800 XT!" like all the others.

You really dont need the huge amount of data shader model 3.0 gives you yet. It will take some time until we get games that really makes it into something than just a cool feature.

The card dont overclock much either, because of the big die size. Ive seen ~11% on the core and 6 % on the mem... not really impressive if you ask me (But then again... im a ATI fanboy...)

We dont know much about ATIs cards. We know that the die size will be smaller (better overclocker), higher clockspeeds and 16 pipelines (just like the new geforce).

Like i said before... will be very interesting.

The prices seems to be OK too (both parts).
 
Majestic XII said:
You really dont need the huge amount of data shader model 3.0 gives you yet. It will take some time until we get games that really makes it into something than just a cool feature.

there's more too it than increased shader sizes. Like dynamic branching for example which in theory would be quicker than static branching, which would carry out all the instructions then chose the right one, instead of just doing the right one.
 
Wow.... :O
Very impressive.

Between all the dozens of benchmarks, these struck me the most: Call Of Duty @ TomsHardware.
The FX5950U and 9800XT look like low-end cards next to the GeForce6800U. CoD doesn't have any pixelshaders of course, so I'm very much looking forward to benchmarks with a fixed version of Far Cry.

I, like everyone else here, am also eagerly awaiting ATI's response to this.
 
I don't trust Tom's Hardware, those results conflict with some other sites, although I think there only slightly exagerated.
 
Majestic XII said:
You can call me a ATI fanboy but i think its better that someone dont say "OMG LOOK AT ALL THE NUMBERS!! F0XXorS 9800 XT!" like all the others.

You really dont need the huge amount of data shader model 3.0 gives you yet. It will take some time until we get games that really makes it into something than just a cool feature.

The card dont overclock much either, because of the big die size. Ive seen ~11% on the core and 6 % on the mem... not really impressive if you ask me (But then again... im a ATI fanboy...)

We dont know much about ATIs cards. We know that the die size will be smaller (better overclocker), higher clockspeeds and 16 pipelines (just like the new geforce).

Like i said before... will be very interesting.

The prices seems to be OK too (both parts).

lemme just give you this quote from a guy who went to the nvidia lan party and saw far cry on shader 3.0 (sounds like a cool feature to have over just higher fps to me)

Ok, here are some of my report notes (from speech):

NVidia poured $1 billion R&D into NV40 lineup, including highend down to entry level, all set to be rolled out this year.
Focused on 4 priorities:

Priority 1: make sure NV40 was fastest on market, period.
Priority 2: advance shader technology, support ps3.0
Priority 3: prosumer home theatre video, Hi-Def
Priority 4: Film quality fx in games, real time

6800 Ultra model with 512MB of GDDR-3 coming in June.

Introduced Nalu, video sample.

Slide: Superscalar 16-pipe GPU
NV40 is completely new architecture

NV40 quadrupled # of pipes of NV3x, doubled ALUs
(Democoder note: probably more significant, that bugs in NV3x with FP32 and register limits were also fixed)

Chart: raw lab data from performance lab (see image I sent), graph of suite of games at every resolution/setting, comparing NV40 vs NV35
Slide: 2X NV40 is 2x faster than 5950 in games across the board (DC Note: and probably WAY faster in games like HL2)

Demos: Nalu
150,000 vertices in hair alone
A single shader for entire model, uses branches to determine whether to write skin or mermaid scales

Demos: Asteroid Field (Geometry instancing)
Shows thousands and thousands of asteroids, 200,000 polygons. 11fps on 6800, very jerky. Uses 11,000+ DrawIndexPrimitive calls
Switch on Geometry Instancing -> butter smooth (verticle sync locked? ) 29.99fps. 23 DrawIndexedPrimitiveCalls

Demos: Timbury. Bugs-life type movie. Impressive. Demonstrates HDR rendering.

Demos: EQ2. (See video)

Demos: FarCry PS3.0. Uses displacement mapping all over the place (not "offset mapping" like Unreal3 engine) Also dynamic shadows everywhere. They showed side by side images. It looks *WAY* better hands on compared to current FarCry.

Demos: Lord of the Rings RTS (canned video, playable version at E3)

Demos: UnrealEngine3

Music Video: Set to Evanescence "Bring me to Life", Nalu, Dawn, Dusk, and indeed, mostly every NVidia Demo character dancing (just fancy editing)

the card dont overclock much??
omfg its a revision board, they are not even on the shelves yet, and diffrent manufactures and distributors oc diffrently. and those havent even hit the concepts yet! (just for gigabyte and asus so far)

hardocp.com says they dont even know the die size measurements on the revision boards.

ill give you that we dont know much about ati boards but we do have some facts laid down. you can see them on the hardocp.com main site.

shader 3.0 will also increase fps for cards enabled with it. so its not like its all just show and tell.

and ill give that it will be interesting. its going to take alot for ati to top the nv40. it kinda tells you something when almost 40% of the games benched are being cpu-limited (bottlenecked by the cpu) at 1600x1200 on the 6800Ultra
:upstare:

edit:
i know im playing a broken record. but lemme also say
the memory found on the revision boards of the 6800 (told by hardocp again)
say they were old memory chips and the newer chips can clock up to 800mhz making it 1600mhz total. and its very possible for the manufacturers to add this memory within the month before shelf time.

so the nvidia card might not be powerfull, but when other developers find the memory and use it, it could be a crazy thought thinking any cpu lower than 4.0ghz might be a bottleneck.

Our card is using 256MB of 2ns Samsung GDDR3 SDRAM rated at 500MHz. So the fact that the memory is running at 550MHz by default means it is "overclocked" past its rated speed for these modules. Looking at Samsung’s website we find that that there are faster modules in this series. It seems there are GDDR3 modules from Samsung capable of running at 1.25ns delivering 800MHz. So the future looks bright with the possibility of seeing faster chips out there.
 
When will this be on the stores :D ?

I need to have it!!!
 
I think that ATi's response will come in the following days or weeks. Since my mom just said I could get a new video card for graduation, I need to determine which card will be better, the X800 XT or the 6800 Ultra.
 
I think I'll wait till the ATI response and subsequent war....

Maybe the Radeon's X890X or GF 6950 Ultra will be good..
 
Back
Top