The Nvidia 6800 and 6800 Ultra

As a Radeon 9800PRO owner myself I didn't want to see an nVidia card kick my ass, but from every benchmark I've seen the new nVidia card is opening up a can of whoopass on ATI's flagship card. I've been doing research into this thing and the capabilities are just amazing, nVidia sets a high standard for ATI, will be interesting to see what they do.
 
Majestic XII said:
Woah... i really dont want that piece of metal in my computer.. will break and blow away half of it in a sec.

Im waiting for ATIs responce to this... will be very interesting.

Edit: Nvida sounds like Intel now. Get as many transistors you can on a small area and feed it with ALOT of power ("NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.") and screw the temp.... just slap a bigger cooler on it!

I think ill pass this one and go with ATI instead...

-snifsniff- reeks of ati fanboyism
 
2 Comments from me:
2 Questions:

1. Awwww....is that another power connector?! Wait....maybe I can spare my hard drives...well.....no.....maybe my DVD drive can go.......ooops.....looks like I will have to instsall another power supply...lol. What is this? Over AMP the cards contest?! ATi and nVidia have used up one, but now they are going into taking two of my power connectors?! :rolling:

2. Fascinating, and just in time for PCIx, too bad that one is AGP...lol. It would be nice if ATi and nVidia make thier next cards for AGP and PCIx, that way you can change to PCIx and save 2 freakin power connectors....rofl.

Q1: What is the next series for ATi?

Q2: Will they even continue the RADEON series with the "new" core?

Other than that, great topic :thumbs:
 
Why do people call someone an "ATi fanboy" if they say they are going to wait to see what ATi can do before they buy a next-gen video card? I think doing anything else is being a stupid consumer. You can't wait another month? If nVidia is still beating ATi at the end of the month (or whenever their card comes out) you can come back in here and bash ATi... but don't jump the gun. If ATi happens to pulls this off and passes nVidia again you'll just look like a retard... and a hypocrite.

Wait for both cards to hit mass production before you make any conclusions. Remember, both companies promised cards that were more than twice as fast as their predecessors and so far one of them has lived up to their claims. Who's to say the other won't be able to do it as well?

Now, about the card itself... one word:
Amazing
 
OCybrManO said:
Why do people call someone an "ATi fanboy" if they say they are going to wait to see what ATi can do before they buy a next-gen video card? I think doing anything else is being a stupid consumer. You can't wait another month? If nVidia is still beating ATi at the end of the month (or whenever their card comes out) you can come back in here and bash ATi... but don't jump the gun. If ATi happens to pulls this off and passes nVidia again you'll just look like a retard... and a hypocrite.

Wait for both cards to hit mass production before you make any conclusions. Remember, both companies promised cards that were more than twice as fast as their predecessors and so far one of them has lived up to their claims. Who's to say the other won't be able to do it as well?

Now, about the card itself... one word:
Amazing

Point taken, also wait for the Stable driver benchmarks before making any token comments. :LOL:
 
It's Interesting times a head. :eek: I mostly wait the scores with both new cards in:
3Dmark04 (wich should be just around the corner)
HL2, Stalker & Doom3.
Can't remember but was Doom3 suppose to be DX9 game ?.
Scores with the older games are irrelevant to me (of course the card is fast).

We dont't know yet how big of importance if any will the VS&Ps 3.0 support be in the next 18 months. There is whole new card Inside that time frame. But is there any interesting games? One interesting aspect may rise from the direction of Softimage. If the new version of Softimage XSI I'm quessing 4.0 will support those new shaders allready. So there would be true benefit from the HW support.

As someone mentioned. Ati forced Nvidia to produce this extremely interesting piece of HW. I would like to see the power usage to be as smart as with Pentium M processors. Now I have to buy Exta expensive powersupply unit (it has to be silent). And I'm quessing there isn't any ATX PWSupply with 88-95% efficiency. This card will cost you plenty in the form of increased elecricity bill. :angry: x 5
 
nonexite said:
Now I have to buy Exta expensive powersupply unit (it has to be silent). And I'm quessing there isn't any ATX PWSupply with 88-95% efficiency. This card will cost you plenty in the form of increased elecricity bill. :angry: x 5

your like one of those guys who complain, but DONT READ REVIEWS!!!!!!

lemme get this quote for you :\
Basically, ATi's Radeon 9800XT and NVIDIA's GeForce FX 5950 have very similar power requirements, although the ATi card proves to be just a touch more economical. The GeForce 6800 Ultra, on the other hand, obviously enjoys taking a good swig from the power socket, but less than one might originally expect upon seeing the dual aux power connectors. In the end, the new card draws about 24 Watts more than its predecessor. Adjusted for the 69% efficiency of our power supply unit, that would make it 17 Watts more than the GeForce FX 5950, even though the new card uses more power-efficient GDDR3 memory.

http://www.tomshardware.com/graphic/20040414/geforce_6800-20.html

even with 2 molexes, the 6800 ultra is only requires a few more watts than the 9800XT and 5950 ultra.

hardocp tested this card with a 430W psu, and ran perfectly fine.
 
x84D80Yx said:
your like one of those guys who complain, but DONT READ REVIEWS!!!!!!

lemme get this quote for you :\


http://www.tomshardware.com/graphic/20040414/geforce_6800-20.html

even with 2 molexes, the 6800 ultra is only requires a few more watts than the 9800XT and 5950 ultra.

hardocp tested this card with a 430W psu, and ran perfectly fine.

Sorry to dissapoint you but I read too much reviews. I read that toms review too. Time has tough me that. One should read between the lines. Example. Tomshardware is propably bought partially by MSI. MSI products allways get the recomendations. Or they have MSI shares. And their review policies are some times questionable too. (mainly when graphics cards are the case) But their reviews are still among the most comprehensive reviews ever.

And when it comes to powersupplies. Please tell me if any of you know a site that allways checks the efficiency as a part of the review. That would be quite usefull. And I mean a site that doesen't give every product a score of 9/10 or 10/10. Those sites are up to no real use I'd say. www.silentpcreview.com is one good site but they test so little these days.

BTW. here is a nice review of nice product that might walk nicely
hand in hand with the new breed of cards. Enermax Noise Taker 475.
http://www.silentpcreview.com/article149-page1.html

For a deeper review of the card. (6800) I recomend.
http://www.beyond3d.com/ :thumbs:
 
I would love to buy this card but i don't have ebough money ;(
I guess i buy it on fall or something.
 
do you know when that next far cry patch with be out? :) looks sexy

nonexite said:
For a deeper review of the card. (6800) I recomend.
http://www.beyond3d.com/ :thumbs:

check the 2nd page of this thread, read it and many others.

toms isnt as biased as you think. i think and many others also on many hardware forums (b3d, hardocp, nvnews, hexus,anatech,3dgameman) all thought toms was pretty decent ...most just thought the worst part was that it had too many pages.

and like i said many other revieiwers tried lower end 400-430 psu's and had NO problems. nvidia is just saying they recommened a 480W psu just to be safe.
 
I am getting one of them for graduation!
EDIT: The next Far Cry patch should be either next week or the week after. It should go along with the release of the NV40. :naughty: :afro: :)
 
I am waiting for the 6800 Ultra 512mb and the X800 512mb (if they have a 512mb version) in june.

call me insane for even thinking about a 512mb card, but then ill just have to call you insane for not thinking towards the future.

by june ill have saved enough to get me another 2x256mb of ram , and 2 21 inch monitors along with either ati/nvidia card i choose. (ive only got 512mb with two 17 inchers)
 
x84D80Yx said:
http://www.beyond3d.com/previews/nvidia/nv40/
check the 2nd page of this thread, read it and many others.
So what. It's still review to be recomended.
x84D80Yx said:
toms isnt as biased as you think. i think and many others also on many hardware forums (b3d, hardocp, nvnews, hexus,anatech,3dgameman) all thought toms was pretty decent ...most just thought the worst part was that it had too many pages.
Well their business is money. And when money is involved there is allways greed. It's up to no real importance wheter or not. But the point remains. Think what you read or something like that. (if you are interested. My point is proven if you check their reviews.)

x84D80Yx said:
and like i said many other revieiwers tried lower end 400-430 psu's and had NO problems. nvidia is just saying they recommened a 480W psu just to be safe.

I really hope so. As I have Zalman 400W.
 
If the X800 XT comes any where near the 6800 Ultra, Nvidia really needs to fix that shader quality if it expects to compete fully.
 
Open your eyes!

First Look: FarCry with Pixel Shader 3.0!
http://www.pcper.com/news.php

What is even more remarkable was that Crytek, the developers of the engine of FarCry, told the audience that it took only three weeks to implement these changes.

Before
FarCry_before3_small.jpg


After
FarCrafter3_small.jpg
 
x84D80Yx said:
I am waiting for the 6800 Ultra 512mb and the X800 512mb (if they have a 512mb version) in june.

call me insane for even thinking about a 512mb card, but then ill just have to call you insane for not thinking towards the future.

by june ill have saved enough to get me another 2x256mb of ram , and 2 21 inch monitors along with either ati/nvidia card i choose. (ive only got 512mb with two 17 inchers)

you DO know the 512 MB version is PCI-E only.... right?
 
Zerox said:
First Look: FarCry with Pixel Shader 3.0!
http://www.pcper.com/news.php



Before
FarCry_before3_small.jpg


After
FarCrafter3_small.jpg
thanks man
i told you guys shader 3.0 is nothing to just ignore.

thehunter1320 said:
you DO know the 512 MB version is PCI-E only.... right?
i doubt this to the fullest.
the nvidia lan party specifically boosted the nv40, and at the party they specifically said the nv40 will have 512mb this summer.

and in no where ive seen that the nv40 is a pci-e card.
prove me wrong and then we can talk.
 
Asus said:
If the X800 XT comes any where near the 6800 Ultra, Nvidia really needs to fix that shader quality if it expects to compete fully.
The drivers that Tomshardware used didn't have ShaderModel 3.0 enabled yet, so Far Cry defaulted to PS 1.1/2.0 mixed mode for the NVidia card. The screenshots that ray_MAN showed give a hint of the true shader quality of the NV40 (although I wish the screenshots weren't scaled down in size).

That's 32bit ShaderModel 3.0 graphics. :p

The R420 is stuck with 24bit PixelShader 2.0 and thus won't be able to compete with the NV40 as far as shader quality is concerned.
 
x84D80Yx said:
i doubt this to the fullest.
the nvidia lan party specifically boosted the nv40, and at the party they specifically said the nv40 will have 512mb this summer.

and in no where ive seen that the nv40 is a pci-e card.
prove me wrong and then we can talk.

only the 512 MB version of the card is PCI-E, the 128 and 256 can be either AGP or PCI-E, consumers choice

i saw it in... i think it was Anandtech's review... actually, i don't remember... maybe in one the the PDFs...
 
Arno said:
The drivers that Tomshardware used didn't have ShaderModel 3.0 enabled yet, so Far Cry defaulted to PS 1.1/2.0 mixed mode for the NVidia card. The screenshots that ray_MAN showed give a hint of the true shader quality of the NV40 (although I wish the screenshots weren't scaled down in size).

That's 32bit ShaderModel 3.0 graphics. :p

The R420 is stuck with 24bit PixelShader 2.0 and thus won't be able to compete with the NV40 as far as shader quality is concerned.

A true. Didn't think of that.
I won't place judgement until after both cards are reviewed and both cards hit the shelves with further reviews.
I'm very open. ;)
 
Just imagine how juicy it will get when we start seeing benchmarks for the radeon's cards.... so far the 6xxx series, looks very promising, and also to note, they are obviously no problems of the past 5xxx series :D
 
Arno said:
The R420 is stuck with 24bit PixelShader 2.0 and thus won't be able to compete with the NV40 as far as shader quality is concerned.

Hm, Seeing the before and after pics from the pixel shaders, I think I might go from ATI to nVidia. That picture looks amazing. Why did ATI stick with 2.0 shaders anyway? Oh well, I'll wait a little longer to see how good ATIs cards are.
 
It would be nice if that before pic was PS2.0 rather than with no shaders/reduced precision.
 
Asus said:
It would be nice if that before pic was PS2.0 rather than with no shaders/reduced precision.

Exactly. They apparently used low quality everything on the first one.
 
yep thats what I thought, the first pictures looked pretty shitty compared to others I have seen.
 
Do you guys know when is the next ATI/NVIDIA videocards with 200 $ price tag coming?
 
dont know but i allways go with Nvidia... half the ppl i know with radeons have problems w/ thare cards... ne ways once my momma gets the cash ile upgrade my puta
 
nonexite said:
Does anyone have link for better quality vid. (UNREAL 3 engine vid) NV40 in action!
http://www.scifience.net/unreal3_0002.wmv - save as. :eek:

Have fun.
im 3 unreal pwns wait no 2 secs i agree unreal 3.0 PWNZ i mean wow... the... U3.0 has 1 F'kin bad ass ghraphics engine that totally supasses HL2 but HL2 is HL2 and it has that game play feeling the story and wlel thats somethin about it i just cant explaine... i love it thow
 
Radeon X800PRO will beat NV40 Ultra, maybe?

The canaries are singing that 12 pipelines, 475MHz/950MHz card with 96 bit precision and PS 2.0 shader only will end up faster then 16 pipelines, 400MHz /1100MHz cards with 128 bit precision and PS 3.0 shader model.

http://www.the-inquirer.com/?article=15406

So don't go and preorder an GF6800U just yet because Inq might be right.


:thumbs:
 
Zerox said:
So don't go and preorder an GF6800U just yet because Inq might be right.
Not to mention that you can't just preorder a random GF6800U. You have to pick a certain brand. Different manufacturers use different cooling solutions and clockspeeds on their cards. For example, it's said that the Albatron GF6800U will be clocked at an amazing 600/1000MHz. :O
 
thehunter1320 said:
you DO know the 512 MB version is PCI-E only.... right?

Wrong, Toms Hardware~"This time, NVIDIA has chosen to play it safe, and so the NV40 will use the tried and true AGP interface. The remaining members of the NV4x family will all be native PCI Express chips, meaning their AGP variants will all use the HSI bridge chip as well, translating from PCI Express to AGP this time."

Which means all NV40 based cards are AGP (regardless of memory size) unless they have the HSI bridge allowing them to be used in PCI-E. Anything beyond the NV40 (i.e. NV41,42,43,whatever's next) will be native PCI-E, but still available in AGP form with the HSI bridge.
 
Zerox said:
Radeon X800PRO will beat NV40 Ultra, maybe?



http://www.the-inquirer.com/?article=15406

So don't go and preorder an GF6800U just yet because Inq might be right.


:thumbs:


...but i am an idiot and i need to own the fastest thing available this week.......screw next week when the competition will have something better. Ill just ask my daddy for another 500$.

:sleep:
 
Back
Top