NVIDIA "G80" Retail Specs Unveiled

DigiQ8

Tank
Joined
Jul 6, 2003
Messages
5,955
Reaction score
0
8800GTX
575MHz Core Clock
900MHz Mem Clock
768MB GDDR3 memory
384-bit memory interface (86GB/s)
128 unified shaders clocked at 1350 MHz
38.4 billion pixels per second theoretical texture fill-rate
450W PSU Recommended
Hard launch in second week of November

8800GTS
500MHz Core Clock
900MHz Mem Clock
640MB GDDR3 memory
320-bit memory interface (64GB/s)
96 unified shaders clocked at 1200 MHz
?? billion pixels per second theoretical texture fill-rate
400W PSU Recommended
Hard launch in second week of November


http://www.dailytech.com/article.aspx?newsid=4441

cant wait to get one !


Edit :
more info:
The GeForce 8800 To Feature 128-bit HDR + 16X AA ( will be the first card to feature 16X AA on a single GPU. )
 
HORAY UNIFIED SHADER WOHOOOooo... I think. Any word on the price?
 
Wohooo, more ram than we can actually make use of, woooo.

:E

It's almost as if they have too much ram lying around so have to get rid of it by sticking unnecessary amounts on new cards.

Still, exciting stuff despite that.
 
Thats really awesome news. I didn't think Nvidia was going to sit well going against ATI with the previous rumored specs. Now I wonder how ATI will keep up. lol
I'm not buying anything yet though. Probably wait til after ATI released their cards and the second revision cards.
 
Yeah, same here. I'll probably hold off until there are offerings from both companies, plus extensive reviews of both. Don't want to go buying another "X1800XT" only to find them release something better a month or two after !!
 
Looking good.

I read the G80 will be more powerful than any sli or crossfire setup. I'm not bothered about dx10 features, but would like to see how it rivals current cards in dx9.
 
Looking good.

I read the G80 will be more powerful than any sli or crossfire setup. I'm not bothered about dx10 features, but would like to see how it rivals current cards in dx9.

it will put the last generation cards in the trash in dx9 games
 
But by how much? Enough to justify the extra cost over a x1900xt with games we'll be playing over the next 6 months? (by which time better and cheaper dx10 cards will be available)

I've no problem spending money on a gfx card, but want to get the most out of it.
 
Doh, only if they sell under €350 at launch I might consider one (getting a new pc mid november). But I'll probably just get the X1900XT anyway.
 
It would be nice to get a DX10 card...

But I just got my new pc, so I can't possibly justify any good reason to swap my X1900GT for a DX10 card until games start taking full advantage.

And I think that will be a good year away.
 
I'm gonna build a new rig once this series of cards come out.
 
128 bit HDR? What does that mean, anyway? I know what HDR is, but what's with the 128 bit?

By the way, I heard that DX 10 won't be fully backwards compatible? Is there any meat to this?
 
By the way, I heard that DX 10 won't be fully backwards compatible? Is there any meat to this?
Think of it like this:
You install Dx9 on Xp.
Dx9 still allows game programmers to call Dx6,7, and 8 commands.

Dx10 gets rid of all this. In other words you can't have Dx10 and make calls to Dx9, Dx8, Dx7, or Dx6 like you could with all other versions. However!!!! Dx10 is Vista-Only and Vista also fully supports Dx9 and Dx10.

Dx10 Graphic Cards will fully support Dx10 and Dx9.
 
128 bit HDR? What does that mean, anyway? I know what HDR is, but what's with the 128 bit?

Okay, so you know what HDR is: It is the technique of internally rendering an image in a much larger color space than the monitor displays. Often, when displaying such an image, tricks such as dynamically changing the mapping from HDR-space to LDR-space (resulting in "changing the aperture" of the camera) or applying bloom to sections of the image too intense to be displayed on a monitor are employed, but these aren't really what HDR is: these are just techniques used to display a HDR image on a monitor.

I'd though that current HDR solutions have 4 32-bit floating point channels (3 colors+alpha), which is then projected onto 32-bit colorspace (well, 24bit + alpha) to be presented on the screen. Which would make them 4 * 32 = 128-bit. Maybe there's an optimization in modern real-time HDR that I didn't know about. Maybe they just use 16 bits per channel. That would actually make a lot of sense, since it's a little overkill to project a 32-bit space down to 8-bits.

Oh, god, did I start rambling?
 
I bet these cards would run BF2 at maxium settings at around 600-700 frames per second :O
 
Good thing I didn't go SLI with my 7800GT. There was little benefit with those anyways, I never slowed down to begin with.
 
looks like with the 9 inch length minimum, I'm gonna have to bump that hard drive up a slot to make room for mammoth card.
 
I bet these cards would run BF2 at maxium settings at around 600-700 frames per second :O

Lol. :P

I don't think the engine would allow anything higher then...120 fps?

It's absolute turd tbh.
 
If you can reach 428 frames per second with 7900GT, why can't you get above 600 with Geforce 8?

aJ5Gq9.gif
 
Oh. So you have to buy a 500 quid CPU lol :P
 
Back
Top