NV40... looks like ATi are in trouble

mrchimp

Newbie
Joined
Jul 19, 2003
Messages
1,928
Reaction score
0
And what is also interesting is they are bridging their cards (which use AGP) to PCI express.
The AGP versions will be 8x AGP like normal but the cards with the PCI express connection will be using a 16x agp so the bandwidth will be, in theory, the same and not hurt by being AGP.
But AGP is one way at a time while PCI Express is both ways. Could be interesting.

Shader proccessing matters and if they nail shadders then Nvidia really did "Catch ATI off-guard".
Not saying ATI's could be a bad card, just that the competition isn't what they had planned for and ATI might try extra hard to get clock speed out of their chips.
 
yeah i think you're right...7x faster would be like 70fps :D
 
Asus said:
And what is also interesting is they are bridging their cards (which use AGP) to PCI express.
The AGP versions will be 8x AGP like normal but the cards with the PCI express connection will be using a 16x agp so the bandwidth will be, in theory, the same and not hurt by being AGP.
But AGP is one way at a time while PCI Express is both ways. Could be interesting.

Shader proccessing matters and if they nail shadders then Nvidia really did "Catch ATI off-guard".
Not saying ATI's could be a bad card, just that the competition isn't what they had planned for and ATI might try extra hard to get clock speed out of their chips.

A bridge chip is always a bad idea. YADITP. Basically you can think of it as having an older AGP brain with a translator, that lets it function with PCI Express. Clearly this is not ideal. I don't know if NV40 is going to use a bridge chip or do PCIe natively though. Maybe someone has a useful link?

It's very unlikely that they will "nail shaders". It seems to me that this is why they're using a 16 pipeline solution. Whatever the problem they're having is, they can't fix it now without a major redesign, so they're just doubling the amount of slow pipes that they have in order to compensate.

Finally, even with moving to 11 micron fab, nVidia is going to have a hard time pushing clock speeds like they've been doing with this massive increase in transistors.

As for ATi pushing for higher clock speeds... I don't know if they'll try this. It's not really something they've emphasized in the past.

The nVidia hardware next generation may turn out to be faster, but it seems to me that more and more it's becoming a compilation of hacks.
 
Nvidia will have 210 million transistors.

ATI will have around 170 million transistors.

Geez, I wonder which one will be cheaper? I don't feel like giving Nvidia an arm and a leg to have the privilege of using their newest graphic cards.
 
The NV40 will be 8x AGP.
The next model (PCI Express 16x slot) will use a bridge agp-to-PCI Express. They will use the same AGP interface for the core but the actual connection will be PCI Express. It won't be natively PCI Express like ATI's.

They increased the AGP interface for the bridged model to smooth the difference in transfer rates. But again, pci express is up and down at the same time while AGP is up OR down.
Whether AGP vs PCI Express is going to be like 4x vs 8x AGP will be an interesting matter.
Whether AGP16x interface on the Nvidia card bridged to PCI express will do much to smooth the difference (If any) out will be another matter.

Nvidia increased their shaders a lot from the FX.
ATI's new card will be a slight improvement in shaders over the 9800 but we don't know how the two cards compare in shader performance yet.

If the new Nvidia card matches shader performance with ATI's card or is pretty close then the 16 pipelines will give them a slight lead depending on clock. Texture fillrates will be the same though. 8x2 vs 16x1 =16
 
Saying what kind of performance improvements either side has made with the new cards right now is based solely on nVidia's and ATI's word... which means nothing. Don't bother getting hyped up about hardware until you see actual real-world benchmarks.
 
Considering that modern AGP cards are almost negligably affected by the AGP transfer rate, I don't see how it will help them. They can increase the bandwidth all they want -- it won't help the latency problem introduced by the bridge chip.

I'm not a graphics card engineer or anything, that's just how I see it from what I know.

Anyway, count me out of the hype for this round. I'm not in the market for an upgrade anyway. :p
 
Asus said:
The NV40 will be 8x AGP.
Nvidia increased their shaders a lot from the FX.
ATI's new card will be a slight improvement in shaders over the 9800 but we don't know how the two cards compare in shader performance yet.
What we do know is the claim its 2-3 times faster than current top card in CM4, same style as the Nvidia 7 times claim. And keep in mind, the 9800XT is already almost 2 times faster than the Nvidia counterpart in that game...
 
we won't see benefits of PCI express till around the NV50 mark, since NV40 and NV45 have the same GPU - but in the mean time, it looks like nVidia were too secretive this time, and ATi underestimated the NV40. Looks like nVidia will have the crown once again, for a few months at least.

U never know, maybe XGI will make something good and beat them both :p
 
Pobz said:
we won't see benefits of PCI express till around the NV50 mark, since NV40 and NV45 have the same GPU - but in the mean time, it looks like nVidia were too secretive this time, and ATi underestimated the NV40. Looks like nVidia will have the crown once again, for a few months at least.

U never know, maybe XGI will make something good and beat them both :p
Problem is, they'd need to become a voodoo wannabe with 6 GPU's in parallel or something, since even the top of the line dual GPU is so pitifully slow its an embarrasment to all GPUs out there.
 
Native PCI Express is only required for HDTV editing, in the last benchmark I saw there was only a difference between 4x and 8x on a 128mb card in a theoretical test, I don't think bandwidth is a problem.

The more pixels and vertexes that can be processed per second is what really matters and with 16 pipelines and better processing technolodgy Nvidia are in with a chance. However I think my next graphics card will be from ATi because of price.
 
the whole idea between the bridge between agp and pciexpress is to start the transition. get people working on this and try to make a buck off it while you can. then they will fully transition to native pci express. the 8 x2 is different from the 16x1 architecture from what i understand. the 16 x 1 actually has the 16 pipelines while the 8x2 has a sort of dual flow i think. i dont really remember how it worked cuz the article i read was sort of shitty. lol, nvidia making a card 7 times faster, hahah thats laughable. i dont think nvidia will be holding the crown for very long. they need a minor miracle from all the crap that happened over the past year. and the number of transistors does not really make up that huge of a difference. its all in the language the card is made for. thats nvidias problem is having to translate the coding language cuz they were to stupid to initially write the fx cards in the dx language like ati did. as much as i like to see the fighting among the graphics card giants, it will take ALOT to sway me from buying ati from now on. all the cheating and poor numbers shown by the fx series really left a black mark on nvidia for me. i will have to be thourolly(spelling?) impressed with this new lineup.
 
Every time you forget line breaks in a post, god kills a kitten...

kitty4.jpg
,
kitty2.jpg
.

Please, insert line breaks...
 
Its still too early to tell anything for sure. I am just going to wait till they start benchmarking the cards.
 
yeah i cant wait for the 500 hardware sites that use Q3 and 3dmark03 and call it a day.

"looks like nvidia wins by 5000% folks"
 
By the time the NV40 is released there should be some very interesting benchmarks out there:

e.g http://www.halflife2.net/forums/showthread.php?t=19639

I'm just going to ignore any Q3 benchmarks, I'm not really bothered which card performs at 400fps and which one performs at 40fps less. Personally I think if they wan't to test an OpenGL game they should find a way of benchmarking NWN or COD.
 
im glad i havent gone radeon yet mabee ATI will look like crap when the NV40 is released... if so im upgradeing to the next NV asap lol
 
im glad i havent gone radeon yet mabee ATI will look like crap when the NV40 is released...
How does ATI looking like crap sometime in the future make not buying a Radeon now a better decision? They are currently the best video cards. Do you only buy cards now from companies that you think will be ahead later on even if they perform worse than their contemporaries? Why not just buy the best card no matter what? Brand loyalty sucks.
 
Except for 3dfx brand loyalty. 3dfx pwns you all. lol. (Btw, if you do have a Voodoo that you love or simply want to get working, we'd love to have you come to our message board on www.voodoofiles.net.)

Well, since ATi actually has a native PCIe solution, I for one am really excited to see what they're going to be doing with their All-in-Wonder series. Just...imagine... Well I don't think they've actually mentioned anything about it yet, but I hope they do soon.

You guys are missing a really important concept here... ATi has the better Pixel Shader architecture. A "16x1" pipeline configuration is a temporary solution. At the end of the day, ATi is still more efficient. What does this mean? The next generation, ATi follows suite with the pipeline config, and nVidia is left dead in the water as far as performance and heat output go. What are they going to do then? 32 pipes? Bump up the clock speeds more? I don't think so.

One thing I noticed though was that nVidia is going to adopt GDDR3 for their next gen cards. This will definitely help them in the performance and heat output of the memory. Does anyone know if ATi and XGI will be doing the same?
 
I beleive the high end ATI would use GDDR3, not sure though.
 
I have read that GDD3 is no more than the high end DDR2 memory for gfx cards.
Rather than the actual DDR3 memory that will follow DDR2.

DDR2 is to GDDR3
as
DX9.0b is to "DX9.1"
 
psyno said:
Except for 3dfx brand loyalty. 3dfx pwns you all. lol. (Btw, if you do have a Voodoo that you love or simply want to get working, we'd love to have you come to our message board on www.voodoofiles.net.)

Well, since ATi actually has a native PCIe solution, I for one am really excited to see what they're going to be doing with their All-in-Wonder series. Just...imagine... Well I don't think they've actually mentioned anything about it yet, but I hope they do soon.

You guys are missing a really important concept here... ATi has the better Pixel Shader architecture. A "16x1" pipeline configuration is a temporary solution. At the end of the day, ATi is still more efficient. What does this mean? The next generation, ATi follows suite with the pipeline config, and nVidia is left dead in the water as far as performance and heat output go. What are they going to do then? 32 pipes? Bump up the clock speeds more? I don't think so.

One thing I noticed though was that nVidia is going to adopt GDDR3 for their next gen cards. This will definitely help them in the performance and heat output of the memory. Does anyone know if ATi and XGI will be doing the same?

Your just assumeing that Nvidia havn't made any major changes to there GPU's architecture, when in actuall fact they probably have. ATi may have also made some big improvements but it's hard to tell right now. However we do know Nvidia has more pipelines which should make ATi worried.

Also ATi have demonstrated realtime HDTV editing which is only possible on PCI express as far as the PC architecture is concerned. So you don't have to imagine what the ATi all in wonder cards are going to be capable of.
 
You're right, I'm assuming that. I don't think they'll go all the way back and redesign it. I could be wrong, I don't have any special knowledge, but it would be a major undertaking costing a lot of money. You really think they redesigned their architecture and doubled the amount of pipelines? Maybe. I doubt it. Why do both?

Cool about the ATi thing. As you can see I don't really keep up with the hype...just what interests me. I'll google for this. Thanks.
 
psyno said:
You're right, I'm assuming that. I don't think they'll go all the way back and redesign it. I could be wrong, I don't have any special knowledge, but it would be a major undertaking costing a lot of money. You really think they redesigned their architecture and doubled the amount of pipelines? Maybe. I doubt it. Why do both?

Cool about the ATi thing. As you can see I don't really keep up with the hype...just what interests me. I'll google for this. Thanks.
The main thing is that two different teams made the core, even if it shares some design... NV40 can be radically different from NV30. Personally I doubt that too though. At any rate, ATI will reign supreme in the low end-high performance market with the [in the future] cheapass 9600/9800 cards when the new cards come.
 
yeh looking back at NVidia's rein, id say they'll probably come out on top with the next gen cards. theres no need to angry , im sensing ATI owners are defending and justifing their position, and so they should, their great cards too,

But theres no dening it, Nvidia's got some cool stuff up their sleeve too, and all the more important as HL2 isnt out till whenever. summer time I guess :p.
 
I don't think we can possibly decide which card is better untile their out next month (hopefully?) And after some benchmarking, we will then see who is king of the cards. I don't think I'll be getting one because I'm not getting another computer until this time next year, but it's always good to follow the GPU market so you know whats going on.
 
well, it is most likely better than a 9800 pro, after all nvidia is a huuge coorporation with plenty of good sientists, they got surprised by atis 9700 chip design, its 256 bus made nvidias cards look bad, so i figure they gave theyr design ppl the messade to design something that could "crush ati"

but i got a more consern about the fact that its the cpu that limits games nowadays anyways. my 9800 pro gets me 100 fps in 1280 * 960 aa*4 anisotropic 4. but when u put in artificcial enemys, X numbers of players, and so on, i say its the cpu that drags the frames down. ill only upgrade my cpu the next year, games like painkiller has shown me that my gfx can push plenty, its the cpu that is braking my system down nowadays. we need xp5000!
 
well, it is most likely better than a 9800 pro
..."most likely"? If the NV40 couldn't compete with a card from the previous generation all of the guys working on the NV40 would be fired.
 
OCybrManO said:
..."most likely"? If the NV40 couldn't compete with a card from the previous generation all of the guys working on the NV40 would be fired.


lol daz right... if you cant make a decent card... your fired... me? im just guna buy the best on the market
 
magnetmannen said:
but i got a more consern about the fact that its the cpu that limits games nowadays anyways. its the cpu that is braking my system down nowadays. we need xp5000!

If your next PC had all PCI-express slots (4x for general use and 16x for gfx) along with no FSB but rather a PCI-express or Hypertransport link with ondie memory controller and a dual core CPU design. That is what you'll be gaming on in a couple years.
 
2 things.

1. if no games use pixel and vertex shader 3.0 for a few years i really dont care.

2. if nvidias cards have them but run like shit(ala current generation FX series) i dont care.
 
You know nVidia will try to throw it in ATi's face if they don't support PS/VS 3.0 in their next video card. They will try to get all of the games that have the TWIMTBP logo plastered on them to add new fancy shaders that will only run on nVidia cards in order to make the technology seem more popular and useful than it really is.

I hope that ATi does support them (I'm all for technological advancements), but I wouldn't be angry if they didn't... because PS/VS 2.0 games are just starting to come out and games that support the new stardard probably won't start coming out until the end of the new cards' life cycles.

If they can get enough of a boost in the performance over nVidia in terms of DX9 games with 2.0 shaders it might be worth it... but if nVidia has equal performance in 2.0 shaders and supports 3.0 shaders ATi will be in deep trouble.
 
I wouldnt worry so much if its true... Games hardly need PS 2.0, most of the stuff today can easily be done with PS 1.3-1.4 or even 1.1, how many would *really* need PS 3.0 within the next say 2 years? On the contrary, if its a really solid 2.0 support, it can be better.

Features dont always count, obviously. Everyone ignores that current Nvidia misses some features the ATI cards have, yet no one appears to use them. Same the other way around. Nvidia supports 2.0+ and nearly reaches PS 3.0 in terms of technological prowess, how much has that helped?
 
Back
Top