nVidia internal ATi Bashing

SidewinderX

Newbie
Joined
May 15, 2003
Messages
4,884
Reaction score
0
I'm not taking any sides on this, only saying that nvida makes some good points.


nvidia slides:
http://www.forum-3dcenter.org/vbulletin/showthread.php?s=&postid=1845416#post1845416


what hardocp reports
hardocp said:
NV Comments On ATi?
Ronald Gasch, the main man over at 3DCenter, sent me a link to what is supposed to be an internal document from NVIDIA that is criticizing the ATi X800. Obviously all of this has to be taken with a few really big grains of salt because the “document” has several flaws in it which draw its authenticity into question. We are contacting NVIDIA for their comments and will update you as soon as we hear back from them.

Kyle's Update: I just got off the phone with NVIDIA PR front man Derek Perez. He did confirm that the slides linked above are NVIDIA's creation. The slide are just a few from a set of "20 or 30" that were developed by NVIDIA's USA Product Marketing team that were to be used only as a sales tool internally. Derek did say, "I am not going to apologize for them." He went on to explain how they need to outfit their sales teams as they see fit. Now that we are through that, we are hoping to see retail GeForce 6 hardware soon
 
"Why pay $499 for last year's technology?"

Why, because it performs and looks better, of course!

(I have no facts to back that up so don't pay attention to me)
 
and if you go back about a year, ati's leaked slides make good points about nvidias nextgen hardware :upstare:
 
Strike Back:

-Nvidias gf6 is massive card requiring lots of voltage, while the x800 actually requires the same power and is more efficient that the 9800.

-Why should we believe that SM 3.0 performance will be so awe inspiring when they couldnt get SM 2.0 right!?

-9800 ansio trickery?! How about explaining your obvious image quality decrease and already exploited performance increase cheats!?

etc etc
 
I enjoy reading stuff like that for some reason. :)

Fight! Fight! Fight!
 
........wow.


They might as well call ATI poopie heads.....


Is nvidia run by 14 year olds or what.... I knew there marketing department was scum(most are) but cmon....
 
Wow..that just made me even more proud to want to buy an X800 XT Platinum. To watch dumbasses who buy the 6800u cry when they pale in performance, quality, power saving, and card temperatures.
 
guinny said:
Wow..that just made me even more proud to want to buy an X800 XT Platinum. To watch dumbasses who buy the 6800u cry when they pale in performance, quality, power saving, and card temperatures.

Quality huh? Not that I really disagree with you, but from what I've seen they are neck and neck.
 
All I want is the toned down card before HL2 (X800 SE or whatever nVidia's going to offer as a cheaper solution).. of course I'll wait for benchmarks, but hm. That whole internal document was kinda funny though
 
Thx for the good laff.
The "An Act of Desperation!" might be true, but for who?
It's always fun to watch marketing give points and reason...actually it can be painful.
 
Those slides are a good reason why marketing people should never be in a position of power. :D

I don't understand one thing, how do they expect SM3.0 games to come out with in 6 to 12 months? SM2.0 has been out over a year now and we barely have any any SM2.0 games. Even the major upcoming game releases (Doom3, HL2, Stalker) are designed around SM2.0.
 
Marketing teams are always interesting (including Doug according to many on this site).

The strongest argument is the lack of PS3.0 support, but they do bring up a few other interesting points.
 
Interesting, though in the unlikely event that it's genuinethe majority of the comments were supposedly made by either reviewers or ATI themselves. And only the black box remarks are NVIDIA.
 
That was a fun read. It shouldn't be taken too seriously, though. Quotes are easily taken out of context and of course it doesn't get more biased then this.

blahblahblah said:
I don't understand one thing, how do they expect SM3.0 games to come out with in 6 to 12 months? SM2.0 has been out over a year now and we barely have any any SM2.0 games. Even the major upcoming game releases (Doom3, HL2, Stalker) are designed around SM2.0.
Supposedly it's easier to code for SM3.0 compared to SM2.0, because SM3.0 looks more like a normal programming language (with loops and branches, instead of long sequential code). It's not unlikely that future patches of HL2 and Stalker will include SM3.0 support. Remember that only a tiny part of all the shaders in a game have SM2.0/3.0 features, so it's not a big deal to rewrite those shaders.
 
LOL, funny... by the time SM3.0 is fully used (shaders with many instructions) the 6800 Ultra is already far too slow to make advantage of it.

It's a waste of time implementing SM3.0 when SM2.0 is barely fully used (the first game to fully exploit it being HL2)
But saying 'Our card has the brand new Pixel- and Vertexshader model three point zero' does sound a lot more impressive than 'Out card fully supports SM2.0, technology that's actually USED in games'. It's cool for the marketing people, but I don't see any advantages for the consumers.
 
PvtRyan - What cards do you expect the developers to create SM3.0 software with, if graphics manufacturers like nVidia don't implement it in the first place?

ATi couldn't care less about new technology or development of it, all they care about is shipping fast cards to kids. If nVidia had the same tactics they'd probably beat ATi, but the developers would not get their new technology, and nothing would evolve.
 
'An act of despiration'...I think that that slide show as more of an act of despiration than the x800 xt. Even if it is last years technology it's competes neck and neck with the Ultra exreme. I think nVidia made that slide show because they are in desperate need of some business.
 
I'm so sick of fanboi's on both side of the fence. Everyone should just wait until these cards are properly tested on nextgen games with proper drivers AND using the retail models.

People should stop making assumptions.

As for the slides well, it is possible that SM 3.0 will be around way before many of you predict. NVidia will probably be pushing it's 'partners' in order to code the SM 3.0 support, like CryTek has already done with FarCry, and I can't see GSC far behind them with S.T.A.L.K.E.R.. I doubt VALVe will create PS 3.0 support for HL2 until ATi's next range of cards that support SM 3.0 are released. However since NVidia's hardware has not been tested properly with SM 3.0 we don't know how it performs yet. It's no use having the architecture to run SM 3.0 if it runs it like a dog (such was the way with the FX line).
 
Vigilante said:
'An act of despiration'...I think that that slide show as more of an act of despiration than the x800 xt. Even if it is last years technology it's competes neck and neck with the Ultra exreme. I think nVidia made that slide show because they are in desperate need of some business.


i agree, i was thinking the same exact thing. I dont see ATI making comments about the competion.
 
While PS3.0 hardware needs to be out on the market before companies will write for it, I think it is a bit too early as a model for an entire line. You need a card that uses a lot of power currently inorder to really meet ALL of the specs for PS3.0.

While the NV40 is understandable and people can dismiss the power requirements for a top end card. Bringing the PS3.0 down throught he mainstream and low end cards will be interesting. I wonder if those cards will end up like the 5200 and I wonder to what extent the mainstream cards will be effected. Probably not any more than the FX I suppose.

I think it would have been wise, from a power/cost stand point, to wait on PS3.0 like ATI has done. The FX and Radeon DX9 cards came out at the same time (PS2.0 hardware). Nvidia always comes out with the more power hungry card though. ATI will have a PS3.0 card out when it's needed. IMO Nvidia should have waited. Although Nvidia lacked in Shader performance (with PS2.0) and by supporting PS3.0 they will benifet from the efficency and match ATI's PS2.0 performance in PS3.0 games. PS2.0 shader performance is still lacking.

While Nvidia says 'Why buy old tech hardware?' (even though it performs tops).
I say 'Why buy hardware with new features that are in away a waste?'. They that take up more resources today with no benefit until tomorrow. Seeing as how demanding games will be in the future that will mostlikely be based on PS3.0 (UE3.0 anyone? Even though that is more than a year off, you get my Point) and how gfx cards increase in performances each year by leaps and bounds. Why buy a new tech feature today when it will be slow in tomorrows games? Similar to how the 5200 'supports' DX9 but it performs very slow in those games.
 
I don't care about them and their PS3.0. I have had a lot of bad experiences with their video cards and I won't be buying their NV40. They lie way more than ATI does and then they pick on ATI because they don't have PS3.0. It doesn't make a diference. Nvidia got owned. This is just propaganda to their fanboys.
 
wow...im sorry. but i have to admit, this is one of the most mis-informed fourms about the next gen hardware. *not all of you, just alot of you seem to not have the facts strait*

not gonna name names ;)

the retail 6800, DX9.0c, a sm3.0 game/patch isnt even out yet so how can you guys bash on what you DO NOT KNOW!? we have no idea on how the 6800 does in 3.0 senerios. WE DONT HAVE THE SM3.0 BENCHMARKS YET! rofl

and the 6800 power requirements? those were revision boards with beta drivers that were tested. they fixed the power issue and the requirement is 350w psu, and a 480w power supply if your overclocking.

this is just insanely funny to me hearing you guys saying stuff that we have no idea about...please go on! i have had a bad day, and i need some entertainment :)

:thumbs:

sure you can say im a nviot or w/e but you can ask disturbed, a friend of mine in person who goes to these forums, that ive been buggin him about telling him ill be gettin a x800xt. so im not biased. its just funny when you guys dont have your facts correct.

both are great cards. it just depends on you. i dont care what card you get, just make sure your facts are correct before you cash out 400+ on one.
 
its also insanely funny when an idiot posts trying to say other forum members dont know what theyre talking about by posting info thats not only wrong but contridicts himself when he says "u guys have no idea what ur talking about"

what were saying is sm3.0 is USELESS. and its 420w psu for the 6800. and 3.0 is killing the cards fps for a feature that is completely useless. thanks for taking the time to read this post and now that you've been served u may kindly reply in an angry or whitty im-gonna-prove-you-wrong-oh-wait-no-im-not post.

edit:

:thumbs:
 
:sleep: Nvidia did some bashing way back with the Geforce2 MX400 vs. PowerVr's KYRO. The KYRO lacked some some features that Nvidia had; T&L engine and yet in a lot of cases the KYRO was faster than the Geforce. All this shit talk from Nvidia towards other companies is just making me look at them in a negative way, but i'm not biased toward any company... i'll settle with any card that is the best in the market.
 
guinny said:
its also insanely funny when an idiot posts trying to say other forum members dont know what theyre talking about by posting info thats not only wrong but contridicts himself when he says "u guys have no idea what ur talking about"

what were saying is sm3.0 is USELESS. and its 420w psu for the 6800.

Earlier this week Gamespot posted an interview with NVIDIA's GM of Desktop GPUs, Ujesh Desai. We have met Ujesh on several occasions and found him to be very forthright in his answers to our questions. The interview outlined NVIDIA's actions in moving their 480 watt PSU specification for the 6800U to a 350 watt PSU which will of course be much more user friendly for many gamers out there.

A good quality 350W power supply with a sufficient 12V rail pull can support the 6800 Ultra standard clocks of 400/550.

Ujesh also went on to make this statement that we thought would of course be interesting to the hardware enthusiast.

For people that do not want to overclock, the 480W power supply and second power connector combination is overkill.
http://hardocp.com/article.html?art=NjE0
the board they tested was a revision board, not retail. nvidia said the problem is fixed in the retail board, and in the shipping bios.


guinny said:
what were saying is sm3.0 is USELESS. and 3.0 is killing the cards fps for a feature that is completely useless.

how would you know this...THERE ARE NO SM3.0 BENCHMARKS OUT FOR GOD SAKES. neither is the 6800 retail yet, neither is DX9.0C out yet, neither is a SM3.0 game/patch out yet to test.


:thumbs:

edit:

i said i wasnt gonna name names, but as you can tell, this kid's info is FUD

:thumbs:
 
x84D80Yx said:
sure you can say im a nviot or w/e but you can ask disturbed, a friend of mine in person who goes to these forums, that ive been buggin him about telling him ill be gettin a x800xt. so im not biased. its just funny when you guys dont have your facts correct.

Yeah the 6800 is def a good card, but it has a lot of things I won't ever touch, which is why one reason I went with the X800 Pro myself. Plus HL2 will have a slight advantage on ATi cards. :thumbs:
 
heya disturbed, or should i say steve ;)

yeah like i said, it just depends on you. and me and disturbed are going with the x800 series.

i just hate to see people putting out false info on a good product.
 
Don't assume anything by my post. I never refered to anything that you specificly brought up.

PS3.0 is slightly more efficent. We shall see if Nvidia can match when DX9.0c is released. Since ATI's PS2.0 implementation is very powerful, I doubt it will flip Nvidia on top with their poor shader implementation getting the efficency from PS3.0.

Sure we don't have the benchmarks but PS2.0 games ATI runs faster so far. Even as Nvidia runs their PS1.1 special path in Farcry. Link

I am also talking about their measured WATTS that they use from reviews.
Not about the old 480Watt PSU requirement or the new 350Watt PSU requirement.
Link
 
x84D80Yx said:
heya disturbed, or should i say steve ;)

yeah like i said, it just depends on you. and me and disturbed are going with the x800 series.

i just hate to see people putting out false info on a good product.

Yeah, Nvidia has really done a great job with the 6800 don't get me wrong. Sounds like a great idea, I just want a card I know I will use everything that comes with it. Not smart idea to blow $500+ on some features I don't use like I said.

Plus I will be building another pc within the next 6 or so months, so I am sure there will be a better card out then. :)
 
Asus said:
Don't assume anything by my post. I never refered to anything that you specificly brought up.

Asus, i didnt name you or anybody specifically. you put up some good points and i have nothing against it.

one thing tho, on those watts test on the techreport.com link you gave, those are also revision boards on the 6800's. they are not retail yet. nvidia supposedly says its a bios fix. and as far as i know, the 6800ultra extreme isnt even happening. (if you want some links i can clear that up for you)

good stuff tho.
should be interesting on how the retail 6800 series does on their own stuff (SM3.0) when the benchmarks come.
 
x84D80Yx said:
Asus, i didnt name you or anybody specifically. you put up some good points and i have nothing against it.
Yeah, I know you didn't name names and I didn't know who you were talking too exactly so I just clarified incase you were. ;)

Sadly we will have to wait for DX9c and a new benchmark on the 6800 to see about PS3.0 performance and they will have to take another stab at Power consumtion measurements.
By then PCI Express Cards will be out! w00t
 
Asus said:
Sadly we will have to wait for DX9c and a new benchmark on the 6800 to see about PS3.0 performance and they will have to take another stab at Power consumtion measurements.
By then PCI Express Cards will be out! w00t

yups. next few months should be very interesting
:p
 
6800Ultra Retail card.
Link

X800 Pro Retail card.
Link

I wish he could just mail those overseas to my house!
 
nice links bro :)
i need me one of those translators now lol
great hi-res pics tho
 
Sorry to say, I wouldn't want that 6800U in my computer, just way to big for me :p
 
Now that is a slick unit in comparison with the other. lol
I hope more companies try to get by with 1 slot cooling solutions.
How many 5900 Ultras took up two slots compared to 1 slot solutions? I know Asus had a single slot 5900 but it wasn't an ultra.
 
Back
Top