Nvidia CEO: "We're going to open a can of whoopass" referring to Intel

CptStern

suckmonkey
Joined
May 5, 2004
Messages
10,303
Reaction score
62
Any notion that Intel and NVIDIA have common ground in the graphics industry can now be easily dismissed

NVIDIA's already candid CEO Jen-Hsun Huang had more than a few things to say during the company's financial analyst meeting today. An hour into the call Huang began to ad lib; clearly something was on his mind.

"We're going to open a can of whoop ass," he told analysts, whom quickly broke out into laughter.


For the past two weeks Intel and NVIDIA have been playing a game of cloak and dagger with technology press, complete with secret slide shows and secret slide show rebuttals. At the heart of this covert battle is the integrated graphics market, and some of the claims attached to it.

Intel senior vice president Pat Gelsinger fired the first volley at the Intel Developer Forum last week in Shanghai. "First, graphics that we have all come to know and love today, I have news for you. It's coming to an end. Our multi-decade old 3D graphics rendering architecture that's base on a rasterization approach is no longer scalable and suitable for the demands of the future," he said.

interesting read


http://www.dailytech.com/NVIDIA+CEO+Were+Going+to+Open+a+Can+of+Whoop+Ass/article11448.htm
 
Yeah, Intel stated earlier that they were going to do raytracing. This could end up pretty interesting.
 
Yeah, Intel stated earlier that they were going to do raytracing. This could end up pretty interesting.

Intel has indeed been saying that for a while.

Unfortunately, I'm not too knowledgeable in Rasterization / Ray Tracing to be able to say what is truly better. Though from all of the images and the white papers I've glanced at, Ray Tracing does indeed seem like it produces better output, at a much slower speed though (and when you need each frame to take 0.016 seconds for 60 FPS, everything counts)

*Edit* Also, I heard through the grape vine from a very reliable source about nvidia's next-gen architecture. It has >120 cores and can perform somewhere around 4 teraflops and has about 1.2 billion transistors on it or something. So yes, nvidia is really ready to open a can of whoop ass. :)
 
It's interesting to note that both corporations now own a company that specializes in physics processors, Intel with Havok and Nvidia with Ageia. Nvidia has already enabled Ageia's processes on the 8 series +. Once Intel gets into the mainstream GPU market, I really fear for ATI, as they're having enough trouble competing against just Nvidia.
 
this news just...

i can't wait to experience the future of video games.


so we can expect a huge jump in graphics coming soon? Because I'm quite bored with graphics anything less than CoD4 is retarded. (except classics and old fun games)

I wonder how well the next generation of cards which specialize in raytracing will be able to render todays games though. Because what if they can't do it well.

Maybe they throw in a miniaturized 8800 onto their new cards - like how old technology can be miniaturized, (like putting a PS1 chip into the PS2 to make it backward compatible)
 
Maybe Intel can now make up for their past sins, selling underpowered stock GPUs...

Oh wait. PC GAMING IS ALMOST DEAD BECAUSE OF THEM! :flame:
 
Please Explain.
Intel sells chips. Lots of chips. And along with those chips, it sells some of the worst graphics cards to ever see daylight. As a result, the vast majority of computers out there won't even run any game newer than HL2 or FEAR, let alone at a good framerate. Intel is almost singlehandedly responsible for the tiny market for new PC games.
 
yeah but you get what you pay for. if nvidia wants to put better integrated graphics for a similar price then fine, but I kinda doubt it won't add to the cost.

Besides, the first thing I do is put in a video card and disable the on-board graphics.
 
Yes, Intel are terrible

*cuddles his core 2 duo*

Just terrible...
 
Intel sells chips. Lots of chips. And along with those chips, it sells some of the worst graphics cards to ever see daylight. As a result, the vast majority of computers out there won't even run any game newer than HL2 or FEAR, let alone at a good framerate. Intel is almost singlehandedly responsible for the tiny market for new PC games.

I think that's a bit of a stretch tbh. They are chips designed to fulfil the minimum display requirements of computers, ie pretty much to just show the desktop. Anyone using them doesn't want to play games so they end up in office PC's, old people's PC or are unused in PCs that get given proper cards.
Besides Intel has stated that they aren't happy with how low their minimum actually is and do plan to raise it although, of course, not to the level of 'proper' graphics cards.
 
yeah but you get what you pay for. if nvidia wants to put better integrated graphics for a similar price then fine, but I kinda doubt it won't add to the cost.
nvidia and ATI are both working on low-end cards for regular PCs, but those won't be out for a while.

Besides, the first thing I do is put in a video card and disable the on-board graphics.
Because you are a gamer. But the vast majority of computer users will never even be introduced to new games because their business laptops/home PCs can't run anything more than Quake 3.

Just 4 years ago, the average computer could play any game at it's lowest settings. Thanks to Intel's refusal to embrace newer GPU technology, the gap between gaming rigs and the average PC has widened exponentially.
 
I think that's a bit of a stretch tbh. They are chips designed to fulfil the minimum display requirements of computers, ie pretty much to just show the desktop. Anyone using them doesn't want to play games so they end up in office PC's, old people's PC or are unused in PCs that get given proper cards.
Besides Intel has stated that they aren't happy with how low their minimum actually is and do plan to raise it although, of course, not to the level of 'proper' graphics cards.

Actually it's very true.

Most people don't even know what a graphics card is, how to install one, etc...

They buy a computer and expect that when they buy a game it will run. Yes, that's an ignorant consumer approach, but that's what a lot of people do.

Or they buy a laptop, don't feel like paying extra for a top of the line graphics card, and a re like my roommate who now has a fairly piece of shit non-gaming laptop since it has an Intel GPU. No way to upgrade those.

Intel makes up a lot of the GPU Desktop market with their integrated GPU's, which are pure shit. They can't do anything but just run Windows essentially. Intel is now trying to make up for it by making a somewhat competitive card, while keeping it much more cheap than the $300 high-end desktop CPU's.

Once Intel gets those cards mainstream and they start replacing all of their other integrated GPU's, you are going to see a spike in the number of pre-built computers that can play games. So when your mom, dad, or non-technology knowledgeable person goes to play a game, it will "just work".
 
In every store in teh computer section there should be a huge sign that says "WANT TO PLAY THE LATEST COMPUTER GAMES? READ THE FU*KING GUIDE WE HAVE OVER HERE!" And have a guide that explains sh17.
 
In every store in teh computer section there should be a huge sign that says "WANT TO PLAY THE LATEST COMPUTER GAMES? READ THE FU*KING GUIDE WE HAVE OVER HERE!" And have a guide that explains sh17.

Best Idea I've heard. People need to know about this. So it can help further the PC industry.
 
Or we just raise the lower-end of the GPU market.

Intel is really pushing hard to the low-cost but powerful GPU's. Sub $45.

It may raise the price of computers by a little bit if they make this the new integrated GPU, which I hope they do since it's worth it.

And I also think that developers need to take more advantage of the Vista rating system! It was designed to let people know if your computer can run certain applications. Nobody even cares about that number though, which is unfortunate.
 
well, you might be glad to pay extra for it, but I'm not paying extra for it. I won't use it.


It probably won't affect me though, because I'll be building my own computers.


I know that Intel's integrated graphics aren't even good enough to do windows very well. Slow page drawing, waiting while browsing photos. It's unacceptable IMO.
 
They buy a computer and expect that when they buy a game it will run. Yes, that's an ignorant consumer approach, but that's what a lot of people do.
People still understand the basics of system requirements, even if they don't understand what all the numbers mean.
To say that the supposed decline of PC gaming is "singlehandedly" Intels fault for selling low cost and crap graphics cards is a mistake IMO.
 
But I think there are more people out there who would pay for the middle end cards that are cheaper, because most people you meet irl haven't really constructed their own PC, whereas people who have would want nothing less than top of the line or somewhere near there.
 
Well, if the minimum is raised to a good standard then won't the rest get raised too? Meaning in a few years we might be back to this situation even tho the cards would of gotten better?
 
People still understand the basics of system requirements, even if they don't understand what all the numbers mean.
To say that the supposed decline of PC gaming is "singlehandedly" Intels fault for selling low cost and crap graphics cards is a mistake IMO.

People understand that there are system requirements. Usually that's the extent of their knowledge (no idea if they meet them).

It's not totally Intel's fault, but they are a major factor along with rampant piracy, which I don't care what anyone says, it's hurting the PC industry if not by sales, (which again I think it does), then by its reputation.
 
So, intel is shit because they don't market their integrated graphics to play top of the line games and you guys want them to?
 
So, intel is shit because they don't market their integrated graphics to play top of the line games and you guys want them to?

intergrated video cards - AMD (780G) vs Intel (G35)
I'm pretty sure the g35 is intel's newest and supports shader model 4. :)

Who cares if Intel doesn't perform as fast as the low end for video cards. But AMD's onboard GPU seems to do well enough. It just would be nice if the volume of PC's shipped with intel onboard graphics could at least do something besides tile the desktop.
 
It just would be nice if the volume of PC's shipped with intel onboard graphics could at least do something besides tile the desktop.

This. Honestly people wake up. You want more PC games right? Well there's a huge hole in the potential market for PC games due to practically everyone running Intel integrated graphics.
 
"We're going to open a can of whoopass... and... and then we're going to fashion that can into one of our heat-sinks."
 
Wow at this article, its ****ing WAR!!! Very interesting times to be a gamer.
 
This. Honestly people wake up. You want more PC games right? Well there's a huge hole in the potential market for PC games due to practically everyone running Intel integrated graphics.
I reiterate this reiteration. All my friend's have been avid gamers in the past, but that passion has been focussed exclusively on consoles in the last 5/6 years, since they buy underpowered PC's in ignorance and then can't use them for gaming. If the standard of integrated graphics was such that prebuilts could at least play not-too-ancient games fairly well, then the PC gaming market would be some orders of magnitude larger than it is atm.
 
woho! more stupid games with extremely useless, good graphics.

cant wait for the new "GTA XXI: The same shit all over again, but with extremeley gut ggrafix ant illegal soft erotic mod"

oh don't forget "NFS 15: Undergound-Extreme pink edition"





call me when games are actually interesting for atleast 3 hours of play
 
Does not compute. GTA is hundreds of hours of entertainment with OK graphics. :p
 
Back
Top