Nvidia to abandon mid and high video card market?

CptStern

suckmonkey
Joined
May 5, 2004
Messages
10,315
Reaction score
62
Nvidia kills GTX285, GTX275, GTX260, abandons the mid and high end market
Full on retreat, can't compete with ATI

NVIDIA IS KILLING the GTX260, GTX275, and GTX285 with the GTX295 almost assured to follow as it (Nvidia: NVDA) abandons the high and mid range graphics card market. Due to a massive series of engineering failures, nearly all of the company's product line is financially under water, and mismanagement seems to be killing the company.

Word from sources deep in the bowels of 2701 San Tomas Expressway tell us that the OEMs have been notified that the GTX285 is EOL'd, the GTX260 is EOL in November or December depending on a few extraneous issues, and the GTX275 will be EOL'd within 2 weeks. I would expect this to happen around the time ATI launches its Juniper based boards, so before October 22.


http://www.semiaccurate.com/2009/10...x275-gtx260-abandons-mid-and-high-end-market/


the lack of competition isnt good for the consumer
 
What? I've only bought Nvidia cards the past 5 years. I didn't know they were in trouble.
 
From a source that doesn't announce its potential inaccuracy in its domain name?
 
I'm a bit skeptical since they only reference themselves in the article.
 
the writer is a regular on the Inquirer ..they're what? accurate 50% of the time? ..probably less
 
I don't instantly believe it, but I think it's a very bad time for Nvidia right now.

With ATI's very high 5800 yields(and their pricing), and reading about Nvidia's problems getting their new GT300 to work properly(with better yields) it's gonna be pretty hard for them.

I've always favoured Nvidia(generally better quality, and less clumsy drivers(especially if your card is more than 1 year old!)) so sad to hear this.
 
SemiAccurate.com? Dude, the author on that site is the defactor Nvidia hater. He's the one that broke the story about the <2% yields on GT300, the "faked" board and all sorts of other crap about them.

I don't believe it for a freakin' second, and that site should be banned for having baseless, speculative articles as "facts".

It's possible they could go through a reorg if it's anything bad, but that's still even IF this is at all true!
 
I seriously hope this isnt true, wouldnt this mean they would stop supporting existing videocards as well? I have a 260 GTX and would hate to have the support pulled from my model.
 
SemiAccurate.com? Dude, the author on that site is the defactor Nvidia hater. He's the one that broke the story about the <2% yields on GT300, the "faked" board and all sorts of other crap about them.

I don't believe it for a freakin' second, and that site should be banned for having baseless, speculative articles as "facts".

It's possible they could go through a reorg if it's anything bad, but that's still even IF this is at all true!
You cannot read this site, like fudzilla, and think you are reading facts. They report on rumors within the industry. Some are made up completely. Some are true but don't turn out quite like they are predicted because things change behind the scenes. And sometimes they are dead on.
Just an FYI this is the guy who said AMD would buy ATI.

It is always an interesting read just like sites that speculate about the rumors of soon-to-be-released products (Apple's next big thing at their show, MS, Intel etc). Sometimes those are right on and sometimes nothing is right in the end.
 
"Patently untrue," says Nvidia on the PC World website. "We are not phasing out the products you list below [the GTX 260, GTX 275, and GTX 285] by the end of this year. We are in the process of transitioning to Fermi-class GPUs and, in time, phasing out existing products,” says Nvidia on ZDNet. What is true though is that their focus will be more on business applications and less on gaming GPU's.
 
"Patently untrue," says Nvidia on the PC World website. "We are not phasing out the products you list below [the GTX 260, GTX 275, and GTX 285] by the end of this year. We are in the process of transitioning to Fermi-class GPUs and, in time, phasing out existing products,” says Nvidia on ZDNet. What is true though is that their focus will be more on business applications and less on gaming GPU's.

D: Do not want!
 
I'm afraid this is another big step backwards for PC gaming, after the leaving of great PC exclusives like Crysis(please don't discuss about Crysis's gameplay now, at least everyone will agree the graphics rock).
I think ATI isn't going to try and produce much better GPU's since they don't have to be afraid of Nvidia making a new monster GPU. And if they have a monopoly on them... D:
 
what if its because resources to create the chipsets are low?? i mean look at the PS2, there was a mineral that only African children could mine and think of how many hours some kid had to scrape away so that another kid could have a big smile on his face. i'm just throwin that out there as a possible cause
 
"Patently untrue," says Nvidia on the PC World website. "We are not phasing out the products you list below [the GTX 260, GTX 275, and GTX 285] by the end of this year. We are in the process of transitioning to Fermi-class GPUs and, in time, phasing out existing products,” says Nvidia on ZDNet. What is true though is that their focus will be more on business applications and less on gaming GPU's.

Makes sense. You can't sell many $400 dollar graphics cards for playing WoW.

It pisses me off that over the past 5 years I have seen PC games in all local shops go from rows to small sections to 2 or 3 games. Though it's true that many PC gamers just buy online or pay subscriptions--more so than 5 years ago, one could assume.
 
Nvidia dropping N-force chipset:

Nvidia drops N-force chipsets

Nvidia has confirmed that the company has essentially placed its Nforce chipset line on hiatus, given the legal wrangling between itself and Intel.
According to Robert Sherbin, the lead corporate communications spokesman for Nvidia, Nvidia will "postpone further chipset investments".
Sherbin also dismissed a report that Nvidia was pulling out of the mid-range and high-end GPU market as "patently untrue". But Nvidia's recent chip introductions do imply a shift in graphics companys traditional stance is underway.

http://www.pcmag.com/article2/0,2817,2353939,00.asp?kc=PCRSS03069TX1K0001121
 
That article reminded me of 3Dlabs which the article says were just into Workstation GPUs and not desktop gaming GPUs. But they don't do that any more as of like 2006. Just low power embedded stuff and are owned by Creative now.

So nvidia is trying to bridge the gap, workstation and desktop GPUs more (quadro + geforce + telsa) than they were before (quadro + geforce).

Would be interesting for workstations if they could get an X86 or non-x86 CPU going under their name and pair it with their telsa or quadro GPUs for doing the bulk of the parallel processing. They would have to produce their chipsets again but with different specs.
 
I'm afraid this is another big step backwards for PC gaming, after the leaving of great PC exclusives like Crysis(please don't discuss about Crysis's gameplay now, at least everyone will agree the graphics rock).
I think ATI isn't going to try and produce much better GPU's since they don't have to be afraid of Nvidia making a new monster GPU. And if they have a monopoly on them... D:

Hopefully someone will step up.
 
Not sure what to think about all this, but whatever. I've been a faithful NVIDIA user for a long time now (the past 5 or so generations) and I'd hate to have to switch to ATI because they aren't willing to try anymore. Not that I'd really mind ATI, I've just had great experiences with my NVIDIA cards.

I wouldn't worry too much about ATI slacking off if NVIDIA does step down from the desktop GPU game, if they get too lax or stop improving new competitors will arise... so they won't.
 
Yeah, at this point I'm far from worried about ATI slacking because of lack of competition. Prices however won't drop until there is a rival card (from Nvidia or a refresh from ATI). But that is normal. In the past both ATI and Nvidia have held their cards price until the other catches up from a delayed launch.

If I was ATI I'd just be glad they are picking the lower power route which also costs less $$ to make a chip. Then they can get earn more money without gouging the customers for an expensive card and put that into R&D for a better chip later.
 
I'd had Nvidia ahead in this game, especially considering that ATI are still yet to put a window-dressing feature like Physics Processing onto the GPU. But I'm an ATI user nonetheless...
 
Not sure what to think about all this, but whatever. I've been a faithful NVIDIA user for a long time now (the past 5 or so generations) and I'd hate to have to switch to ATI because they aren't willing to try anymore. Not that I'd really mind ATI, I've just had great experiences with my NVIDIA cards.

I wouldn't worry too much about ATI slacking off if NVIDIA does step down from the desktop GPU game, if they get too lax or stop improving new competitors will arise... so they won't.

Well I know what you can think. You can think this article is a whole load of BS and FUD...

Honestly guys, Nvidia isn't going away and stopping mid/high-end GPU's. Don't panic.

As posted earlier, probably the only fact in the article is the fact that Nvidia is slowing shipments of their current chips to start to drain the supply to get ready for Fermi. It's quite normal.

Also, Fermi IS a high-end GPU. Yes, it has some cool things in there that are geared towards business users (ECC for example), but all of the CUDA cores and just the raw increase in computing power will help games immensely, especially with DirectCompute coming in DX11.

The sky is not falling guys. If Fermi comes out and it turns out to be a huge flop, then you can start to worry!
 
I'd had Nvidia ahead in this game, especially considering that ATI are still yet to put a window-dressing feature like Physics Processing onto the GPU. But I'm an ATI user nonetheless...
Actually ATI did support Havok on the GPU. Although now Intel owns Havok...
Nvidia happened to buy Ageia (PhysX) so it is theirs to put on their card.

But do you think in the end will a propriety type of physics from one player in the market win or something from an open standard like OpenCL or a future version of Microsoft's DirectX?
Well I know what you can think. You can think this article is a whole load of BS and FUD...

Honestly guys, Nvidia isn't going away and stopping mid/high-end GPU's. Don't panic.

As posted earlier, probably the only fact in the article is the fact that Nvidia is slowing shipments of their current chips to start to drain the supply to get ready for Fermi. It's quite normal.

Also, Fermi IS a high-end GPU. Yes, it has some cool things in there that are geared towards business users (ECC for example), but all of the CUDA cores and just the raw increase in computing power will help games immensely, especially with DirectCompute coming in DX11.

The sky is not falling guys. If Fermi comes out and it turns out to be a huge flop, then you can start to worry!
You seem to be a bit paranoid that we forgot what is real and what isn't.
Stock of certain Nvidia GPUs is drying up. There are theories as to the reasons but ignoring those Nvidia is still not in a good spot if that is the case (even if it is normal). If Nvidia's product is delayed and their production of current cards is slowing because of bad prediction with their new products arrival how can that be good?

The title of the article (killing mid/high end parts) is about current GPUs which may be seeing a shortage (who cares why atm). The title is not about Fermi. Fermi is not Nvidia's high end part because it is not here. Fermi WILL be a high end part when it launches later this year or early 2010. But time will tell what the consequences will be for Nvidia for the span of current GPU shortage with no new card in sight.

And yes, Fermi is a desktop 3d gaming card. But it's design is a big departure from how previous cards have been (ati and nvidia) and looks like something that might be meant to meet Intel's Larabee half way (cpu/gpu combination).
Also that would help with the rumors about Nvidia being more prominent in the workstation market in a couple years than it is now while still having a foot in the desktop market. But that would mean that it may not hold quite the same position as it currently enjoys for gaming because the design of the card is drastically different. We shall have to see how they choose to balance their goals for each market. ;)
 
As a 3D artist, I kind of like how they're going to be taking a step towards workstation functionality with their normal cards because I've had lots of problems with both nvidia and ati cards in my 3d apps, and I play too many games to want to buy a real workstation card.

As a gamer, I am afraid this is another stab into the open wound of PC gaming.
 
All I know is that I was thinking of buying an 8800GT or something and the card was nowhere to be found at newegg
 
Seems like our chinese friends are really busy, my 5850 got delayed another week. :/

What's going on?
 
Back
Top