Am I dumbassed that I bought a display card now?

bbson john

Tank
Joined
Feb 11, 2006
Messages
3,292
Reaction score
3
As you all know, a new Windows, i.e. Windows Vista, is going to be released. There is gonna be a Directx 10. The bad news is that I have recently bought a piece of display card, GeForce 7900GT, supporting Directx9, and worths $350. Do you think this card will support Directx10? If no, have I just wasted $350 on a card which doesn't support the new technology? Should I not to buy any display card until Windows Vista is released? :(
 
I'm pretty sure one of the latest cards out atm can handle DirectX 10.

But hell, I should of tried to convince you to send me the card in the mail, ****ing 5700. :<
 
No a 7900gt does not support directX 10. The new dx10 chips of ATI and nVidia have yet to be released. It's not really a big mistake though, that card is excellent for atleast 1.5 years.
 
The Brick said:
No a 7900gt does not support directX 10. The new dx10 chips of ATI and nVidia have yet to be released. It's not really a big mistake though, that card is excellent for atleast 1.5 years.

Directx10 is a new technology. It is sensible that my card does not support so. The future development of the graphic drivers will tread upon the road of Directx10, rather than Directx9. That's mean the graphic technology of my display card is not going to upgrade anymore. I am kind of pissed off. :x
 
7900GT. Pissed off. DOES NOT COMPUTE.
 
You won't need dx10 for about a year atleast. 99% of the dx10 games will support dx9. And the biggest change from dx9 to 10 is performance efficiency. It won't look any different.
 
The Brick said:
And the biggest change from dx9 to 10 is performance efficiency. It won't look any different.

Is that mean there will be less lag or what?
 
It's just that dx10 will perform slightly better than dx9 on the same hardware. But tests will have to prove that once it is out. Trust me, you made a good choice.
 
The Brick said:
It's just that dx10 will perform slightly better than dx9 on the same hardware. But tests will have to prove that once it is out. Trust me, you made a good choice.

Doesn't the DX10 support features that the DX9 can't handle?

I wouldn't say he made a good choice but it's not the end of the world either. I'm waiting till DX10 is released with mid-range cards before I upgrade my PC. Reason being there's nothing in the market right now that requires something as ridiculously powerful as a 7900 GT for DX9 games, I'd be better off saving money and investing it into something that has better long-term potential.

:|
 
If AMD prices are right coming up, I'm thinking of buying an AMD S939 system with the 7900GT.

Two reasons:
1)I don't want to buy 1st generation/revision stuff (DX10, Conroe, AM2) and mess with the bugs in motherboards or 1st gen performance. 2nd gen products will perform better, less problems and be more relevant (more DX10 games out).
2)the 7900GT costs a lot less than other high end cards and a lot less than any new card will cost when they come out. Performance/Cost ratio = high
I can live with that. ;)
 
Dx10 is not just a little better than Dx9. It itself is a very big improvement, but that is not the only improvement. The drivers for vista have to be significantly changed and take a much longer time to develop.

Dx10 introduces Pixel Shader 4, pushes for Universal Shaders, decreases CPU overhead signifcantly(CPU will be less of a bottleneck), more and bigger instruction sets, no having to check with dx9 or lower code etc..
The next thing is driver support. With Xp drivers may have to rewrite something 2 - 4 times to get it working on most graphic cards. With Vista they will only have to write it once because of the drivers. This will reduce things like texture corruption, things looking differently on Nvidia\ATI graphic cards, bugs, and development time for games. It also will cause an increase in speed.
Dx10 is significantly faster than Dx9.

What "The Brick" is talking about is this: ATI is only implementing Dx10 to be as fast as Dx9. This is because they are more concened with Dx9 than Dx10 because most games are still being made with Dx9 and will be until Vista is out for sometime. Nvidia has been tight lipped but considering there graphic card is coming out earlier and not entirely meeting Dx10 recommended specifications(No Universal Shaders).
 
I'm still thinking about getting a conroe E6600 and a dx10 card by the end of this year. The E6600 seems to be faster than an FX-62 and overclocks really well. And it's only about €300 :naughty:

But I have yet to see how dx10 turns out. I can imagine that even then I'll go for a 7900gt.
 
DX 10 is not specificly for "1337 gr@phX" it basicly has better performance; which results in allowing developers to add more detail, and more complex effects. The better graphics are from the developers utilizing the "performance boost" they are given with of DX 10. You still might get low framerates if they over-do it with the fancy effects, but the fact of the matter is that they are allowed more fancy effects/detail with less restrictions, that are a result of the hardware.
 
Why are ppl waiting for the first generation DX 10 cards and bitching about current generations? i say, stick to the 7900 gt cuz most games fully use the DX 9 architecture. 7900 gt is pretty gd and is cheap. upgrade to DX 10 when you REALLY have to, like in 2 years time or so... when DX 9 games becomes obsolete....
 
direct 10 isnt really that different from direct x 9 just a few new features and thats it unlike 8.1 was to 9. also i would wait a year like everyone else said so the bugs are sorted out with vista,games,driver,etc.
 
I disagree giant, I believe Dx10 is a signifcant improvment over Dx9.
From drivers to PS4, to being optimized for universial shaders, geometry shader integrated into drawing pipeline (should dramatically increase performance on various things), particle systems & data dumping without CPU intervention, very optimized & new functions for procedurally generated graphics, a hell of a lot less CPU intervention on everything else.

I can easily see it proving much faster.
 
I got a 7800GT a few months ago, amazing card. I refuse to upgrade until I can't play the game 1680x1050 max settings, which I can with every game so far, including precious Oblivion. Or you know... until I can't play at max settings on SOME resolution.
 
I keep looking at the X1900XT and then back at the 7900GT for 100$ less. X1900XT performs better than the 7900gt but it also costs more. 7900gt still has very high performance (even if it doesn't quite match the X1900XT), uses a lot less power, 1 slot card and is 100$ less. Yeah I think I'll be going with the 7900gt. Too many positives.
 
7900 GT FTW!!! i played Most Wanted on that gpu...it was really beautiful... it runs Lost Coast pretty well..
 
hneaz said:
7900 GT FTW!!! i played Most Wanted on that gpu...it was really beautiful... it runs Lost Coast pretty well..

There are occasional lags, still.
 
I'm getting a 7600 GT and after that im planning to get a DX10 card. Only 2nd Gen cards that are mid-range is where I am heading for though.
 
I'm waiting to see if DX10 can do real-time weapon change.
 
Nvidia has been tight lipped but considering there graphic card is coming out earlier and not entirely meeting Dx10 recommended specifications(No Universal Shaders).

Unified Shader Architecture is NOT part of the DX10 spec, so even though nVidia is currently planning for a non-unified system (32pixel, 16, vertex/geometry pipelines)... We can only tell if ATi or nVidia made the right choice when next year rolls along with benchmarking.

BTW, OP (Original Poster... I don't think I've seen that abbreviation used aorund here much), no you did not make a bad choice really... I mean hell your card will get you awesome performance in every game that is released this year... Next year some games will start to use DX10 (not completely right away), but your card will still run those games as I am very confident that unless the developers want their game to fail cause of a lack of a majority of people buying it, they will have DX9 version.

Probably mid to late next year (just pay attention during GDC and E3 next year) you will start seeing a lot more games using DX10 (Eve Online already started rewriting their graphics system to make it pure DX10... also WoW will eventually get a graphics overhaul... "We know that sometime in WOW's lifetime, we're going to upgrade the graphics engine. We don't know when that will be, but we built the game from the ground up knowing that it's going to be around a long time and needs to be upgradable. " Source: http://www.gamespot.com/pc/rpg/worldofwarcraftexp1/news.html?page=2&sid=6151428&q )

Anyways, your decision is fine, but just next year you won't be getting the top of the line experience, but everything will still run.
 
If the Nvidia G80 does not have a unified shader architecture then that does not mean it is out of DX10 spec. Although most people coding for DX10 will be optimizing for a Unified shader architecture. And Minerel did say "not entirely meeting the DX10 recommended spec" which probably is not the same as required spec.

FYI some rumors say it will be. :O
link
"We have previously mentioned that G80 is likely to take on the Unified Shader approach and supports Shader Model 4.0."
 
Back
Top