Zyphria
Newbie
- Joined
- Jul 4, 2003
- Messages
- 188
- Reaction score
- 0
Disclaimer: This post isn't about how much Nvidia sucks. Rather looking at how games can alter an entire industry. Just as how George Lucas' quest to get digital projectors into every theatre, it amazes me how individuals (or smaller companies in the case of Valve) can alter how billions of dollars will be spent, how future games will be designed, and how innovation takes a great step forward.
For anyone who recalls a game called Unreal, you'll also probably remember how new and refreshing it was at the time it was released, which was about the same time as Half Life (please correct me if I'm wrong here). One of the biggest features of this new game was the inclusion of 32-bit textures. This allowed the game developers to enhance textures in a way that they had been very restricted to in the past. There was a problem though. When played with a 3Dfx card, there was a fairly obvious problem. Banding. You would look at a texture (the sky was the most outstanding example) and you could tell where you were losing out right from the get go. If you want to see this phenomena for yourself, get a 32-bit color wallpaper, then change your colors down to 16-bit. Yeck.
At the time though, the 3dfx cards still got the most fps. While multiplayer was limited (mostly do to some rather chunky netcode that wasn't fixed till Unreal Tournement came out a year and a half down the line), the game did have bots, so framerates did matter if you did that. However, the gaming community on the whole started to come to a realization of how much had been sacrificed to get that extra 25%-33% more frames per second. The game just didn't look...good (for the sake of a better word).
Coupled with some fatal mistakes by 3dfx's management, and their poor handling of the 16-bit versus 32-bit color issue, the company went under in 2000, just two and a half years later.
As I stated in another post, I don't think Nvidia will make those same poor decisions, but they do need to come to grips with an entire failed line of video cards. By all appearences, the Radeon 9700 Pro, looks to be the Geforce 2 of tomorrow, in terms of a baseline video card to which will all games in years to come will build towards as a sort of "solid" baseline. (For anyone wondering, the Radeon 9700 Pro core is essentially being resized down for the RV380, and made very cost effective, just like the Geforce 2 MX 200/400).
DX9 isn't like DX7 or DX8 in terms of game developers not focusing on things like bump mapping or pixel shaders for enhancing maps. In the "Making of Half Life 2" movie which had two parts, one on Alyx and one on "The Wall", the latter truly demonstrated the real power of DX9 as a means to boosting graphical content, but significantly reducing load and boosting framerates in the overall picture. Just as hardware transform and lighting (often referred to as T&L) have become core parts of almost all modern games, so is DX9 features like Pixel Shaders 2.0.
So we're at a crossroads, one which has great significance in not only the short, but most likely the mid and possibly long term effect on the gaming industry on the whole. Sounds a bit grandiose doesn't it? Time will always tell, but through some objective analysis of the parts laid down in front of me, it looks fairly clear.
Comments, corrections, and constructive criticism are more than welcome. As for the rest of the posts calling me an ATI fanboi or somesuch, please just keep it to yourself. I've seen enough of it already.
For anyone who recalls a game called Unreal, you'll also probably remember how new and refreshing it was at the time it was released, which was about the same time as Half Life (please correct me if I'm wrong here). One of the biggest features of this new game was the inclusion of 32-bit textures. This allowed the game developers to enhance textures in a way that they had been very restricted to in the past. There was a problem though. When played with a 3Dfx card, there was a fairly obvious problem. Banding. You would look at a texture (the sky was the most outstanding example) and you could tell where you were losing out right from the get go. If you want to see this phenomena for yourself, get a 32-bit color wallpaper, then change your colors down to 16-bit. Yeck.
At the time though, the 3dfx cards still got the most fps. While multiplayer was limited (mostly do to some rather chunky netcode that wasn't fixed till Unreal Tournement came out a year and a half down the line), the game did have bots, so framerates did matter if you did that. However, the gaming community on the whole started to come to a realization of how much had been sacrificed to get that extra 25%-33% more frames per second. The game just didn't look...good (for the sake of a better word).
Coupled with some fatal mistakes by 3dfx's management, and their poor handling of the 16-bit versus 32-bit color issue, the company went under in 2000, just two and a half years later.
As I stated in another post, I don't think Nvidia will make those same poor decisions, but they do need to come to grips with an entire failed line of video cards. By all appearences, the Radeon 9700 Pro, looks to be the Geforce 2 of tomorrow, in terms of a baseline video card to which will all games in years to come will build towards as a sort of "solid" baseline. (For anyone wondering, the Radeon 9700 Pro core is essentially being resized down for the RV380, and made very cost effective, just like the Geforce 2 MX 200/400).
DX9 isn't like DX7 or DX8 in terms of game developers not focusing on things like bump mapping or pixel shaders for enhancing maps. In the "Making of Half Life 2" movie which had two parts, one on Alyx and one on "The Wall", the latter truly demonstrated the real power of DX9 as a means to boosting graphical content, but significantly reducing load and boosting framerates in the overall picture. Just as hardware transform and lighting (often referred to as T&L) have become core parts of almost all modern games, so is DX9 features like Pixel Shaders 2.0.
So we're at a crossroads, one which has great significance in not only the short, but most likely the mid and possibly long term effect on the gaming industry on the whole. Sounds a bit grandiose doesn't it? Time will always tell, but through some objective analysis of the parts laid down in front of me, it looks fairly clear.
Comments, corrections, and constructive criticism are more than welcome. As for the rest of the posts calling me an ATI fanboi or somesuch, please just keep it to yourself. I've seen enough of it already.