Unreal 1 : Half Life 2 is to 3Dfx : Nvidia

Zyphria

Newbie
Joined
Jul 4, 2003
Messages
188
Reaction score
0
Disclaimer: This post isn't about how much Nvidia sucks. Rather looking at how games can alter an entire industry. Just as how George Lucas' quest to get digital projectors into every theatre, it amazes me how individuals (or smaller companies in the case of Valve) can alter how billions of dollars will be spent, how future games will be designed, and how innovation takes a great step forward.

For anyone who recalls a game called Unreal, you'll also probably remember how new and refreshing it was at the time it was released, which was about the same time as Half Life (please correct me if I'm wrong here). One of the biggest features of this new game was the inclusion of 32-bit textures. This allowed the game developers to enhance textures in a way that they had been very restricted to in the past. There was a problem though. When played with a 3Dfx card, there was a fairly obvious problem. Banding. You would look at a texture (the sky was the most outstanding example) and you could tell where you were losing out right from the get go. If you want to see this phenomena for yourself, get a 32-bit color wallpaper, then change your colors down to 16-bit. Yeck.

At the time though, the 3dfx cards still got the most fps. While multiplayer was limited (mostly do to some rather chunky netcode that wasn't fixed till Unreal Tournement came out a year and a half down the line), the game did have bots, so framerates did matter if you did that. However, the gaming community on the whole started to come to a realization of how much had been sacrificed to get that extra 25%-33% more frames per second. The game just didn't look...good (for the sake of a better word).

Coupled with some fatal mistakes by 3dfx's management, and their poor handling of the 16-bit versus 32-bit color issue, the company went under in 2000, just two and a half years later.

As I stated in another post, I don't think Nvidia will make those same poor decisions, but they do need to come to grips with an entire failed line of video cards. By all appearences, the Radeon 9700 Pro, looks to be the Geforce 2 of tomorrow, in terms of a baseline video card to which will all games in years to come will build towards as a sort of "solid" baseline. (For anyone wondering, the Radeon 9700 Pro core is essentially being resized down for the RV380, and made very cost effective, just like the Geforce 2 MX 200/400).

DX9 isn't like DX7 or DX8 in terms of game developers not focusing on things like bump mapping or pixel shaders for enhancing maps. In the "Making of Half Life 2" movie which had two parts, one on Alyx and one on "The Wall", the latter truly demonstrated the real power of DX9 as a means to boosting graphical content, but significantly reducing load and boosting framerates in the overall picture. Just as hardware transform and lighting (often referred to as T&L) have become core parts of almost all modern games, so is DX9 features like Pixel Shaders 2.0.

So we're at a crossroads, one which has great significance in not only the short, but most likely the mid and possibly long term effect on the gaming industry on the whole. Sounds a bit grandiose doesn't it? Time will always tell, but through some objective analysis of the parts laid down in front of me, it looks fairly clear.

Comments, corrections, and constructive criticism are more than welcome. As for the rest of the posts calling me an ATI fanboi or somesuch, please just keep it to yourself. I've seen enough of it already.
 
good post.

i was thinking today, valve has been working on halflife2 for 5 years. they have had more than enough time to let nvidia know what the source engine is doing and how to make it run fast, yet the nvidia line can't handle it nearly as well as atis cards. why is this? what is a possible explanation besides some conspiracy theory that ati made a deal with valve to shut out nvidia? bad planning on nvidias part?
 
Originally posted by poseyjmac
good post.

i was thinking today, valve has been working on halflife2 for 5 years. they have had more than enough time to let nvidia know what the source engine is doing and how to make it run fast, yet the nvidia line can't handle it nearly as well as atis cards. why is this? what is a possible explanation besides some conspiracy theory that ati made a deal with valve to shut out nvidia? bad planning on nvidias part?

The bad performance for NVidia was not wholely unexpected. Both the Halo benchmark and the Tomb Raider: Angel of Darkness benchmark showed the relatively poor performance of DX9 NVidia cards.

Now, unless ATi bought off both of these, this is NVidia's fault.
 
Yikes, I didn't realize how long that post was until I looked at it :) !

I think the Radeon 9700 Pro caught Nvidia so entirely offguard that they were not able to get the R&D for DX9 down pat when they pushed the already late GeforceFX out the door.
 
Originally posted by Zyphria
DX9 isn't like DX7 or DX8 in terms of game developers not focusing on things like bump mapping or pixel shaders for enhancing maps. In the "Making of Half Life 2" movie which had two parts, one on Alyx and one on "The Wall", the latter truly demonstrated the real power of DX9 as a means to boosting graphical content, but significantly reducing load and boosting framerates in the overall picture. Just as hardware transform and lighting (often referred to as T&L) have become core parts of almost all modern games, so is DX9 features like Pixel Shaders 2.0.
There's no doubt in me that your post is one of the best posts written on this forum ever, but I have to comment on this part. The feature in "The Wall" movie you're talking about is called "normal mapping" and is in fact a DX7 feature. I got that info from this page. Since it's a DX7 feature, NVidia cards can handle it quite well.
Good examples of pure DX9 features are shown in that latest HDR tech movie.
 
Good post. Remember the stuff with unreal and problems with 3dfx back then.
 
Originally posted by Zyphria
Yikes, I didn't realize how long that post was until I looked at it :) !

I think the Radeon 9700 Pro caught Nvidia so entirely offguard that they were not able to get the R&D for DX9 down pat when they pushed the already late GeforceFX out the door.

That's an interesting point. :bounce:
 
Originally posted by Feath
The bad performance for NVidia was not wholely unexpected. Both the Halo benchmark and the Tomb Raider: Angel of Darkness benchmark showed the relatively poor performance of DX9 NVidia cards.

Now, unless ATi bought off both of these, this is NVidia's fault.

Yeah, I'm sick of people blaming Valve or ATI for what happened with FX Line.

The 5800ultra failed, so they rushed out a quick solution, ala the 5900ultra. Furthermore I can't believe they had the balls to advertise 5600s as DX9 cards.

Alot of people paid good money for these "top-end" cards (I was almost one of them.) Nvidia wasn't the only culprit though. Various hardware sites, especially the pro-Nvidia ones, called the card as the "fastest gpu on the planet", without real testing in DX9 applications (albeit there weren't too many of those available at the time.) Based on misinformation (ie Nvidia calling FX line "DX9++" compatible, sugjesting they were beyond mere DX9.)

I'm just glad I did my homework before I bought. All of the ATI reviews said that the Radeons were faster in pixel and vertex shader applications (ie games of the future) and thats what I based my decision on.
 
yes i have to agree, good post, i'd almost forgotten about the first unreal and 3dfx problems. It does strike remarkable similarities with the curent nvidia and ati happenings now.
Its just something that nvidia hadn't concentrated on, they probably never realised how completely the new dx9 features could be incorporated in game. Ati, obviously, having developed the 9700 at their own pace could sit back and just program them in as they saw fit. As has alreay been stated, nvidia were late, overbudget and under teched with the FX series. They ended up rushing the product out, even before it hit shops, analyists were questioning its performance capabilites.
Hence the 5900, its what the FX series should have to begin with. they are now nearly a generation behind as a result. This was still an ugly "patch" but was the best they could do without re-organising the actual structure of the card.

One last word, i'm a nvidia fan boy if you like, but at the end of the day, they havn't kept themselves in the market, hl2 is making them pay severely for it. I have a gf3 now, i was looking at a 5900, but tbh, there isn't really any point, i usually skip every other generation card to keep costs down, eg. get gf2, skip gf3, get gf4 etc. but in this case, ill just get the 9800, i need it to last past the following generation, its sad to say, but the 5900 hasn't made it fully into this generation yet.

/edit spelling
 
Normal Mapping is DX7? Phew, that's alright by me :cheers:
 
I still have a voodoo2, I e-mailed Gabe and he said the voodoo2's performs amazing on HL2 , can't wait.
 
yeah the voodoo 2 sits real nice on the hl2 box, keeps the dust off it.
 
TBH, i'm VERY pissed at nvidia

it was false advertising, no other way around it. they created drivers that would artificially boost their scores in synthetic benches like 3dmark03 that became so prevalent, far beyond the actual, real game capabilities of the card, then sold these scores as real performance.

in doing so, they conned thousands upon thousands into buying their cards based off of cooked information. they claimed their cards were fully DX 9 capable, yet on every DX 9 game so far, and even one OpenGL game with high level shaders (Doom 3), the cards have either performed terribly or have had to have special, sub DX9 standard codepaths built for them. if you're not running the games at the standards that the DX 9 API calls for, you're not running the game in DX 9. there is no way you can justify or weasel your way out of that, no way.
 
Yeah, the FX series are fully dx9 capable, they are just very very poor at performing under it, there is a difference between being able to do soemthing, and being able to do it well :(

The biggest reason why Gabe didn't want the det 50 drivers used in benchmarking was because in one the hl2 levels, they totally removed the fog. This is totally unacceptable, nobody likes a driver or card removing a game feature.
 
Originally posted by MaDMaXX
Yeah, the FX series are fully dx9 capable, they are just very very poor at performing under it, there is a difference between being able to do soemthing, and being able to do it well :(

The biggest reason why Gabe didn't want the det 50 drivers used in benchmarking was because in one the hl2 levels, they totally removed the fog. This is totally unacceptable, nobody likes a driver or card removing a game feature.


Thats the problem....they are NOT fully DX9 capable....

Whats the point of saying the card supports a feature if it wont even give you playable fps?...........


Nothing but nvidia blatantly lying to there customers.....

Just becuase a product has something poorly bulit into it....doesnt mean its capable of it.


ABS braked on a car......a manufacturer may say a car has ABS.....but if ABS doesnt function correctly then it isnt ABS.
 
Originally posted by MaDMaXX
Yeah, the FX series are fully dx9 capable, they are just very very poor at performing under it, there is a difference between being able to do soemthing, and being able to do it well :(

The biggest reason why Gabe didn't want the det 50 drivers used in benchmarking was because in one the hl2 levels, they totally removed the fog. This is totally unacceptable, nobody likes a driver or card removing a game feature.
Add a few "very", "very", to that :)

At any rate, I think Gabe knew what Nvidia was doing in the 50 drivers. 51.75 does a huge loss of image quality compared to the old drivers, even reviewers are suprised and apalled. These drivers are meant to increase speed, totally ignoring IQ. Thus, they are very very very very unsuited for benchmarking.
 
I think all the people who bought an FX card should call for theyr money back, stating they diddnt get what they paid for.
 
ATi should make a policy that for every FX card they get sent to them they'll return a radeon at half price! They would destroy nvidia's market share in a matter of days!

omg I'm brilliant! :D

e.g.
FX5900 -> radeon 9800 pro
FX5200 -> radeon 9200 pro or 9600 non pro
 
I'll never forget the days when i first bought my voodoo 3 (was orginally on a voodoo banshee) The performance of those cards back then was unbeatable. I have to admit that back then I was a die hard 3Dfx fan, and can always remember whispers of this new Nvidia company and there 'geforce 2' but at the time 3Dfx were about to release there suppossed 'phenomenol' new card the Voodoo 5. I remeber reading all the hype they had put out on it being a dual processor and how it had fullscreen AA mode! I'll never forget the day I went out and bought that card!after forking out nearly £400 I fired up 3DMark 2001 and was deeply dissapointed :((. Not only did it not run any of the the new tests (T&L, bumpmapping etc) but its frame rates on normal test were no better than my old Voodoo 3!! I couldnt believe it. Later that week I went round to a friends, he had a new Geforce 2 he got for less than half the price of mine and i remember taking the piss out of his feeble card, and challenged him to run 3Dmark to prove how feeble. Well I was shocked!!! I couldnt believe what i was seeing, this card out-performed my crappy Voodoo by miles and was running all the tests!! since then i totally converted to Nvidia and have put my faith in them since. But now it seems history is repeating and yet another company has stolen the crown. Luckily i learned my lesson and I have been watching the benchmarks of Nvidia's FX cards closely and decided not to go out and buy the 5900 straight away. Thank god i didnt, because now Im going to buy the Radeon 9800Pro for less than Nvidia's Rip-off card and have better performance alll round too.

I just hope that ATi doesnt go the same route as Nvidia.
Keep it real ATI!!
 
Originally posted by Incitatus
ATi should make a policy that for every FX card they get sent to them they'll return a radeon at half price! They would destroy nvidia's market share in a matter of days!

omg I'm brilliant! :D

e.g.
FX5900 -> radeon 9800 pro
FX5200 -> radeon 9200 pro or 9600 non pro


what would they do with all the FX cards...?

1. Play FX basball

2. load them into trucks and dump them in nvidias parking lot

3. Sell them to gateway or dell

4. dump them in afhganistan.
 
Originally posted by crabcakes66
what would they do with all the FX cards...?

1. Play FX basball

2. load them into trucks and dump them in nvidias parking lot

3. Sell them to gateway or dell

4. dump them in afhganistan.

That's so LoL material! haha :D

Here's #5: take the transistors and chips apart and build the mega FX5900 card, and install it on Deepblue!
 
Originally posted by Tequila
Normal Mapping is DX7? Phew, that's alright by me :cheers:

bump mapping is DX7 not normal mapping, you don't have to worry though because I think normal mapping is very rarely used for walls and if it is, it can be substituted by bump mapping on older machines.

Normal mapping can be used to create an entirely 3D object useing non other than normal maps, but bump mapping just use hieght maps which hold x and y coordinates unlike normal maps which hold x,y and z coordinates. :afro:
 
Bump maps store height.

Normal maps store the angle the surface is facing (the normal angle is the angle perpendicular to the face) by using different color values to represent rotations in different axes.

Normal mapping creates more realistic shading... it does not actually make the 3d object. That would be done using a heightmap.
 
I'm sure I'v seen normal maps used to say turn a flat wall into a flat wall whith a body coming out of it, I'm also sure a normal map is called a normal map because it moves the positions of the normals in the object changeing it's 3d representation.
 
Re: Re: Unreal 1 : Half Life 2 is to 3Dfx : Nvidia

Originally posted by Arno
There's no doubt in me that your post is one of the best posts written on this forum ever, but I have to comment on this part. The feature in "The Wall" movie you're talking about is called "normal mapping" and is in fact a DX7 feature. I got that info from this page. Since it's a DX7 feature, NVidia cards can handle it quite well.
Good examples of pure DX9 features are shown in that latest HDR tech movie.

Actually, it's DX8.
DX8 was, as far as I know, the first to support movie shaders, used in 3D modeling programs for creating movies like Toy Story. DX8 was the first to implement these per pixel shaders in realtime.
 
Bump maps, height maps (just another name for a map that can be used as a bump map or displacement map), normal maps, they all don't change the physical shape of the objects. The only map that does that is a displacement map, these aren't used in games because as you can imagine they require an enormous poly count to have a good result.
 
Originally posted by crabcakes66
what would they do with all the FX cards...?

1. Play FX basball

2. load them into trucks and dump them in nvidias parking lot

3. Sell them to gateway or dell

4. dump them in afhganistan.

Who gives a **** what they would do with them, burn them is my guess. But they would be off the market and everybody would be using ATi cards. :bounce:

Lol that's the best -> 2. load them into trucks and dump them in nvidias parking lot :cheers:
 
Somewhere on the ATi developer site they have a picture of a tire useing a bump mapping for the treads and a picture of a tire useing normal maps for the treads. On the one useing normal mapping the treads look 3d as opposed the bumped version that looks more like the picture in the link you gave me???
 
Do you think nVidia will collapse like 3Dfx did if they don't sort out their problems?

I read a whole buncha benchmarks, and I choose the Radeon 9800. I'll be buying it this week.
 
I'm also buying my Radeon 9800 this week. I used Nvidia boards for a long time, and 3dfx before that, but confuscious say:

"A wise man changes his mind... a fool -never!"

Considering all this bad publicity happened in such a short time-span I think Nvidia will have a hard time restoring their reputation. They have many signs of that whole 3dfx syndrome. The stock is driven by speculation and not neccessarily by the quality of the product... currently they have neither matching up to the juggernaut that is ATI for now.
 
I am afraid Nvidia will die like 3dfx. Because of their bad support for the cards.
 
Originally posted by Sirdenchalot
I am afraid Nvidia will die like 3dfx. Because of their bad support for the cards.
Dont be afraid just yet... They have a MASSIVE budget behind them to handle times like this. Sure, its not good, but they will still survive. If the next series, NV4x, goes just as bad, they are in trouble... If the next again goes equally bad, THEN you can be afraid. Very afraid. The worst part still is that they simply dont listen to people. I mean, come on we have been dragging up the IQ issue for a long time now. And its become important. We have two big companies, one have to outdo the other. Nvidia would stand a big chance of doing just this. And what happens? They create a blurry world of poorly lit pixels just to gain a few fps...
 
You are right, it is kinda...scary? how one company can alter the vision of millions of people. But in this case by buying a Radeon 9700 card you dont lose fps...you win fps, and you can make HL2 run with allot of nasy gfx features...so lets be true, im not saying nvidia ****ed up, but its either that or Ati just outclassed them :)
 
LOL Nvidia won't die just yet. Maby when the quadro series and the Nforce series go down the bin oh and they lose every peny they have in a law suit and nobody buys the company.

NV are still much, much richer and bigger than ATi, NV are here to stay.
 
I'm using an Nforce² chipset based mainboard, at least nvidia did do something right =)
 
Back
Top