ATI filtering article...

crabcakes66

Newbie
Joined
Jul 13, 2003
Messages
3,178
Reaction score
0
http://www.3dcenter.org/artikel/2003/11-21_a_english.php


kind of funny...since 9 times out of 10........ATIs filtering is NOTICABLEY better looking than nvidias.....


isnt that what really counts? in game quality.....

why does it matter if they do it with less precision if it looks better anyway.....


doesnt make sence.. sounds like these guys have there heads in the sand.
 
The article is pointless.

They do not show any ingame images as a first. We KNOW that ATI often shows higher quality of AF filtering even though Nvidia should be *technically* better.

And the most moronic thing; They use the NV25 as comparison. That is the Geforce 4 if I'm not mistaken. Meaning, all image quality compared to the R300 is completely pointless. The NV3X does NOT have the same image quality as the NV25. It is PROVEN to be correct. Furthermore, you can actually run games with AF on the ATI card. On the G4, it would mean loss of performance.

Crappy article, with a crappy comparison.
 
I have both a GF4 and a ATi 9800, the 9800 looks alot better ingame with AF switched on or off.

I can't believe they would compare the NV25 to the R300, I never ever used AF when I had my GF4 except on HL and other really old games, it was nice quality but not quite as good as my new 9800.
 
I think some of you are missing the point of this article. The title of the article is "ATI's filtering tricks". This is not an ATI VS NVIDIA article. They already did the IQ issues with the GeForceFX series in another article. In this article, the focus is on ATI. They don't compare ATI and NVIDIA directly, rather they compare both of them to the "official" IQ requirements for DirectX and OpenGL in seperate articles.

The authors of this article are obviously image quality purists. With such articles they want to accomplish that ATI and NVIDIA make drivers that follow the official IQ requirements more closely. THAT is the true point of this article. The point of this article is not to make hardware recommendations for customers. They don't take the NV25 as an example of a good buy, they take it as an example because it's image quality is close to the actual IQ requirements and thus useful for their IQ analysis.

Originally posted by dawdler
They do not show any ingame images as a first.
So? There are no ingame images of NVIDIA's brilineair filtering in existence either, but that never stopped you from criticizing it to the max. I sense a little bias here.

Personally, I don't care about optimizations that aren't visible for the naked eye. But apparently the authors of this article do, and they're free to do so. I also see a point in investigating and publishing these analysises. After all, somebody has to keep an eye on all the optimizations in case companies try to push it too far.
But in this case my conclusion would be that the optimizations are too minor to care about.

Wow, big post. I hope I made myself clear. :|
 
Originally posted by Arno
So? There are no ingame images of NVIDIA's brilineair filtering in existence either, but that never stopped you from criticizing it to the max. I sense a little bias here.
Huh? How do you think they found out it existed? Yep that right: ingame images (UT2K3 in this case)!

And you sense bias on my account? How about this little line from the article:
"FP24's precision however is always as good or higher than the CineFX proprietary FX12, FX16 and of course FP16."
He obviously have no idea what the differences in color precision is from FP16 and FP24 (not to mention from FX12 to FP24!).

Or what about this line:
"This is how R300 offers unmatched performance, but doesn't deliver the best image quality. From an "ethics" point of view (whatever that means to ATI and Nvidia) the competition can easily reduce image quality through drivers, to squeeze a bit of extra performance out of the chips and keep up."
9 out 10 article writers would disagree.


The article is pointless because its just... pointless. Its what we see that counts. This article says that the filtering quality is not as good as Nvidias. Then WHY does the final image look better?! And WHY ON GODS GREEN EARTH would this justify lowering the POORER G4 image quality to match ATI's "lower quality"!? Its so braindead its unbeleivable. Its nothing biased. Just read the lines and think about it, and patch that up with what you have read from other articles.
 
Originally posted by dawdler
Huh? How do you think they found out it existed? Yep that right: ingame images (UT2K3 in this case)!
No, you're talking about images from UT2K3 where the textures are replaced with very bright colors to visualise the mipmaps. Nobody plays games like that. I'm talking about screenshots from a gamers perspective.

Originally posted by dawdler
And you sense bias on my account? How about this little line from the article:
"FP24's precision however is always as good or higher than the CineFX proprietary FX12, FX16 and of course FP16."
He obviously have no idea what the differences in color precision is from FP16 and FP24 (not to mention from FX12 to FP24!).
So, in your opinion, he should have said "always better", instead of "always as good or higher". I'm aware of the slight difference in meaning, but I don't see the big deal.

Originally posted by dawdler
This article says that the filtering quality is not as good as Nvidias. Then WHY does the final image look better?!
Their analysis tries to explain this, but they fail to do so. That's a bit disappointing, but it's logical that ATI does its best to keep it a secret.

And they're not suggesting that the G4 should lower it's precision. They want the R300 and GeForceFX to higher their precision. Which I disagree with, as the differences will not be noticable.

And once again: this is a technical analysis, not a consumer review. They only look at certain technical aspects and not at the gameplay value of the hardware.
 
Originally posted by Arno
No, you're talking about images from UT2K3 where the textures are replaced with very bright colors to visualise the mipmaps. Nobody plays games like that. I'm talking about screenshots from a gamers perspective.
Nope, I am talking actuall ingame images from it. We can see the mipmap discrepancies without visualizing them, hence; ingame images from a gamers perspective.


So, in your opinion, he should have said "always better", instead of "always as good or higher". I'm aware of the slight difference in meaning, but I don't see the big deal.
The term "better" should not be used at all. Of course higher precision is always better, the point is that he is obviously underestimating the differences. Its MUCH larger difference from FP16 to FP 24 than to FP24 to FP32. Its just vast. It doesnt matter for us gamers, that's true, but as you pointed out, this is a technical article, and as such should matter quite alot.
 
Back
Top