I have a ATI Radeon 9800 and I was wondering what the Force Z-Buffer Depth on the OpenGL compatibilty settings is? What is it? What does it do? And is it better to have it on or off? Thanks.
Hm. Maybe. Does anyone else know what it is? I kind of asked this question because I wanted to know what it was and about DirectX games, does it look at the settings for OpenGL or Direct3D to choose what settings to use like AA and stuff?
A z-buffer is a 3d harware buffer that holds layers and pixels and stuff in memory so things that are behind other things actually are rendered behind, where they should be. wiout a z-buffer, programmers would have to organize their code so to always draw the back stuff first (a real pain, very slow too) or they could splat it all on the screen, and let the z-buffer take care of it all and store everything in the render and only draw it when it knows exactly what goes where. The more accurate your z-buffer is, the better things look when polygons intersect. Ideally, the z-buffer should record the 'depth' of every pixel of every surface so as to provide pixel-perfect polygon occlusion and intersection (gets rid of jaggies). The more accurate your z-buffer is, however, the more processing power it eats. Your z-buffer could alternately just record the depth of most polygons, and draw depths accordingly. This method will almost certainly result in jaggies. I would imagine that that setting on your card is to force the accuracy of your z-buffer in opengl mode. Try it out by turnig it all the way down or off and looking carefully at where objects intersect in your opengl games, then turn it up and compare.