Gabe claimed the X800 is 40% faster than

blahblahblah said:
On the Direct 3D front it appears that X800 may win the current battle. Its 24FP shaders are really fast in games. Plus, DX9 games are still a rare bunch. We have no idea what SM3.0 will do in games and when it will actually be incoprated in to games fully (Far Cry/Stalker doesn't count, they weren't designed around 3.0).
I don't see neither PS2.0 nor SM3.0 being used fully in any game soon. Far Cry has PS1.1 all over the place. Only certain lighting effects in indoor sections are PS2.0, and they tax the GPU heavily.
 
Nope. The X800 needs about as much as or less power as the 9800 XT.
 
jonnyapps said:
can anyone tell me if it's possible to get the ruby exe demo to run on a 9600xt to compare speeds? I'm guesing not cos Ruby uses new pixel shaders no?

Check out www.elitebastards.com, people are working on it, it's not quite working yet but should be soon.
 
You can see well beyond 30fps.
60fps gets most of it.
Link

Your monitor limits how many FPS you actually see by the refreshrate.
If a game is going at 100FPS then depending on the refreshrate you will only see so many of those frames.
60Hz = 60FPS
85Hz = 85FPS
 
what if you put it on a projector screen with a digi projector, .. im planning on doing that in my garage, like a mini cinema screen :cool: . But im not too sure what HL2 will be like framerate wise when its projected onto a sheet of white board? will there be any difference..
 
Dark Auro said:
Nope. The X800 needs about as much as or less power as the 9800 XT.

it needs less, since it has a smaller heat sink than the 9800xt (doesn't cover up the ram)

http://www.tomshardware.com/graphic/20040504/ati-x800-09.html

i find these findings here interesting, since it shows that the 9800xt and the 6800 use about the same ammount of power (the graphs are of the entire system, but everything's the same except graphics card)
 
Arno said:
I don't see neither PS2.0 nor SM3.0 being used fully in any game soon. Far Cry has PS1.1 all over the place. Only certain lighting effects in indoor sections are PS2.0, and they tax the GPU heavily.

Hardware features are always ahead of hardware capability. I remember buying a TNT card and it said on the box that is supported Anti-alaising and bump mapping, and I thought that was cool. Little did I know that it would be around 4 years before the hardware was capable of using those features on a regular basis.
 
clarky003 said:
what if you put it on a projector screen with a digi projector, .. im planning on doing that in my garage, like a mini cinema screen :cool: . But im not too sure what HL2 will be like framerate wise when its projected onto a sheet of white board? will there be any difference..
I'm pretty sure Digital projectors are 60Hz. You probably could adjust a DLP projector like a monitor possibly up to 85Hz. They probably have a range like 43-85Hz or something.
 
clarky003 said:
what if you put it on a projector screen with a digi projector, .. im planning on doing that in my garage, like a mini cinema screen :cool: . But im not too sure what HL2 will be like framerate wise when its projected onto a sheet of white board? will there be any difference..

Sounds like a good idea.
But be shure to do some research before buying.
There are A LOT of different digital projection systems out there each with their own good and bad points.
In all of these there is one constant. A medium priced projector is not build to overcome ambient light. So you'll have to make shure that the room can be made dark enough to get good results.
Also make shure what you want before buying anything. Do you only want to play games? Maybe some movies too? TV?
If your buying a somewhat older LCD projector your bound to run into some problems. Things like LCD response time (slow response times means you wont be able to play fast paced action games), bulb availabilty, light output and black levels (not really an issue, but if you have a black scene and it's grey your gonna notice).
Newer projectors like DLP, LCOS, and QUALIA fix most of these problems, but also create new ones.
Just find out what works for you, and do proper research (try www.avsforum.com for a start). Also keep in mind that your initial purchase isn't the only cost, bulbs cost a lot of money (250$ and up) and don't last that long (2000-3000 hours).

I did this research last year and voted for option C. I bought a CRT projector, a little harder to setup, but worth the effort (click the link below to see it).

When you did all that, just sit back and enjoy the ride 'cause nothing can beat a screen measured in feet, not inches! :)
 
Asus said:
I'm pretty sure Digital projectors are 60Hz. You probably could adjust a DLP projector like a monitor possibly up to 85Hz. They probably have a range like 43-85Hz or something.

LCD based projectors don't really have a refresh rate. Like any fixed pixel display they just switch pixels on and off.
DLP's with a single display element on the other hand use a "colour wheel" to create all the colour from that single monochrome display element (wich also adds a display anomaly known as the rainbow effect). It's been reported that on a DLP projector if you increase the refresh rate the colour wheel also speeds up.
Personally I can't stand the DLP's, I tried one before i bought my current projector, and after a movie I had a splitting headache. You can get a three display element DLP though wich removes that problem, but they cost in the 5000$ and up, wich was a bit on the pricey side for me.
 
lol, yeh woot! its all hooked up to the PC, for movies , games, and my garage had no windows, and is being converted into a recreation room, the only window is on the door, but its small patterned glass cross section that can be covered easily. but yeh the room is pretty much pitch black in the day if the windows on the door are covered.

so a digi projector is fine, im actually borrowing it from a friend for a couple of weeks, for chilling out and inviting friends round for just general wall sized entertainment , if its good fun i might consider buying one. heck im considering buying one anyway. :)

as for refresh rate, ill fiddle till i find something that works well.
 
clarky003 said:
lol, yeh woot! its all hooked up to the PC, for movies , games, and my garage had no windows, and is being converted into a recreation room, the only window is on the door, but its small patterned glass cross section that can be covered easily. but yeh the room is pretty much pitch black in the day if the windows on the door are covered.

so a digi projector is fine, im actually borrowing it from a friend for a couple of weeks, for chilling out and inviting friends round for just general wall sized entertainment , if its good fun i might consider buying one. heck im considering buying one anyway. :)

as for refresh rate, ill fiddle till i find something that works well.

Best. Fun. Evar!
We have movie night every saturday, and it totally rocks!
I don't know why everyone doesn't have one, great way to have fun with your friends! Also hooking up a PS2 with gran turismo really blew my socks off, highly recommended!
 
kool, that just makes me want to do it even more. gonna go out and buy beanie bags and lay them out to sit on , but i mostley want to play far cry and HL2 and Stalker on it, it'll be like watching an interactive movie, :) thats if your not playing. but im gonna do all this when we upgrade our comp, and that'll be when the 6800 ultra comes down in price a little or X800 whichever gets cheaper quicker ;)
 
"Your eye can only see 30"?? Yea...but I dont think our eyeballs come with VSYNC enabled.

I play my games at 1024 and turn my refresh up to 85hz. Then I enable VSYNC if i know i can maintain 85fps or higher. I currently have a AIW 9800 pro, 512 PC 3500 and a AMD 2800+ so 85fps isnt that hard to hit.

If you play your games with 6xFSAA, playing a game above 1024 is a worthless loss of framerates, because comparisons ive done personally, there is NO difference. so 1024 w/ 6xFSAA should be what everyone with high end stuff should play on.
 
the problem with your argument is that you've done them "personally." Everybody is different.
 
The whole "Your eyes can't see faster than 24 fps" is a common misconception. That is in fact the slowest speed at which the eye sees the illusion continuous motion rather than individual frames. I believe scientific studies have shown that the human eye can detect framerates well into the hundreds of frames per second. When it comes to video games, however, most people seem to be comfortable somewhere in the 60 fps range.
 
junco said:
hmm? from those benchmarks i got the reverse impression. seems like teh 6800U consistently beat out the x800 XT except in a few of them.

The 6800U is faster when you benchmark them with no AA or AF. But once you turn on AA and AF, the 6800U frame rates drop like a rock. And when you spend $500 on a graphics card, you better be turning on AA and AF, otherwise you are wasting your money.
 
junco said:
hmm? from those benchmarks i got the reverse impression. seems like teh 6800U consistently beat out the x800 XT except in a few of them.
Try to block out the "GeForce 6800 Ultra" and look at the "GeForce 6800 Ultra (opts. off)" as the trilliniar optimizations are off.
 
blahblahblah said:
the problem with your argument is that you've done them "personally." Everybody is different.

Yea, but facts are facts. Being a Photoshop professional, I tend to notice things visually that others dont. VISUALLY, if you look at the edges, they look the same with 6xFSAA at any resolution 1024 or higher.

I can personally add 2+2 and get 4. Does that make it wrong or up for debate? No.
 
I agree theres no need for blowing the image up to silly resolutions that dont really make a difference quality wise, (sharpness maybe), unless your putting it on a massive screen, like, measured in feet :p. On 17 inch screens its hard on the eyes, even on a 21 inch.
 
The whole "Your eyes can't see faster than 24 fps" is a common misconception. That is in fact the slowest speed at which the eye sees the illusion continuous motion rather than individual frames. I believe scientific studies have shown that the human eye can detect framerates well into the hundreds of frames per second. When it comes to video games, however, most people seem to be comfortable somewhere in the 60 fps range.
The highest number I've heard of was well into the mid 200's by fighter pilots... but that doesn't really mean anything.

Technically, there has been no actual maximum framerate that the eye can detect, so far. The framerate you need to play a game depends entirely on the speed of the objects moving relative to the player's view and the amount of dependence on reflex/reaction time. If you are playing a point-and-click RPG where you don't move very fast you might feel comfortable at 30fps (possibly even less). If you are playing a racing game or a fast-paced FPS you'll probably want to keep your framerate at or above 60fps... and even then you'll still easily be able to discern the discrete distances that objects move between frames.

The light detectors in the eye build up and discharge light energy over time rather than the way that computer games render the exact status of each pixel at a given instant... and the sun does not give off light at any equivalent of a "refresh rate" like a strobe light or a monitor would, instead it gives off light constantly. Those two factors mean that even if the brain only checked what your eye detected at a given interval it would pick up all of the little movements between said intervals in the form of motion blurring (the oldest data being the most faded out).

A simple way to illustrate this is to wave your hand around in natural light and then wave it in front of your computer monitor (when displaying any bright screen, like a maximized notepad). Notice how in natural light it seems to be one fluid motion whereas in the light from the monitor it seems to be a series of pictures of your hand with blank space in between the intervals.

In Counter-Strike (with V-Sync enabled) I was able to tell the difference between 100fps and 120fps... and I only stopped there because that is as high as my monitor goes. Even at 120fps there were still noticeable jumps between frames... though there was MUCH less than was present at 60fps (the "acceptable" framerate for online gaming). If you want to be able to live with merely "acceptable" framerates I recommend that you do not try this experiment. Once you play a game at 120fps it is torture to play it at 60fps.
 
http://www.users.on.net/triforce/ruby/r300rubyrap.zip

This should work for all 128 MB Radeon 9500 and above cards.

A 128 MB card is REQUIRED! It will NOT run on 64 MB cards.

If you have a 128 MB card, you should have your AGP aperture size set to 256 MB. You should also change
resolution to 640x480 and multisampleType to 1 in Sushi.ini.

You 'really' need more than 512 MB of memory to run this other wise speed will probably be terrible, if it runs
at all.

To decrease the size of the shaders so they would work on R300 class card, certain effects are now gone.
These include Depth of Field and many specular highlights. A future version may attempt to put these back in.
 
G0rgon said:
http://www.users.on.net/triforce/ruby/r300rubyrap.zip

This should work for all 128 MB Radeon 9500 and above cards.

A 128 MB card is REQUIRED! It will NOT run on 64 MB cards.

If you have a 128 MB card, you should have your AGP aperture size set to 256 MB. You should also change
resolution to 640x480 and multisampleType to 1 in Sushi.ini.

You 'really' need more than 512 MB of memory to run this other wise speed will probably be terrible, if it runs
at all.

To decrease the size of the shaders so they would work on R300 class card, certain effects are now gone.
These include Depth of Field and many specular highlights. A future version may attempt to put these back in.


G0rgon...your just the man. Plain and simple.
 
I don't believe that the sites are biased. Take one benchmark common to all sites and see if they all show the same thing. To say one site is biased implies that they doctor the results of the benchmarks. Using different test beds means nothing. Imagine if the drivers performed differently on an Intel or an AMD system; just imagine how much of a media frenzy this would be. People would know about it..
Remember when hardocp made a correction and people noticed?
Remember when people found out that some video card manufacturers (can?t remember which one) had application specific optimizations? Renaming quake3 would yield different benchmark results. The string quake3 was also found in the drivers


People find out about these sorts of things. So stop saying these sites are incredible/biased. It gives me the impression that you are cynical zealot.
 
retrofitter, there is a lot more to quality hardware reviewing than verifiable numbers that have rough parity across a number of sites. These sites receive a lot of money for what they do so it is a bit naive of you to ignore the various ways in which results can be spun through comment, choice of benchmarks, and testbed...

Just going by the numbers is kinda like trusting Fox News to give an balanced editorial about the US Government. :p

FYI - testbed can make a big difference to results. For instance, if you choose a CPU much slower than P4EE 3.2 or A64 FX53 then the results you get are going to hide performance differentials in games which are CPU limited.
 
Back
Top