Absolute worst review...ever

amneziac85

Newbie
Joined
Mar 4, 2004
Messages
229
Reaction score
0
http://www.hardocp.com/article.html?art=NjI4LDE=

Wow...you guys should read this review, its truly horrible. I dont mean the results, I just mean the quality of the whole thing.

This guy runs 2 ATI cards at 1280 vs an NVIDIA card at 1024 and actually seems to wonder why the NVIDIA is keeping up...

The worst review of ANYTHING I have ever ever ever read, so funny I want to laugh and then write a letter to hardocp about handing this guy and the guy who hired him a pink slip.
 
HardOCP is pursuing a different technique for graphics card reviews. I personally rather like it. I'm sure they'd appreciate your feedback on it, either by e-mail or in their feedback thread here: http://www.hardforum.com/showthread.php?p=1026141455. The article clearly states the author and editor.

The objective of the new [H] style is to determine which card delivers the best quality. How high can they crank up the resolution, AA, and AF with decent frame rates? It's not FPS-centric, and it's not meant to be.
 
As psyno said, it's a new "method" of benchmarking.
Still, where does he in this review "wonder" why the nVidia card is keeping up?
 
It's ment to go after compairing cards from the point of view with gameplay in mind. Not numbers on a graph. You don't benchmark games, you play them. It shows you how far each card dips during gameplay rather than just an average. You don't know what the min FPS or how often it dips by going to Tom's or Anandtech's.

Reread the comments with that inmind.
 
It's ok because it shows you what the max settings are you can get for good fps for each card. It's kinda misleading though when trying to compare two cards.
 
I liked the screenshot of the FC menu. ;)
The guy actually took a shot of the screen....literally!
 
I liked that review, though I wished they would have used a 6800 GT in comparision. They even gave a work around for the Far Cry bug. w00t! If you actually read that review, I think it is clear that either card from ATI or Nvidia will satisfy your gaming needs for the next couple of years.
 
The new stlye of there reviews is awsome in my opinion.


If you lack the intelligence to appreciate what they are doing then go look at Bobs Q3 benchmarks and 3dmark scores......we all know how much those tell us. :dozey:
 
amneziac85 said:
This guy runs 2 ATI cards at 1280 vs an NVIDIA card at 1024 and actually seems to wonder why the NVIDIA is keeping up...


Yea, that is a bit of an unbalanced system there.
 
I simply dont understand why you would ever run 1280 on one card and 1024 on another then compare the 2, just doesnt make sense.

The problem I have with it is this... I got half way through the damn review before I just stopped and said..."wait...somethins wrong...". Then I see that their runnin the NVIDIA card at 1024 :rolling:

I knew my stuff enough so that I could determine that something was wrong with the benchmarks, but what if I wasnt into that deep and just came by looking for a review...would I have caught it? Maybe...

I actually read the first few pages of the review and still didnt catch it, I just assumed it would be in the same resolution as would anyone.

My thing is, this gives a completly unfair advange to the card being ran at 1024 to the untrained, doesnt matter if its ATI of NVIDIA, they need to chuck that new style they got there cuz it smells like pig shit.
 
amneziac85 said:
I simply dont understand why you would ever run 1280 on one card and 1024 on another then compare the 2, just doesnt make sense.
The problem I have with it is this... I got half way through the damn review before I just stopped and said..."wait...somethins wrong...". Then I see that their runnin the NVIDIA card at 1024 :rolling:
I knew my stuff enough so that I could determine that something was wrong with the benchmarks, but what if I wasnt into that deep and just came by looking for a review...would I have caught it? Maybe...
I actually read the first few pages of the review and still didnt catch it, I just assumed it would be in the same resolution as would anyone.
My thing is, this gives a completly unfair advange to the card being ran at 1024 to the untrained, doesnt matter if its ATI of NVIDIA, they need to chuck that new style they got there cuz it smells like pig shit.
Did you even bother to read their bold disclaimer on the 2nd page right above the benchmarks? The only way to read a review is to read it.
Please be aware we test our video cards a bit different from what is the norm. We concentrate on examining the real-world gameplay that each video card provides. Gameplay includes performance and image quality evaluation. We have two sections, “Highest Playable” and “Apples to Apples”. The Highest Playable section shows the best Image Quality delivered at a playable frame rate. Following the Highest Playable section we have a brief Apples to Apples performance section for those that find benefit of framerates with matching IQ. More background on our reasoning behind our evaluation can be found in our Cheating the Cheaters editorial.

Simply though, we feel that finding the Highest Playable quality settings benefit gamers the most as compared to synthetic and canned benchmarks that often do not represent true gaming at all. We use a high performance system, with a very fast CPU in order to remove CPU bottlenecking.
 
Well if you would "read" youd see I already said I read it.

The entire concept is silly. Why dont they run all the cards at 1024? AT least then it would be even or how about not trying to be different and just doing both?

While were at it, lets just sit a geforce 3 next to the x800 and watch the sparks fly as my game zooooms in 640! WEEEEEEE!

Its dumb, just show both 1024 and 1280 for both cards or default the faster card to the slower, at least then, you wouldnt have to worry about people who want to buy these things as gifts thinking "That NVIDIA cards numbers were just as good and for alot cheaper!"

Duh. And theirs no need to get testy like a damn woman.
 
They're not comparing FPS though. I encourage you to read their explanation. If you want FPS comparisons, there are a gazillion other sites out there that still do it like that.
 
The purpose of the review was to let you know what type of image quality (through resolution, AA and AF) was playable with each card.

They lowered the resolution for the 6800 because the game Far Cry was not playable at 1280 x 1024 with 2AA and 8AF. So they bumped down the resolution and provided you with the knowledge that Far Cry is best played at 1024 x 768 resolution (with 2AA and 8AF). Now that is helpful to the average consumer.

Telling a consumer that a 6800 Ultra has an average frame rate of 35 FPS in Far Cry does nothing for them. That is because little, if any of the game time is ever spent at 35 FPS. A more useful statistic is telling the consumer the average FPS and then telling him what the minimum FPS was and how much time was spent there. That is exactly what this review did. Like I said above, Far Cry was best playable at 1024 resolution with the 6800 Ultra. Giving consumers the hard facts is much more useful than fairytale numbers provided by most review sites.
 
amneziac85 said:
The entire concept is silly. Why dont they run all the cards at 1024? AT least then it would be even or how about not trying to be different and just doing both?
You should skip to the second half of the review, they do Apples to Apples just like every other site.
 
amneziac85 said:
The entire concept is silly. Why dont they run all the cards at 1024? AT least then it would be even or how about not trying to be different and just doing both?

While were at it, lets just sit a geforce 3 next to the x800 and watch the sparks fly as my game zooooms in 640! WEEEEEEE!

Its dumb, just show both 1024 and 1280 for both cards or default the faster card to the slower, at least then, you wouldnt have to worry about people who want to buy these things as gifts thinking "That NVIDIA cards numbers were just as good and for alot cheaper!"

Duh. And theirs no need to get testy like a damn woman.

Mabey if you actaully read it instead of looking at the pretty pictures you wouldnt be so confused.
Well if you would "read" youd see I already said I read it.


bullshit..... :|
 
amneziac85 said:
And theirs no need to get testy like a damn woman.

I know several women who would happily kick your arse without a second thought. Cut the misogynistic crap.
 
Yeah personally I don't see the problem

I read the review and formulated that the 6800 is less powerful but I aint really going to notice too much difference. But the x800 is currently somewhat hamstrung by graphic glitches

doesn't bother me either way I have a 9600 XT and I can't afford owt else :)
 
Although I prefer the normal "apples to apples" type of benchmarking myself, I can appreciate it that HardOCP is experimenting with new review methods. You can't blame them for trying.
 
Back
Top