Terrible performance on a 4 year old game

Planetary

Newbie
Joined
Feb 2, 2007
Messages
1,576
Reaction score
19
This is mystifying me. I have no explanation. I've got that Chronicles of Riddick game, and somehow my fps is completely in the gutter, loading screens take minutes (even TF2 is faster), and don't get me started on the cutscenes.

Utterly unplayable, yet I'm way above the minimum system requirements.
I checked http://www.systemrequirementslab.com and I even pass the recommended requirements!!
Yet my fps is about 0.2 or so at best. I have no idea what's going on. Screenshot to prove this should be running like a dream:



Only thing I can't explain is what "1.7 Ghz performance rated at 4.32 Ghz" means in that performance report, since I'm not a real hardcore hardware sort of guy, it's probably something obvious.

It took me many minutes (even the menu lags!) but I managed to get all the setting as low as possible, windowed mode, shaders at 0.5 whatever that means, lowest resolution, and no change in playability whatsoever! Using one core or both has no effect, or putting priority at high (in the ctrl alt delete menu).
I did some research, apparently it uses the same lighting system as F.E.A.R., yet that one ran clean as a whistle! What am I missing here?

Yes, I shut down all other processes. Riddick's the only thing running,

Yes, I do have a 2-3 year old laptop, but even my Source engine games run like a charm. What's wrong with this game? What can I do?
 
Your laptop has the ATI Radeon Xpress 200M graphics processor which by todays standards is extremly low end. According to this graph it performs worse than Intels GMA 950. It doesn't seem much better than the GPU in my laptop (IGP 320M) which can't even run a game as old as AVP2.

http://www.notebookcheck.net/Intel-Graphics-Media-Accelerator-950.2177.0.html

If you are able to get to the settings menu then switch everything to the absolute lowest, turning of any features that you can and running it at 640x480.
 
Hmm, well I already knew that I don't have the best stuff for "today's standards" but Riddick wasn't exactly "today".
Somehow I can run much newer and more higher quality games without any issues? TF2, Portal, Episode Two, up to 60-70 fps in TF2 when I let it use both cores, even when things are crowded, and high textures and models and stuff (I dont care for AA or HDR very much so does are turned off).
Even F.E.A.R. as I said, runs beautifully, or Far Cry 1, which is from right around when Riddick was made.
Even Halo 2 ran alright (there's a way to get it to run in XP)

Also why does that System Requirements program say I can run this very well, if that's evidently not the case?
 
The older a game is doesn't always have a bearing on how well it will run on newer systems, especially if it's a weaker system and especially if it's a console port that may have some optimisation issues. Way back in the day I had a fairly weak desktop PC that could still run the PC version of FF7 fluidly and without a hitch. Some time afterwards, and after a graphics card upgrade, I tried FF8 - it barely ran at all, despite being an aged game even then, and not looking half as good as FF7 did on my machine.

Also, that system requirements lab site is very misleading in my experience. It should be used more as a guideline as to whether you can play the game at all, rather than as a measure of what kind of performance you'll get. Most likely the only thing really holding you back from good performance is your graphics card, yet SR lab measures all this stuff like OS and soundcard, and attributes equal weighting to them as if they have a significant bearing on performance. Then when it comes to the graphics card, it just confirms whether or not you have the necessary VRAM and whether your card supports Dx9, then pops up a fairly arbitrary little green gauge.

I have no idea why FEAR runs beautifully for you, if your card really is only around the level of an Intel integrated chip (and these two links indicate that it is). It's possible simply that FEAR is better optimised for lower end systems than Riddick, despite having higher requirements across the board. In any case, you can't necessarily expect much from a card that a hardware comparison site describes as 'not really suited for actual gaming'. Sorry dude.

EDIT: One last ditch thing you can try is to change your graphics card drivers. Because your card is a little aged, I have my doubts that the newer Catalysts from the ATI site would be any good to you at all, since they won't have been creatd with your card in mind. Instead, go to your laptop manufacturer's page and see if there are any other drivers you can try based on your model of laptop. If there are none, or if there is no significant improvement, then try the latest catalysts (after ensuring they support your card - I'm not sure what ATI's policy is on mobile support). Either than or try researching to find out what the best drivers are for a Radeon Xpress 200M then download them from ATI's archive. But honestly, don't get your hopes up; drivers can't work magic, and you could be on a hiding to nothing, but equally there's a slim hope you may be able to wring out enough FPS to make it barely playable.
 
system requirements lab isnt always spot on... besides.. that game might just not run well with that card.

I've got a Q6600, 2 gigs of ram, and right now im stuck with an nvidia Gefore 5500 PCI card,

i tried running soilder of fortune 2.. a game from the late 90's i believe.. it would just not run well at all... lowest of low settings at a res of like 640x480 i couldnt get anything higher then 15fps.. no idea why .. the game just did not like my set up or something.
 
FEAR probably ran well since it wasn't on maximum settings, I think it autodetected what I could use for best performance (always a nice feature).
Tried updating drivers, even though I think they already are, restarted and no change, I expected as much.
Oh well, guess a demo is still the best way to find out how well something will run, before buying it?

Just one last thing. What did SystemRequirementsLab mean by my cpu speed being rated 4.32 Ghz when it's obviously not? Is that what overclocking is all about? Liquid nitrogen gels and other such things?
 
They are trying to assure you that your 1.7GHz CPU performs faster than you may think if you try to compare it to a Pentium 4 CPU. (Your CPU is good. Don't worry) You can't really use GHz to compare CPUs that are made differently though. The confusion comes in when people think they can and others try to say one GHz equals another GHz depending on the CPU.

Up until the Pentium 4 all mainstream CPUs had a similar design and the clock speed increases were no surprise. But then Intel made the Pentium 4 and had it clocked very high. Consumers bought into the high CPU speed only they didn't know that it did less work (performed less) per GHz. It simply made up for it by having more of them. But customers didn't notice anything different but loved the bigger GHz number. So when comparing with AMD's or Intel's current FASTER cpus came out that were designed smarter and required less GHz to perform even better...customers were confused trying to compare it to their 'smoking' Pentium 4's with the higher GHz number.

The best way to compare a CPU is run something on it and see if it did better or worse than another CPU. Ghz are just numbers...
 
They are trying to assure you that your 1.7GHz CPU performs faster than you may think if you try to compare it to a Pentium 4 CPU. (Your CPU is good. Don't worry) You can't really use GHz to compare CPUs that are made differently though. The confusion comes in when people think they can and others try to say one GHz equals another GHz depending on the CPU.

Up until the Pentium 4 all mainstream CPUs had a similar design and the clock speed increases were no surprise. But then Intel made the Pentium 4 and had it clocked very high. Consumers bought into the high CPU speed only they didn't know that it did less work (performed less) per GHz. It simply made up for it by having more of them. But customers didn't notice anything different but loved the bigger GHz number. So when comparing with AMD's or Intel's current FASTER cpus came out that were designed smarter and required less GHz to perform even better...customers were confused trying to compare it to their 'smoking' Pentium 4's with the higher GHz number.

The best way to compare a CPU is run something on it and see if it did better or worse than another CPU. Ghz are just numbers...

Thanks for the explanation. So numbers aren't everything then. I'll upgrade someday, and this will all be moot... :rolleyes:
 
Well I bought more ram (now just under 2 Gigs) and it ran like a charm! Just beat it today. Noticed a slight increase in TF2's performance as well! So happy ending. :)
 
Back
Top