Unfocused
Companion Cube
- Joined
- Feb 22, 2004
- Messages
- 6,459
- Reaction score
- 52
I used to think that the amount of DPI a mouse has translates to how accurate it is ie. it won't 'skip' pixels etc.
Now I know that's not exactly what DPI relates to. My question is - what's the difference between hardware determined DPI and software determined sensitivity?
Does: 1600DPI/1.0 sensitivity = 800DPI/2.0 sensitivity?
I'm a low sensitivity player - would it be better for me to play at a high DPI and compensate for it with low sensitivity or the other way around?
I'm using a regular 800DPI mouse atm (Creative Mouse Optical 3000) and will be switching to a Razer DeathAdder and am wondering whether I should even use the additional DPI it provides.
Now I know that's not exactly what DPI relates to. My question is - what's the difference between hardware determined DPI and software determined sensitivity?
Does: 1600DPI/1.0 sensitivity = 800DPI/2.0 sensitivity?
I'm a low sensitivity player - would it be better for me to play at a high DPI and compensate for it with low sensitivity or the other way around?
I'm using a regular 800DPI mouse atm (Creative Mouse Optical 3000) and will be switching to a Razer DeathAdder and am wondering whether I should even use the additional DPI it provides.