Answer Best answer as chosen by user trunks_cscs
Check and see
Most monitors have a "native setting" that it best works at. Also, the video card may as well do the same. However, typically the monitor will be the setting to best use. Of course the video card has to support that. further, if this is a demanding setting or extreme in some way, the quality of the devices comeinto play. All this can be done as trial and error method, but if a setting seems to cause grief then you settle for one that doesn't. Sometimes, the cure is to replace a device that better suits whatever it is being used for. ATI and Nvidea both provide their "video setting manager" to best handle or just test before final use. Both also offer to test the setting before accepting, so see if yours does this. I exclude any actual h/w fault, which can confuse things. generally, speaking, the specs provides what works best or the "defaults" expected.
Was this reply helpful? (0) (0)