Originally Posted by thenotsogoodtrickster
I have the HD4850 ATI card, and overclocking makes all GPU related settings worse. It's weird, I OC the GPU clock and memory clock and things load slower and there's more disappearings. So unless there's a way around that, I don't think OC'ing cards does much difference.
It makes a difference. But OC'ing this and that by 50 MHz won't do shit unless you know what your doing.
If your seeing artifacts, that means your overdoing something or your frying your card.
When OC'ing i always have RivaTuners ingame moniton enabled so i can keep a close look at the GPU stats.
And in the end it's down to the GPU and provider. My card does 1GHz on memory from the shitty 850 something it's set on and a boost of 50 - 75 MHz depending on the game and i always max the fans out at 100 from the 50% default. Made crysis playable on Very High at least
Now one of my friends can't even go about 10MHz for the shader core and he gets artifacts. Some people with the same product have different results which makes it hard to give a 100% sure answer to it, only estimates.
And it's down to the game. Most games i've played i didn't have any issues and a good amount of FPS boost with better performance. I haven't tried OC'ing on GTA IV because it's pointless anyway. I've got a Dual-Shit CPU and card with only 640MB with subliminal performance compared to what you people have now. GTAIV will run like shit no matter what setting.