GPU VRAM
Memory seems to be one of the most over-marketed products within the PC enthusiast’s arsenal of componentry. Whether it’s SSD cache being used to speed up ageing hard drives, enticingly large supplies of GPU memory on a $150 card, or desktop DDR4 with frequencies up into the 4,000MHz range, it tends to be obscured by a barrage of baseless promises.
For example, MSI recently stated that by increasing DDR4 frequencies from 2,400MHz to 4,000MHz, you’d see an in-game improvement in The Witcher3 of average frame rates from 94.1 to 114.1. In tests in our March 2016 issue, we saw zero difference in frame rates across 10 memory kits with a 800MHz difference between them.
But my issue lies with graphics memory, and how much is “enough.” We had an argument with Asus’s PR team regarding the appropriate level of VRAM for various cards. The 4GB 1050 Ti doesn’t make sense to us, when the 3GB 1060 is about $30 more, yet performs almost three times better. Asus’s argument is that games require ever larger VRAM caches, and you’d be future-proofing yourself by getting a 4GB 1050 Ti over a 3GB 1060.
We disagree. If you game at 1080p, 3GB of VRAM is plenty for any game today. Larger textures don’t provide any more detail, and if you’re looking to upgrade to higher resolutions, the GPU will become the bottleneck. Don’t fall for marketing lies—buy a card with an appropriate amount of VRAM for the resolution and games you play, and build a system around that.