> GTA Performance > $300 Graphics Cards > Thunderbolt Pains
Dual Graphics?
Greetings Doc. My rig consists of an ASRock FM2A88X-ITX+, an AMD A10-7850K APU, two Crucial 240GB SSDs in RAID 1, and 8GB of Kingston HyperX DDR3 at 2400 MT/s. My question is which would be a better graphics solution? Install a Radeon R7 250 in the PCI slot and enable AMD’s Dual Graphics tech, or forget the on-die graphics and install a Radeon R9 (or something comparable) in the PCI slot?
—Rick Stephenson
THE DOCTOR RESPONDS: If you’re using a Mini-ITX mobo, Rick, the Doc can’t help but wonder if your small form factor chassis can accommodate a dual-slot PCI Express card. Assuming it can, though, and that you have plenty of room (and airflow) in your case, the Doc would almost certainly bypass the on-die graphics and spring for something like a Radeon R9 270. You should be able to find one for around $150.
Why go that route when you already have a Radeon R7-class engine with 512 shader cores in your APU? As a rule, one graphics processor is better for compatibility than two or more. This is doubly true for AMD’s Dual Graphics, which relies on up-to-date CrossFire profiles for increased performance. Even then, higher benchmark results aren’t always indicative of a better experience with Dual Graphics. The Radeon R9 270, on the other hand, is still relatively affordable, gives you more processing power than the APU and R7 250 combined, and will almost assuredly support the newest games before a Dual Graphics configuration. Plus, it’s well-balanced with the two Steamroller-based modules in your A10-7850K—paying more for an even faster card may not be worthwhile if it’s bottlenecked by the APU.
GTA V Frame Rates
I started out by writing a long letter explaining the workarounds I’ve tried. But I’ll distill it down to this: my frame rates in GTAV are horrible—15 to 20fps. I’ve done everything short of nuking my machine and reinstalling Windows. Given my system specs, I expect to see 30-40fps nominally. I’m not running at some insane resolution; I’m at 1920x1080.
Before pulling the trigger, there’s one item I wanted to run by you to see if it could be the culprit. My motherboard (GA-Z68XP-UD3-iSSD) has “Dual PCI Express 2.0” printed on the board just above the PCIe slot. GPU-Z also shows “PCIe 16x 2.0.” However, the Gigabyte website says the PCIe slot is 3.0-capable. Since thirdgen PCIe is nearly double the throughput of 2.0, could this be a bottleneck causing the frame rate to suffer? If I have to get a new board, so be it. I just want to know what’s going on.
—Michael Guimond
THE DOCTOR RESPONDS: Performance issues can be very frustrating to troubleshoot, Michael. But the Doc will focus on your PCIe question so you can move on to other possible causes before venting your frustrations on the denizens of Los Santos. Gigabyte’s Z68XP-UD3-iSSD may support PCIe 3.0 transfer rates through its x16 slot, but because the controller it communicates with is built into your CPU, that component needs to support PCIe 3.0 as well. Unfortunately, Intel’s Core i5-2500K is limited to PCIe 2.0, so you’d need something like a Core i5-3570K (or another Ivy Bridge-based CPU) for thirdgen throughput. If you upgrade, be sure to flash the latest BIOS before removing your i5-2500K.
The good news is that PCIe bandwidth probably isn’t your problem. Even a modern card like the GTX 970 doesn’t move
enough information over the PCI Express bus to saturate an 8GB/s link. You might want to try overclocking your i5 to see if that helps, though. GTA is notoriously processor-bound, so higher clock rates could improve performance. Also, make sure you have the latest patch. Rockstar recently published a handful of updates that made a big difference.
Shopping For Balance
Hey Doc, I received a Mini-ITX case (Cooler Master Elite 130) from a buddy and decided to build a system with it. I dropped in a Gigabyte GA-Z97N-WIFI motherboard with a quad-core Intel Core i5-4590 at 3.3 GHz. It’s liquid-cooled by a Cooler Master Seidon 120V, and I maxed out the RAM using a G.Skill Ripjaws X Series 16GB kit. I have a Samsung 850 EVO 250GB SSD for the OS and a Western Digital Blue WD10EZEX 1TB for data that I repurposed from an old machine. The whole setup is powered by a Cooler Master G550M 550W PSU.
The issue I’m having is that the integrated graphics just isn’t doing it for me. Can you suggest a graphics card that will work with my system and really kick butt? I’ve about $300 to spend and have been looking at the Radeon R9 290 and GeForce GTX 900-series. I should have enough available power to support either, but I’m not sure if one card offers an advantage over the other. I trust you and MaximumPC to give me an honest and unbiased recommendation. —Steve
THE DOCTOR RESPONDS: It’s a privilege to have your trust, Steve. That’s an otherwise highly capable PC you have, and your Cooler Master Elite 130 does in fact have room for any high-end graphics card you want to install (even though it’s a Mini-ITX enclosure).
The Radeon R9 290 is a great card at around $250. Stepping up to a GeForce GTX 970 would be even better (it’s faster, uses less power, and generates less heat). But that would put you over your budget. If 3D performance is your top priority, nothing beats AMD’s R9 290 for less than $300.
Office Uproar
My company’s IT group just forced me to “upgrade” to Office 2013. Every time they do these upgrades, everyone spends months figuring out how to undo the changes Microsoft made to its latest version. One I can’t figure out is how to move the star icon for Favorites from the right-hand side of the screen back to the left, where I’m used to it being. Can you help? Also, is there a place we can send messages to Microsoft, telling them to stop making all these stupid changes? THE DOCTOR RESPONDS: The Doc must assume this update also involved a new version of Internet Explorer, which now has its “View favorites, feeds, and history” tab in the upper right-hand corner between Home and Settings. There are a couple of ways to access this. First, you can turn on the Favorites toolbar by clicking “View > Toolbars > Favorites bar.” This will put any link you’ve specifically added to the Favorites bar right under the Menu bar across the top. Or, if you’d like to enable the Explorer bar, which is the pane on the left-hand side of the screen with all of your favorites, click “View > Explorer bars > Favorites.” Of course, if the Doc is misunderstanding the favorites to which you’re referring, please feel free to send clarification. If you’d like to submit feedback, check out Asus’s ThunderboltEX II/DUAL card first shipped with a ninepin cable, but to work with the company’s newest Z97 and X99 motherboards, you need a nine-to-five-pin adapter.
connect.microsoft.com.
The company publishes a list of software currently accepting bugs and suggestions. IE is on the list, but Office is not.
Thunderbolt Anger
I look forward to your column every month and finally have a problem that requires the Doctor’s attention.
For some time, I’ve wondered why Windows is not embracing Thunderbolt tech. While some might covet its video capabilities, I‘ve always been interested in Thunderbolt’s data transfer speed. When Asus introduced its X99-based mobos, I thought it was time to give Thunderbolt a try. One of the features flaunted was that the X99 Deluxe worked with its ThunderboltEX II PCIe cards. I bought the X99 Deluxe, a ThunderboltEX II/DUAL, a Core i7-5930K CPU, and the rest of the hardware necessary for my new build. The build was uneventful, until it was time to install the Thunderbolt card. Although I’d read that the X99 Deluxe and the ThunderboltEX II/DUAL were compatible, it turns out the card has a ninepin header, while the mobo has a five-pin Thunderbolt header. The cable that came with the card had nine pins on both ends and simply wouldn’t work.
I quickly found out the Thunderbolt EX II cards were initially intended for the Z87series motherboards, which had nine-pin Thunderbolt headers. Apparently, some later versions shipped with a nine-to-five-pin adapter cable that allowed them to be used with the Z97 and X99 mobos. I began an exchange with Asus to get the correct cable, leading to my most frustrating tech support experience ever!
I was told they weren’t compatible with each other; get a new motherboard. Then, it was call the Asus eStore (I did; they didn’t know what I was talking about). I was told to order a newer version of the motherboard that came with the cable (no X99 boards do). How about looking on eBay? Or try a third-party Thunderbolt card with a five-pin cable.
At this point I’ve had it with Asus. But I’m stuck and would like to know if there’s any other way to get the card to work with the X99 Deluxe, or if a thirdparty card is available that will work with the five-pin header. Thanks for all you do.
–Tony Paradowski
THE DOCTOR RESPONDS: This isn’t the first complaint the Doc has fielded about Asus’s Thunderbolt products, Tony. But you’re right—that mobo and the ThunderboltEX II/DUAL should work together.
Company representatives say that customers with the older kit who contact the Asus service department are eligible to receive an adapter. Indeed, they claim many of these have already shipped out. The Doc knows some folks over at Asus who should be able to rectify the situation for you. Expect an email follow-up from one of them in the days to come.