Nvidia's GeForce GTX 970 GPU was launched alongside the GTX 980 late last year. It was seen as the value-focused choice, and cards based on it were surprisingly inexpensive. GTX 980 cards cater to those who want only the best and are willing to pay any amount to play today's games at greater-than-HD resolutions with all the settings bumped up to the limit. The GTX 970, on the other hand, is a more sensible option for most people. It should have enough power for gaming at 1080p or greater resolutions with reasonably high quality settings. It's still a high-end GPU, but with an element of moderation.
The GTX 970 and 980 GPUs are based on the same silicone - a processor codenamed GM204. The M stands for Maxwell, Nvidia's internal name for its current-generation graphics architecture. Maxwell improves upon its predecessors Kepler and Fermi in terms of not only performance, but also power efficiency.
As is common industry practice, GTX 970s are essentially identical to GTX 980s, but with certain subsystems disabled. Due to the highly parallel nature of graphics processing, this just means there are fewer resources, thus reducing overall power. Apart from making the manufacturing process a lot easier, this allows Nvidia to get some use out of chips with minor defects in specific areas, which would otherwise have needed to be tossed.
Graphics cards based on both GPUs launched to much critical acclaim, and the GTX 970 in particular has sold in huge numbers primarily due to the apparently minimal differences between it and the powerhouse GTX 980 (Review). However just recently, it has emerged that there is in fact one significant difference between the two, which Nvidia failed to inform anyone of till it was discovered by users and posted online. We'll factor that in to our review as well.
The Gigabyte G1 Gaming GTX 970
Graphics cards come in a surprising variety of shapes and sizes; even ones based on the exact same GPU. The most common thing to do is to use a custom-designed cooler rather than the GPU vendor's stock design, and companies put a lot of effort into making their coolers look good and perform well.
Gigabyte has been promoting its Windforce cooling systems for quite a few graphics card generations now. The current design uses specially shaped fan blades designed to reduce turbulence and noise without compromising on airflow. There are solid copper heatpipes which make direct contact with the GPU, which Gigabyte says improves heat dissipation. There's no vent on the rear, so keep in mind that all hot air is expelled inside your PC cabinet.
In terms of appearances, this is one hefty card. At 312mm, it's a lot longer than stock cards, so you should make sure you have enough clearance for it in your case. The GTX 970 probably doesn't need such an elaborate cooling system, but it leaves headroom for overclocking, and Gigabyte knows that gamers like to have bragging rights. The black sheath has a very aggressive styling and is designed to show off the fans, aluminium fins and copper heatpipes.
The Windforce logo on top lights up in blue, and you can control lighting effects with a bundled Windows utility. There's also a G1 Gaming logo printed on the black backplate. All in all, this is a card that was designed to be shown off.
You'll need one 6-pin and one 8-pin PCIe power connector as opposed to the reference design's requirement of two 6-pin connectors. Gigabyte includes Molex adapters for both, just in case you're using an older power supply. There are two SLI fins for multi-card setups in the usual spot on the upper edge of the circuit board.
On the back of the card, you'll find quite a large selection of video outputs. There are three modern DisplayPorts (v1.2), one HDMI port, one dual-link DVI-D port and one dual-link DVI-I port which allows for a VGA monitor using an adapter (not included).
Gigabyte says that up to four screens can be used at the same time usingvarious output combinations. Gigabyte ships the card with plastic dust stoppers for the HDMI and DisplayPort outputs, which we thought was a great touch.
Specifications and the VRAM issue
Nvidia's stock GTX 970 specification calls for a base clock speed of 1.05GHz and maximum boost speed of 1.178GHz. Gigabyte has stepped up to 1.178GHz for the base speed and 1.329GHz for the boost speed, out of the box. There's also the assurance that Gigabyte tests GPUs individually and reserves the best-performing ones for its G1 Gaming series cards. This should provide a nice bump compared to stock models. The card has 4GB of GDDR5 RAM running at 7GHz on a 256-bit bus.
As we saw in our review of the GeForce GTX 980, the 2,048 programmable CUDA cores in the GM204 chip are divided into four clusters of four SM units each. Whereas all 16 SMs are functional in chips that become GTX 980s, three are disabled for a total of 13 functioning SMs in GTX 970 chips. This is just as expected, and there should be a perfectly proportional difference in power between the two.
However, things get murky in another part of the GPU - Nvidia originally claimed that there was no corresponding reduction in the number of render operation pipelines (ROPs) and the associated amount of L2 cache memory. Previous generation cards would have had these numbers reduced proportionately as well, but Nvidia stated that Maxwell allows for finer control than its predecessors did.
As it turns out, that wasn't true at all. The GTX 970 has only seven-eighths the ROP and cache strength of the GTX 980. There are actually 56 ROPs, not 64, and 1.79MB of L2 cache, not 2MB. Since the GPU's pathway to its RAM goes through the cache, this means there only 3.5GB of the VRAM is addressed directly, which is one-eighth less than the claimed 4GB.
Nvidia's cleverness lies in the way it masked this fact and continued selling GTX 970 cards with 4GB of VRAM. Cards do have 4GB, but partitioned into two different pools: seven-eighths running normally through seven L2 cache blocks, with the last one-eighth of it accessible through a connection spliced on to the second-last L2 cache block. The result of this is an inefficient pathway from the GPU to the VRAM, and a hierarchy wherein part of the memory is a lot slower to access. Things work just fine as long as only 3.5GB of VRAM is needed, but as soon as that threshold is crossed, the performance hit resulting from one L2 block serving two pools of RAM can be massive.
Nvidia might have hoped no one would notice, or that we'd all be fine with its "error" in communicating the ROP count at launch time, which is what it now claims happened. The company also points out that there is still technically 4GB of VRAM, and makes the assertion that stepping down to 3.5GB would have reduced performance further. Even with the benefit of doubt in that this was a mistake rather than deliberate misdirection, we're disappointed that Nvidia didn't bother issuing a correction till it was caught.
Performance
We tested the Gigabyte G1 Gaming GeForce GTX 970 with an Intel Core i7-4770K CPU and 16GB of DDR3-1600 RAM from Adata, all plugged in to an Asus Z87-Pro motherboard. Our test system also used a 1,000W power supply and closed-loop Nepton 280 CPU cooler from Cooler Master, an Adata SX910 SSD, and a Dell U2711 1440p monitor. The driver version installed was 347.25.
Installation in our test system was a breeze, though people with even regular-sized ATX cases might find that they need to move their hard drives around in order for this card to fit. We started off with 3DMark's Fire Strike test, and recorded a score of 10,104 overall. This is only very slightly lower than the 10,690 points scored by the GTX 980. Of course it should be kept in mind that our Gigabyte G1 Gaming test card is factory-overclocked.
Tomb Raider's built-in benchmark had posed no challenge to the GTX 980, which ran comfortably at the 60fps cap even at 1440 with settings at Ultra and 16xAF. The GTX 970 ran exactly the same, with the minimum never dropping below 58fps. This is a pretty good sign that you could save a lot of money by going with the GTX 970 over its bigger sibling and not see any significant degradation in reasonably modern games on a single monitor.
Star Swarm is an extremely demanding benchmark that simulates thousands of objects moving around in 3D space at high speed, and got an average frame rate of 63.55fps. While the GTX 980 delivered 74.2fps, it's interesting to note that Nvidia's previous-generation flagship, the GTX 780, scored 63.77fps in the same test using the same hardware.
We then moved on to real-world game tests, measuring actual play sessions with the FRAPS reporting tool and analysing both frame rates and frame times. While the frame rate gives us a single, averaged number representing raw performance, frame time addresses the variance between frames to account for intermittent stutters and tears that degrade the gaming experience.
Battlefield 4 ran at an average of 59fps at 2560x1440 using the Ultra quality preset and 4xMSAA. The average frame time was 17.1ms and 99.9 percent of frames came in at under 22.1ms. With the resolution turned down to 1920x1080 and using the High quality preset, the average frame rate zoomed to 136, with much tighter timings of 7.37ms (average) and 11.1ms (99.9th percentile).
Crysis 3 can really push today's hardware, and so we tried it out at 2560x1440, Very High quality and 8xMSAA. We saw only 23fps with an average time of 43.7ms and 99.9th percentile time of 57.5fps. This was below what anyone would consider smooth playability, and so some compromises were necessary. At 1920x1080, High quality and 4xMSAA, the game ran at a much smoother 62fps, with 16ms and 22.3ms timings respectively.
While power consumption is low thanks to Nvidia's engineering, we were concerned about noise that Gigabyte's Windforce cooler would make with its three fans. We're happy to report that even under heavy stress, it wasn't too loud. While it definitely wasn't whisper-silent, it was still much better than the GTX 780 we tested with its stock cooler pushing hot air out through the rear.
Verdict
Nvidia has modified the ROP and L2 figures in its official spec following widespread outcry online, and card manufacturers are expected to relabel their products going forward. Those who have already bought GeForce GTX 970 cards would be justified in feeling as though important information was withheld. There is no flaw or manufacturing defect, but it is a new design that has significant drawbacks, and as such Nvidia has been negligent at best.
The GTX 970 is still a very powerful GPU and cards such as the Gigabyte model we reviewed are still very attractive in terms of price, performance, power consumption, heat and noise compared to AMD's current offerings. Users will run into the 3.5GB wall if they push their cards hard, such as in multi-monitor or 4K setups. Games releasing over the next few years will be demanding, and it isn't unreasonable to expect a high-end graphics card purchased today to be able to handle them at high quality settings. It remains uncertain how well the GTX 970 will hold up.
On the product side, Gigabyte has done a pretty admirable job of souping up the GTX 970 GPU. The G1 Gaming model we reviewed is certainly powerful - consider that its scores are not too far off from those of the GTX 980. This model is definitely way more expensive than vanilla GTX 970 cards available from other manufacturers, so it's only worth it if you value good looks, low power consumption and features such as the flexible multi-screen support.
Street prices are lower than MRPs and fluctuate often. If the Gigabyte G1 Gaming's tag seems too steep, any other GTX 970 card is also still a good buy. On the AMD side of the fence, Radeon R9-290 cards are also in the same neighbourhood in terms of price and performance but will run hotter and be noisier.
Cons
Ratings (Out of 5)
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.