NVIDIA GeForce GTX 980 Review – GIGABYTE G1 Gaming

NVIDIA GeForce GTX 980

Beloved readers, we are gathered here today in the presence of these witnesses, to join NVIDIA GeForce GTX 980 and the hard core gamers in holy PC Gaming union, which is commended to be honorable among all men; and therefore is not by any to be entered into unadvisedly or lightly, but reverently, discreetly, advisedly and solemnly. Into this holy estate these two come to be joined. If any person can show just cause why they may not be joined together, let them speak now or forever hold their peace. Monstru, September 19, 2014

No, no, even though my big beard and my deep singing voice would recommend me for the job, I didn’t become a minister in the last few days. I did however get married recently so maybe that is why those words stuck to my head. Or…who knows, maybe because the union between a gamer and his graphic card can be described as a short marriage. Don’t laugh boys, I am serious. Unlike long lasting marriages, which cannot be taken into consideration is this situation, short marriages have more similarities to this relationship than anything else.

When a new GPU is out, you will research it, think about it all day, look for it, want it with passion and struggle to get it. Just like when you see a girl you like… After that, in the beginning all that you will do will be to play with your new GPU, see what it can do and brag about it to your friends. Just like with a new girlfriend. Then the crazy passion will start to fade away, but you will still be happy with your nice, reliable GPU. And even if some other GPU will be launched, you will still be faithful to your old, beloved GPU. Just like you do in the first months/years with your new girl. But here is where the difference from a long lasting marriage occurs. In that case, this is the point where we say “they lived a long and wonderful life together”.

But as I said, in the case of a gamer and his GPU, it’s not like that. No sir… After the peaceful years the gamer begins to crave for something else. Sure, that GPU you saw last year didn’t make you wish to get rid of your old GPU, but this new one… Oh boy, it’s so sexy and powerful and it has all those features you GPU doesn’t… Just like it happens to some guys after the first years of marriage. You had your happy years now you are looking for something more sexy and new and exciting and…oh well, you get the idea. So you dump your old faithful GPU and get a new one. And the cycle repeats.

And you are in luck, because compared to a marriage you didn’t loose half of your belongings to the old GPU. All you had to pay was that classic “699 USD” or something like that and bam… you got a new GPU. Hmm… maybe is better to stick to your wife / girlfriend when it comes to your couple life, and only switch GPU’s once every year, what do you say? That sounds like a more sensitive thing to do, no? I mean, when we look at it like this, it’s a small price to pay for that new thrill you have when you buy it, take it out of the box for the first time and then start enjoying it with full passion. And just to be clear, I was talking about the new GPU, not a girl…

So why are we here today? Well, well, well… It’s time for a new GPU boys! Get your wallets ready, your hopes high and get ready to crave for something new. ‘Cause Nvidia just launched the new Maxwell based GeForce 9xx generation and you boys are in for a new treat. Or treats, if you have the cash. Because unlike wives, GPU’s you can actually have 2, 3 or 4 at the same time! Are you ready??? Do you really want it??? Well… here it goes….


GeForce & beyond

… but not before we talk about GeForce for a bit. I know, I am such a tease. He, he, he… What is it there to talk about GeForce, you would ask, we all know GeForce. Yes, of course we all know GeForce, I think it is one of the best known brands to tech enthusiasts along with the likes of Pentium, Windows, Google or iPhone. And you know I am right, GeForce is a strong name that is synonym with graphics processing, powerful video cards and PC Gaming in general. And I think that is something to appreciate because it is not that easy to turn a product, a brand or an idea into something that all those even slightly interested in an area will instantly recognize and associate with a desirable product.

Maybe you don’t realize it, because you are hard-core enthusiasts, but when you say “Pentium” to a less informed person looking for a PC, they will still consider that to be the best because of the huge brand that Pentium is to people. The same thing happens to GeForce… It literally does not matter if we talk about GT 440 or GTX 780 Ti… It’s GeForce, so it must be good! Why am I addressing this issue in this review, if all of us hard-core tech gurus know it already? Because I think it means something. And because I think that it tells us something about the company behind the brand and the people behind the company. And I think that these are aspects that matter now, more then ever, because we are literally sitting at a crossroad for PC Gaming, PC’s and technology in general.

No, no, I am not with that scared bunch that dooms our beloved PC… I know that when it comes to technology, most of the time you have parallel evolutions or synergies. Parallel evolution means that PC Gaming and console gaming have been evolving together since the 70s and everything is fine and dandy, with consoles hogging the AAA titles and the PC dominating the eSports arena. Synergy means that many times I have heard that laptops will kill desktop PC’s and tablets will kill laptops. And there we are now, happily using desktops and laptops and tablets. And all of those above mentioned segments grew in the last few years, except the “399” desktop which honestly I am happy is slowly fading away, for obvious reasons.

In the last 20 years we have seen many revolutions (the internet, the smartphone, etc) and our lives have changed completely, from the way we work to the way we interact or relax. And all of that was possible because of all these new technologies, materialized in the form of hardware or software breakthroughs. But one can only wonder… was this it? Are we to expect a period of refinement of existing technologies and no more revolutions in the near future? Well, knowing how the business works I would say no. Because the people behind the brands know very well that they have to keep innovating. They know the success equation is constantly changing and that there will always be someone there to come with the next revolutionary thing, if they don’t do it first.

And I think that the folks at Nvidia are very well aware of that, so in the last years the company has embarked on a trip where building high-end GPU’s is not the only goal, and offering graphics performance and quality is not the only mission for the company. That is why we had seen graphics processing, that is why we have seen Nvidia powered mobile devices, that is why we have seen Nvidia portable consoles and Tegra powered cars. There is something more to Nvidia than just building graphic cards and I think that has become pretty obvious in the last years. So every move the Santa Clara giant does, must be looked at keeping that in mind. And yes, today we will be looking at a new GPU launch, but also at some new technologies and features. And this new GPU, and these new technology and features may be a part of some greater ecosystem at one point…




Our readers are very well aware that every new GPU launch also brings tons of new technologies, things that each company tries to transform intro a personal advantage. And being a serious website, result and performance oriented, we don’t pay much attention to the marketing gimmicks. However, we always give you inside regarding the new tech, because sometimes the new technologies do have an impact. This is the case today, so I will briefly take you through some concepts like G-Sync, Dynamic Super Resolution, MFAA and VXGI.

Of course, G-Sync needs no more introduction, since my colleague Matei presented the technology one year ago after attending the Montreal conference where it was introduced. I just wanted to stress out that G-Sync is really a technology that improves the gaming performance, and it does so without affecting performance. With G-Sync, you can finally say good-bye to tearing and stuttering, and a new smooth gaming experience can be attained because of the framerate synchronization between the frames rendered by the GPU and the frames shown by the monitor. Of course, at the moment there are two trade-offs – it is Nvidia proprietary and the monitors can get pretty expensive, at least in the case of high-end models like the 4K, G-Sync Acer XB280HK. However, a 1080p G-Sync monitor is not that expensive and can really help improve you gaming experience.

Dynamic Super Resolution

Unlike G-Sync, which you already know, DSR is a new concept introduced with Maxwell and it addresses image quality in a different way. Taking advantage of the enormous extra processing power we have already seen on the likes of GTX 780Ti, DSR renders the image in 4K (3840×2160) then down scales it to 1080p and applies a 13 tap Gaussian filter. One could think that this is just another gimmick to make yo use that GPU more then you should, but in reality, DSR does bring some improvements in certain scenarios, like rendering grass. Why do I say that?

Well, take a look at the scene bellow. Because there are not enough samples to correctly render each piece of grass in 1080p, when you move around it will have an annoying flicker like feeling, because each individual grass will appear or not accordingly to the number of samples that intersect the place where the piece of grass should be. If you render the scene in 4K, then you will have much more samples, so each individual grass will be rendered more close to reality even when you down scale to 1080p. The resolution is called 3840×2160 DSR and it can be set using the game menu or GeForce Experience.







Boy, does Nvidia like to play with AA… It seems that once every two launches they also introduce a new AA mode, and this time it is the case of Multi-frame Sampled AA. What does it do? Well, first of all we all know what AA does and how it works. But we also know that 4x MSAA or 8x MSAA can really give a performance hit in some games even with the most powerful graphic cards. So Nvidia went on and developed a new way of AA in order to get a similar image quality with MSAA but for a much less performance cost.

How does that work? Well, it is not that complicated. Instead of multi sampling each frame, MFAA is a temporal effect, that takes into consideration what is happening in two consecutive frames. For instance, we have an example bellow which shows how MSAA 4x works and looks, and how MFAA 4x works and looks. MSAA 4x takes 4 samples for each pixel in order to properly define edges. According to the number and position of samples, a pixel can be black, white or grey. MFAA does the same but it uses only 2 samples per pixel, but it changes their position from one frame to the other. That means that in theory you will get a similar effect with MSAA 4x but with a performance hit comparable with that of MSAA 2x. This looked pretty convincing in Nvidia’s presentation, now I am curious how it will look in most games in real-life.






VXGI, or Voxel Global Illumination, is a bit more complicated but the final implications could drastically improve the way computer graphics is done. In laymen terms, VXGI uses a comnibation of voxels, pathtracing, ambient occlusion and diffuse and specular effects in order to simulate real time global illumination with dynamic geometry and lights. Basically, this is achieved by overlaying a voxel map over the scene, and using each voxel as a container for light intensity, color and direction. Again, we’ll see how this will work in new game engines.




Oculus VR

Of course, Oculus VR is not an Nvidia technology, but it is a technology of the future for PC Gaming and Virtual Reality so Nvidia is working close with them in order to assure a smooth experience. Personally I didn’t want to try out Oculus in the first stages because I knew that in the beginning the technology will not be close to it’s real potential and I might be disappointed. And I had the right approach. Back in Nvidia’s Editors day I had the chance to try Eve Valkyrie and Unreal Engine 4 with GTX 980 and the second generation Oculus goggles and I was not disappointed.

The teams have been working together on a new technology called Asynchronous Warp, that pre-prepares a frame when you are going to look in that direction, thus reducing the latency in such a way that the whole experience feels more realistic. Both Eve Valkyrie and Unreal Engine 4 demos look impressive with Oculus and I was quite impressed with the UE4 one honestly speaking, because you really get the feeling that you are there. And that’s what VR is all about! I am eager to see how this technology will evolve, together with the upcoming DX12 in Unreal Engine 4.






We already tasted the highly efficient Maxwell architecture when we reviewed the Geforce GTX 750 Ti, based on the GM107 GPU. It was pretty obvious since that time that we will see this architecture scaled up, because the potential was pretty big.


The GM204 core that powers up the new Geforce GTX 980 / 970 is based on the same architecture but is equipped now with 16 SMMs, totaling 2048 CUDA cores with 16 geometry units and 128 texture units. To compete with GTX 780 and R9 290 class graphic cards, the GM204 has some extensive memory architecture optimizations to reduce the bandwidth needed. This allowed NVIDIA to further reduce costs and use only a 256 bit memory subsystem, but running at a hefty frequency of 7.0GHz.


Another big improvement in terms of performance is the doubling of ROPs from 32 to 64 units, in the past this was reserved only for 512 bit cards. Don’t forget that the GTX 980 is the succesor of the GTX 680 and we have SMs with twice the performance per watt and 40% better performance per CUDA core. The big picture shows us that the GM204 has twice the performance of the GK104 (GTX 680 / GTX 770).


The GM204 itself is more complex than its succesor having 5.2 billion transistor, compared to just 3.54 billion for the GK104. PCB complexity remains about the same, while the TDP went down from 195W to 165W although the GPU clocks went up with more than 100MHz. Great engineering masterclass from the NVIDIA team in terms of performance per watt if you ask me. Can’t wait to see what the Maxwell based replacement for GTX 780 Ti will look like and also the AMD response to this…




 GTX 780R9 290XGTX 980GIGABYTE GTX980 GTX 780 Ti

GPU clock863MHz1000 MHz1126 MHz1228 MHz876 MHz
GDDR clock1500 MHz1250 MHz1750 MHz1750 MHz1750 MHz
Boost clock900 MHz1000 MHz1216 MHz1329 MHz928 MHz
Memory3072 MB4096 MB4096 MB4096 MB3072 MB
Bus384 bit512 bit256 bit256 bit384 bit
Shader processors 23042816204820482880
Process28 nm28 nm28 nm28 nm28 nm
Power6 + 86+86 + 68 + 86 + 8
TDP250 W290 W165 W165 W250 W
Die size550 mm2 438 mm2398 mm2398 mm2550 mm2
Transistor count7.1 billion6,2 billion 5.2 billion5.2 billion7.1 billion


As you know, Nvidia launched two cards today, namely GTX 970 and GTX 980, however we will only focus on GTX 980 in this review, and GTX 970 will be tested in a future review. This does not mean that we are testing only one card today, no sir. We will be testing Nvidia’s reference design but also a cool implementation from GIGABYTE, the first member of the G1 Gaming graphic cards series, GIGABYTE GTX 980 G1 Gaming, code name GV-N980G1 Gaming-4GD.

Based on the new GM204 Maxwell core, Nvidia GTX 980 is an interesting beast that reminds us more of GTX 680 than GTX 780. I say this because it is pretty obvious that GTX 980 is a very relaxed implementation of Maxwell and there is still room there for a much more monstruous chip. GTX 980 has 2048 shader processors and 64 ROPs, and while the memory quantity is bigger than what we could see on GTX 780 and GTX 780Ti (4048MB vs 3072MB) the bus is reduced to 256 bit, compared to 384 bit.

However, the clocks are higher, with 1126 MHz base clock, 1216 MHz boost clock and 1750 MHz GDDR5, for the reference version. The GIGABYTE GTX 980 is clocked even higher, with 1228 MHz base clock and 1329 MHz boost clock, so that will be fun to see how it performs in our tests.

GM204 is manufactured on 28nm, it has a die size of 398 mm2 and a transistor count of 5.2 billion. With a TDP of only 165W, I think that it is pretty obvious that there are some huge monstruous Maxwell versions in some drawer in Santa Clara, that are just waiting to be launched. Just like it was with Kepler…




Usually we don’t talk about the package for the reference design because it has no relevance to what you guys will find in retail, but this time I decided to include a picture with the reference design to show that even the box is more slim and elegant with what we could see in the last two years. The devil is in the details and Nvidia didn’t leave any detail untouched with this launch.


The weapon

And here it is…the new, the extraordinary, the never seen before Nvidia GTX 980… Ah…wait… it has been before many times because this is the same cooling system as the one on Titan, GTX 780 and many other Nvidia cards from the 7xx family. Oh well, at least the I/O shield is different and there is a back-plate. And we also notice two more differences – only 2 PCI-E 6 pin connectors and a new set of video connectors on the back, namely one DVI, one HDMI and 3 Display Port connectors. Honestly, this grey design is truly sexy and I think it will be very difficult to find a better looking solution in the future.






Cooling system

The cooling system is similar to what we have seen on GTX 780 Ti, GTX 780 and GTX 770, and knowing that the card only has 165W TDP we can expect lower temps or lower noise levels on this beauty. The back-plate does not play an active role in cooling the card, it just helps protect the back of the card better but you can also remove the upper left part to further improve the inside airflow of the card.0




Under the hood




The naked card shows us a VRM that is similar to the one that is implemented in the GTX 680, borrowing some technology from the big brother GTX 780 Ti. To be precise, we have a 4 phased On Semi 81174 based VRM with discreet components. Although, similar to GTX 780 Ti we have actually more drivers and MOSFETs on the PCB, in this case we can see 6 independent phases but just 4 R22 chokes. I can only assume that NVIDIA is using some kind of phase alternating technique to prevent overheating and overcharging, but don’t take my word for it.

The memory is fed by just one phase VRM with a choke, 3 n-channel MOSFETs and filtering caps. The GDDR5 memory comes from Samsung and it’s rated for 0.28ns, this means 1750MHz right out of the box. Nice to see Samsung for a change and thanks God it’s not Elpida!






The GIGABYTE GTX 980 G1 Gaming is a different story, because it is a full retail product so we have a nice looking box with everything needed inside. In this case, by everything needed I mean the card itself and two PCI-E adapters.




The weapon

GIGABYTE continues to use it’s excellent Windforce cooling system so you may already know how their GTX 980 looks like. There are some differences to the reference card if we look at the connectors, mainly due to the fact that GIGABYTE GTX 980 G1 Gaming has 2 x 8 pin connectors and also it has 2 x DVI connectors besides the 3 x DP and 1 x HDMI.






Cooling system

Windforce is a very popular cooling system with our readers – you have seen it on many successful GIGABYTE implementation and we all know it keeps the card quiet and cool. With GIGABYTE GTX 980 G1 Gaming we have 6 massive heatpipes which transfer the heat to a solid copper base, which is connected to the aluminum radiator. The VRM is cooled directly by the radiator, while all the heat is “scared” away by 3 fans. For protection, the card is also fitted with a back-plate, even though it does not play an active role in the cooling of any component.




Under the hood

From the first look of the PCB we can see that this card means business, being equipped with 8 phases for the GPU VRM. The controller is the same ON Semi 81174 but Gigabyte uses doublers to get the needed amount of phases and although I’m not a fan of this design it gets the job done flawlessly and better than the reference model. Three MOSFETs per phase, one choke and tantalum caps complete the rest of the phase.

On the memory side we have the same Samsung 0.28ns GDDR5 memory but this time it’s powered by a 2 phase VRM with On Semi 4901NF high-side / low-side MOSFET package and On Semi 81172 controller. It’s pretty easy to see that, compared to the reference by NVIDIA, this card is pretty beefed up VRM wise.








CPUIntel Core i7 4790K
MotherboardGIGABYTE GA-Z97X-OC Force
RAMGEIL Evo Corsa 8GB DDR3-2133 CL10
CoolingNoctua NH-D14 + 2 x Coolink SwiF 120P
HDDIntel SSD 730 240GB
PSUSeasonic P1200
CaseHSPC Top Deck Station
Room temp25 oC
OSWindows 7 Ultimate x64 SP1
AMD DriverCatalyst 14.4 WHQL
Nvidia DriverForceware 344.07 BETA
CPU clock4500 MHz
DDR clockDDR3 2133
Timings9-11-9-28 1T

In order to get the best performance from such a beast, we used a PCI-E 3.o platform, based on a Devil’s Canyon CPU. The Intel Core i7 4790K CPU was clocked at 4500MHz, being cool and cozy with 1.2v and a Noctua NH-D14 on top. We used a GIGABYTE Z97X-SOC Force motherboard and a dual-channel GEIL Evo Corsa 8GB DDR3-2133 CL10 memory kit set at DDR3 2133 9-11-9-28. The PSU is the nice and shiny Seasonic P1200 while the “case” was a HSPC Top Deck Station bench table.

The OS, Windows 7 Ultimate x64 SP1. was installed on a Intel SSD 730 240GB. Of course, we used the latest AMD driver at the time of the test (Catalyst 14.4 WHQL), and the latest Nvidia driver available at the time of the testing (Forceware 344.07 BETA). We are using only original SO, benchmark and games, regularly updated.

We tested all graphic cards with our usual test resolutions (1920×1200 si 2560×1600), with two quality settings for each (8xAA and 2xAA), replacing them with 4xAA and NoAA when the game only offered this option. All details were set to maximum, as long as this meant that we were using identical settings for AMD and Nvidia cards.


3DMark Vantage

3DMark Vantage

3DMark Vantage GPU

The GTX 980 duo starts the battle with full force, surpassing the old GTX 780Ti beast without too much problem. Of course the difference is not something to make noise about, but still it is impressive that a chip that small can outperform a beast like GTX 780Ti.

3DMark Vantage X

3DMark Vantage X GPU

3DMark 2011

3DMark 2011

3DMark 2011 GPU

The gap widens quite a lot in 3DMark 2011, and the “little” GTX 980 shows it’s real fangs. It’s true though, the base clock is pretty darn impressive to start with.

3DMark 2011 x

3DMark 2011 x GPU


3DMark Cloud G

3DMark Sky D

The difference in 3DMark varies between 10 and 20% and it is pretty clear that benchmark wise the GTX 980 duo definitely wins the battle.

3DMark Fire S

3DMark Fire S Extreme

Far Cry 2

FC2 2560 8x

FC2 2560 2x

We start the games with the old Far Cry 2, and GTX 980 together with GIGABYTE GTX 980 manage to get a solid 10% advance on GTX 780 Ti. Now it’s time to see something more stressful…

FC2 1920 8x

FC2 1920 2x

Crysis 2

C2 2560 4x

C2 2560 NoAA

Another solid win for the dynamic duo and it starts to be pretty clear that Maxwell is one pretty darn efficient architecture. Yes sir… there is much force in this one.

C2 1920 4xAA

C2 1920 NoAA


Hitman Absolution

HA 2560 8xAA

HA 2560 2xAA

In Hitman Absolution the game changes and we see Radeon R9 290X surpassing all Nvidia solutions. And that would not be the problem, but GTX 980 does not perfom that well compared to GTX 780Ti in this specific benchmark.

HA 1920 8xAA

HA 1920 2xAA

Alien vs Predator

AvP 2560 4xAA

AvP 2560 NoAA

Alien vs Predator is a pretty hard nut to crack, and the top is pretty clear here: GIGABYTE GTX 980 > GTX 980 > Radeon R9 290X > GTX 780 Ti > GTX 780.

AvP 1920 4xAA

AvP 1920 NoAA

Bioshock Infinite

BioSH i 2560

Apparently Bioshock is not really AMD friendly, but that aside the GTX 980 duo get back in their game and outperform GTX 780 Ti once again.

BioSH I 1920

Batman Arkham Origins

Bat Ao 2560 8xAA

Bat AO 2560 2xAA

It is a strong fight between R9 290X and GIGABYTE GTX 980 in Batman: Arkham Origins, with GTX 980 and GTX 780Ti following close by.

Bat AO 1920 8xAA

Bat AO 1920 2xAA

Sleeping Dogs


SD 2560 8xAA

SD 2560 2xAA

Sleeping Dogs shows us the same situation we saw in the first games, with GIGABYTE GTX 980 and GTX 980 leading the squad.

SD 1920 8xAA

SD 1920 2xAA


FarCry 3

FC3 2560 8xAA

FC3 2560 2xAA

And we see the same situation again in Far Cry 3… Impressive I might add…

FC3 1920 8xAA

FC3 1920 2xAA

Battlefield 4

B4 2560 4xAA

B4 2560 NoAA

I personally find the difference between GIGABYTE GTX 980, GTX 980 and the other cards pretty disturbing in this benchmark. And no, I double checked with the older GTX 780TI BF4 results and these ones are actually better.

B4 1920 4xAA

Ba 1920 NoAA

Metro Last Light

MLN 2560

We finish our performance tests with Metro Last Light, which shows us exactly the same situation we have seen before – GIGABYTE GTX 980 leads the platoon by far and GTX 980 is followed closely by GTX 780Ti.

MLL 1920

Power consumption




In order to measure the power consumption, we used a dedicated tool, leaving the system in idle for 10 minutes. Then we ran Crysis 2 in 1920×1200 4xAA. The numbers you see above represent the power consumption of the whole system, not just the VGA, and we are talking about the highest value recorded during our test.

That means that the average power consumption is considerably lower, but we are looking for the peak value, the worst case scenario, the maximum possible stress for our PSU. Obviously the total power consumption of the system can be even higher if you stress even more components (storage, RAM, etc), but we are looking for a maximum value relevant to daily use scenarios, aka gaming.

And there you have it, ladies and gentlemen, progress… Same performance with GTX 780 Ti with 76W less power consumption makes us realize that both a GTX 990 or some GM110 monster are a very real possibility.




The temperature levels we reach are not representative for the temperature levels you could get in a normal system, in an enclosure. If you have a poorly ventilated enclosure you could get much higher temps, while if you have a very well ventilated enclosure you could get better temp numbers. The purpose of this chapter is to compare one graphic card to another, not to get absolute numbers.

We used EVGA Precision to measure the temperature levels of the GPU and the fan speed, both in idle and in full load, in the same situation we used in the power consumption test. As you would expect, the GTX 980 is on par with GTX 780Ti and GTX 780 (same threshold), but GIGABYTE GTX 980 G1 Gaming really shines, with a mere 63 oC in full load…


Noise levels




We measure the noise level using a professional Voltcraft meter, namely SL200. The meter is fixed on a tripod at 85cm from our test platform, above the CPU cooler. For this test, the CPU runs fanless at 3GHz. The only fan besides the one from the graphic card is the one from the PSU.

The tests are done in a soundproof room with a noise level bellow 32dB. Remember – the numbers you see here are relevant to our test platform and you will get different results at home, in a case, etc. Our reference system is built so we can compare various graphic cards in the same conditions. The sound level is measured during 4 runs of Crysis 2 in 1920×1200 with 8xAA. The numbers you see in the graphs are the maximum levels measured during these tests.

Nvidia GTX 980 is a pretty quiet board, maybe one of the quietest high-end VGA ever built, but the GIGABYTE version still manages to get a little better reading from the sound meter.

Final thoughts


Well this is one article where the conclusion is pretty simple to reach. We have a new, more efficient architecture that is implemented in a product which takes quite a lot less power then it’s predecessor, has similar temperatures and noise levels, but in many cases has a better performance, and all that with a lower price. The recommended price for GTX 980 is 549 USD, the recommended price for GTX 970 is 329 USD and for the moment, until GTX 960 arrives, GTX 760 will be priced at 219 USD.

The Kepler GTX 780 Ti, GTX 780 and GTX 770 monsters will be discontinued effective right now, so if you didn’t buy any until now, you could get a GTX 980 right now, or wait for some bigger GM210 version. Financially speaking it does not make that much sense to switch from a GTX 780Ti to a GTX 980 right now – remember guys, CPU’s, GPU’s or smartphones, always skip one generation if you really want to get your money’s worth.

Whichever you prefer, one thing is pretty clear – Maxwell is much more effective then anything built so far and bigger (as well as smaller) versions are sure to come at one point or another. The price is also fair for what we get so there is almost nothing we can complain when it comes to this launch.

I would like to point out however that GIGABYTE GTX 980, besides having a better performance due to the higher clocks, also has a more solid PCB and I would like to see that on future Nvidia reference design cards. And if we talk about GIGABYTE, I must say that besides the great performance, there is also the advantage of lower temperatures and noise levels, so even the more perfectionist consumers can find something suitable for their taste. If you ask me, the GTX 980 is a good example of a launch done right.

Well, I hope you liked our journey through the exciting world of GTX 980, and if you didn’t take a look yet, be sure to check out Nvidia’s GAME24 event – there will be some cool announcements there ans also the chance to win some awesome prizes!


No comments la: NVIDIA GeForce GTX 980 Review – GIGABYTE G1 Gaming

    Leave a Reply