Join Us On Facebook

Compare any two graphics cards:

GeForce GTX 780 Ti vs Radeon R9 290X


The GeForce GTX 780 Ti features a GPU clock speed of 875 MHz, and the 3072 MB of GDDR5 memory is set to run at 1750 MHz through a 384-bit bus. It also features 2880 SPUs, 240 TAUs, and 48 Raster Operation Units.

Compare all of that to the Radeon R9 290X, which makes use of a 28 nm design. AMD has set the core frequency at 800 MHz. The GDDR5 RAM is set to run at a frequency of 1250 MHz on this model. It features 2816 SPUs as well as 176 TAUs and 64 Rasterization Operator Units.

(No game benchmarks for this combination yet.)

Power Usage and Theoretical Benchmarks

Power Consumption (Max TDP)

GeForce GTX 780 Ti 250 Watts
Radeon R9 290X 300 Watts
Difference: 50 Watts (20%)

Memory Bandwidth

Performance-wise, the GeForce GTX 780 Ti should in theory be just a bit superior to the Radeon R9 290X in general. (explain)

GeForce GTX 780 Ti 336000 MB/sec
Radeon R9 290X 320000 MB/sec
Difference: 16000 (5%)

Texel Rate

The GeForce GTX 780 Ti is quite a bit (about 49%) more effective at AF than the Radeon R9 290X. (explain)

GeForce GTX 780 Ti 210000 Mtexels/sec
Radeon R9 290X 140800 Mtexels/sec
Difference: 69200 (49%)

Pixel Rate

If running with a high resolution is important to you, then the Radeon R9 290X is superior to the GeForce GTX 780 Ti, by far. (explain)

Radeon R9 290X 51200 Mpixels/sec
GeForce GTX 780 Ti 42000 Mpixels/sec
Difference: 9200 (22%)

Please note that the above 'benchmarks' are all just theoretical - the results were calculated based on the card's specifications, and real-world performance may (and probably will) vary at least a bit.

Price Comparison

GeForce GTX 780 Ti

Radeon R9 290X

Please note that the price comparisons are based on search keywords - sometimes it might show cards with very similar names that are not exactly the same as the one chosen in the comparison. We do try to filter out the wrong results as best we can, though.


Model GeForce GTX 780 Ti Radeon R9 290X
Manufacturer nVidia AMD
Year November 2013 October 2013
Code Name GK110 Hawaii XT
Fab Process 28 nm 28 nm
Bus PCIe 3.0 x16 PCIe 3.0 x16
Memory 3072 MB 4096 MB
Core Speed 875 MHz 800 MHz
Shader Speed N/A MHz (N/A) MHz
Memory Speed 1750 MHz (7000 MHz effective) 1250 MHz (5000 MHz effective)
Unified Shaders 2880 2816
Texture Mapping Units 240 176
Render Output Units 48 64
Bus Type GDDR5 GDDR5
Bus Width 384-bit 512-bit
DirectX Version DirectX 11.0 DirectX 11.2
OpenGL Version OpenGL 4.4 OpenGL 4.3
Power (Max TDP) 250 watts 300 watts
Shader Model 5.0 5.0
Bandwidth 336000 MB/sec 320000 MB/sec
Texel Rate 210000 Mtexels/sec 140800 Mtexels/sec
Pixel Rate 42000 Mpixels/sec 51200 Mpixels/sec

Memory Bandwidth: Bandwidth is the max amount of data (counted in megabytes per second) that can be transported past the external memory interface in one second. The number is calculated by multiplying the card's interface width by its memory clock speed. If the card has DDR type memory, it should be multiplied by 2 once again. If it uses DDR5, multiply by ANOTHER 2x. The higher the memory bandwidth, the faster the card will be in general. It especially helps with anti-aliasing, High Dynamic Range and high resolutions.

Texel Rate: Texel rate is the maximum texture map elements (texels) that are processed in one second. This figure is calculated by multiplying the total amount of texture units by the core clock speed of the chip. The higher the texel rate, the better the graphics card will be at texture filtering (anisotropic filtering - AF). It is measured in millions of texels processed per second.

Pixel Rate: Pixel rate is the most pixels the graphics card could possibly write to the local memory per second - measured in millions of pixels per second. Pixel rate is calculated by multiplying the amount of colour ROPs by the the card's clock speed. ROPs (Raster Operations Pipelines - aka Render Output Units) are responsible for outputting the pixels (image) to the screen. The actual pixel fill rate is also dependant on quite a few other factors, especially the memory bandwidth of the card - the lower the bandwidth is, the lower the potential to get to the maximum fill rate.


53 Responses to “GeForce GTX 780 Ti vs Radeon R9 290X”
Pantelos says:
Yan says:
really? go 780ti vs HD 7990 nab
Joshua A. says:
It's quite funny to be honest, the 780 ti beats an AMD Radeon R9 290x 1 GB of less ram than the AMD Card.
junction says:
Ya know whats even funnier? The 250 dollar price difference. If you want to pay out the a$$ for slightly better performance, Nvidia is your goto card.
Pantelos says:
Ya know what's even funnier, r9 290x costs 549$ and gtx ti costs 600 $ so i think for +10$ i think it's worth the difference

it's really funny to say 250 $ difference lol...

Scott says:
Ya know what's even funnier, That is actually a $50 difference not a $10 but the price varies in different places. I live in Australia and the R9 290x costs $699 and the 780 Ti costs $849.
chisoi says:
I`m form Romania ... GTX 870TI =3.049,99 RON end R9 290x = 2.285,77 RON ; R9 290 = 1.683,08 RON. Nice try NVIDIA !
Gadiel says:
Both cards are close in 4K gaming, it ends in technologies both companies have good ones you have to choose the right for you
FElipe says:
gabriel says:
but in games the r9 290x will be best, because de r9 290 is best than a gtx 780 ti in benchmarks
Mohannad says:
I have done a long research about this. The outcome is screaming 290x. Anyways, AMD and Nvidia set together and set specifications and pricing after consulting with their marketing departments to reach to the mutual goal of squeezing as much money out of us. Do not be a fan boy; be the person who want the best for himself. Pursue happiness without blinding agenda!
G Hanson says:
Seriously titan ti got 7000MHz memory speed..............
ya digg says:
LOL! 5000mhz on such an expensive card... 7000mhz titan just eats 290x alive.
lacrate says:
Na boa AMD n é tao mercenaria...compraria r9 290x ...sobre qualquer outra placa ... custo beneficio acima de td... valorizo meu dinheiro!!!
jimmx says:
look at the memory bandwidth
780 ti 384-bit r9 290x 512
beat that nvidia.!
Raspy says:
I see a lot of ppl saying about memory speed bla bla bla thats only part of the complete card and makes littel to no differince. Ok so you want a graphics card. what rez 1080p/2k/4k
if you have lots of dosh wast it on a 290x/780ti neither of them will run 4k very well thats dual card setup domain atm just look a reviews DoHhhhh.
If 1080p/2k get 780/290 max they will do your job for years.
Forget synthetic benchmarks look for real world game fps then choose what you need. To many fanboys who aint got a clue posting here. Remember most screens are running at 60Mhz so wont display any fps above that. If you are lucky enought to have a higher spec 120mhz screen chances are you wont be worried about coast of a gfx and get the best.
Bottom line is dont be a fanboy just get what makes scence for YOUR OWN use.
Jack says:
The first processor with 64-bit technology. AMD.
The first 2 Core Processor. AMD.
The first 4-core processor. AMD.
The first 6-core processor. AMD.
The first 8-core processor. AMD.
The first Accelerator with GDDR3. ATI.
The first Accelerator with GDDR5. ATI.
The first Accelerator DX10. ATI.
The first Accelerator DX11. ATI.
The first Accelerator with 512 bit memory bus. ATI.
The first dual GPU Accelerator. ATI
The first Accelerator with 4k technology.
History teaches us that AMD and ATI innovate all the time.
Where are Nvidia and Intel?
John says:
really funny all this fanboy fights, nvidia has always launched expensive top notch gpus 1000 titan, but what kinda really matters the most is the mid range 760 770, 280x 290, when AMD totally rapes nvidia by alot including price tag
Jackson says:
Just get the best bang for your bucks!
Vicious says:
Memory Clock isn't important..? It's about the most important thing for GPUs. 5000mhz is pathetic on such a card with such a price tag.. 780 TI wins all around!
Kitten says:
Memory clock isn't everything. Effective bandwidth is, 290X has a wider bus (512bit vs 384bit) than the 780 Ti. 320GB/s vs 336GB/s isn't as massive a difference as the 2GHz difference in memory clock would lead you to believe.

290x is better value. Custom cards will reinforce this. 780Ti is a more efficient, cooler-running card that is slightly faster. All comes down to the price gap between them, which can be as much as £100 here in the UK.

Anonymous says:
@Jack Half of this is false and the lets just look back in the days when AMD and ATI didn't even exist and only Nvidia and Intel were...
GDDR3 technology and GDDR5 are both from intel/Nvidia ATI just made an accelerator for it same with DX10 and DX11, altough I admit 512 bit memory bus is pretty good but it doesn't make a big difference compared to 384 bit,,
And why didn't you mention
"The first Accelerator with 256 bit memory bus. Nvidia.??
Stop fanboying

The fact is fact GTX 780 Ti won here
but Nvidia+Intel is more of a high performance "heavy tank" style for various tasks while AMD+ATI (AMD+AMD nowadays) is for specially for gaming.
This doesn't mean Nvidia+Intel is not good for gaming.
Also Intel is using the factory strenght against the AMD "OC Mania", AMDs are better for overclocking but Intel will function for way longer time.
I mean an AMD/ATI will work for like 2-3 years while Nvidia/Intel will still work after 5-6+ years

So keep that in mind and about price, there is not much to it they are about the same range in both Intel VS AMD and Nvidia VS ATI
the 8-core CPUs of AMD doesnt cost too much less than the i7s


P.S.:Choose for your own desires...

anon says:
9300M is better
anon says:
The Zx Spectrum is better
Anonymous says:
you are wrong.
C64 is better!!!
Márcio Monteiro says:
It's amazing the performance of a graphic that costs almost half the GTX 780 TI and almost has the same performance. Always had Nvidias from the RivaTnT (16-32) but the truth is that AMD is currently by far the best choice. The level of processors is that I still think that is not the height of Intel.
broman says:
For all AMD fans out there:
nVidia could crush AMD. They have probably already designed the chips for 8xx and 9xx series and they could release them even in a month from now. Proof? The 780Ti had been in golden status for months, but they waited for AMD to respond to the Titan. When that happened, all of the sudden they have introduced it. Maxwell (8xx series architecture) has been announced in 2010. 4 years before they will have used it in GPUs. They are just waiting and watching AMD struggle to beat their 8-month-old card. They finally achieve it by running at 95C constantly and sounding like a vacuum Conclusion? AMD is a nooby tryhard, nVidia is a lazy pro. You have to decide which one of them suits you.
shoot-|-here says:
Watercooling the 290X shows what it is truly capable of (as with any card, really).
It really is a matter of taste these days... I have used both; and when it comes down to it I choose to save 20%, and miss out on 5-10% performance?
That is OK with me. Others will disagree, that is their opinion and they are entitled to it. :)
When it comes to cyrpto-currency mining, AMD has it all over Nvidia...if that matters.
Simple says:


nobody says:
The info provided in the majority of the posts here is a joke. People raving about cards that they know very little about, or at least that's how it appears to be judging on the rubbish that's being posted, If you really want to know how the cards perform get current reliable info and stop spouting c**p.

m says:
OLA pessoal se vocês olharem bem
como ficaria a R9 290x com a memoria em 7000mhz so de ver isso e logico que a R9 290x ganha da GTX 780 Ti pois 5000mhz
contra 7000mhz e a R9 290x esta ali uma passo atras da GTX 780 Ti...
wmp007 says:
@Anonymous says:
January 11, 2014 at 8:59 pm

"I mean an AMD/ATI will work for like 2-3 years while Nvidia/Intel will still work after 5-6+ years"

Really? I got a 10 year old AMD system with an ATI GPU and it still works.

Makes no sense that you would post that. Because hardware, no matter where produced, or what company makes it, it can still fail.

Go look at reviews on newegg and see how many people get DOA hardware, from both Nvidia and AMD.

LOLZ says:
@Jack, ATI first dual GPU??? LOLZ, get your facts straight. 3DFX Voodoo 5500 is the first to bring you dual GPU and dual videocard with SLI. Nvidia bought 3DFX's tech when they went bankrupt.
yucatan says:
Nvidia sucks! You nvidia fags probably don't know about Mantle API yet. Bitch please, world's two gaming console are running on AMD chip (PS4 & Xbox One). And yet still able to render those graphics buttery smooth. That's because AMD GCN architecture is far more efficient than whatever-crap Nvidia is using.
Nope says:
Mantle API is not faster than DX12 and good drivers. It is, in fact, a huge waste of time. Next.

NVidia is stupidly overpriced, AMD is stupidly underpowered. I wish this baby steps bullshit between the gpu manufacturers would stop so we can get a palpable increase in graphical power.

Papa Zane says:
Holy shit this is hilarious, guys, they're both good cards, and, frankly, this choice comes down to preference, fanboying either is ridiculous. yes, Nvidia ridiculously overprices their products, we've known this for years, and yes, AMD doesn't hold the same benchmark power, but the thing is they'll both hold up on any game for years to come, so just pick which one ya' like.
Sam says:
Z80 is better!!!!!!!
janagn_marah says:
hey your guys. please fight for cheep price. not brand... don't care who's good and best..
all i want is more power efficient and less expensive.
Vicious says:
It really depends on what setup you're going for. Have 2-Wat SLI GTX 780 TIs and Nvidia has been doing an amazing job with their drivers. I've ran this exact SLI for past 3 months and have had zero problems. If some game would not work to with SLI, I would disable one of the cards and run through one of em which is still plenty of power. I am not a fan boy. My main rig is Intel-Nvidia. I also have second rig that is AMD annd am most likey gonna buy an R9 280X very soon for it. I currently have an OLD ass GTX 260 running in my AMD build and it is working beautifully still. It's not about buying the cheapest, you know. It's whatever your setup up suits the best. You also have to make sure a high end card doesn't bottleneck your CPU. So don't buy a high end ATI card with a pile CPU (that would be pointless).
That one guy says:
Quote Anonymous Dipshit "Half of this is false and the lets just look back in the days when AMD and ATI didn't even exist and only Nvidia and Intel were..."

Amd strated in 1969
Nividia started in 1993

Will says:
Well you cant compare two cards with different prices you have to go r9 290x and gtx 780 not ti
kevan says:
Nvidia wouldnt even exist or have been able to excel into future graphics without the purchase of 3dfx(voodoo) for those of you who remember thats where "sli" came from nvidia is not an innovator they simply took a tech from another company (after they bought out voodoo after bankruptcy) and made it their own, whereas ati always had stable technologies on their own, and it was made even better with the merger to amd it was win win. Look it up if you dont believe me. Frankly its like this if you have intel go nvidia, if you have amd go amd/ati. Seems as though thats what works best together at the moment.
Razorsharp says:
But can they run minesweeper?
admin says:
nah, need SLI or crossfire to get that to run acceptably.
Anonymous says:
Let's just stop this fanboyish argument. Let's put it simple:

Those are the two advantages. Personally, I prefer NVIDIA, but would take AMD if I had to. What I don't like about NVIDIA is price. Too expensive for the performance, but they have the most powerful GPU's out. What I dislike about AMD is their noise and heat. Run wayyy to hot, but can be water cooled for awesome performance.

What I don't see reasonable is that people buy the r9 290x and water cool it. That takes away the whole AMD price advantage. You can buy an r9 290x for about $520. Water cooling costs about $150. That equals $650. Yet the people who do that claim that they bought AMD for price. Yet for $650, you can buy a perfectly good 780 Ti. It doesn't make sense. And buying an r9 290x is irrelevant, as the r9 290, which costs about $100 less, can basically be turned into a "x" version. And for that $400 for the 290, you can get a Windforce edition, which has superb cooling.

But I have to give props to AMD. People are complaining about the whole memory clock 5GHz vs 7GHz, yet even with that huge drop in performance, the AMD still performs 5-10% less than NVIDIA. If they had a 7GHz memory clock, I even have to say that it'd be on the same level and maybe even outperform the 780 Ti.

But, put it simple:

And if you do go AMD, buy the r9 290 Windforce, and tweak the BIOS settings to make it a 290x basically. But either way, AMD/NVIDIA, or Intel/AMD, you'll get superb performance out of both.

And one last thing: Why are AMD fanboys going against Intel. I'm fine with going against NVIDIA, but Intel? It's funny, because AMD GPU's perform best with Intel CPU's. Best AMD combo is r9 290 with an Intel i5-4670K. Just saying...

Nathan says:
R9 290 Is much worse then the 290x, 290x has DOUBLE the render output processors, 15%+ cores and many other advantages.

Also the memory clocks don't matter here, because the GTX 780ti has 384BIT memory bus, and the r9 290x has 512 bit memory bus... When you overclock both to the limits the R9 290x kicks ass especially at 4k.

Anonymous says:
Just grab a 6670 and be done.
Billy Johnson says:
Billy Johnson says:
An R920x is full of sea-cucumbers. Proven fact!
Jett says:
Reading the arguments in this thread is just amazing. Might as well get some popcorns lol.
Anonymous says:
Alright fanboys let's break it down here, the amd one wins this one, and this from a guy who has a gtx 770, not only do they have a wider bus, more memory for higher resolutions, and a better price, they do this with a bad stock cooling system unless you get a brand like Asus, they use less cores, use less tmu's and have less MHz on the memory speed, YET they can compete with your "king" if amd built a GPU with the same specs as the 780ti then they would obviously win, the only difference here is efficiency, money, and total speed.
Yes I am an nvidia boy, my first GPU was the 7500le, my first CPU was the e6400 from Intel, yet amd does incredible thing using either older technology or less efficient tech, also their price gives us the best bang for buck and a great gaming experience. Everything I just wrote here except my first stuff, is said in their products or on the screen
Flauzer says:
I already have a stove, it is useless to buy another one

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Spam Protection by WP-SpamFree