Submit Benchmarks!

Submit SSD Benchmark
Submit GPU Benchmark

Compare any two graphics cards:
VS

GeForce GTX Titan X vs Radeon RX Vega 56

Intro

The GeForce GTX Titan X makes use of a 28 nm design. nVidia has set the core frequency at 1000 MHz. The GDDR5 memory works at a frequency of 1750 MHz on this card. It features 3072 SPUs as well as 192 Texture Address Units and 96 ROPs.

Compare all that to the Radeon RX Vega 56, which has a clock speed of 1156 MHz and a HBM2 memory frequency of 1600 MHz. It also uses a 2048-bit memory bus, and uses a 14 nm design. It features 3584 SPUs, 224 TAUs, and 64 ROPs.

Display Graphs

Hide Graphs

Benchmarks

These are real-world performance benchmarks that were submitted by Hardware Compare users. The scores seen here are the average of all benchmarks submitted for each respective test and hardware.

3DMark Fire Strike Graphics Score

Radeon RX Vega 56 21011 points
GeForce GTX Titan X 17879 points
Difference: 3132 (18%)

Power Usage and Theoretical Benchmarks

Power Consumption (Max TDP)

Radeon RX Vega 56 210 Watts
GeForce GTX Titan X 250 Watts
Difference: 40 Watts (19%)

Memory Bandwidth

Performance-wise, the Radeon RX Vega 56 should in theory be a lot better than the GeForce GTX Titan X in general. (explain)

Radeon RX Vega 56 419430 MB/sec
GeForce GTX Titan X 336000 MB/sec
Difference: 83430 (25%)

Texel Rate

The Radeon RX Vega 56 is much (more or less 35%) more effective at AF than the GeForce GTX Titan X. (explain)

Radeon RX Vega 56 258944 Mtexels/sec
GeForce GTX Titan X 192000 Mtexels/sec
Difference: 66944 (35%)

Pixel Rate

If using a high screen resolution is important to you, then the GeForce GTX Titan X is superior to the Radeon RX Vega 56, by a large margin. (explain)

GeForce GTX Titan X 96000 Mpixels/sec
Radeon RX Vega 56 73984 Mpixels/sec
Difference: 22016 (30%)

Please note that the above 'benchmarks' are all just theoretical - the results were calculated based on the card's specifications, and real-world performance may (and probably will) vary at least a bit.

Price Comparison

Display Prices

Hide Prices

GeForce GTX Titan X

Amazon.com

Check prices at:

Radeon RX Vega 56

Amazon.com

Check prices at:

Please note that the price comparisons are based on search keywords - sometimes it might show cards with very similar names that are not exactly the same as the one chosen in the comparison. We do try to filter out the wrong results as best we can, though.

Specifications

Display Specifications

Hide Specifications

Model GeForce GTX Titan X Radeon RX Vega 56
Manufacturer nVidia AMD
Year March 2015 September 2017
Code Name GM200 Vega 10 XL
Memory 12288 MB 8192 MB
Core Speed 1000 MHz 1156 MHz
Memory Speed 7000 MHz 1600 MHz
Power (Max TDP) 250 watts 210 watts
Bandwidth 336000 MB/sec 419430 MB/sec
Texel Rate 192000 Mtexels/sec 258944 Mtexels/sec
Pixel Rate 96000 Mpixels/sec 73984 Mpixels/sec
Unified Shaders 3072 3584
Texture Mapping Units 192 224
Render Output Units 96 64
Bus Type GDDR5 HBM2
Bus Width 384-bit 2048-bit
Fab Process 28 nm 14 nm
Transistors 8000 million 12500 million
Bus PCIe 3.0 x16 PCIe 3.0 x16
DirectX Version DirectX 12.0 DirectX 12.0
OpenGL Version OpenGL 4.5 OpenGL 4.5

Memory Bandwidth: Memory bandwidth is the max amount of information (counted in MB per second) that can be moved across the external memory interface in a second. The number is worked out by multiplying the card's bus width by its memory speed. If the card has DDR memory, the result should be multiplied by 2 once again. If it uses DDR5, multiply by 4 instead. The better the memory bandwidth, the better the card will be in general. It especially helps with anti-aliasing, HDR and high resolutions.

Texel Rate: Texel rate is the maximum texture map elements (texels) that can be processed in one second. This figure is calculated by multiplying the total texture units of the card by the core clock speed of the chip. The higher the texel rate, the better the card will be at texture filtering (anisotropic filtering - AF). It is measured in millions of texels processed in one second.

Pixel Rate: Pixel rate is the maximum number of pixels that the graphics chip could possibly record to its local memory in a second - measured in millions of pixels per second. The number is calculated by multiplying the number of ROPs by the the card's clock speed. ROPs (Raster Operations Pipelines - sometimes also referred to as Render Output Units) are responsible for filling the screen with pixels (the image). The actual pixel rate is also dependant on quite a few other factors, most notably the memory bandwidth of the card - the lower the bandwidth is, the lower the ability to reach the max fill rate.

Display Prices

Hide Prices

GeForce GTX Titan X

Amazon.com

Check prices at:

Radeon RX Vega 56

Amazon.com

Check prices at:

Please note that the price comparisons are based on search keywords - sometimes it might show cards with very similar names that are not exactly the same as the one chosen in the comparison. We do try to filter out the wrong results as best we can, though.

Comments

Be the first to leave a comment!

Your email address will not be published. Required fields are marked *


*

WordPress Anti-Spam by WP-SpamShield