RTX Ada Linux Kernel Driver Benchmarks NVIDIA RTX A2000 and A4000 with the MIT GPL versus Proprietary kernel driver options with the NVIDIA R555 (555.58.02) Linux driver. Benchmarks by Michael Larabel for a future article.
HTML result view exported from: https://openbenchmarking.org/result/2407084-PTS-RTXADAKE17&sor .
Processor Motherboard Chipset Memory Disk Graphics Audio Monitor Network OS Kernel Desktop Display Server Display Driver OpenGL OpenCL Vulkan Compiler File-System Screen Resolution Proprietary MIT GPL RTX A2000 RTX A4000 RTX A2000 RTX A4000 AMD Ryzen 9 7950X 16-Core @ 5.88GHz (16 Cores / 32 Threads) ASUS ROG STRIX X670E-E GAMING WIFI (2007 BIOS) AMD Device 14d8 2 x 16GB DRAM-6000MT/s G Skill F5-6000J3038F16G Western Digital WD_BLACK SN850X 2000GB + 64GB Flash Drive NVIDIA RTX 2000 Ada Generation 16GB NVIDIA Device 22be DELL U2723QE Intel I225-V + Intel Wi-Fi 6 AX210/AX211/AX411 Ubuntu 22.04 6.5.0-41-generic (x86_64) GNOME Shell 42.9 X Server 1.21.1.4 NVIDIA 555.58.02 4.6.0 OpenCL 3.0 CUDA 12.5.85 1.3.278 GCC 12.3.0 ext4 3840x2160 NVIDIA RTX 4000 Ada Generation 20GB NVIDIA Device 22bc NVIDIA RTX 2000 Ada Generation 16GB NVIDIA Device 22be NVIDIA RTX 4000 Ada Generation 20GB NVIDIA Device 22bc OpenBenchmarking.org Kernel Details - nouveau.modeset=0 - Transparent Huge Pages: madvise Compiler Details - --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-cet --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++,m2 --enable-libphobos-checking=release --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-defaulted --enable-offload-targets=nvptx-none=/build/gcc-12-ALHxjy/gcc-12-12.3.0/debian/tmp-nvptx/usr,amdgcn-amdhsa=/build/gcc-12-ALHxjy/gcc-12-12.3.0/debian/tmp-gcn/usr --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v Processor Details - Scaling Governor: amd-pstate-epp powersave (EPP: balance_performance) - CPU Microcode: 0xa601206 Graphics Details - Proprietary: RTX A2000: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.07.47.00.05 - Proprietary: RTX A4000: BAR1 / Visible vRAM Size: 32768 MiB - vBIOS Version: 95.04.5c.00.0d - MIT GPL: RTX A2000: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.07.47.00.05 - MIT GPL: RTX A4000: BAR1 / Visible vRAM Size: 32768 MiB - vBIOS Version: 95.04.5c.00.0d OpenCL Details - Proprietary: RTX A2000: GPU Compute Cores: 2816 - Proprietary: RTX A4000: GPU Compute Cores: 6144 - MIT GPL: RTX A2000: GPU Compute Cores: 2816 - MIT GPL: RTX A4000: GPU Compute Cores: 6144 Python Details - Python 3.10.12 Security Details - gather_data_sampling: Not affected + itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + mmio_stale_data: Not affected + retbleed: Not affected + spec_rstack_overflow: Mitigation of Safe RET + spec_store_bypass: Mitigation of SSB disabled via prctl + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced / Automatic IBRS; IBPB: conditional; STIBP: always-on; RSB filling; PBRSB-eIBRS: Not affected; BHI: Not affected + srbds: Not affected + tsx_async_abort: Not affected
v-ray: NVIDIA CUDA GPU v-ray: NVIDIA RTX GPU octanebench: Total Score indigobench: OpenCL GPU - Supercar indigobench: OpenCL GPU - Bedroom blender: BMW27 - NVIDIA CUDA blender: BMW27 - NVIDIA OptiX blender: Classroom - NVIDIA CUDA blender: Classroom - NVIDIA OptiX blender: Fishy Cat - NVIDIA CUDA blender: Fishy Cat - NVIDIA OptiX blender: Pabellon Barcelona - NVIDIA CUDA blender: Pabellon Barcelona - NVIDIA OptiX blender: Barbershop - NVIDIA CUDA blender: Barbershop - NVIDIA OptiX blender: Junkshop - NVIDIA CUDA blender: Junkshop - NVIDIA OptiX luxmark: GPU - Luxball HDR luxmark: GPU - Microphone luxmark: GPU - Hotel fluidx3d: FP32-FP32 fluidx3d: FP32-FP16S fluidx3d: FP32-FP16C specviewperf2020: 1920 x 1080 - CATIA-06 specviewperf2020: 1920 x 1080 - CREO-03 specviewperf2020: 1920 x 1080 - ENERGY-03 specviewperf2020: 1920 x 1080 - MAYA-06 specviewperf2020: 1920 x 1080 - MEDICAL-O3 specviewperf2020: 1920 x 1080 - SNX-04 specviewperf2020: 1920 x 1080 - SOLIDWORKS-07 specviewperf2020: 2560 x 1440 - CATIA-06 specviewperf2020: 2560 x 1440 - CREO-03 specviewperf2020: 2560 x 1440 - ENERGY-03 specviewperf2020: 2560 x 1440 - MAYA-06 specviewperf2020: 2560 x 1440 - MEDICAL-O3 specviewperf2020: 2560 x 1440 - SNX-04 specviewperf2020: 2560 x 1440 - SOLIDWORKS-07 unigine-super: 1920 x 1080 - Fullscreen - Low - OpenGL unigine-super: 1920 x 1080 - Fullscreen - Medium - OpenGL unigine-super: 1920 x 1080 - Fullscreen - High - OpenGL unigine-super: 1920 x 1080 - Fullscreen - Ultra - OpenGL unigine-super: 2560 x 1440 - Fullscreen - Low - OpenGL unigine-super: 2560 x 1440 - Fullscreen - Medium - OpenGL unigine-super: 2560 x 1440 - Fullscreen - High - OpenGL unigine-super: 2560 x 1440 - Fullscreen - Ultra - OpenGL breaking-limit: 1920 x 1080 - On breaking-limit: 1920 x 1080 - Off breaking-limit: 2560 x 1440 - On breaking-limit: 2560 x 1440 - Off paraview: Many Spheres - 1920 x 1080 paraview: Many Spheres - 1920 x 1080 paraview: Many Spheres - 2560 x 1440 paraview: Many Spheres - 2560 x 1440 paraview: Wavelet Contour - 1920 x 1080 paraview: Wavelet Contour - 1920 x 1080 paraview: Wavelet Contour - 2560 x 1440 paraview: Wavelet Contour - 2560 x 1440 paraview: Wavelet Volume - 1920 x 1080 paraview: Wavelet Volume - 1920 x 1080 paraview: Wavelet Volume - 2560 x 1440 paraview: Wavelet Volume - 2560 x 1440 Proprietary MIT GPL RTX A2000 RTX A4000 RTX A2000 RTX A4000 1653 2037 284.830693 24.802 8.333 22.57 10.41 44.94 27.79 47.06 22.78 108.43 30.39 186.23 111.31 37.77 21.59 32886 25323 7181 1371 2488 2556 83.40 136.64 51.31 371.37 103.52 334.05 200.59 83.43 111.02 31.75 273.51 65.69 286.27 183.88 208.3 105.2 75 30.9 142.8 65.7 44.2 17.9 33.09 102.86 20.74 70.22 49.86 4998.433 48.77 4889.876 253.24 2639.025 203.83 2124.142 448.44 7175.023 301.50 4823.934 2340 2832 495.030899 38.125 13.322 13.30 7.13 25.98 17.22 26.12 12.80 58.81 18.86 105.87 69.88 22.54 14.04 53209 39074 12065 2159 4006 4280 144.18 192.14 98.42 540.09 142.54 495.75 336.92 144.40 160.57 62.60 416.35 97.00 421.61 304.80 323.3 178.9 128.5 51.2 242.0 114.6 77.4 29.6 61.98 181.40 39.19 125.55 101.92 10217.673 98.36 9861.257 448.72 4676.206 370.36 3859.581 701.98 11231.590 513.10 8209.650 1642 2020 280.903124 24.396 8.181 22.76 10.57 45.24 27.86 47.28 22.64 108.86 30.66 187.86 112.84 38.22 22.02 32501 25078 7183 1330 2416 2495 83.67 136.80 51.51 375.02 103.80 334.87 200.81 83.56 111.11 31.80 273.74 65.84 287.19 184.18 208.9 105.5 75.2 31.1 143.2 66.0 44.4 17.9 33.23 103.55 20.85 70.63 49.87 4999.660 48.81 4893.831 253.15 2638.139 204.16 2127.565 448.76 7180.140 301.44 4823.032 2340 2927 482.879547 36.981 12.882 13.50 7.39 26.34 17.61 26.30 12.96 59.25 19.33 107.41 71.75 23.08 14.50 51531 38207 11595 2049 3814 4065 143.45 192.12 98.09 539.31 142.75 496.61 340.81 144.74 160.53 62.47 414.44 97.22 423.50 308.53 324.6 179.1 128.6 51.1 242.0 114.6 77.4 29.6 61.89 181.16 39.17 125.39 101.88 10214.160 98.41 9865.540 448.11 4669.861 368.43 3839.488 701.45 11223.158 503.14 8050.292 OpenBenchmarking.org
Chaos Group V-RAY Mode: NVIDIA CUDA GPU MIT GPL Proprietary OpenBenchmarking.org vpaths, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA CUDA GPU RTX A4000 RTX A2000 500 1000 1500 2000 2500 SE +/- 0.00, N = 3 SE +/- 3.67, N = 3 SE +/- 0.00, N = 3 SE +/- 7.00, N = 3 2340 1642 2340 1653
Chaos Group V-RAY Mode: NVIDIA CUDA GPU MIT GPL Proprietary OpenBenchmarking.org vpaths Per Watt, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA CUDA GPU RTX A2000 RTX A4000 8 16 24 32 40 32.46 31.99 32.41 30.94
Chaos Group V-RAY GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Chaos Group V-RAY 6.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.29 / Avg: 50.58 / Max: 67.64 Min: 6.55 / Avg: 73.14 / Max: 108.42 Min: 4.55 / Avg: 51.01 / Max: 68.75 Min: 6.67 / Avg: 75.64 / Max: 109.6
Chaos Group V-RAY GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Chaos Group V-RAY 6.0 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 29 / Avg: 55.71 / Max: 71 Min: 43 / Avg: 64.23 / Max: 75 Min: 31 / Avg: 57.08 / Max: 72 Min: 32 / Avg: 59.69 / Max: 74
Chaos Group V-RAY Mode: NVIDIA RTX GPU MIT GPL Proprietary OpenBenchmarking.org vpaths, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA RTX GPU RTX A4000 RTX A2000 600 1200 1800 2400 3000 SE +/- 10.33, N = 3 SE +/- 7.00, N = 3 SE +/- 29.16, N = 3 SE +/- 3.33, N = 3 2927 2020 2832 2037
Chaos Group V-RAY Mode: NVIDIA RTX GPU Proprietary MIT GPL OpenBenchmarking.org vpaths Per Watt, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA RTX GPU RTX A2000 RTX A4000 10 20 30 40 50 44.30 37.13 43.53 37.94
Chaos Group V-RAY GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Chaos Group V-RAY 6.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.97 / Avg: 45.98 / Max: 63.95 Min: 14.35 / Avg: 76.28 / Max: 101.35 Min: 4.49 / Avg: 46.41 / Max: 61.88 Min: 7.78 / Avg: 77.16 / Max: 102.86
Chaos Group V-RAY GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Chaos Group V-RAY 6.0 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 47 / Avg: 61.91 / Max: 69 Min: 54 / Avg: 65.65 / Max: 72 Min: 49 / Avg: 62.14 / Max: 69 Min: 53 / Avg: 65.03 / Max: 72
OctaneBench Total Score Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better OctaneBench 2020.1 Total Score RTX A4000 RTX A2000 110 220 330 440 550 495.03 284.83 482.88 280.90
OctaneBench Total Score MIT GPL Proprietary OpenBenchmarking.org Score Per Watt, More Is Better OctaneBench 2020.1 Total Score RTX A2000 RTX A4000 0.9533 1.9066 2.8599 3.8132 4.7665 4.237 4.208 4.225 4.224
OctaneBench GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better OctaneBench 2020.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.76 / Avg: 66.29 / Max: 69.87 Min: 8.63 / Avg: 114.76 / Max: 127.42 Min: 4.84 / Avg: 67.42 / Max: 69.96 Min: 17.94 / Avg: 117.19 / Max: 128.9
OctaneBench GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better OctaneBench 2020.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 72.57 / Max: 79 Min: 56 / Avg: 74.94 / Max: 79 Min: 54 / Avg: 72.97 / Max: 78 Min: 56 / Avg: 75.38 / Max: 79
IndigoBench Acceleration: OpenCL GPU - Scene: Supercar Proprietary MIT GPL OpenBenchmarking.org M samples/s, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Supercar RTX A4000 RTX A2000 9 18 27 36 45 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.03, N = 3 38.13 24.80 36.98 24.40
IndigoBench Acceleration: OpenCL GPU - Scene: Supercar MIT GPL Proprietary OpenBenchmarking.org M samples/s Per Watt, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Supercar RTX A2000 RTX A4000 0.0956 0.1912 0.2868 0.3824 0.478 0.425 0.400 0.423 0.404
IndigoBench GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better IndigoBench 4.4 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.84 / Avg: 57.35 / Max: 65.74 Min: 9.58 / Avg: 92.48 / Max: 105.24 Min: 5 / Avg: 58.63 / Max: 67.57 Min: 14.6 / Avg: 94.46 / Max: 106.75
IndigoBench GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better IndigoBench 4.4 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 56 / Avg: 67.79 / Max: 72 Min: 59 / Avg: 72.23 / Max: 76 Min: 55 / Avg: 67.82 / Max: 71 Min: 58 / Avg: 71.3 / Max: 75
IndigoBench Acceleration: OpenCL GPU - Scene: Bedroom Proprietary MIT GPL OpenBenchmarking.org M samples/s, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Bedroom RTX A4000 RTX A2000 3 6 9 12 15 SE +/- 0.002, N = 3 SE +/- 0.001, N = 3 SE +/- 0.004, N = 3 SE +/- 0.003, N = 3 13.322 8.333 12.882 8.181
IndigoBench Acceleration: OpenCL GPU - Scene: Bedroom MIT GPL Proprietary OpenBenchmarking.org M samples/s Per Watt, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Bedroom RTX A2000 RTX A4000 0.0306 0.0612 0.0918 0.1224 0.153 0.136 0.130 0.136 0.132
IndigoBench GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better IndigoBench 4.4 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 5.13 / Avg: 60.22 / Max: 67.46 Min: 9.12 / Avg: 99.42 / Max: 113.23 Min: 5.08 / Avg: 61.2 / Max: 68.31 Min: 14.66 / Avg: 100.99 / Max: 115.19
IndigoBench GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better IndigoBench 4.4 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 55 / Avg: 69.01 / Max: 74 Min: 59 / Avg: 72.77 / Max: 77 Min: 55 / Avg: 69.08 / Max: 73 Min: 58 / Avg: 71.9 / Max: 76
Blender Blend File: BMW27 - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: BMW27 - Compute: NVIDIA CUDA RTX A4000 RTX A2000 5 10 15 20 25 SE +/- 0.02, N = 4 SE +/- 0.01, N = 3 SE +/- 0.03, N = 4 SE +/- 0.03, N = 3 13.30 22.57 13.50 22.76
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.14 / Avg: 50.61 / Max: 61.74 Min: 9.23 / Avg: 79.88 / Max: 107.92 Min: 4.89 / Avg: 51.77 / Max: 62.98 Min: 14.6 / Avg: 81.94 / Max: 108.87
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 56 / Avg: 65.05 / Max: 70 Min: 59 / Avg: 70.36 / Max: 74 Min: 56 / Avg: 65.16 / Max: 70 Min: 59 / Avg: 69.61 / Max: 73
Blender Blend File: BMW27 - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: BMW27 - Compute: NVIDIA OptiX RTX A4000 RTX A2000 3 6 9 12 15 SE +/- 0.01, N = 6 SE +/- 0.06, N = 15 SE +/- 0.01, N = 6 SE +/- 0.01, N = 5 7.13 10.41 7.39 10.57
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.91 / Avg: 46.99 / Max: 65.85 Min: 8.63 / Avg: 69.74 / Max: 107.1 Min: 4.87 / Avg: 48.92 / Max: 67.77 Min: 14.19 / Avg: 71.07 / Max: 108.8
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 53 / Avg: 60.19 / Max: 65 Min: 56 / Avg: 67.45 / Max: 72 Min: 53 / Avg: 60.25 / Max: 65 Min: 56 / Avg: 65.42 / Max: 70
Blender Blend File: Classroom - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Classroom - Compute: NVIDIA CUDA RTX A4000 RTX A2000 10 20 30 40 50 SE +/- 0.03, N = 3 SE +/- 0.01, N = 3 SE +/- 0.00, N = 3 SE +/- 0.02, N = 3 25.98 44.94 26.34 45.24
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.94 / Avg: 58.43 / Max: 65.19 Min: 8.42 / Avg: 95.45 / Max: 113.53 Min: 4.87 / Avg: 59.74 / Max: 66.5 Min: 14.18 / Avg: 97.57 / Max: 114.29
Blender GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 52 / Avg: 67.27 / Max: 74 Min: 55 / Avg: 71.61 / Max: 76 Min: 52 / Avg: 67.27 / Max: 72 Min: 56 / Avg: 72.64 / Max: 77
Blender Blend File: Classroom - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Classroom - Compute: NVIDIA OptiX RTX A4000 RTX A2000 7 14 21 28 35 SE +/- 0.03, N = 3 SE +/- 0.06, N = 3 SE +/- 0.04, N = 3 SE +/- 0.03, N = 3 17.22 27.79 17.61 27.86
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.08 / Avg: 59.5 / Max: 69.56 Min: 8.93 / Avg: 92.63 / Max: 117.55 Min: 5.06 / Avg: 59.66 / Max: 69.61 Min: 14.23 / Avg: 94.28 / Max: 118.65
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 65.63 / Max: 72 Min: 57 / Avg: 71.78 / Max: 76 Min: 54 / Avg: 65.77 / Max: 72 Min: 57 / Avg: 71.24 / Max: 76
Blender Blend File: Fishy Cat - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Fishy Cat - Compute: NVIDIA CUDA RTX A4000 RTX A2000 11 22 33 44 55 SE +/- 0.01, N = 3 SE +/- 0.04, N = 3 SE +/- 0.02, N = 3 SE +/- 0.03, N = 3 26.12 47.06 26.30 47.28
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.13 / Avg: 48.96 / Max: 61.27 Min: 9.96 / Avg: 71.27 / Max: 104.34 Min: 4.94 / Avg: 49.72 / Max: 61.77 Min: 14.25 / Avg: 82.27 / Max: 105.21
Blender GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 53 / Avg: 63.98 / Max: 71 Min: 57 / Avg: 69.28 / Max: 74 Min: 54 / Avg: 65.29 / Max: 71 Min: 57 / Avg: 70.01 / Max: 75
Blender Blend File: Fishy Cat - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Fishy Cat - Compute: NVIDIA OptiX RTX A4000 RTX A2000 5 10 15 20 25 SE +/- 0.02, N = 4 SE +/- 0.28, N = 3 SE +/- 0.01, N = 4 SE +/- 0.02, N = 3 12.80 22.78 12.96 22.64
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.97 / Avg: 51.98 / Max: 65.31 Min: 13.76 / Avg: 80.4 / Max: 111.14 Min: 7.27 / Avg: 52.64 / Max: 66.49 Min: 14.19 / Avg: 81.85 / Max: 112.37
Blender GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 49 / Avg: 61.97 / Max: 70 Min: 55 / Avg: 68.21 / Max: 74 Min: 52 / Avg: 63.89 / Max: 70 Min: 56 / Avg: 69.01 / Max: 75
Blender Blend File: Pabellon Barcelona - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Pabellon Barcelona - Compute: NVIDIA CUDA RTX A4000 RTX A2000 20 40 60 80 100 SE +/- 0.02, N = 3 SE +/- 0.04, N = 3 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 58.81 108.43 59.25 108.86
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.04 / Avg: 57.16 / Max: 61.32 Min: 14.19 / Avg: 97.07 / Max: 110.36 Min: 7.38 / Avg: 58.16 / Max: 62.68 Min: 14.15 / Avg: 99.6 / Max: 111.35
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 52 / Avg: 69.64 / Max: 74 Min: 56 / Avg: 73.36 / Max: 76 Min: 52 / Avg: 69.68 / Max: 74 Min: 55 / Avg: 72.48 / Max: 76
Blender Blend File: Pabellon Barcelona - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Pabellon Barcelona - Compute: NVIDIA OptiX RTX A4000 RTX A2000 7 14 21 28 35 SE +/- 0.00, N = 3 SE +/- 0.05, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 18.86 30.39 19.33 30.66
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.98 / Avg: 58.92 / Max: 69.19 Min: 14.4 / Avg: 93.59 / Max: 119.14 Min: 7.26 / Avg: 59.61 / Max: 69.33 Min: 14.41 / Avg: 94.89 / Max: 120.59
Blender GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 53 / Avg: 65.14 / Max: 71 Min: 57 / Avg: 70.31 / Max: 75 Min: 54 / Avg: 65.6 / Max: 71 Min: 57 / Avg: 70.98 / Max: 75
Blender Blend File: Barbershop - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Barbershop - Compute: NVIDIA CUDA RTX A4000 RTX A2000 40 80 120 160 200 SE +/- 0.08, N = 3 SE +/- 0.30, N = 3 SE +/- 0.06, N = 3 SE +/- 0.03, N = 3 105.87 186.23 107.41 187.86
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.13 / Avg: 60.01 / Max: 64.75 Min: 14.65 / Avg: 100.95 / Max: 113.97 Min: 7.44 / Avg: 61.26 / Max: 66.3 Min: 14.4 / Avg: 102.04 / Max: 115.51
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 70.08 / Max: 74 Min: 57 / Avg: 74.28 / Max: 78 Min: 53 / Avg: 70.31 / Max: 75 Min: 56 / Avg: 73.38 / Max: 76
Blender Blend File: Barbershop - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Barbershop - Compute: NVIDIA OptiX RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.09, N = 3 SE +/- 0.07, N = 3 SE +/- 0.08, N = 3 SE +/- 0.03, N = 3 69.88 111.31 71.75 112.84
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.03 / Avg: 60.34 / Max: 66.04 Min: 14.77 / Avg: 98.25 / Max: 111.89 Min: 7.61 / Avg: 61.96 / Max: 67.74 Min: 14.47 / Avg: 99.6 / Max: 114.26
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 68.91 / Max: 73 Min: 58 / Avg: 74.04 / Max: 77 Min: 54 / Avg: 69.08 / Max: 73 Min: 57 / Avg: 72.94 / Max: 77
Blender Blend File: Junkshop - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Junkshop - Compute: NVIDIA CUDA RTX A4000 RTX A2000 9 18 27 36 45 SE +/- 0.02, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.04, N = 3 22.54 37.77 23.08 38.22
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.03 / Avg: 47.56 / Max: 57.57 Min: 14.84 / Avg: 75.59 / Max: 100.49 Min: 7.6 / Avg: 48.63 / Max: 58.89 Min: 14.67 / Avg: 76.87 / Max: 102.06
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 55 / Avg: 62.88 / Max: 67 Min: 59 / Avg: 68.09 / Max: 72 Min: 56 / Avg: 63.22 / Max: 68 Min: 58 / Avg: 67.3 / Max: 71
Blender Blend File: Junkshop - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Junkshop - Compute: NVIDIA OptiX RTX A4000 RTX A2000 5 10 15 20 25 SE +/- 0.03, N = 4 SE +/- 0.04, N = 3 SE +/- 0.02, N = 4 SE +/- 0.05, N = 3 14.04 21.59 14.50 22.02
Blender GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.06 / Avg: 47.52 / Max: 60.51 Min: 14.62 / Avg: 74.29 / Max: 104.43 Min: 7.23 / Avg: 48.44 / Max: 62.04 Min: 14.26 / Avg: 75.92 / Max: 106.15
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 53 / Avg: 60.93 / Max: 66 Min: 57 / Avg: 66.06 / Max: 70 Min: 54 / Avg: 61.36 / Max: 66 Min: 56 / Avg: 65.32 / Max: 69
LuxMark OpenCL Device: GPU - Scene: Luxball HDR Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Luxball HDR RTX A4000 RTX A2000 11K 22K 33K 44K 55K SE +/- 303.83, N = 3 SE +/- 92.15, N = 3 SE +/- 15.04, N = 3 SE +/- 15.82, N = 3 53209 32886 51531 32501
LuxMark OpenCL Device: GPU - Scene: Luxball HDR Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Luxball HDR RTX A2000 RTX A4000 110 220 330 440 550 490.54 472.37 486.87 467.40
LuxMark GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better LuxMark 3.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.11 / Avg: 66.75 / Max: 69.21 Min: 17.87 / Avg: 110.25 / Max: 114.63 Min: 7.51 / Avg: 67.04 / Max: 69.51 Min: 14.35 / Avg: 112.64 / Max: 116.78
LuxMark GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better LuxMark 3.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 72.14 / Max: 75 Min: 56 / Avg: 75.16 / Max: 78 Min: 53 / Avg: 72.22 / Max: 75 Min: 57 / Avg: 75.63 / Max: 78
LuxMark OpenCL Device: GPU - Scene: Microphone Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Microphone RTX A4000 RTX A2000 8K 16K 24K 32K 40K SE +/- 16.65, N = 3 SE +/- 9.61, N = 3 SE +/- 6.51, N = 3 SE +/- 5.04, N = 3 39074 25323 38207 25078
LuxMark OpenCL Device: GPU - Scene: Microphone Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Microphone RTX A2000 RTX A4000 80 160 240 320 400 380.43 343.78 377.01 341.74
LuxMark GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better LuxMark 3.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.24 / Avg: 66.52 / Max: 69.39 Min: 15.13 / Avg: 111.61 / Max: 116.72 Min: 7.49 / Avg: 66.56 / Max: 69.8 Min: 14.75 / Avg: 113.66 / Max: 118.61
LuxMark GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better LuxMark 3.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 57 / Avg: 73.03 / Max: 75 Min: 60 / Avg: 75.75 / Max: 78 Min: 57 / Avg: 73.08 / Max: 75 Min: 59 / Avg: 75.55 / Max: 78
LuxMark OpenCL Device: GPU - Scene: Hotel Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Hotel RTX A4000 RTX A2000 3K 6K 9K 12K 15K SE +/- 9.21, N = 3 SE +/- 1.45, N = 3 SE +/- 8.88, N = 3 SE +/- 1.20, N = 3 12065 7181 11595 7183
LuxMark OpenCL Device: GPU - Scene: Hotel Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Hotel RTX A2000 RTX A4000 20 40 60 80 100 108.28 104.52 108.20 98.39
LuxMark GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better LuxMark 3.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.59 / Avg: 66.32 / Max: 69.54 Min: 14.64 / Avg: 115.43 / Max: 125.3 Min: 7.18 / Avg: 66.39 / Max: 69.71 Min: 15.1 / Avg: 117.85 / Max: 123.12
LuxMark GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better LuxMark 3.1 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 56 / Avg: 74.29 / Max: 78 Min: 60 / Avg: 76.59 / Max: 79 Min: 57 / Avg: 74.85 / Max: 78 Min: 59 / Avg: 76.26 / Max: 79
FluidX3D Test: FP32-FP32 Proprietary MIT GPL OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.17 Test: FP32-FP32 RTX A4000 RTX A2000 500 1000 1500 2000 2500 SE +/- 0.58, N = 3 SE +/- 0.00, N = 3 SE +/- 0.88, N = 3 SE +/- 0.00, N = 3 2159 1371 2049 1330
FluidX3D Test: FP32-FP32 MIT GPL Proprietary OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.17 Test: FP32-FP32 RTX A2000 RTX A4000 6 12 18 24 30 26.47 24.36 26.42 24.95
FluidX3D GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.17 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7 / Avg: 50.24 / Max: 52.08 Min: 14.95 / Avg: 84.13 / Max: 88.64 Min: 7.46 / Avg: 51.89 / Max: 53.87 Min: 14 / Avg: 86.53 / Max: 91.45
FluidX3D GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.17 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 51 / Avg: 65.16 / Max: 68 Min: 59 / Avg: 68.95 / Max: 70 Min: 57 / Avg: 65.29 / Max: 67 Min: 59 / Avg: 67.99 / Max: 69
FluidX3D Test: FP32-FP16S Proprietary MIT GPL OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.17 Test: FP32-FP16S RTX A4000 RTX A2000 900 1800 2700 3600 4500 SE +/- 1.15, N = 3 SE +/- 0.00, N = 3 SE +/- 0.58, N = 3 SE +/- 0.00, N = 3 4006 2488 3814 2416
FluidX3D Test: FP32-FP16S MIT GPL Proprietary OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.17 Test: FP32-FP16S RTX A2000 RTX A4000 10 20 30 40 50 43.19 41.68 43.07 42.87
FluidX3D GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.17 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.99 / Avg: 55.93 / Max: 59.72 Min: 14.82 / Avg: 91.51 / Max: 100.39 Min: 7.4 / Avg: 57.76 / Max: 61.45 Min: 14.43 / Avg: 93.46 / Max: 101.83
FluidX3D GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.17 GPU Temperature Monitor RTX A4000 RTX A2000 14 28 42 56 70 Min: 55 / Avg: 67.25 / Max: 70 Min: 57 / Avg: 70.29 / Max: 73 Min: 55 / Avg: 67.28 / Max: 70 Min: 58 / Avg: 71.34 / Max: 74
FluidX3D Test: FP32-FP16C Proprietary MIT GPL OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.17 Test: FP32-FP16C RTX A4000 RTX A2000 900 1800 2700 3600 4500 SE +/- 0.88, N = 3 SE +/- 2.52, N = 3 SE +/- 1.00, N = 3 SE +/- 2.00, N = 3 4280 2556 4065 2495
FluidX3D Test: FP32-FP16C Proprietary MIT GPL OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.17 Test: FP32-FP16C RTX A2000 RTX A4000 9 18 27 36 45 38.88 37.47 38.10 36.84
FluidX3D GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.17 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.16 / Avg: 65.49 / Max: 69.99 Min: 14.98 / Avg: 110.35 / Max: 121.87 Min: 7.46 / Avg: 65.74 / Max: 70.02 Min: 14.73 / Avg: 114.21 / Max: 126.15
FluidX3D GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.17 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 56 / Avg: 72.28 / Max: 76 Min: 58 / Avg: 75.29 / Max: 78 Min: 56 / Avg: 72.5 / Max: 76 Min: 59 / Avg: 75.34 / Max: 78
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CATIA-06 RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.45, N = 3 SE +/- 0.14, N = 3 SE +/- 1.23, N = 3 SE +/- 0.13, N = 3 144.18 83.40 143.45 83.67
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CATIA-06 RTX A4000 RTX A2000 0.3695 0.739 1.1085 1.478 1.8475 1.642 1.544 1.627 1.541
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.55 / Avg: 54.03 / Max: 69.39 Min: 14.53 / Avg: 87.81 / Max: 127.55 Min: 7.21 / Avg: 54.3 / Max: 69.36 Min: 14.83 / Avg: 88.19 / Max: 127.51
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 55 / Avg: 66.27 / Max: 73 Min: 58 / Avg: 70.65 / Max: 76 Min: 56 / Avg: 66.77 / Max: 74 Min: 58 / Avg: 70.52 / Max: 76
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CREO-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CREO-03 RTX A4000 RTX A2000 40 80 120 160 200 SE +/- 0.15, N = 3 SE +/- 0.16, N = 3 SE +/- 0.19, N = 3 SE +/- 0.18, N = 3 192.14 136.64 192.12 136.80
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CREO-03 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CREO-03 RTX A2000 RTX A4000 0.623 1.246 1.869 2.492 3.115 2.769 2.561 2.754 2.571
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.1 / Avg: 49.4 / Max: 69.57 Min: 14.54 / Avg: 75.02 / Max: 116.94 Min: 7.39 / Avg: 49.62 / Max: 69.36 Min: 14.34 / Avg: 74.74 / Max: 116.33
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 62.65 / Max: 71 Min: 56 / Avg: 68.61 / Max: 76 Min: 53 / Avg: 63.13 / Max: 71 Min: 56 / Avg: 68.04 / Max: 76
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: ENERGY-03 RTX A4000 RTX A2000 20 40 60 80 100 SE +/- 0.10, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 SE +/- 0.05, N = 3 98.42 51.31 98.09 51.51
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: ENERGY-03 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: ENERGY-03 RTX A4000 RTX A2000 0.23 0.46 0.69 0.92 1.15 1.022 0.997 1.020 0.985
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.83 / Avg: 51.68 / Max: 69.92 Min: 14.37 / Avg: 96.01 / Max: 130.43 Min: 7.24 / Avg: 52.11 / Max: 69.97 Min: 14.18 / Avg: 96.51 / Max: 130.24
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 52 / Avg: 69.03 / Max: 79 Min: 55 / Avg: 69.86 / Max: 78 Min: 53 / Avg: 69.21 / Max: 79 Min: 55 / Avg: 69.37 / Max: 77
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MAYA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MAYA-06 RTX A4000 RTX A2000 120 240 360 480 600 SE +/- 2.35, N = 3 SE +/- 2.79, N = 3 SE +/- 2.04, N = 3 SE +/- 0.58, N = 3 540.09 371.37 539.31 375.02
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MAYA-06 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MAYA-06 RTX A2000 RTX A4000 2 4 6 8 10 7.096 6.650 7.032 6.701
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.31 / Avg: 52.81 / Max: 69.78 Min: 14.38 / Avg: 80.6 / Max: 125 Min: 6.9 / Avg: 52.85 / Max: 69.82 Min: 14.56 / Avg: 81.1 / Max: 125.04
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 64.34 / Max: 73 Min: 56 / Avg: 70.16 / Max: 77 Min: 54 / Avg: 64.62 / Max: 73 Min: 56 / Avg: 69.67 / Max: 77
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.04, N = 3 SE +/- 0.08, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 142.75 103.80 142.54 103.52
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 RTX A2000 RTX A4000 0.4088 0.8176 1.2264 1.6352 2.044 1.817 1.724 1.804 1.736
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.13 / Avg: 57.12 / Max: 69.75 Min: 14.5 / Avg: 82.8 / Max: 118.96 Min: 7.47 / Avg: 57.4 / Max: 69.72 Min: 14.14 / Avg: 82.09 / Max: 118.6
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 65.76 / Max: 75 Min: 57 / Avg: 73.75 / Max: 79 Min: 54 / Avg: 66.01 / Max: 75 Min: 57 / Avg: 73.02 / Max: 79
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SNX-04 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SNX-04 RTX A4000 RTX A2000 110 220 330 440 550 SE +/- 0.58, N = 3 SE +/- 0.61, N = 3 SE +/- 0.37, N = 3 SE +/- 0.16, N = 3 496.61 334.87 495.75 334.05
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SNX-04 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SNX-04 RTX A2000 RTX A4000 1.2222 2.4444 3.6666 4.8888 6.111 5.432 4.885 5.392 4.890
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.01 / Avg: 61.65 / Max: 69.87 Min: 14.19 / Avg: 101.65 / Max: 128.07 Min: 7.46 / Avg: 61.96 / Max: 70.06 Min: 13.97 / Avg: 101.39 / Max: 127.34
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 51 / Avg: 70.3 / Max: 76 Min: 57 / Avg: 74.88 / Max: 78 Min: 51 / Avg: 70.35 / Max: 76 Min: 57 / Avg: 74.26 / Max: 78
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 RTX A4000 RTX A2000 70 140 210 280 350 SE +/- 0.09, N = 3 SE +/- 0.09, N = 3 SE +/- 0.57, N = 3 SE +/- 0.17, N = 3 340.81 200.81 336.92 200.59
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 RTX A4000 RTX A2000 0.716 1.432 2.148 2.864 3.58 3.182 3.134 3.179 3.119
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7 / Avg: 64.08 / Max: 69.67 Min: 14.57 / Avg: 107.11 / Max: 118.58 Min: 7.54 / Avg: 64.3 / Max: 69.84 Min: 14.35 / Avg: 105.97 / Max: 118.99
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 71.86 / Max: 78 Min: 58 / Avg: 76.69 / Max: 81 Min: 54 / Avg: 72.35 / Max: 77 Min: 57 / Avg: 76.27 / Max: 81
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CATIA-06 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CATIA-06 RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.26, N = 3 SE +/- 0.01, N = 3 SE +/- 0.26, N = 3 SE +/- 0.09, N = 3 144.74 83.56 144.40 83.43
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CATIA-06 RTX A4000 RTX A2000 0.3699 0.7398 1.1097 1.4796 1.8495 1.644 1.532 1.636 1.540
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.09 / Avg: 54.27 / Max: 69.15 Min: 14.69 / Avg: 88.5 / Max: 127.57 Min: 7.41 / Avg: 54.45 / Max: 69.52 Min: 14.4 / Avg: 87.84 / Max: 127.38
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 66.8 / Max: 74 Min: 58 / Avg: 71.02 / Max: 76 Min: 55 / Avg: 67.14 / Max: 74 Min: 57 / Avg: 70.52 / Max: 75
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CREO-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CREO-03 RTX A4000 RTX A2000 40 80 120 160 200 SE +/- 0.16, N = 3 SE +/- 0.09, N = 3 SE +/- 0.07, N = 3 SE +/- 0.09, N = 3 160.57 111.02 160.53 111.11
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CREO-03 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CREO-03 RTX A2000 RTX A4000 0.4964 0.9928 1.4892 1.9856 2.482 2.206 2.051 2.187 2.060
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.06 / Avg: 50.36 / Max: 69.6 Min: 14.65 / Avg: 78.28 / Max: 119.94 Min: 7.35 / Avg: 50.76 / Max: 69.64 Min: 14.22 / Avg: 77.95 / Max: 119.73
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 63.88 / Max: 74 Min: 57 / Avg: 69.19 / Max: 77 Min: 54 / Avg: 64.14 / Max: 73 Min: 56 / Avg: 68.55 / Max: 76
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: ENERGY-03 RTX A4000 RTX A2000 14 28 42 56 70 SE +/- 0.06, N = 3 SE +/- 0.02, N = 3 SE +/- 0.06, N = 3 SE +/- 0.02, N = 3 62.60 31.75 62.47 31.80
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: ENERGY-03 RTX A4000 RTX A2000 0.1476 0.2952 0.4428 0.5904 0.738 0.656 0.610 0.655 0.615
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.02 / Avg: 51.69 / Max: 69.86 Min: 14.29 / Avg: 95.4 / Max: 131.07 Min: 7.26 / Avg: 52.06 / Max: 69.95 Min: 14.11 / Avg: 95.36 / Max: 130.53
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 69.68 / Max: 79 Min: 55 / Avg: 70.42 / Max: 78 Min: 55 / Avg: 69.8 / Max: 78 Min: 53 / Avg: 69.88 / Max: 79
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MAYA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MAYA-06 RTX A4000 RTX A2000 90 180 270 360 450 SE +/- 1.59, N = 3 SE +/- 0.58, N = 3 SE +/- 1.14, N = 3 SE +/- 0.39, N = 3 416.35 273.51 414.44 273.74
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MAYA-06 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MAYA-06 RTX A2000 RTX A4000 1.1284 2.2568 3.3852 4.5136 5.642 5.015 4.816 5.005 4.862
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.11 / Avg: 54.58 / Max: 69.93 Min: 14.52 / Avg: 86.05 / Max: 123.65 Min: 7.22 / Avg: 54.65 / Max: 69.72 Min: 14.28 / Avg: 85.63 / Max: 123.49
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 66.03 / Max: 73 Min: 56 / Avg: 70.7 / Max: 77 Min: 54 / Avg: 66.38 / Max: 74 Min: 56 / Avg: 70.51 / Max: 77
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 RTX A4000 RTX A2000 20 40 60 80 100 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 97.22 65.84 97.00 65.69
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 RTX A2000 RTX A4000 0.2543 0.5086 0.7629 1.0172 1.2715 1.130 1.119 1.130 1.128
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.49 / Avg: 58.15 / Max: 69.77 Min: 14.34 / Avg: 85.98 / Max: 123.58 Min: 7.39 / Avg: 58.24 / Max: 69.81 Min: 14.67 / Avg: 86.87 / Max: 123.79
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 67.54 / Max: 76 Min: 58 / Avg: 74.27 / Max: 80 Min: 55 / Avg: 67.95 / Max: 77 Min: 57 / Avg: 74.14 / Max: 79
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SNX-04 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SNX-04 RTX A4000 RTX A2000 90 180 270 360 450 SE +/- 0.17, N = 3 SE +/- 0.36, N = 3 SE +/- 1.18, N = 3 SE +/- 0.06, N = 3 423.50 287.19 421.61 286.27
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SNX-04 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SNX-04 RTX A2000 RTX A4000 1.0375 2.075 3.1125 4.15 5.1875 4.611 4.147 4.590 4.162
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.27 / Avg: 62.29 / Max: 69.81 Min: 14.26 / Avg: 102.11 / Max: 127.86 Min: 7.38 / Avg: 62.37 / Max: 69.77 Min: 14.05 / Avg: 101.31 / Max: 127.07
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 52 / Avg: 70.53 / Max: 76 Min: 57 / Avg: 74.79 / Max: 78 Min: 52 / Avg: 70.88 / Max: 76 Min: 57 / Avg: 74.49 / Max: 78
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 MIT GPL Proprietary OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 RTX A4000 RTX A2000 70 140 210 280 350 SE +/- 0.17, N = 3 SE +/- 0.14, N = 3 SE +/- 0.20, N = 3 SE +/- 0.09, N = 3 308.53 184.18 304.80 183.88
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 MIT GPL Proprietary OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 RTX A4000 RTX A2000 0.6689 1.3378 2.0067 2.6756 3.3445 2.973 2.913 2.970 2.900
SPECViewPerf 2020 GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.18 / Avg: 63.22 / Max: 69.85 Min: 14.6 / Avg: 103.79 / Max: 118.25 Min: 7.47 / Avg: 63.41 / Max: 69.83 Min: 14.44 / Avg: 102.64 / Max: 118.53
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 71.16 / Max: 78 Min: 58 / Avg: 76.17 / Max: 81 Min: 55 / Avg: 71.83 / Max: 77 Min: 58 / Avg: 75.87 / Max: 81
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A4000 RTX A2000 70 140 210 280 350 SE +/- 0.62, N = 3 SE +/- 0.21, N = 3 SE +/- 0.46, N = 3 SE +/- 0.31, N = 3 324.6 208.9 323.3 208.3
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A2000 RTX A4000 0.7425 1.485 2.2275 2.97 3.7125 3.300 3.259 3.278 3.268
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.93 / Avg: 63.31 / Max: 69.98 Min: 14.62 / Avg: 99.6 / Max: 114.79 Min: 7.1 / Avg: 63.74 / Max: 69.95 Min: 14.4 / Avg: 98.63 / Max: 114.11
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 52 / Avg: 70.8 / Max: 75 Min: 55 / Avg: 75.65 / Max: 79 Min: 52 / Avg: 71.38 / Max: 76 Min: 55 / Avg: 75.29 / Max: 79
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A4000 RTX A2000 40 80 120 160 200 SE +/- 0.09, N = 3 SE +/- 0.06, N = 3 SE +/- 0.03, N = 3 SE +/- 0.12, N = 3 179.1 105.5 178.9 105.2
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A2000 RTX A4000 0.3742 0.7484 1.1226 1.4968 1.871 1.663 1.654 1.659 1.643
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.23 / Avg: 63.46 / Max: 70.01 Min: 14.95 / Avg: 108.27 / Max: 119.23 Min: 7.23 / Avg: 64.05 / Max: 69.98 Min: 14.52 / Avg: 107.85 / Max: 118.76
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 73.72 / Max: 77 Min: 57 / Avg: 76.3 / Max: 79 Min: 54 / Avg: 74.11 / Max: 78 Min: 57 / Avg: 75.87 / Max: 79
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.00, N = 3 SE +/- 0.03, N = 3 SE +/- 0.09, N = 3 SE +/- 0.00, N = 3 128.6 75.2 128.5 75.0
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A4000 RTX A2000 0.2698 0.5396 0.8094 1.0792 1.349 1.199 1.185 1.180 1.171
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.09 / Avg: 63.43 / Max: 69.68 Min: 14.52 / Avg: 107.3 / Max: 120.82 Min: 7.34 / Avg: 63.55 / Max: 69.68 Min: 14.47 / Avg: 109.76 / Max: 120.33
Unigine Superposition GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 52 / Avg: 73.58 / Max: 78 Min: 57 / Avg: 75.77 / Max: 79 Min: 53 / Avg: 74.17 / Max: 78 Min: 57 / Avg: 75.89 / Max: 79
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A4000 RTX A2000 12 24 36 48 60 SE +/- 0.00, N = 3 SE +/- 0.03, N = 3 SE +/- 0.10, N = 3 SE +/- 0.00, N = 3 51.2 30.9 51.1 31.1
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A2000 RTX A4000 0.1168 0.2336 0.3504 0.4672 0.584 0.519 0.445 0.488 0.442
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.99 / Avg: 59.92 / Max: 69.9 Min: 14.78 / Avg: 114.83 / Max: 127.02 Min: 7.29 / Avg: 63.28 / Max: 69.78 Min: 14.67 / Avg: 115.71 / Max: 126.92
Unigine Superposition GPU Temperature Monitor MIT GPL Proprietary OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 55 / Avg: 74.77 / Max: 79 Min: 53 / Avg: 76.05 / Max: 80 Min: 53 / Avg: 76.1 / Max: 80 Min: 57 / Avg: 76.22 / Max: 80
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A4000 RTX A2000 50 100 150 200 250 SE +/- 0.12, N = 3 SE +/- 0.20, N = 3 SE +/- 0.09, N = 3 SE +/- 0.17, N = 3 242.0 143.2 242.0 142.8
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A2000 RTX A4000 0.5067 1.0134 1.5201 2.0268 2.5335 2.252 2.224 2.237 2.215
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.69 / Avg: 63.6 / Max: 70.02 Min: 14.89 / Avg: 108.83 / Max: 119.71 Min: 7.46 / Avg: 64.47 / Max: 69.91 Min: 14.57 / Avg: 108.17 / Max: 119.48
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 73.54 / Max: 77 Min: 57 / Avg: 75.99 / Max: 79 Min: 53 / Avg: 73.93 / Max: 78 Min: 54 / Avg: 75.16 / Max: 79
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.06, N = 3 SE +/- 0.07, N = 3 SE +/- 0.06, N = 3 SE +/- 0.03, N = 3 114.6 66.0 114.6 65.7
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A2000 RTX A4000 0.2367 0.4734 0.7101 0.9468 1.1835 1.052 1.046 1.042 1.027
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.97 / Avg: 62.75 / Max: 69.82 Min: 15.02 / Avg: 109.61 / Max: 120.91 Min: 7.5 / Avg: 63.98 / Max: 69.64 Min: 14.38 / Avg: 109.97 / Max: 120.46
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 54 / Avg: 74.13 / Max: 77 Min: 57 / Avg: 76.38 / Max: 80 Min: 54 / Avg: 74.37 / Max: 78 Min: 56 / Avg: 75.65 / Max: 79
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A4000 RTX A2000 20 40 60 80 100 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 77.4 44.4 77.4 44.2
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A2000 RTX A4000 0.1566 0.3132 0.4698 0.6264 0.783 0.696 0.690 0.693 0.693
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.2 / Avg: 63.77 / Max: 69.96 Min: 14.83 / Avg: 112.11 / Max: 122.87 Min: 7.43 / Avg: 63.8 / Max: 70.03 Min: 14.63 / Avg: 111.75 / Max: 122.85
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 74.19 / Max: 78 Min: 57 / Avg: 76.3 / Max: 80 Min: 54 / Avg: 74.87 / Max: 78 Min: 57 / Avg: 76.16 / Max: 79
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A4000 RTX A2000 7 14 21 28 35 SE +/- 0.00, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.00, N = 3 29.6 17.9 29.6 17.9
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL MIT GPL Proprietary OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A2000 RTX A4000 0.0657 0.1314 0.1971 0.2628 0.3285 0.292 0.261 0.283 0.259
Unigine Superposition GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.06 / Avg: 61.31 / Max: 69.68 Min: 15 / Avg: 113.51 / Max: 126.47 Min: 7.47 / Avg: 63.22 / Max: 69.65 Min: 14.59 / Avg: 114.36 / Max: 126.44
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 75.19 / Max: 79 Min: 57 / Avg: 77.17 / Max: 81 Min: 54 / Avg: 75.61 / Max: 80 Min: 55 / Avg: 75.93 / Max: 80
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: On RTX A4000 RTX A2000 14 28 42 56 70 SE +/- 0.12, N = 3 SE +/- 0.05, N = 3 SE +/- 0.06, N = 3 SE +/- 0.03, N = 3 61.98 33.09 61.89 33.23
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: On MIT GPL Proprietary OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: On RTX A4000 RTX A2000 0.1357 0.2714 0.4071 0.5428 0.6785 0.603 0.536 0.602 0.535
GPUScore: Breaking Limit GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.33 / Avg: 61.86 / Max: 69.51 Min: 14.44 / Avg: 102.9 / Max: 125.69 Min: 7.13 / Avg: 61.96 / Max: 69.51 Min: 14.84 / Avg: 102.71 / Max: 126.05
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 56 / Avg: 71.12 / Max: 78 Min: 59 / Avg: 75.27 / Max: 80 Min: 56 / Avg: 71.72 / Max: 78 Min: 59 / Avg: 74.97 / Max: 79
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: Off Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: Off RTX A4000 RTX A2000 40 80 120 160 200 SE +/- 0.40, N = 4 SE +/- 0.19, N = 3 SE +/- 0.43, N = 4 SE +/- 0.24, N = 3 181.40 102.86 181.16 103.55
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: Off MIT GPL Proprietary OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: Off RTX A4000 RTX A2000 0.4979 0.9958 1.4937 1.9916 2.4895 2.213 1.923 2.204 1.921
GPUScore: Breaking Limit GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.37 / Avg: 53.55 / Max: 69.86 Min: 14.55 / Avg: 82.32 / Max: 120.3 Min: 7.11 / Avg: 53.84 / Max: 69.67 Min: 14.69 / Avg: 81.88 / Max: 120.39
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 55 / Avg: 65.14 / Max: 72 Min: 59 / Avg: 70.81 / Max: 76 Min: 56 / Avg: 65.61 / Max: 72 Min: 58 / Avg: 70.26 / Max: 76
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: On RTX A4000 RTX A2000 9 18 27 36 45 SE +/- 0.09, N = 3 SE +/- 0.07, N = 3 SE +/- 0.08, N = 3 SE +/- 0.02, N = 3 39.19 20.74 39.17 20.85
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: On RTX A4000 RTX A2000 0.0801 0.1602 0.2403 0.3204 0.4005 0.356 0.324 0.354 0.326
GPUScore: Breaking Limit GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.05 / Avg: 63.86 / Max: 69.02 Min: 14.59 / Avg: 110.65 / Max: 125.65 Min: 7.29 / Avg: 63.92 / Max: 69.25 Min: 14.19 / Avg: 110.16 / Max: 125.34
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 53 / Avg: 72.93 / Max: 78 Min: 57 / Avg: 75.99 / Max: 80 Min: 54 / Avg: 73.65 / Max: 79 Min: 57 / Avg: 75.51 / Max: 80
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: Off Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: Off RTX A4000 RTX A2000 30 60 90 120 150 SE +/- 0.43, N = 3 SE +/- 0.27, N = 3 SE +/- 0.26, N = 3 SE +/- 0.14, N = 3 125.55 70.22 125.39 70.63
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: Off MIT GPL Proprietary OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: Off RTX A4000 RTX A2000 0.3128 0.6256 0.9384 1.2512 1.564 1.390 1.233 1.378 1.221
GPUScore: Breaking Limit GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7 / Avg: 57.3 / Max: 69.86 Min: 14.92 / Avg: 90.21 / Max: 122.98 Min: 7.5 / Avg: 57.49 / Max: 69.6 Min: 14.64 / Avg: 91.1 / Max: 122.61
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A4000 RTX A2000 15 30 45 60 75 Min: 55 / Avg: 67.19 / Max: 74 Min: 59 / Avg: 72.6 / Max: 78 Min: 56 / Avg: 67.6 / Max: 74 Min: 58 / Avg: 72.15 / Max: 77
ParaView Test: Many Spheres - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 1920 x 1080 RTX A4000 RTX A2000 20 40 60 80 100 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 101.92 49.86 101.88 49.87
ParaView Test: Many Spheres - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 1920 x 1080 RTX A4000 RTX A2000 2K 4K 6K 8K 10K SE +/- 2.93, N = 3 SE +/- 2.90, N = 3 SE +/- 2.02, N = 3 SE +/- 0.83, N = 3 10217.67 4998.43 10214.16 4999.66
ParaView Test: Many Spheres - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 1920 x 1080 RTX A4000 RTX A2000 50 100 150 200 250 220.57 150.60 219.73 152.65
ParaView GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.89 / Avg: 32.75 / Max: 67.38 Min: 14.13 / Avg: 46.49 / Max: 124.43 Min: 7.21 / Avg: 33.19 / Max: 67.5 Min: 14.12 / Avg: 46.32 / Max: 124.16
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A4000 RTX A2000 13 26 39 52 65 Min: 51 / Avg: 54.89 / Max: 62 Min: 55 / Avg: 60.78 / Max: 69 Min: 52 / Avg: 55.48 / Max: 63 Min: 54 / Avg: 60.13 / Max: 68
ParaView Test: Many Spheres - Resolution: 2560 x 1440 MIT GPL Proprietary OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 2560 x 1440 RTX A4000 RTX A2000 20 40 60 80 100 SE +/- 0.03, N = 3 SE +/- 0.01, N = 3 SE +/- 0.08, N = 3 SE +/- 0.02, N = 3 98.41 48.81 98.36 48.77
ParaView Test: Many Spheres - Resolution: 2560 x 1440 MIT GPL Proprietary OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 2560 x 1440 RTX A4000 RTX A2000 2K 4K 6K 8K 10K SE +/- 2.84, N = 3 SE +/- 1.06, N = 3 SE +/- 8.11, N = 3 SE +/- 2.13, N = 3 9865.54 4893.83 9861.26 4889.88
ParaView Test: Many Spheres - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 2560 x 1440 RTX A4000 RTX A2000 50 100 150 200 250 208.85 144.75 205.12 145.26
ParaView GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.79 / Avg: 33.69 / Max: 69.02 Min: 13.99 / Avg: 48.1 / Max: 126.69 Min: 7.01 / Avg: 33.78 / Max: 69 Min: 13.55 / Avg: 47.22 / Max: 126.07
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A4000 RTX A2000 13 26 39 52 65 Min: 48 / Avg: 52.67 / Max: 61 Min: 52 / Avg: 58.8 / Max: 68 Min: 49 / Avg: 53.74 / Max: 62 Min: 52 / Avg: 58.42 / Max: 67
ParaView Test: Wavelet Contour - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 1920 x 1080 RTX A4000 RTX A2000 100 200 300 400 500 SE +/- 1.31, N = 6 SE +/- 0.28, N = 5 SE +/- 0.58, N = 6 SE +/- 0.33, N = 5 448.72 253.24 448.11 253.15
ParaView Test: Wavelet Contour - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 1920 x 1080 RTX A4000 RTX A2000 1000 2000 3000 4000 5000 SE +/- 13.61, N = 6 SE +/- 2.93, N = 5 SE +/- 6.02, N = 6 SE +/- 3.45, N = 5 4676.21 2639.03 4669.86 2638.14
ParaView Test: Wavelet Contour - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 1920 x 1080 RTX A4000 RTX A2000 30 60 90 120 150 128.73 112.33 126.45 113.61
ParaView GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.71 / Avg: 23.22 / Max: 69.89 Min: 14.01 / Avg: 36.93 / Max: 124.95 Min: 7.17 / Avg: 23.49 / Max: 69.95 Min: 13.6 / Avg: 36.32 / Max: 125.2
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A4000 RTX A2000 13 26 39 52 65 Min: 47 / Avg: 49.4 / Max: 58 Min: 51 / Avg: 54.81 / Max: 64 Min: 48 / Avg: 50.59 / Max: 58 Min: 51 / Avg: 54.62 / Max: 64
ParaView Test: Wavelet Contour - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 2560 x 1440 RTX A4000 RTX A2000 80 160 240 320 400 SE +/- 0.44, N = 5 SE +/- 0.11, N = 5 SE +/- 0.22, N = 5 SE +/- 0.10, N = 5 370.36 203.83 368.43 204.16
ParaView Test: Wavelet Contour - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 2560 x 1440 RTX A4000 RTX A2000 800 1600 2400 3200 4000 SE +/- 4.63, N = 5 SE +/- 1.11, N = 5 SE +/- 2.33, N = 5 SE +/- 1.04, N = 5 3859.58 2124.14 3839.49 2127.57
ParaView Test: Wavelet Contour - Resolution: 2560 x 1440 MIT GPL Proprietary OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 2560 x 1440 RTX A4000 RTX A2000 20 40 60 80 100 101.19 85.86 100.85 83.85
ParaView GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.52 / Avg: 24.78 / Max: 69.43 Min: 13.77 / Avg: 37.94 / Max: 123.93 Min: 6.87 / Avg: 25.33 / Max: 69.71 Min: 13.38 / Avg: 38.27 / Max: 124.72
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A4000 RTX A2000 12 24 36 48 60 Min: 45 / Avg: 47.54 / Max: 56 Min: 49 / Avg: 53.72 / Max: 63 Min: 46 / Avg: 48.7 / Max: 56 Min: 49 / Avg: 53 / Max: 62
ParaView Test: Wavelet Volume - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 1920 x 1080 RTX A4000 RTX A2000 150 300 450 600 750 SE +/- 5.11, N = 7 SE +/- 2.91, N = 7 SE +/- 3.18, N = 7 SE +/- 2.55, N = 7 701.98 448.44 701.45 448.76
ParaView Test: Wavelet Volume - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 1920 x 1080 RTX A4000 RTX A2000 2K 4K 6K 8K 10K SE +/- 81.78, N = 7 SE +/- 46.50, N = 7 SE +/- 50.88, N = 7 SE +/- 40.83, N = 7 11231.59 7175.02 11223.16 7180.14
ParaView Test: Wavelet Volume - Resolution: 1920 x 1080 MIT GPL Proprietary OpenBenchmarking.org MiVoxels / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 1920 x 1080 RTX A2000 RTX A4000 90 180 270 360 450 423.87 399.35 414.12 404.77
ParaView GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 6.53 / Avg: 16.94 / Max: 56.9 Min: 13.52 / Avg: 28.1 / Max: 74.36 Min: 6.74 / Avg: 17.33 / Max: 56.53 Min: 13.16 / Avg: 27.75 / Max: 73.82
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A4000 RTX A2000 11 22 33 44 55 Min: 43 / Avg: 44.37 / Max: 50 Min: 47 / Avg: 50.02 / Max: 58 Min: 44 / Avg: 45.65 / Max: 50 Min: 47 / Avg: 49.47 / Max: 57
ParaView Test: Wavelet Volume - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 2560 x 1440 RTX A4000 RTX A2000 110 220 330 440 550 SE +/- 1.00, N = 7 SE +/- 0.59, N = 6 SE +/- 3.22, N = 7 SE +/- 0.52, N = 6 513.10 301.50 503.14 301.44
ParaView Test: Wavelet Volume - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 2560 x 1440 RTX A4000 RTX A2000 2K 4K 6K 8K 10K SE +/- 16.02, N = 7 SE +/- 9.42, N = 6 SE +/- 51.48, N = 7 SE +/- 8.36, N = 6 8209.65 4823.93 8050.29 4823.03
ParaView Test: Wavelet Volume - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 2560 x 1440 RTX A4000 RTX A2000 60 120 180 240 300 268.85 244.64 263.73 247.92
ParaView GPU Power Consumption Monitor MIT GPL Proprietary OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.59 / Avg: 19.45 / Max: 56.44 Min: 13.44 / Avg: 30.52 / Max: 92.07 Min: 6.8 / Avg: 19.72 / Max: 56.6 Min: 13.07 / Avg: 30.54 / Max: 93.96
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A4000 RTX A2000 11 22 33 44 55 Min: 41 / Avg: 43.26 / Max: 51 Min: 46 / Avg: 48.97 / Max: 57 Min: 42 / Avg: 44.47 / Max: 52 Min: 45 / Avg: 48.21 / Max: 56
GPU Power Consumption Monitor Phoronix Test Suite System Monitoring MIT GPL Proprietary OpenBenchmarking.org Watts GPU Power Consumption Monitor Phoronix Test Suite System Monitoring RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.29 / Avg: 56.02 / Max: 70.02 Min: 6.55 / Avg: 92.65 / Max: 131.07 Min: 4.55 / Avg: 56.48 / Max: 70.06 Min: 6.67 / Avg: 92.95 / Max: 130.53
GPU Temperature Monitor Phoronix Test Suite System Monitoring Proprietary MIT GPL OpenBenchmarking.org Celsius GPU Temperature Monitor Phoronix Test Suite System Monitoring RTX A4000 RTX A2000 15 30 45 60 75 Min: 31 / Avg: 68.43 / Max: 80 Min: 32 / Avg: 72.27 / Max: 81 Min: 29 / Avg: 68.65 / Max: 80 Min: 43 / Avg: 71.84 / Max: 81
Phoronix Test Suite v10.8.5