RTX Ada Linux Kernel Driver Benchmarks NVIDIA RTX A2000 and A4000 with the MIT GPL versus Proprietary kernel driver options with the NVIDIA R555 (555.58.02) Linux driver. Benchmarks by Michael Larabel for a future article.
HTML result view exported from: https://openbenchmarking.org/result/2407084-PTS-RTXADAKE17 .
Processor Motherboard Chipset Memory Disk Graphics Audio Monitor Network OS Kernel Desktop Display Server Display Driver OpenGL OpenCL Vulkan Compiler File-System Screen Resolution Proprietary MIT GPL RTX A2000 RTX A4000 RTX A2000 RTX A4000 AMD Ryzen 9 7950X 16-Core @ 5.88GHz (16 Cores / 32 Threads) ASUS ROG STRIX X670E-E GAMING WIFI (2007 BIOS) AMD Device 14d8 2 x 16GB DRAM-6000MT/s G Skill F5-6000J3038F16G Western Digital WD_BLACK SN850X 2000GB + 64GB Flash Drive NVIDIA RTX 2000 Ada Generation 16GB NVIDIA Device 22be DELL U2723QE Intel I225-V + Intel Wi-Fi 6 AX210/AX211/AX411 Ubuntu 22.04 6.5.0-41-generic (x86_64) GNOME Shell 42.9 X Server 1.21.1.4 NVIDIA 555.58.02 4.6.0 OpenCL 3.0 CUDA 12.5.85 1.3.278 GCC 12.3.0 ext4 3840x2160 NVIDIA RTX 4000 Ada Generation 20GB NVIDIA Device 22bc NVIDIA RTX 2000 Ada Generation 16GB NVIDIA Device 22be NVIDIA RTX 4000 Ada Generation 20GB NVIDIA Device 22bc OpenBenchmarking.org Kernel Details - nouveau.modeset=0 - Transparent Huge Pages: madvise Compiler Details - --build=x86_64-linux-gnu --disable-vtable-verify --disable-werror --enable-cet --enable-checking=release --enable-clocale=gnu --enable-default-pie --enable-gnu-unique-object --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++,m2 --enable-libphobos-checking=release --enable-libstdcxx-debug --enable-libstdcxx-time=yes --enable-multiarch --enable-multilib --enable-nls --enable-objc-gc=auto --enable-offload-defaulted --enable-offload-targets=nvptx-none=/build/gcc-12-ALHxjy/gcc-12-12.3.0/debian/tmp-nvptx/usr,amdgcn-amdhsa=/build/gcc-12-ALHxjy/gcc-12-12.3.0/debian/tmp-gcn/usr --enable-plugin --enable-shared --enable-threads=posix --host=x86_64-linux-gnu --program-prefix=x86_64-linux-gnu- --target=x86_64-linux-gnu --with-abi=m64 --with-arch-32=i686 --with-default-libstdcxx-abi=new --with-gcc-major-version-only --with-multilib-list=m32,m64,mx32 --with-target-system-zlib=auto --with-tune=generic --without-cuda-driver -v Processor Details - Scaling Governor: amd-pstate-epp powersave (EPP: balance_performance) - CPU Microcode: 0xa601206 Graphics Details - Proprietary: RTX A2000: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.07.47.00.05 - Proprietary: RTX A4000: BAR1 / Visible vRAM Size: 32768 MiB - vBIOS Version: 95.04.5c.00.0d - MIT GPL: RTX A2000: BAR1 / Visible vRAM Size: 16384 MiB - vBIOS Version: 95.07.47.00.05 - MIT GPL: RTX A4000: BAR1 / Visible vRAM Size: 32768 MiB - vBIOS Version: 95.04.5c.00.0d OpenCL Details - Proprietary: RTX A2000: GPU Compute Cores: 2816 - Proprietary: RTX A4000: GPU Compute Cores: 6144 - MIT GPL: RTX A2000: GPU Compute Cores: 2816 - MIT GPL: RTX A4000: GPU Compute Cores: 6144 Python Details - Python 3.10.12 Security Details - gather_data_sampling: Not affected + itlb_multihit: Not affected + l1tf: Not affected + mds: Not affected + meltdown: Not affected + mmio_stale_data: Not affected + retbleed: Not affected + spec_rstack_overflow: Mitigation of Safe RET + spec_store_bypass: Mitigation of SSB disabled via prctl + spectre_v1: Mitigation of usercopy/swapgs barriers and __user pointer sanitization + spectre_v2: Mitigation of Enhanced / Automatic IBRS; IBPB: conditional; STIBP: always-on; RSB filling; PBRSB-eIBRS: Not affected; BHI: Not affected + srbds: Not affected + tsx_async_abort: Not affected
v-ray: NVIDIA CUDA GPU v-ray: NVIDIA RTX GPU octanebench: Total Score indigobench: OpenCL GPU - Supercar indigobench: OpenCL GPU - Bedroom blender: BMW27 - NVIDIA CUDA blender: BMW27 - NVIDIA OptiX blender: Classroom - NVIDIA CUDA blender: Classroom - NVIDIA OptiX blender: Fishy Cat - NVIDIA CUDA blender: Fishy Cat - NVIDIA OptiX blender: Pabellon Barcelona - NVIDIA CUDA blender: Pabellon Barcelona - NVIDIA OptiX blender: Barbershop - NVIDIA CUDA blender: Barbershop - NVIDIA OptiX blender: Junkshop - NVIDIA CUDA blender: Junkshop - NVIDIA OptiX luxmark: GPU - Luxball HDR luxmark: GPU - Microphone luxmark: GPU - Hotel fluidx3d: FP32-FP32 fluidx3d: FP32-FP16S fluidx3d: FP32-FP16C specviewperf2020: 1920 x 1080 - CATIA-06 specviewperf2020: 1920 x 1080 - CREO-03 specviewperf2020: 1920 x 1080 - ENERGY-03 specviewperf2020: 1920 x 1080 - MAYA-06 specviewperf2020: 1920 x 1080 - MEDICAL-O3 specviewperf2020: 1920 x 1080 - SNX-04 specviewperf2020: 1920 x 1080 - SOLIDWORKS-07 specviewperf2020: 2560 x 1440 - CATIA-06 specviewperf2020: 2560 x 1440 - CREO-03 specviewperf2020: 2560 x 1440 - ENERGY-03 specviewperf2020: 2560 x 1440 - MAYA-06 specviewperf2020: 2560 x 1440 - MEDICAL-O3 specviewperf2020: 2560 x 1440 - SNX-04 specviewperf2020: 2560 x 1440 - SOLIDWORKS-07 unigine-super: 1920 x 1080 - Fullscreen - Low - OpenGL unigine-super: 1920 x 1080 - Fullscreen - Medium - OpenGL unigine-super: 1920 x 1080 - Fullscreen - High - OpenGL unigine-super: 1920 x 1080 - Fullscreen - Ultra - OpenGL unigine-super: 2560 x 1440 - Fullscreen - Low - OpenGL unigine-super: 2560 x 1440 - Fullscreen - Medium - OpenGL unigine-super: 2560 x 1440 - Fullscreen - High - OpenGL unigine-super: 2560 x 1440 - Fullscreen - Ultra - OpenGL breaking-limit: 1920 x 1080 - On breaking-limit: 1920 x 1080 - Off breaking-limit: 2560 x 1440 - On breaking-limit: 2560 x 1440 - Off paraview: Many Spheres - 1920 x 1080 paraview: Many Spheres - 1920 x 1080 paraview: Many Spheres - 2560 x 1440 paraview: Many Spheres - 2560 x 1440 paraview: Wavelet Contour - 1920 x 1080 paraview: Wavelet Contour - 1920 x 1080 paraview: Wavelet Contour - 2560 x 1440 paraview: Wavelet Contour - 2560 x 1440 paraview: Wavelet Volume - 1920 x 1080 paraview: Wavelet Volume - 1920 x 1080 paraview: Wavelet Volume - 2560 x 1440 paraview: Wavelet Volume - 2560 x 1440 Proprietary MIT GPL RTX A2000 RTX A4000 RTX A2000 RTX A4000 1653 2037 284.830693 24.802 8.333 22.57 10.41 44.94 27.79 47.06 22.78 108.43 30.39 186.23 111.31 37.77 21.59 32886 25323 7181 1371 2488 2556 83.40 136.64 51.31 371.37 103.52 334.05 200.59 83.43 111.02 31.75 273.51 65.69 286.27 183.88 208.3 105.2 75 30.9 142.8 65.7 44.2 17.9 33.09 102.86 20.74 70.22 49.86 4998.433 48.77 4889.876 253.24 2639.025 203.83 2124.142 448.44 7175.023 301.50 4823.934 2340 2832 495.030899 38.125 13.322 13.30 7.13 25.98 17.22 26.12 12.80 58.81 18.86 105.87 69.88 22.54 14.04 53209 39074 12065 2159 4006 4280 144.18 192.14 98.42 540.09 142.54 495.75 336.92 144.40 160.57 62.60 416.35 97.00 421.61 304.80 323.3 178.9 128.5 51.2 242.0 114.6 77.4 29.6 61.98 181.40 39.19 125.55 101.92 10217.673 98.36 9861.257 448.72 4676.206 370.36 3859.581 701.98 11231.590 513.10 8209.650 1642 2020 280.903124 24.396 8.181 22.76 10.57 45.24 27.86 47.28 22.64 108.86 30.66 187.86 112.84 38.22 22.02 32501 25078 7183 1330 2416 2495 83.67 136.80 51.51 375.02 103.80 334.87 200.81 83.56 111.11 31.80 273.74 65.84 287.19 184.18 208.9 105.5 75.2 31.1 143.2 66.0 44.4 17.9 33.23 103.55 20.85 70.63 49.87 4999.660 48.81 4893.831 253.15 2638.139 204.16 2127.565 448.76 7180.140 301.44 4823.032 2340 2927 482.879547 36.981 12.882 13.50 7.39 26.34 17.61 26.30 12.96 59.25 19.33 107.41 71.75 23.08 14.50 51531 38207 11595 2049 3814 4065 143.45 192.12 98.09 539.31 142.75 496.61 340.81 144.74 160.53 62.47 414.44 97.22 423.50 308.53 324.6 179.1 128.6 51.1 242.0 114.6 77.4 29.6 61.89 181.16 39.17 125.39 101.88 10214.160 98.41 9865.540 448.11 4669.861 368.43 3839.488 701.45 11223.158 503.14 8050.292 OpenBenchmarking.org
Chaos Group V-RAY Mode: NVIDIA CUDA GPU Proprietary MIT GPL OpenBenchmarking.org vpaths, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA CUDA GPU RTX A2000 RTX A4000 500 1000 1500 2000 2500 SE +/- 7.00, N = 3 SE +/- 0.00, N = 3 SE +/- 3.67, N = 3 SE +/- 0.00, N = 3 1653 2340 1642 2340
Chaos Group V-RAY Mode: NVIDIA CUDA GPU Proprietary MIT GPL OpenBenchmarking.org vpaths Per Watt, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA CUDA GPU RTX A2000 RTX A4000 8 16 24 32 40 32.41 30.94 32.46 31.99
Chaos Group V-RAY GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Chaos Group V-RAY 6.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.55 / Avg: 51.01 / Max: 68.75 Min: 6.67 / Avg: 75.64 / Max: 109.6 Min: 4.29 / Avg: 50.58 / Max: 67.64 Min: 6.55 / Avg: 73.14 / Max: 108.42
Chaos Group V-RAY GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Chaos Group V-RAY 6.0 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 32 / Avg: 59.69 / Max: 74 Min: 31 / Avg: 57.08 / Max: 72 Min: 43 / Avg: 64.23 / Max: 75 Min: 29 / Avg: 55.71 / Max: 71
Chaos Group V-RAY Mode: NVIDIA RTX GPU Proprietary MIT GPL OpenBenchmarking.org vpaths, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA RTX GPU RTX A2000 RTX A4000 600 1200 1800 2400 3000 SE +/- 3.33, N = 3 SE +/- 29.16, N = 3 SE +/- 7.00, N = 3 SE +/- 10.33, N = 3 2037 2832 2020 2927
Chaos Group V-RAY Mode: NVIDIA RTX GPU Proprietary MIT GPL OpenBenchmarking.org vpaths Per Watt, More Is Better Chaos Group V-RAY 6.0 Mode: NVIDIA RTX GPU RTX A2000 RTX A4000 10 20 30 40 50 44.30 37.13 43.53 37.94
Chaos Group V-RAY GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Chaos Group V-RAY 6.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.97 / Avg: 45.98 / Max: 63.95 Min: 14.35 / Avg: 76.28 / Max: 101.35 Min: 4.49 / Avg: 46.41 / Max: 61.88 Min: 7.78 / Avg: 77.16 / Max: 102.86
Chaos Group V-RAY GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Chaos Group V-RAY 6.0 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 53 / Avg: 65.03 / Max: 72 Min: 49 / Avg: 62.14 / Max: 69 Min: 54 / Avg: 65.65 / Max: 72 Min: 47 / Avg: 61.91 / Max: 69
OctaneBench Total Score Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better OctaneBench 2020.1 Total Score RTX A2000 RTX A4000 110 220 330 440 550 284.83 495.03 280.90 482.88
OctaneBench Total Score Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better OctaneBench 2020.1 Total Score RTX A2000 RTX A4000 0.9533 1.9066 2.8599 3.8132 4.7665 4.225 4.224 4.237 4.208
OctaneBench GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better OctaneBench 2020.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.84 / Avg: 67.42 / Max: 69.96 Min: 17.94 / Avg: 117.19 / Max: 128.9 Min: 4.76 / Avg: 66.29 / Max: 69.87 Min: 8.63 / Avg: 114.76 / Max: 127.42
OctaneBench GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better OctaneBench 2020.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 56 / Avg: 75.38 / Max: 79 Min: 54 / Avg: 72.97 / Max: 78 Min: 56 / Avg: 74.94 / Max: 79 Min: 53 / Avg: 72.57 / Max: 79
IndigoBench Acceleration: OpenCL GPU - Scene: Supercar Proprietary MIT GPL OpenBenchmarking.org M samples/s, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Supercar RTX A2000 RTX A4000 9 18 27 36 45 SE +/- 0.00, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.00, N = 3 24.80 38.13 24.40 36.98
IndigoBench Acceleration: OpenCL GPU - Scene: Supercar Proprietary MIT GPL OpenBenchmarking.org M samples/s Per Watt, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Supercar RTX A2000 RTX A4000 0.0956 0.1912 0.2868 0.3824 0.478 0.423 0.404 0.425 0.400
IndigoBench GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better IndigoBench 4.4 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 5 / Avg: 58.63 / Max: 67.57 Min: 14.6 / Avg: 94.46 / Max: 106.75 Min: 4.84 / Avg: 57.35 / Max: 65.74 Min: 9.58 / Avg: 92.48 / Max: 105.24
IndigoBench GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better IndigoBench 4.4 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 59 / Avg: 72.23 / Max: 76 Min: 56 / Avg: 67.79 / Max: 72 Min: 58 / Avg: 71.3 / Max: 75 Min: 55 / Avg: 67.82 / Max: 71
IndigoBench Acceleration: OpenCL GPU - Scene: Bedroom Proprietary MIT GPL OpenBenchmarking.org M samples/s, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Bedroom RTX A2000 RTX A4000 3 6 9 12 15 SE +/- 0.001, N = 3 SE +/- 0.002, N = 3 SE +/- 0.003, N = 3 SE +/- 0.004, N = 3 8.333 13.322 8.181 12.882
IndigoBench Acceleration: OpenCL GPU - Scene: Bedroom Proprietary MIT GPL OpenBenchmarking.org M samples/s Per Watt, More Is Better IndigoBench 4.4 Acceleration: OpenCL GPU - Scene: Bedroom RTX A2000 RTX A4000 0.0306 0.0612 0.0918 0.1224 0.153 0.136 0.132 0.136 0.130
IndigoBench GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better IndigoBench 4.4 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 5.08 / Avg: 61.2 / Max: 68.31 Min: 14.66 / Avg: 100.99 / Max: 115.19 Min: 5.13 / Avg: 60.22 / Max: 67.46 Min: 9.12 / Avg: 99.42 / Max: 113.23
IndigoBench GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better IndigoBench 4.4 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 59 / Avg: 72.77 / Max: 77 Min: 55 / Avg: 69.01 / Max: 74 Min: 58 / Avg: 71.9 / Max: 76 Min: 55 / Avg: 69.08 / Max: 73
Blender Blend File: BMW27 - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: BMW27 - Compute: NVIDIA CUDA RTX A2000 RTX A4000 5 10 15 20 25 SE +/- 0.01, N = 3 SE +/- 0.02, N = 4 SE +/- 0.03, N = 3 SE +/- 0.03, N = 4 22.57 13.30 22.76 13.50
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.89 / Avg: 51.77 / Max: 62.98 Min: 14.6 / Avg: 81.94 / Max: 108.87 Min: 7.14 / Avg: 50.61 / Max: 61.74 Min: 9.23 / Avg: 79.88 / Max: 107.92
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 59 / Avg: 70.36 / Max: 74 Min: 56 / Avg: 65.05 / Max: 70 Min: 59 / Avg: 69.61 / Max: 73 Min: 56 / Avg: 65.16 / Max: 70
Blender Blend File: BMW27 - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: BMW27 - Compute: NVIDIA OptiX RTX A2000 RTX A4000 3 6 9 12 15 SE +/- 0.06, N = 15 SE +/- 0.01, N = 6 SE +/- 0.01, N = 5 SE +/- 0.01, N = 6 10.41 7.13 10.57 7.39
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.87 / Avg: 48.92 / Max: 67.77 Min: 14.19 / Avg: 71.07 / Max: 108.8 Min: 6.91 / Avg: 46.99 / Max: 65.85 Min: 8.63 / Avg: 69.74 / Max: 107.1
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 56 / Avg: 67.45 / Max: 72 Min: 53 / Avg: 60.19 / Max: 65 Min: 56 / Avg: 65.42 / Max: 70 Min: 53 / Avg: 60.25 / Max: 65
Blender Blend File: Classroom - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Classroom - Compute: NVIDIA CUDA RTX A2000 RTX A4000 10 20 30 40 50 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 SE +/- 0.00, N = 3 44.94 25.98 45.24 26.34
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.87 / Avg: 59.74 / Max: 66.5 Min: 14.18 / Avg: 97.57 / Max: 114.29 Min: 6.94 / Avg: 58.43 / Max: 65.19 Min: 8.42 / Avg: 95.45 / Max: 113.53
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 56 / Avg: 72.64 / Max: 77 Min: 52 / Avg: 67.27 / Max: 72 Min: 55 / Avg: 71.61 / Max: 76 Min: 52 / Avg: 67.27 / Max: 74
Blender Blend File: Classroom - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Classroom - Compute: NVIDIA OptiX RTX A2000 RTX A4000 7 14 21 28 35 SE +/- 0.06, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.04, N = 3 27.79 17.22 27.86 17.61
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 5.06 / Avg: 59.66 / Max: 69.61 Min: 14.23 / Avg: 94.28 / Max: 118.65 Min: 7.08 / Avg: 59.5 / Max: 69.56 Min: 8.93 / Avg: 92.63 / Max: 117.55
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 71.78 / Max: 76 Min: 54 / Avg: 65.63 / Max: 72 Min: 57 / Avg: 71.24 / Max: 76 Min: 54 / Avg: 65.77 / Max: 72
Blender Blend File: Fishy Cat - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Fishy Cat - Compute: NVIDIA CUDA RTX A2000 RTX A4000 11 22 33 44 55 SE +/- 0.04, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 47.06 26.12 47.28 26.30
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.94 / Avg: 49.72 / Max: 61.77 Min: 14.25 / Avg: 82.27 / Max: 105.21 Min: 7.13 / Avg: 48.96 / Max: 61.27 Min: 9.96 / Avg: 71.27 / Max: 104.34
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 57 / Avg: 70.01 / Max: 75 Min: 54 / Avg: 65.29 / Max: 71 Min: 57 / Avg: 69.28 / Max: 74 Min: 53 / Avg: 63.98 / Max: 71
Blender Blend File: Fishy Cat - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Fishy Cat - Compute: NVIDIA OptiX RTX A2000 RTX A4000 5 10 15 20 25 SE +/- 0.28, N = 3 SE +/- 0.02, N = 4 SE +/- 0.02, N = 3 SE +/- 0.01, N = 4 22.78 12.80 22.64 12.96
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.27 / Avg: 52.64 / Max: 66.49 Min: 14.19 / Avg: 81.85 / Max: 112.37 Min: 6.97 / Avg: 51.98 / Max: 65.31 Min: 13.76 / Avg: 80.4 / Max: 111.14
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 56 / Avg: 69.01 / Max: 75 Min: 52 / Avg: 63.89 / Max: 70 Min: 55 / Avg: 68.21 / Max: 74 Min: 49 / Avg: 61.97 / Max: 70
Blender Blend File: Pabellon Barcelona - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Pabellon Barcelona - Compute: NVIDIA CUDA RTX A2000 RTX A4000 20 40 60 80 100 SE +/- 0.04, N = 3 SE +/- 0.02, N = 3 SE +/- 0.02, N = 3 SE +/- 0.03, N = 3 108.43 58.81 108.86 59.25
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.38 / Avg: 58.16 / Max: 62.68 Min: 14.15 / Avg: 99.6 / Max: 111.35 Min: 7.04 / Avg: 57.16 / Max: 61.32 Min: 14.19 / Avg: 97.07 / Max: 110.36
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 56 / Avg: 73.36 / Max: 76 Min: 52 / Avg: 69.64 / Max: 74 Min: 55 / Avg: 72.48 / Max: 76 Min: 52 / Avg: 69.68 / Max: 74
Blender Blend File: Pabellon Barcelona - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Pabellon Barcelona - Compute: NVIDIA OptiX RTX A2000 RTX A4000 7 14 21 28 35 SE +/- 0.05, N = 3 SE +/- 0.00, N = 3 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 30.39 18.86 30.66 19.33
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.26 / Avg: 59.61 / Max: 69.33 Min: 14.41 / Avg: 94.89 / Max: 120.59 Min: 6.98 / Avg: 58.92 / Max: 69.19 Min: 14.4 / Avg: 93.59 / Max: 119.14
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 57 / Avg: 70.98 / Max: 75 Min: 54 / Avg: 65.6 / Max: 71 Min: 57 / Avg: 70.31 / Max: 75 Min: 53 / Avg: 65.14 / Max: 71
Blender Blend File: Barbershop - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Barbershop - Compute: NVIDIA CUDA RTX A2000 RTX A4000 40 80 120 160 200 SE +/- 0.30, N = 3 SE +/- 0.08, N = 3 SE +/- 0.03, N = 3 SE +/- 0.06, N = 3 186.23 105.87 187.86 107.41
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.44 / Avg: 61.26 / Max: 66.3 Min: 14.4 / Avg: 102.04 / Max: 115.51 Min: 7.13 / Avg: 60.01 / Max: 64.75 Min: 14.65 / Avg: 100.95 / Max: 113.97
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 74.28 / Max: 78 Min: 53 / Avg: 70.08 / Max: 74 Min: 56 / Avg: 73.38 / Max: 76 Min: 53 / Avg: 70.31 / Max: 75
Blender Blend File: Barbershop - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Barbershop - Compute: NVIDIA OptiX RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.07, N = 3 SE +/- 0.09, N = 3 SE +/- 0.03, N = 3 SE +/- 0.08, N = 3 111.31 69.88 112.84 71.75
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.61 / Avg: 61.96 / Max: 67.74 Min: 14.47 / Avg: 99.6 / Max: 114.26 Min: 7.03 / Avg: 60.34 / Max: 66.04 Min: 14.77 / Avg: 98.25 / Max: 111.89
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 58 / Avg: 74.04 / Max: 77 Min: 54 / Avg: 68.91 / Max: 73 Min: 57 / Avg: 72.94 / Max: 77 Min: 54 / Avg: 69.08 / Max: 73
Blender Blend File: Junkshop - Compute: NVIDIA CUDA Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Junkshop - Compute: NVIDIA CUDA RTX A2000 RTX A4000 9 18 27 36 45 SE +/- 0.03, N = 3 SE +/- 0.02, N = 3 SE +/- 0.04, N = 3 SE +/- 0.03, N = 3 37.77 22.54 38.22 23.08
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.6 / Avg: 48.63 / Max: 58.89 Min: 14.67 / Avg: 76.87 / Max: 102.06 Min: 7.03 / Avg: 47.56 / Max: 57.57 Min: 14.84 / Avg: 75.59 / Max: 100.49
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 59 / Avg: 68.09 / Max: 72 Min: 55 / Avg: 62.88 / Max: 67 Min: 58 / Avg: 67.3 / Max: 71 Min: 56 / Avg: 63.22 / Max: 68
Blender Blend File: Junkshop - Compute: NVIDIA OptiX Proprietary MIT GPL OpenBenchmarking.org Seconds, Fewer Is Better Blender 4.1 Blend File: Junkshop - Compute: NVIDIA OptiX RTX A2000 RTX A4000 5 10 15 20 25 SE +/- 0.04, N = 3 SE +/- 0.03, N = 4 SE +/- 0.05, N = 3 SE +/- 0.02, N = 4 21.59 14.04 22.02 14.50
Blender GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Blender 4.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.23 / Avg: 48.44 / Max: 62.04 Min: 14.26 / Avg: 75.92 / Max: 106.15 Min: 7.06 / Avg: 47.52 / Max: 60.51 Min: 14.62 / Avg: 74.29 / Max: 104.43
Blender GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Blender 4.1 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 57 / Avg: 66.06 / Max: 70 Min: 53 / Avg: 60.93 / Max: 66 Min: 56 / Avg: 65.32 / Max: 69 Min: 54 / Avg: 61.36 / Max: 66
LuxMark OpenCL Device: GPU - Scene: Luxball HDR Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Luxball HDR RTX A2000 RTX A4000 11K 22K 33K 44K 55K SE +/- 92.15, N = 3 SE +/- 303.83, N = 3 SE +/- 15.82, N = 3 SE +/- 15.04, N = 3 32886 53209 32501 51531
LuxMark OpenCL Device: GPU - Scene: Luxball HDR Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Luxball HDR RTX A2000 RTX A4000 110 220 330 440 550 490.54 472.37 486.87 467.40
LuxMark GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better LuxMark 3.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.51 / Avg: 67.04 / Max: 69.51 Min: 14.35 / Avg: 112.64 / Max: 116.78 Min: 7.11 / Avg: 66.75 / Max: 69.21 Min: 17.87 / Avg: 110.25 / Max: 114.63
LuxMark GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better LuxMark 3.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 75.63 / Max: 78 Min: 53 / Avg: 72.22 / Max: 75 Min: 56 / Avg: 75.16 / Max: 78 Min: 54 / Avg: 72.14 / Max: 75
LuxMark OpenCL Device: GPU - Scene: Microphone Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Microphone RTX A2000 RTX A4000 8K 16K 24K 32K 40K SE +/- 9.61, N = 3 SE +/- 16.65, N = 3 SE +/- 5.04, N = 3 SE +/- 6.51, N = 3 25323 39074 25078 38207
LuxMark OpenCL Device: GPU - Scene: Microphone Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Microphone RTX A2000 RTX A4000 80 160 240 320 400 380.43 343.78 377.01 341.74
LuxMark GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better LuxMark 3.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.49 / Avg: 66.56 / Max: 69.8 Min: 14.75 / Avg: 113.66 / Max: 118.61 Min: 7.24 / Avg: 66.52 / Max: 69.39 Min: 15.13 / Avg: 111.61 / Max: 116.72
LuxMark GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better LuxMark 3.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 60 / Avg: 75.75 / Max: 78 Min: 57 / Avg: 73.03 / Max: 75 Min: 59 / Avg: 75.55 / Max: 78 Min: 57 / Avg: 73.08 / Max: 75
LuxMark OpenCL Device: GPU - Scene: Hotel Proprietary MIT GPL OpenBenchmarking.org Score, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Hotel RTX A2000 RTX A4000 3K 6K 9K 12K 15K SE +/- 1.45, N = 3 SE +/- 9.21, N = 3 SE +/- 1.20, N = 3 SE +/- 8.88, N = 3 7181 12065 7183 11595
LuxMark OpenCL Device: GPU - Scene: Hotel Proprietary MIT GPL OpenBenchmarking.org Score Per Watt, More Is Better LuxMark 3.1 OpenCL Device: GPU - Scene: Hotel RTX A2000 RTX A4000 20 40 60 80 100 108.28 104.52 108.20 98.39
LuxMark GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better LuxMark 3.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.59 / Avg: 66.32 / Max: 69.54 Min: 14.64 / Avg: 115.43 / Max: 125.3 Min: 7.18 / Avg: 66.39 / Max: 69.71 Min: 15.1 / Avg: 117.85 / Max: 123.12
LuxMark GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better LuxMark 3.1 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 60 / Avg: 76.59 / Max: 79 Min: 56 / Avg: 74.29 / Max: 78 Min: 59 / Avg: 76.26 / Max: 79 Min: 57 / Avg: 74.85 / Max: 78
FluidX3D Test: FP32-FP32 Proprietary MIT GPL OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.17 Test: FP32-FP32 RTX A2000 RTX A4000 500 1000 1500 2000 2500 SE +/- 0.00, N = 3 SE +/- 0.58, N = 3 SE +/- 0.00, N = 3 SE +/- 0.88, N = 3 1371 2159 1330 2049
FluidX3D Test: FP32-FP32 Proprietary MIT GPL OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.17 Test: FP32-FP32 RTX A2000 RTX A4000 6 12 18 24 30 26.42 24.95 26.47 24.36
FluidX3D GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.17 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.46 / Avg: 51.89 / Max: 53.87 Min: 14 / Avg: 86.53 / Max: 91.45 Min: 7 / Avg: 50.24 / Max: 52.08 Min: 14.95 / Avg: 84.13 / Max: 88.64
FluidX3D GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.17 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 59 / Avg: 68.95 / Max: 70 Min: 51 / Avg: 65.16 / Max: 68 Min: 59 / Avg: 67.99 / Max: 69 Min: 57 / Avg: 65.29 / Max: 67
FluidX3D Test: FP32-FP16S Proprietary MIT GPL OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.17 Test: FP32-FP16S RTX A2000 RTX A4000 900 1800 2700 3600 4500 SE +/- 0.00, N = 3 SE +/- 1.15, N = 3 SE +/- 0.00, N = 3 SE +/- 0.58, N = 3 2488 4006 2416 3814
FluidX3D Test: FP32-FP16S Proprietary MIT GPL OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.17 Test: FP32-FP16S RTX A2000 RTX A4000 10 20 30 40 50 43.07 42.87 43.19 41.68
FluidX3D GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.17 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.4 / Avg: 57.76 / Max: 61.45 Min: 14.43 / Avg: 93.46 / Max: 101.83 Min: 6.99 / Avg: 55.93 / Max: 59.72 Min: 14.82 / Avg: 91.51 / Max: 100.39
FluidX3D GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.17 GPU Temperature Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 58 / Avg: 71.34 / Max: 74 Min: 55 / Avg: 67.28 / Max: 70 Min: 57 / Avg: 70.29 / Max: 73 Min: 55 / Avg: 67.25 / Max: 70
FluidX3D Test: FP32-FP16C Proprietary MIT GPL OpenBenchmarking.org MLUPs/s, More Is Better FluidX3D 2.17 Test: FP32-FP16C RTX A2000 RTX A4000 900 1800 2700 3600 4500 SE +/- 2.52, N = 3 SE +/- 0.88, N = 3 SE +/- 2.00, N = 3 SE +/- 1.00, N = 3 2556 4280 2495 4065
FluidX3D Test: FP32-FP16C Proprietary MIT GPL OpenBenchmarking.org MLUPs/s Per Watt, More Is Better FluidX3D 2.17 Test: FP32-FP16C RTX A2000 RTX A4000 9 18 27 36 45 38.88 37.47 38.10 36.84
FluidX3D GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better FluidX3D 2.17 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.46 / Avg: 65.74 / Max: 70.02 Min: 14.73 / Avg: 114.21 / Max: 126.15 Min: 7.16 / Avg: 65.49 / Max: 69.99 Min: 14.98 / Avg: 110.35 / Max: 121.87
FluidX3D GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better FluidX3D 2.17 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 59 / Avg: 75.34 / Max: 78 Min: 56 / Avg: 72.5 / Max: 76 Min: 58 / Avg: 75.29 / Max: 78 Min: 56 / Avg: 72.28 / Max: 76
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CATIA-06 RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.14, N = 3 SE +/- 0.45, N = 3 SE +/- 0.13, N = 3 SE +/- 1.23, N = 3 83.40 144.18 83.67 143.45
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CATIA-06 RTX A2000 RTX A4000 0.3695 0.739 1.1085 1.478 1.8475 1.544 1.642 1.541 1.627
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.55 / Avg: 54.03 / Max: 69.39 Min: 14.53 / Avg: 87.81 / Max: 127.55 Min: 7.21 / Avg: 54.3 / Max: 69.36 Min: 14.83 / Avg: 88.19 / Max: 127.51
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 58 / Avg: 70.65 / Max: 76 Min: 55 / Avg: 66.27 / Max: 73 Min: 58 / Avg: 70.52 / Max: 76 Min: 56 / Avg: 66.77 / Max: 74
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CREO-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CREO-03 RTX A2000 RTX A4000 40 80 120 160 200 SE +/- 0.16, N = 3 SE +/- 0.15, N = 3 SE +/- 0.18, N = 3 SE +/- 0.19, N = 3 136.64 192.14 136.80 192.12
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: CREO-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: CREO-03 RTX A2000 RTX A4000 0.623 1.246 1.869 2.492 3.115 2.754 2.571 2.769 2.561
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.39 / Avg: 49.62 / Max: 69.36 Min: 14.34 / Avg: 74.74 / Max: 116.33 Min: 7.1 / Avg: 49.4 / Max: 69.57 Min: 14.54 / Avg: 75.02 / Max: 116.94
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 56 / Avg: 68.61 / Max: 76 Min: 53 / Avg: 62.65 / Max: 71 Min: 56 / Avg: 68.04 / Max: 76 Min: 53 / Avg: 63.13 / Max: 71
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: ENERGY-03 RTX A2000 RTX A4000 20 40 60 80 100 SE +/- 0.01, N = 3 SE +/- 0.10, N = 3 SE +/- 0.05, N = 3 SE +/- 0.02, N = 3 51.31 98.42 51.51 98.09
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: ENERGY-03 RTX A2000 RTX A4000 0.23 0.46 0.69 0.92 1.15 0.985 1.020 0.997 1.022
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.24 / Avg: 52.11 / Max: 69.97 Min: 14.18 / Avg: 96.51 / Max: 130.24 Min: 6.83 / Avg: 51.68 / Max: 69.92 Min: 14.37 / Avg: 96.01 / Max: 130.43
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 55 / Avg: 69.86 / Max: 78 Min: 52 / Avg: 69.03 / Max: 79 Min: 55 / Avg: 69.37 / Max: 77 Min: 53 / Avg: 69.21 / Max: 79
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MAYA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MAYA-06 RTX A2000 RTX A4000 120 240 360 480 600 SE +/- 2.79, N = 3 SE +/- 2.35, N = 3 SE +/- 0.58, N = 3 SE +/- 2.04, N = 3 371.37 540.09 375.02 539.31
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MAYA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MAYA-06 RTX A2000 RTX A4000 2 4 6 8 10 7.032 6.701 7.096 6.650
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.31 / Avg: 52.81 / Max: 69.78 Min: 14.38 / Avg: 80.6 / Max: 125 Min: 6.9 / Avg: 52.85 / Max: 69.82 Min: 14.56 / Avg: 81.1 / Max: 125.04
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 56 / Avg: 70.16 / Max: 77 Min: 53 / Avg: 64.34 / Max: 73 Min: 56 / Avg: 69.67 / Max: 77 Min: 54 / Avg: 64.62 / Max: 73
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.08, N = 3 SE +/- 0.04, N = 3 103.52 142.54 103.80 142.75
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: MEDICAL-O3 RTX A2000 RTX A4000 0.4088 0.8176 1.2264 1.6352 2.044 1.804 1.736 1.817 1.724
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.47 / Avg: 57.4 / Max: 69.72 Min: 14.14 / Avg: 82.09 / Max: 118.6 Min: 7.13 / Avg: 57.12 / Max: 69.75 Min: 14.5 / Avg: 82.8 / Max: 118.96
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 73.75 / Max: 79 Min: 53 / Avg: 65.76 / Max: 75 Min: 57 / Avg: 73.02 / Max: 79 Min: 54 / Avg: 66.01 / Max: 75
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SNX-04 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SNX-04 RTX A2000 RTX A4000 110 220 330 440 550 SE +/- 0.16, N = 3 SE +/- 0.37, N = 3 SE +/- 0.61, N = 3 SE +/- 0.58, N = 3 334.05 495.75 334.87 496.61
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SNX-04 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SNX-04 RTX A2000 RTX A4000 1.2222 2.4444 3.6666 4.8888 6.111 5.392 4.890 5.432 4.885
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.46 / Avg: 61.96 / Max: 70.06 Min: 13.97 / Avg: 101.39 / Max: 127.34 Min: 7.01 / Avg: 61.65 / Max: 69.87 Min: 14.19 / Avg: 101.65 / Max: 128.07
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 74.88 / Max: 78 Min: 51 / Avg: 70.3 / Max: 76 Min: 57 / Avg: 74.26 / Max: 78 Min: 51 / Avg: 70.35 / Max: 76
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 RTX A2000 RTX A4000 70 140 210 280 350 SE +/- 0.17, N = 3 SE +/- 0.57, N = 3 SE +/- 0.09, N = 3 SE +/- 0.09, N = 3 200.59 336.92 200.81 340.81
SPECViewPerf 2020 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 1920 x 1080 - Viewset: SOLIDWORKS-07 RTX A2000 RTX A4000 0.716 1.432 2.148 2.864 3.58 3.119 3.179 3.134 3.182
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.54 / Avg: 64.3 / Max: 69.84 Min: 14.35 / Avg: 105.97 / Max: 118.99 Min: 7 / Avg: 64.08 / Max: 69.67 Min: 14.57 / Avg: 107.11 / Max: 118.58
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 58 / Avg: 76.69 / Max: 81 Min: 54 / Avg: 71.86 / Max: 78 Min: 57 / Avg: 76.27 / Max: 81 Min: 54 / Avg: 72.35 / Max: 77
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CATIA-06 RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.09, N = 3 SE +/- 0.26, N = 3 SE +/- 0.01, N = 3 SE +/- 0.26, N = 3 83.43 144.40 83.56 144.74
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CATIA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CATIA-06 RTX A2000 RTX A4000 0.3699 0.7398 1.1097 1.4796 1.8495 1.532 1.644 1.540 1.636
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.41 / Avg: 54.45 / Max: 69.52 Min: 14.4 / Avg: 87.84 / Max: 127.38 Min: 7.09 / Avg: 54.27 / Max: 69.15 Min: 14.69 / Avg: 88.5 / Max: 127.57
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 58 / Avg: 71.02 / Max: 76 Min: 54 / Avg: 66.8 / Max: 74 Min: 57 / Avg: 70.52 / Max: 75 Min: 55 / Avg: 67.14 / Max: 74
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CREO-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CREO-03 RTX A2000 RTX A4000 40 80 120 160 200 SE +/- 0.09, N = 3 SE +/- 0.16, N = 3 SE +/- 0.09, N = 3 SE +/- 0.07, N = 3 111.02 160.57 111.11 160.53
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: CREO-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: CREO-03 RTX A2000 RTX A4000 0.4964 0.9928 1.4892 1.9856 2.482 2.187 2.060 2.206 2.051
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.35 / Avg: 50.76 / Max: 69.64 Min: 14.22 / Avg: 77.95 / Max: 119.73 Min: 7.06 / Avg: 50.36 / Max: 69.6 Min: 14.65 / Avg: 78.28 / Max: 119.94
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 69.19 / Max: 77 Min: 54 / Avg: 63.88 / Max: 74 Min: 56 / Avg: 68.55 / Max: 76 Min: 54 / Avg: 64.14 / Max: 73
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: ENERGY-03 RTX A2000 RTX A4000 14 28 42 56 70 SE +/- 0.02, N = 3 SE +/- 0.06, N = 3 SE +/- 0.02, N = 3 SE +/- 0.06, N = 3 31.75 62.60 31.80 62.47
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: ENERGY-03 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: ENERGY-03 RTX A2000 RTX A4000 0.1476 0.2952 0.4428 0.5904 0.738 0.610 0.656 0.615 0.655
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.26 / Avg: 52.06 / Max: 69.95 Min: 14.11 / Avg: 95.36 / Max: 130.53 Min: 7.02 / Avg: 51.69 / Max: 69.86 Min: 14.29 / Avg: 95.4 / Max: 131.07
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 55 / Avg: 70.42 / Max: 78 Min: 53 / Avg: 69.68 / Max: 79 Min: 55 / Avg: 69.8 / Max: 78 Min: 53 / Avg: 69.88 / Max: 79
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MAYA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MAYA-06 RTX A2000 RTX A4000 90 180 270 360 450 SE +/- 0.58, N = 3 SE +/- 1.59, N = 3 SE +/- 0.39, N = 3 SE +/- 1.14, N = 3 273.51 416.35 273.74 414.44
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MAYA-06 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MAYA-06 RTX A2000 RTX A4000 1.1284 2.2568 3.3852 4.5136 5.642 5.005 4.862 5.015 4.816
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.22 / Avg: 54.65 / Max: 69.72 Min: 14.28 / Avg: 85.63 / Max: 123.49 Min: 7.11 / Avg: 54.58 / Max: 69.93 Min: 14.52 / Avg: 86.05 / Max: 123.65
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 56 / Avg: 70.7 / Max: 77 Min: 54 / Avg: 66.03 / Max: 73 Min: 56 / Avg: 70.51 / Max: 77 Min: 54 / Avg: 66.38 / Max: 74
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 RTX A2000 RTX A4000 20 40 60 80 100 SE +/- 0.02, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 SE +/- 0.01, N = 3 65.69 97.00 65.84 97.22
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: MEDICAL-O3 RTX A2000 RTX A4000 0.2543 0.5086 0.7629 1.0172 1.2715 1.130 1.128 1.130 1.119
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.49 / Avg: 58.15 / Max: 69.77 Min: 14.34 / Avg: 85.98 / Max: 123.58 Min: 7.39 / Avg: 58.24 / Max: 69.81 Min: 14.67 / Avg: 86.87 / Max: 123.79
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 58 / Avg: 74.27 / Max: 80 Min: 54 / Avg: 67.54 / Max: 76 Min: 57 / Avg: 74.14 / Max: 79 Min: 55 / Avg: 67.95 / Max: 77
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SNX-04 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SNX-04 RTX A2000 RTX A4000 90 180 270 360 450 SE +/- 0.06, N = 3 SE +/- 1.18, N = 3 SE +/- 0.36, N = 3 SE +/- 0.17, N = 3 286.27 421.61 287.19 423.50
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SNX-04 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SNX-04 RTX A2000 RTX A4000 1.0375 2.075 3.1125 4.15 5.1875 4.590 4.162 4.611 4.147
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.38 / Avg: 62.37 / Max: 69.77 Min: 14.05 / Avg: 101.31 / Max: 127.07 Min: 7.27 / Avg: 62.29 / Max: 69.81 Min: 14.26 / Avg: 102.11 / Max: 127.86
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 74.79 / Max: 78 Min: 52 / Avg: 70.53 / Max: 76 Min: 57 / Avg: 74.49 / Max: 78 Min: 52 / Avg: 70.88 / Max: 76
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 Proprietary MIT GPL OpenBenchmarking.org Composite Score, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 RTX A2000 RTX A4000 70 140 210 280 350 SE +/- 0.09, N = 3 SE +/- 0.20, N = 3 SE +/- 0.14, N = 3 SE +/- 0.17, N = 3 183.88 304.80 184.18 308.53
SPECViewPerf 2020 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 Proprietary MIT GPL OpenBenchmarking.org Composite Score Per Watt, More Is Better SPECViewPerf 2020 3.0 Resolution: 2560 x 1440 - Viewset: SOLIDWORKS-07 RTX A2000 RTX A4000 0.6689 1.3378 2.0067 2.6756 3.3445 2.900 2.970 2.913 2.973
SPECViewPerf 2020 GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better SPECViewPerf 2020 3.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.47 / Avg: 63.41 / Max: 69.83 Min: 14.44 / Avg: 102.64 / Max: 118.53 Min: 7.18 / Avg: 63.22 / Max: 69.85 Min: 14.6 / Avg: 103.79 / Max: 118.25
SPECViewPerf 2020 GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better SPECViewPerf 2020 3.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 58 / Avg: 76.17 / Max: 81 Min: 54 / Avg: 71.16 / Max: 78 Min: 58 / Avg: 75.87 / Max: 81 Min: 55 / Avg: 71.83 / Max: 77
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A2000 RTX A4000 70 140 210 280 350 SE +/- 0.31, N = 3 SE +/- 0.46, N = 3 SE +/- 0.21, N = 3 SE +/- 0.62, N = 3 208.3 323.3 208.9 324.6
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A2000 RTX A4000 0.7425 1.485 2.2275 2.97 3.7125 3.268 3.278 3.300 3.259
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.1 / Avg: 63.74 / Max: 69.95 Min: 14.4 / Avg: 98.63 / Max: 114.11 Min: 6.93 / Avg: 63.31 / Max: 69.98 Min: 14.62 / Avg: 99.6 / Max: 114.79
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 55 / Avg: 75.65 / Max: 79 Min: 52 / Avg: 70.8 / Max: 75 Min: 55 / Avg: 75.29 / Max: 79 Min: 52 / Avg: 71.38 / Max: 76
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A2000 RTX A4000 40 80 120 160 200 SE +/- 0.12, N = 3 SE +/- 0.03, N = 3 SE +/- 0.06, N = 3 SE +/- 0.09, N = 3 105.2 178.9 105.5 179.1
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A2000 RTX A4000 0.3742 0.7484 1.1226 1.4968 1.871 1.643 1.659 1.663 1.654
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.23 / Avg: 64.05 / Max: 69.98 Min: 14.52 / Avg: 107.85 / Max: 118.76 Min: 7.23 / Avg: 63.46 / Max: 70.01 Min: 14.95 / Avg: 108.27 / Max: 119.23
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 76.3 / Max: 79 Min: 53 / Avg: 73.72 / Max: 77 Min: 57 / Avg: 75.87 / Max: 79 Min: 54 / Avg: 74.11 / Max: 78
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.00, N = 3 SE +/- 0.09, N = 3 SE +/- 0.03, N = 3 SE +/- 0.00, N = 3 75.0 128.5 75.2 128.6
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A2000 RTX A4000 0.2698 0.5396 0.8094 1.0792 1.349 1.180 1.171 1.185 1.199
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.34 / Avg: 63.55 / Max: 69.68 Min: 14.47 / Avg: 109.76 / Max: 120.33 Min: 7.09 / Avg: 63.43 / Max: 69.68 Min: 14.52 / Avg: 107.3 / Max: 120.82
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 75.89 / Max: 79 Min: 53 / Avg: 74.17 / Max: 78 Min: 57 / Avg: 75.77 / Max: 79 Min: 52 / Avg: 73.58 / Max: 78
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A2000 RTX A4000 12 24 36 48 60 SE +/- 0.03, N = 3 SE +/- 0.00, N = 3 SE +/- 0.00, N = 3 SE +/- 0.10, N = 3 30.9 51.2 31.1 51.1
Unigine Superposition Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A2000 RTX A4000 0.1168 0.2336 0.3504 0.4672 0.584 0.488 0.442 0.519 0.445
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.29 / Avg: 63.28 / Max: 69.78 Min: 14.67 / Avg: 115.71 / Max: 126.92 Min: 6.99 / Avg: 59.92 / Max: 69.9 Min: 14.78 / Avg: 114.83 / Max: 127.02
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 76.22 / Max: 80 Min: 53 / Avg: 76.1 / Max: 80 Min: 55 / Avg: 74.77 / Max: 79 Min: 53 / Avg: 76.05 / Max: 80
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A2000 RTX A4000 50 100 150 200 250 SE +/- 0.17, N = 3 SE +/- 0.09, N = 3 SE +/- 0.20, N = 3 SE +/- 0.12, N = 3 142.8 242.0 143.2 242.0
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL RTX A2000 RTX A4000 0.5067 1.0134 1.5201 2.0268 2.5335 2.215 2.237 2.252 2.224
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.46 / Avg: 64.47 / Max: 69.91 Min: 14.57 / Avg: 108.17 / Max: 119.48 Min: 6.69 / Avg: 63.6 / Max: 70.02 Min: 14.89 / Avg: 108.83 / Max: 119.71
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 75.99 / Max: 79 Min: 53 / Avg: 73.54 / Max: 77 Min: 54 / Avg: 75.16 / Max: 79 Min: 53 / Avg: 73.93 / Max: 78
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.03, N = 3 SE +/- 0.06, N = 3 SE +/- 0.07, N = 3 SE +/- 0.06, N = 3 65.7 114.6 66.0 114.6
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Medium - Renderer: OpenGL RTX A2000 RTX A4000 0.2367 0.4734 0.7101 0.9468 1.1835 1.027 1.042 1.052 1.046
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.5 / Avg: 63.98 / Max: 69.64 Min: 14.38 / Avg: 109.97 / Max: 120.46 Min: 6.97 / Avg: 62.75 / Max: 69.82 Min: 15.02 / Avg: 109.61 / Max: 120.91
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 76.38 / Max: 80 Min: 54 / Avg: 74.13 / Max: 77 Min: 56 / Avg: 75.65 / Max: 79 Min: 54 / Avg: 74.37 / Max: 78
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A2000 RTX A4000 20 40 60 80 100 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 44.2 77.4 44.4 77.4
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: High - Renderer: OpenGL RTX A2000 RTX A4000 0.1566 0.3132 0.4698 0.6264 0.783 0.693 0.693 0.696 0.690
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.43 / Avg: 63.8 / Max: 70.03 Min: 14.63 / Avg: 111.75 / Max: 122.85 Min: 7.2 / Avg: 63.77 / Max: 69.96 Min: 14.83 / Avg: 112.11 / Max: 122.87
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 76.3 / Max: 80 Min: 53 / Avg: 74.19 / Max: 78 Min: 57 / Avg: 76.16 / Max: 79 Min: 54 / Avg: 74.87 / Max: 78
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A2000 RTX A4000 7 14 21 28 35 SE +/- 0.00, N = 3 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.00, N = 3 17.9 29.6 17.9 29.6
Unigine Superposition Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL Proprietary MIT GPL OpenBenchmarking.org Frames Per Second Per Watt, More Is Better Unigine Superposition 1.0 Resolution: 2560 x 1440 - Mode: Fullscreen - Quality: Ultra - Renderer: OpenGL RTX A2000 RTX A4000 0.0657 0.1314 0.1971 0.2628 0.3285 0.283 0.259 0.292 0.261
Unigine Superposition GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better Unigine Superposition 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.47 / Avg: 63.22 / Max: 69.65 Min: 14.59 / Avg: 114.36 / Max: 126.44 Min: 7.06 / Avg: 61.31 / Max: 69.68 Min: 15 / Avg: 113.51 / Max: 126.47
Unigine Superposition GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better Unigine Superposition 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 77.17 / Max: 81 Min: 53 / Avg: 75.19 / Max: 79 Min: 55 / Avg: 75.93 / Max: 80 Min: 54 / Avg: 75.61 / Max: 80
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: On RTX A2000 RTX A4000 14 28 42 56 70 SE +/- 0.05, N = 3 SE +/- 0.12, N = 3 SE +/- 0.03, N = 3 SE +/- 0.06, N = 3 33.09 61.98 33.23 61.89
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: On RTX A2000 RTX A4000 0.1357 0.2714 0.4071 0.5428 0.6785 0.535 0.602 0.536 0.603
GPUScore: Breaking Limit GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.33 / Avg: 61.86 / Max: 69.51 Min: 14.44 / Avg: 102.9 / Max: 125.69 Min: 7.13 / Avg: 61.96 / Max: 69.51 Min: 14.84 / Avg: 102.71 / Max: 126.05
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 59 / Avg: 75.27 / Max: 80 Min: 56 / Avg: 71.12 / Max: 78 Min: 59 / Avg: 74.97 / Max: 79 Min: 56 / Avg: 71.72 / Max: 78
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: Off Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: Off RTX A2000 RTX A4000 40 80 120 160 200 SE +/- 0.19, N = 3 SE +/- 0.40, N = 4 SE +/- 0.24, N = 3 SE +/- 0.43, N = 4 102.86 181.40 103.55 181.16
GPUScore: Breaking Limit Resolution: 1920 x 1080 - Ray-Tracing: Off Proprietary MIT GPL OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 1920 x 1080 - Ray-Tracing: Off RTX A2000 RTX A4000 0.4979 0.9958 1.4937 1.9916 2.4895 1.921 2.204 1.923 2.213
GPUScore: Breaking Limit GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.37 / Avg: 53.55 / Max: 69.86 Min: 14.55 / Avg: 82.32 / Max: 120.3 Min: 7.11 / Avg: 53.84 / Max: 69.67 Min: 14.69 / Avg: 81.88 / Max: 120.39
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 59 / Avg: 70.81 / Max: 76 Min: 55 / Avg: 65.14 / Max: 72 Min: 58 / Avg: 70.26 / Max: 76 Min: 56 / Avg: 65.61 / Max: 72
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: On RTX A2000 RTX A4000 9 18 27 36 45 SE +/- 0.07, N = 3 SE +/- 0.09, N = 3 SE +/- 0.02, N = 3 SE +/- 0.08, N = 3 20.74 39.19 20.85 39.17
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: On Proprietary MIT GPL OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: On RTX A2000 RTX A4000 0.0801 0.1602 0.2403 0.3204 0.4005 0.324 0.356 0.326 0.354
GPUScore: Breaking Limit GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.29 / Avg: 63.92 / Max: 69.25 Min: 14.19 / Avg: 110.16 / Max: 125.34 Min: 7.05 / Avg: 63.86 / Max: 69.02 Min: 14.59 / Avg: 110.65 / Max: 125.65
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 57 / Avg: 75.99 / Max: 80 Min: 53 / Avg: 72.93 / Max: 78 Min: 57 / Avg: 75.51 / Max: 80 Min: 54 / Avg: 73.65 / Max: 79
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: Off Proprietary MIT GPL OpenBenchmarking.org FPS, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: Off RTX A2000 RTX A4000 30 60 90 120 150 SE +/- 0.27, N = 3 SE +/- 0.43, N = 3 SE +/- 0.14, N = 3 SE +/- 0.26, N = 3 70.22 125.55 70.63 125.39
GPUScore: Breaking Limit Resolution: 2560 x 1440 - Ray-Tracing: Off Proprietary MIT GPL OpenBenchmarking.org FPS Per Watt, More Is Better GPUScore: Breaking Limit 1.0 Resolution: 2560 x 1440 - Ray-Tracing: Off RTX A2000 RTX A4000 0.3128 0.6256 0.9384 1.2512 1.564 1.221 1.378 1.233 1.390
GPUScore: Breaking Limit GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.5 / Avg: 57.49 / Max: 69.6 Min: 14.64 / Avg: 91.1 / Max: 122.61 Min: 7 / Avg: 57.3 / Max: 69.86 Min: 14.92 / Avg: 90.21 / Max: 122.98
GPUScore: Breaking Limit GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better GPUScore: Breaking Limit 1.0 GPU Temperature Monitor RTX A2000 RTX A4000 15 30 45 60 75 Min: 59 / Avg: 72.6 / Max: 78 Min: 55 / Avg: 67.19 / Max: 74 Min: 58 / Avg: 72.15 / Max: 77 Min: 56 / Avg: 67.6 / Max: 74
ParaView Test: Many Spheres - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 1920 x 1080 RTX A2000 RTX A4000 20 40 60 80 100 SE +/- 0.03, N = 3 SE +/- 0.03, N = 3 SE +/- 0.01, N = 3 SE +/- 0.02, N = 3 49.86 101.92 49.87 101.88
ParaView Test: Many Spheres - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 1920 x 1080 RTX A2000 RTX A4000 2K 4K 6K 8K 10K SE +/- 2.90, N = 3 SE +/- 2.93, N = 3 SE +/- 0.83, N = 3 SE +/- 2.02, N = 3 4998.43 10217.67 4999.66 10214.16
ParaView Test: Many Spheres - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 1920 x 1080 RTX A2000 RTX A4000 50 100 150 200 250 150.60 220.57 152.65 219.73
ParaView GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.21 / Avg: 33.19 / Max: 67.5 Min: 14.12 / Avg: 46.32 / Max: 124.16 Min: 6.89 / Avg: 32.75 / Max: 67.38 Min: 14.13 / Avg: 46.49 / Max: 124.43
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A2000 RTX A4000 13 26 39 52 65 Min: 55 / Avg: 60.78 / Max: 69 Min: 51 / Avg: 54.89 / Max: 62 Min: 54 / Avg: 60.13 / Max: 68 Min: 52 / Avg: 55.48 / Max: 63
ParaView Test: Many Spheres - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 2560 x 1440 RTX A2000 RTX A4000 20 40 60 80 100 SE +/- 0.02, N = 3 SE +/- 0.08, N = 3 SE +/- 0.01, N = 3 SE +/- 0.03, N = 3 48.77 98.36 48.81 98.41
ParaView Test: Many Spheres - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 2560 x 1440 RTX A2000 RTX A4000 2K 4K 6K 8K 10K SE +/- 2.13, N = 3 SE +/- 8.11, N = 3 SE +/- 1.06, N = 3 SE +/- 2.84, N = 3 4889.88 9861.26 4893.83 9865.54
ParaView Test: Many Spheres - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Many Spheres - Resolution: 2560 x 1440 RTX A2000 RTX A4000 50 100 150 200 250 144.75 208.85 145.26 205.12
ParaView GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.01 / Avg: 33.78 / Max: 69 Min: 13.55 / Avg: 47.22 / Max: 126.07 Min: 6.79 / Avg: 33.69 / Max: 69.02 Min: 13.99 / Avg: 48.1 / Max: 126.69
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A2000 RTX A4000 13 26 39 52 65 Min: 52 / Avg: 58.8 / Max: 68 Min: 48 / Avg: 52.67 / Max: 61 Min: 52 / Avg: 58.42 / Max: 67 Min: 49 / Avg: 53.74 / Max: 62
ParaView Test: Wavelet Contour - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 1920 x 1080 RTX A2000 RTX A4000 100 200 300 400 500 SE +/- 0.28, N = 5 SE +/- 1.31, N = 6 SE +/- 0.33, N = 5 SE +/- 0.58, N = 6 253.24 448.72 253.15 448.11
ParaView Test: Wavelet Contour - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 1920 x 1080 RTX A2000 RTX A4000 1000 2000 3000 4000 5000 SE +/- 2.93, N = 5 SE +/- 13.61, N = 6 SE +/- 3.45, N = 5 SE +/- 6.02, N = 6 2639.03 4676.21 2638.14 4669.86
ParaView Test: Wavelet Contour - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 1920 x 1080 RTX A2000 RTX A4000 30 60 90 120 150 112.33 128.73 113.61 126.45
ParaView GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 7.17 / Avg: 23.49 / Max: 69.95 Min: 13.6 / Avg: 36.32 / Max: 125.2 Min: 6.71 / Avg: 23.22 / Max: 69.89 Min: 14.01 / Avg: 36.93 / Max: 124.95
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A2000 RTX A4000 13 26 39 52 65 Min: 51 / Avg: 54.81 / Max: 64 Min: 47 / Avg: 49.4 / Max: 58 Min: 51 / Avg: 54.62 / Max: 64 Min: 48 / Avg: 50.59 / Max: 58
ParaView Test: Wavelet Contour - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 2560 x 1440 RTX A2000 RTX A4000 80 160 240 320 400 SE +/- 0.11, N = 5 SE +/- 0.44, N = 5 SE +/- 0.10, N = 5 SE +/- 0.22, N = 5 203.83 370.36 204.16 368.43
ParaView Test: Wavelet Contour - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 2560 x 1440 RTX A2000 RTX A4000 800 1600 2400 3200 4000 SE +/- 1.11, N = 5 SE +/- 4.63, N = 5 SE +/- 1.04, N = 5 SE +/- 2.33, N = 5 2124.14 3859.58 2127.57 3839.49
ParaView Test: Wavelet Contour - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiPolys / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Contour - Resolution: 2560 x 1440 RTX A2000 RTX A4000 20 40 60 80 100 83.85 100.85 85.86 101.19
ParaView GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.87 / Avg: 25.33 / Max: 69.71 Min: 13.38 / Avg: 38.27 / Max: 124.72 Min: 6.52 / Avg: 24.78 / Max: 69.43 Min: 13.77 / Avg: 37.94 / Max: 123.93
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A2000 RTX A4000 12 24 36 48 60 Min: 49 / Avg: 53.72 / Max: 63 Min: 45 / Avg: 47.54 / Max: 56 Min: 49 / Avg: 53 / Max: 62 Min: 46 / Avg: 48.7 / Max: 56
ParaView Test: Wavelet Volume - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 1920 x 1080 RTX A2000 RTX A4000 150 300 450 600 750 SE +/- 2.91, N = 7 SE +/- 5.11, N = 7 SE +/- 2.55, N = 7 SE +/- 3.18, N = 7 448.44 701.98 448.76 701.45
ParaView Test: Wavelet Volume - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 1920 x 1080 RTX A2000 RTX A4000 2K 4K 6K 8K 10K SE +/- 46.50, N = 7 SE +/- 81.78, N = 7 SE +/- 40.83, N = 7 SE +/- 50.88, N = 7 7175.02 11231.59 7180.14 11223.16
ParaView Test: Wavelet Volume - Resolution: 1920 x 1080 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 1920 x 1080 RTX A2000 RTX A4000 90 180 270 360 450 414.12 404.77 423.87 399.35
ParaView GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 14 28 42 56 70 Min: 6.74 / Avg: 17.33 / Max: 56.53 Min: 13.16 / Avg: 27.75 / Max: 73.82 Min: 6.53 / Avg: 16.94 / Max: 56.9 Min: 13.52 / Avg: 28.1 / Max: 74.36
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A2000 RTX A4000 11 22 33 44 55 Min: 47 / Avg: 50.02 / Max: 58 Min: 43 / Avg: 44.37 / Max: 50 Min: 47 / Avg: 49.47 / Max: 57 Min: 44 / Avg: 45.65 / Max: 50
ParaView Test: Wavelet Volume - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org Frames / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 2560 x 1440 RTX A2000 RTX A4000 110 220 330 440 550 SE +/- 0.59, N = 6 SE +/- 1.00, N = 7 SE +/- 0.52, N = 6 SE +/- 3.22, N = 7 301.50 513.10 301.44 503.14
ParaView Test: Wavelet Volume - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 2560 x 1440 RTX A2000 RTX A4000 2K 4K 6K 8K 10K SE +/- 9.42, N = 6 SE +/- 16.02, N = 7 SE +/- 8.36, N = 6 SE +/- 51.48, N = 7 4823.93 8209.65 4823.03 8050.29
ParaView Test: Wavelet Volume - Resolution: 2560 x 1440 Proprietary MIT GPL OpenBenchmarking.org MiVoxels / Sec Per Watt, More Is Better ParaView 5.10.1 Test: Wavelet Volume - Resolution: 2560 x 1440 RTX A2000 RTX A4000 60 120 180 240 300 244.64 268.85 247.92 263.73
ParaView GPU Power Consumption Monitor Proprietary MIT GPL OpenBenchmarking.org Watts, Fewer Is Better ParaView 5.10.1 GPU Power Consumption Monitor RTX A2000 RTX A4000 20 40 60 80 100 Min: 6.8 / Avg: 19.72 / Max: 56.6 Min: 13.07 / Avg: 30.54 / Max: 93.96 Min: 6.59 / Avg: 19.45 / Max: 56.44 Min: 13.44 / Avg: 30.52 / Max: 92.07
ParaView GPU Temperature Monitor Proprietary MIT GPL OpenBenchmarking.org Celsius, Fewer Is Better ParaView 5.10.1 GPU Temperature Monitor RTX A2000 RTX A4000 11 22 33 44 55 Min: 46 / Avg: 48.97 / Max: 57 Min: 41 / Avg: 43.26 / Max: 51 Min: 45 / Avg: 48.21 / Max: 56 Min: 42 / Avg: 44.47 / Max: 52
GPU Power Consumption Monitor Phoronix Test Suite System Monitoring Proprietary MIT GPL OpenBenchmarking.org Watts GPU Power Consumption Monitor Phoronix Test Suite System Monitoring RTX A2000 RTX A4000 20 40 60 80 100 Min: 4.55 / Avg: 56.48 / Max: 70.06 Min: 6.67 / Avg: 92.95 / Max: 130.53 Min: 4.29 / Avg: 56.02 / Max: 70.02 Min: 6.55 / Avg: 92.65 / Max: 131.07
GPU Temperature Monitor Phoronix Test Suite System Monitoring Proprietary MIT GPL OpenBenchmarking.org Celsius GPU Temperature Monitor Phoronix Test Suite System Monitoring RTX A2000 RTX A4000 15 30 45 60 75 Min: 32 / Avg: 72.27 / Max: 81 Min: 31 / Avg: 68.43 / Max: 80 Min: 43 / Avg: 71.84 / Max: 81 Min: 29 / Avg: 68.65 / Max: 80
Phoronix Test Suite v10.8.5