newa
Intel Core i7-1185G7 testing with a Dell XPS 13 9310 0DXP1F (3.7.0 BIOS) and Intel Xe TGL GT2 8GB on Ubuntu 24.04 via the Phoronix Test Suite.
HTML result view exported from: https://openbenchmarking.org/result/2408221-NE-NEWA6453334&sor.
Etcpak
Benchmark: Multi-Threaded - Configuration: ETC2
simdjson
Throughput Test: Kostya
simdjson
Throughput Test: TopTweet
simdjson
Throughput Test: LargeRandom
simdjson
Throughput Test: PartialTweets
simdjson
Throughput Test: DistinctUserID
SVT-AV1
Encoder Mode: Preset 3 - Input: Bosphorus 4K
SVT-AV1
Encoder Mode: Preset 5 - Input: Bosphorus 4K
SVT-AV1
Encoder Mode: Preset 8 - Input: Bosphorus 4K
SVT-AV1
Encoder Mode: Preset 13 - Input: Bosphorus 4K
SVT-AV1
Encoder Mode: Preset 3 - Input: Bosphorus 1080p
SVT-AV1
Encoder Mode: Preset 5 - Input: Bosphorus 1080p
SVT-AV1
Encoder Mode: Preset 8 - Input: Bosphorus 1080p
SVT-AV1
Encoder Mode: Preset 13 - Input: Bosphorus 1080p
SVT-AV1
Encoder Mode: Preset 3 - Input: Beauty 4K 10-bit
SVT-AV1
Encoder Mode: Preset 5 - Input: Beauty 4K 10-bit
SVT-AV1
Encoder Mode: Preset 8 - Input: Beauty 4K 10-bit
SVT-AV1
Encoder Mode: Preset 13 - Input: Beauty 4K 10-bit
Build2
Time To Compile
Y-Cruncher
Pi Digits To Calculate: 1B
Y-Cruncher
Pi Digits To Calculate: 500M
Mobile Neural Network
Model: nasnet
Mobile Neural Network
Model: mobilenetV3
Mobile Neural Network
Model: squeezenetv1.1
Mobile Neural Network
Model: resnet-v2-50
Mobile Neural Network
Model: SqueezeNetV1.0
Mobile Neural Network
Model: MobileNetV2_224
Mobile Neural Network
Model: mobilenet-v1-1.0
Mobile Neural Network
Model: inception-v3
XNNPACK
Model: FP32MobileNetV2
XNNPACK
Model: FP32MobileNetV3Large
XNNPACK
Model: FP32MobileNetV3Small
XNNPACK
Model: FP16MobileNetV2
XNNPACK
Model: FP16MobileNetV3Large
XNNPACK
Model: FP16MobileNetV3Small
XNNPACK
Model: QU8MobileNetV2
XNNPACK
Model: QU8MobileNetV3Large
XNNPACK
Model: QU8MobileNetV3Small
ONNX Runtime
Model: GPT-2 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: GPT-2 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: GPT-2 - Device: CPU - Executor: Standard
ONNX Runtime
Model: GPT-2 - Device: CPU - Executor: Standard
ONNX Runtime
Model: yolov4 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: yolov4 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: yolov4 - Device: CPU - Executor: Standard
ONNX Runtime
Model: yolov4 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ZFNet-512 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ZFNet-512 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ZFNet-512 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ZFNet-512 - Device: CPU - Executor: Standard
ONNX Runtime
Model: T5 Encoder - Device: CPU - Executor: Parallel
ONNX Runtime
Model: T5 Encoder - Device: CPU - Executor: Parallel
ONNX Runtime
Model: T5 Encoder - Device: CPU - Executor: Standard
ONNX Runtime
Model: T5 Encoder - Device: CPU - Executor: Standard
ONNX Runtime
Model: bertsquad-12 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: bertsquad-12 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: bertsquad-12 - Device: CPU - Executor: Standard
ONNX Runtime
Model: bertsquad-12 - Device: CPU - Executor: Standard
ONNX Runtime
Model: CaffeNet 12-int8 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: CaffeNet 12-int8 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: CaffeNet 12-int8 - Device: CPU - Executor: Standard
ONNX Runtime
Model: CaffeNet 12-int8 - Device: CPU - Executor: Standard
ONNX Runtime
Model: fcn-resnet101-11 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: fcn-resnet101-11 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: fcn-resnet101-11 - Device: CPU - Executor: Standard
ONNX Runtime
Model: fcn-resnet101-11 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ArcFace ResNet-100 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ArcFace ResNet-100 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ArcFace ResNet-100 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ArcFace ResNet-100 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ResNet50 v1-12-int8 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ResNet50 v1-12-int8 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ResNet50 v1-12-int8 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ResNet50 v1-12-int8 - Device: CPU - Executor: Standard
ONNX Runtime
Model: super-resolution-10 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: super-resolution-10 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: super-resolution-10 - Device: CPU - Executor: Standard
ONNX Runtime
Model: super-resolution-10 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ResNet101_DUC_HDC-12 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ResNet101_DUC_HDC-12 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: ResNet101_DUC_HDC-12 - Device: CPU - Executor: Standard
ONNX Runtime
Model: ResNet101_DUC_HDC-12 - Device: CPU - Executor: Standard
ONNX Runtime
Model: Faster R-CNN R-50-FPN-int8 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: Faster R-CNN R-50-FPN-int8 - Device: CPU - Executor: Parallel
ONNX Runtime
Model: Faster R-CNN R-50-FPN-int8 - Device: CPU - Executor: Standard
ONNX Runtime
Model: Faster R-CNN R-50-FPN-int8 - Device: CPU - Executor: Standard
Phoronix Test Suite v10.8.5