Tests
Suites
Latest Results
Search
Register
Login
Popular Tests
Timed Linux Kernel Compilation
Blender
SVT-AV1
FFmpeg
dav1d
7-Zip Compression
Newest Tests
OpenVINO GenAI
Rustls
LiteRT
WarpX
Epoch
Valkey
Recently Updated Tests
Llama.cpp
OpenVINO
Renaissance
Blender
vkpeak
ProjectPhysX OpenCL-Benchmark
New & Recently Updated Tests
Recently Updated Suites
Machine Learning
Server Motherboard
HPC - High Performance Computing
New & Recently Updated Suites
Component Benchmarks
CPUs / Processors
GPUs / Graphics
OpenGL
Disks / Storage
Motherboards
File-Systems
Operating Systems
OpenBenchmarking.org
Corporate / Organization Info
Bug Reports / Feature Requests
ONNX Runtime 1.17.0
pts/onnx-1.17.0
- 02 February 2024 -
Update against ONNX Runtime 1.17, update download links.
downloads.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.8.4--> <PhoronixTestSuite> <Downloads> <Package> <URL>https://github.com/onnx/models/raw/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/object_detection_segmentation/yolov4/model/yolov4.tar.gz</URL> <FileName>yolov4.tar.gz</FileName> <FileSize>248665735</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/raw/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/object_detection_segmentation/fcn/model/fcn-resnet101-11.tar.gz</URL> <FileName>fcn-resnet101-11.tar.gz</FileName> <FileSize>294729465</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/raw/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/super_resolution/sub_pixel_cnn_2016/model/super-resolution-10.tar.gz</URL> <FileName>super-resolution-10.tar.gz</FileName> <FileSize>1815286</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/blob/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/text/machine_comprehension/bert-squad/model/bertsquad-12.tar.gz</URL> <FileName>bertsquad-12.tar.gz</FileName> <FileSize>403082198</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/blob/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/text/machine_comprehension/gpt-2/model/gpt2-10.tar.gz</URL> <FileName>gpt2-10.tar.gz</FileName> <FileSize>463131530</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/blob/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/body_analysis/arcface/model/arcfaceresnet100-8.tar.gz</URL> <FileName>arcfaceresnet100-8.tar.gz</FileName> <FileSize>237564370</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/raw/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/classification/resnet/model/resnet50-v1-12-int8.tar.gz</URL> <FileName>resnet50-v1-12-int8.tar.gz</FileName> <FileSize>22355322</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/raw/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/classification/caffenet/model/caffenet-12-int8.tar.gz</URL> <FileName>caffenet-12-int8.tar.gz</FileName> <FileSize>40718510</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/blob/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/vision/object_detection_segmentation/faster-rcnn/model/FasterRCNN-12-int8.tar.gz</URL> <FileName>FasterRCNN-12-int8.tar.gz</FileName> <FileSize>38019008</FileSize> <Optional>TRUE</Optional> </Package> <Package> <URL>https://github.com/onnx/models/raw/4c46cd00fbdb7cd30b6c1c17ab54f2e1f4f7b177/validated/text/machine_comprehension/t5/model/t5-encoder-12.tar.gz</URL> <FileName>t5-encoder-12.tar.gz</FileName> <FileSize>194535656</FileSize> <Optional>TRUE</Optional> </Package> </Downloads> </PhoronixTestSuite>
install.sh
#!/bin/bash rm -rf onnxruntime git clone https://github.com/microsoft/onnxruntime cd onnxruntime git checkout v1.17.0 export CFLAGS="-O3 -march=native $CFLAGS -Wno-error=overloaded-virtual" export CXXFLAGS="-O3 -march=native $CXXFLAGS -Wno-error=overloaded-virtual" ./build.sh --config Release --build_shared_lib --parallel --skip_tests --enable_lto --cmake_extra_defines onnxruntime_BUILD_FOR_NATIVE_MACHINE=ON echo $? > ~/install-exit-status cd ~ tar -xf yolov4.tar.gz tar -xf fcn-resnet101-11.tar.gz tar -xf super-resolution-10.tar.gz tar -xf bertsquad-12.tar.gz tar -xf gpt2-10.tar.gz tar -xf arcfaceresnet100-8.tar.gz tar -xf resnet50-v1-12-int8.tar.gz tar -xf caffenet-12-int8.tar.gz tar -xf FasterRCNN-12-int8.tar.gz tar -xf t5-encoder-12.tar.gz echo "#!/bin/bash ./onnxruntime/build/Linux/Release/onnxruntime_perf_test \$@ > \$LOG_FILE 2>&1 echo \$? > ~/test-exit-status" > onnx chmod +x onnx
results-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.8.4--> <PhoronixTestSuite> <ResultsParser> <OutputTemplate>Number of inferences per second: #_RESULT_#</OutputTemplate> </ResultsParser> <ResultsParser> <OutputTemplate>Average inference time cost: #_RESULT_# ms</OutputTemplate> <ResultScale>Inference Time Cost (ms)</ResultScale> <ResultProportion>LIB</ResultProportion> <Importance>Secondary</Importance> </ResultsParser> </PhoronixTestSuite>
test-definition.xml
<?xml version="1.0"?> <!--Phoronix Test Suite v10.8.4--> <PhoronixTestSuite> <TestInformation> <Title>ONNX Runtime</Title> <AppVersion>1.17</AppVersion> <Description>ONNX Runtime is developed by Microsoft and partners as a open-source, cross-platform, high performance machine learning inferencing and training accelerator. This test profile runs the ONNX Runtime with various models available from the ONNX Model Zoo.</Description> <ResultScale>Inferences Per Second</ResultScale> <Proportion>HIB</Proportion> <TimesToRun>3</TimesToRun> </TestInformation> <TestProfile> <Version>1.17.0</Version> <SupportedPlatforms>Linux</SupportedPlatforms> <SoftwareType>Utility</SoftwareType> <TestType>System</TestType> <License>Free</License> <Status>Verified</Status> <ExternalDependencies>python, git, build-utilities, cmake</ExternalDependencies> <InstallRequiresInternet>TRUE</InstallRequiresInternet> <EnvironmentSize>4900</EnvironmentSize> <ProjectURL>https://www.onnxruntime.ai/</ProjectURL> <RepositoryURL>https://github.com/microsoft/onnxruntime</RepositoryURL> <InternalTags>SMP</InternalTags> <Maintainer>Michael Larabel</Maintainer> <SystemDependencies>gmock/gmock.h</SystemDependencies> </TestProfile> <TestSettings> <Default> <PostArguments>-t 60 </PostArguments> </Default> <Option> <DisplayName>Model</DisplayName> <Identifier>model</Identifier> <Menu> <Entry> <Name>yolov4</Name> <Value>yolov4/yolov4.onnx</Value> </Entry> <Entry> <Name>fcn-resnet101-11</Name> <Value>fcn-resnet101-11/model.onnx</Value> </Entry> <Entry> <Name>super-resolution-10</Name> <Value>super_resolution/super_resolution.onnx</Value> </Entry> <Entry> <Name>bertsquad-12</Name> <Value>bertsquad-12/bertsquad-12.onnx</Value> </Entry> <Entry> <Name>GPT-2</Name> <Value>GPT2/model.onnx</Value> </Entry> <Entry> <Name>ArcFace ResNet-100</Name> <Value>resnet100/resnet100.onnx</Value> </Entry> <Entry> <Name>ResNet50 v1-12-int8</Name> <Value>resnet50-v1-12-int8/resnet50-v1-12-int8.onnx</Value> </Entry> <Entry> <Name>CaffeNet 12-int8</Name> <Value>caffenet-12-int8/caffenet-12-int8.onnx</Value> </Entry> <Entry> <Name>Faster R-CNN R-50-FPN-int8</Name> <Value>FasterRCNN-12-int8/FasterRCNN-12-int8.onnx</Value> </Entry> <Entry> <Name>T5 Encoder</Name> <Value>t5-encoder/t5-encoder.onnx</Value> </Entry> </Menu> </Option> <Option> <DisplayName>Device</DisplayName> <Identifier>device</Identifier> <ArgumentPrefix>-e </ArgumentPrefix> <Menu> <Entry> <Name>CPU</Name> <Value>cpu</Value> </Entry> </Menu> </Option> <Option> <DisplayName>Executor</DisplayName> <Identifier>executor</Identifier> <ArgumentPrefix> </ArgumentPrefix> <Menu> <Entry> <Name>Standard</Name> </Entry> <Entry> <Name>Parallel</Name> <Value>-P</Value> </Entry> </Menu> </Option> </TestSettings> </PhoronixTestSuite>