-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
onnxruntime version #127
Comments
Hi You tried too old release I am not sure what you mean you have tried one model with Exp operator. What type of model is this? We cannot run one operator on NPU. Ryzen AI is supported for complete CNN models. Thanks |
Thanks for replying! |
With the current software, some operators can run on the CPU depending on the model. We expect more offloading to the NPU in future software versions. |
Dear Authors,
Thanks for the great job.
After installing "ryzen-ai-1.2.0-20240726.msi", I can run with NPU under the target platform.
However, there are some questions I would like to verify.
(1) What's the version of vitis onnxruntime.dll you've provided in msi packages ?
(2) I've tried to download onnxruntime official release 1.8.0 and run CPUExecutionProvider.
But the speed is faster than (1) under CPU mode. (without launching NPU) Is it expected?
(3) I've tried one model with "Exp" operator. However, after quantizing and inferencing, the operator is not run under npu mode. Is it expected?
Thanks
The text was updated successfully, but these errors were encountered: