-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possibilities of support Pascal #66
Comments
https://github.com/sasha0552/pascal-pkgs-ci/releases |
I tried to run this and this happen. sageattn_cogvideo.py |
It got int8 but was expecting floats is my guess. P40 natively supports int8. P100 doesn't at all. If you are loading a model quantized with fp8 pytorch, try a GGUF and it's more likely to work. |
so maybe just load a fp16 model or gguf. A suggestions would be nice. ps:you are right this is a p40 |
Since Pascal except P100 do support F32 and Int8 via DP4A. I was wonder if sage attention is usable via DP4A along.
The text was updated successfully, but these errors were encountered: