You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I trained a ThunderSVM model in Google Colab and tested it and it seemed to work quite well. However, I believe it does not use the GPU when running locally.
First, I git cloned this repo, made a build directory, and ran the cmake command for Visual Studio 2017. The .sln file was generated - I opened vs2017 and built it successfully. Then, before trying to import thundersvm in Python, I have to do os.add_dll_directory("C:\\Program Files\\NVIDIA GPU Computing Toolkit\\CUDA\\v11.1\\bin"). This took me about a week's worth of work to figure out due to this repo being fairly out of date (eg. the readme installation still recommends using vscode 2015) and ThunderSVM seemingly needing out of date dependencies.
All in all, I'm able to import ThunderSVM and even load my ThunderSVM model. But when it comes to prediction, it does not use the GPU (which I verified with nvidia-smi). I have no idea why this would be, and I suspect it's an issue in the installation. Please help!
The text was updated successfully, but these errors were encountered:
I trained a ThunderSVM model in Google Colab and tested it and it seemed to work quite well. However, I believe it does not use the GPU when running locally.
First, I git cloned this repo, made a build directory, and ran the cmake command for Visual Studio 2017. The .sln file was generated - I opened vs2017 and built it successfully. Then, before trying to
import thundersvm
in Python, I have to doos.add_dll_directory("C:\\Program Files\\NVIDIA GPU Computing Toolkit\\CUDA\\v11.1\\bin")
. This took me about a week's worth of work to figure out due to this repo being fairly out of date (eg. the readme installation still recommends using vscode 2015) and ThunderSVM seemingly needing out of date dependencies.All in all, I'm able to import ThunderSVM and even load my ThunderSVM model. But when it comes to prediction, it does not use the GPU (which I verified with nvidia-smi). I have no idea why this would be, and I suspect it's an issue in the installation. Please help!
The text was updated successfully, but these errors were encountered: