You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
GPU Not Being Utilized When Running !svc infer on Kaggle
Hi team,
I am testing the so-vits-svc-fork package in a Kaggle Notebook environment. However, I noticed that when I run the following command:
!svc infer input.wav -o output.wav -m G_300.pth -c config.json
the model does not appear to utilize the GPU. Here are some additional details:
The command executes without errors, but there is no indication that the GPU is being used in the output logs.
I have confirmed that the GPU is available in my Kaggle environment (e.g., nvidia-smi detects the GPU correctly).
This behavior suggests that either the package isn't properly configured to use the GPU in the Kaggle environment, or additional settings may be required.
Could you please advise on how to ensure that the GPU is used during inference? If you need further details (e.g., full logs or environment configurations), I would be happy to provide them.
Thank you for your help!
To Reproduce
Environment Setup:
Use a Kaggle Notebook with GPU enabled.
Ensure that the environment has the so-vits-svc-fork package installed via:
%pip install so-vits-svc-fork
Place an audio file (e.g., input.wav) in the working directory.
Ensure that the model file (G_300.pth) and configuration file (config.json) are present in the specified paths.
Run the Inference Command:
Execute the following command in a notebook cell:
!svc infer input.wav -o output.wav -m G_300.pth -c config.json
Expected Behavior:
The model should utilize the GPU for processing, which can be verified by monitoring the GPU usage (e.g., using nvidia-smi).
Observed Behavior:
The command executes without error, but there is no indication that the GPU is being used during inference.
Additional context
No response
Version
4.2.26
Platform
Kaggle
Code of Conduct
I agree to follow this project's Code of Conduct.
No Duplicate
I have checked existing issues to avoid duplicates.
The text was updated successfully, but these errors were encountered:
Describe the bug
GPU Not Being Utilized When Running !svc infer on Kaggle
Hi team,
I am testing the so-vits-svc-fork package in a Kaggle Notebook environment. However, I noticed that when I run the following command:
!svc infer input.wav -o output.wav -m G_300.pth -c config.json
the model does not appear to utilize the GPU. Here are some additional details:
The command executes without errors, but there is no indication that the GPU is being used in the output logs.
I have confirmed that the GPU is available in my Kaggle environment (e.g., nvidia-smi detects the GPU correctly).
This behavior suggests that either the package isn't properly configured to use the GPU in the Kaggle environment, or additional settings may be required.
Could you please advise on how to ensure that the GPU is used during inference? If you need further details (e.g., full logs or environment configurations), I would be happy to provide them.
Thank you for your help!
To Reproduce
Environment Setup:
Use a Kaggle Notebook with GPU enabled.
Ensure that the environment has the so-vits-svc-fork package installed via:
%pip install so-vits-svc-fork
Place an audio file (e.g., input.wav) in the working directory.
Ensure that the model file (G_300.pth) and configuration file (config.json) are present in the specified paths.
Run the Inference Command:
Execute the following command in a notebook cell:
!svc infer input.wav -o output.wav -m G_300.pth -c config.json
Expected Behavior:
The model should utilize the GPU for processing, which can be verified by monitoring the GPU usage (e.g., using nvidia-smi).
Observed Behavior:
The command executes without error, but there is no indication that the GPU is being used during inference.
Additional context
No response
Version
4.2.26
Platform
Kaggle
Code of Conduct
No Duplicate
The text was updated successfully, but these errors were encountered: