Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: GPU is required to run AWQ quantized model. You can use IPEX version AWQ if you have an Intel CPU #6662

Open
1 task done
Dcode1999 opened this issue Jan 13, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@Dcode1999
Copy link

Dcode1999 commented Jan 13, 2025

Describe the bug

After installing AWQ I am getting this runtime error.

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

  1. Loading model TheBloke_Wizard-Vicuna-7B-Uncensored-AWQ
  2. ERROR

Screenshot

Screenshot 2025-01-14 005231

Logs

00:39:08-649538 ERROR    Failed to load the model.
Traceback (most recent call last):
  File "F:\AI\G\text-generation-webui-main\modules\ui_model_menu.py", line 214, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(selected_model, loader)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\AI\G\text-generation-webui-main\modules\models.py", line 90, in load_model
    output = load_func_map[loader](model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\AI\G\text-generation-webui-main\modules\models.py", line 183, in huggingface_loader
    model = LoaderClass.from_pretrained(path_to_model, **params)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\AI\G\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\models\auto\auto_factory.py", line 564, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\AI\G\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\modeling_utils.py", line 3669, in from_pretrained
    hf_quantizer.validate_environment(
  File "F:\AI\G\text-generation-webui-main\installer_files\env\Lib\site-packages\transformers\quantizers\quantizer_awq.py", line 71, in validate_environment
    raise RuntimeError(
RuntimeError: GPU is required to run AWQ quantized model. You can use IPEX version AWQ if you have an Intel CPU

System Info

OS- Win 11 Home
RAM- 16 GB
VRAM- NVIDIA RTX 2060 6 GB
CPU - i7 9750H
@Dcode1999 Dcode1999 added the bug Something isn't working label Jan 13, 2025
@Dcode1999
Copy link
Author

Please @oobabooga help me!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant