You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/Users/taozeyu/codes/test/tiny_llm/main.py", line 13, in <module>
main()
File "/Users/taozeyu/codes/test/tiny_llm/main.py", line 4, in main
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/taozeyu/codes/test/tiny_llm/.venv/lib/python3.12/site-packages/transformers/models/auto/tokenization_auto.py", line 738, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/taozeyu/codes/test/tiny_llm/.venv/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 2017, in from_pretrained
return cls._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^
File "/Users/taozeyu/codes/test/tiny_llm/.venv/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 2249, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/taozeyu/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/bf0f5cfb575eebebf9b655c5861177acfee03f16/tokenization_chatglm.py", line 196, in __init__
super().__init__(
File "/Users/taozeyu/codes/test/tiny_llm/.venv/lib/python3.12/site-packages/transformers/tokenization_utils.py", line 367, in __init__
self._add_tokens(
File "/Users/taozeyu/codes/test/tiny_llm/.venv/lib/python3.12/site-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
current_vocab = self.get_vocab().copy()
^^^^^^^^^^^^^^^^
File "/Users/taozeyu/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/bf0f5cfb575eebebf9b655c5861177acfee03f16/tokenization_chatglm.py", line 248, in get_vocab
vocab = {self._convert_id_to_token(i): i for i in range(self.vocab_size)}
^^^^^^^^^^^^^^^
File "/Users/taozeyu/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/bf0f5cfb575eebebf9b655c5861177acfee03f16/tokenization_chatglm.py", line 244, in vocab_size
return self.sp_tokenizer.num_tokens
^^^^^^^^^^^^^^^^^
AttributeError: 'ChatGLMTokenizer' object has no attribute 'sp_tokenizer'. Did you mean: '_tokenize'?
The text was updated successfully, but these errors were encountered:
Moskize91
changed the title
[BUG/Help] <title> macOS 的 transformers==4.34.1 版本运行 README 上的 example 代码报错
macOS 的 transformers==4.34.1 版本运行 README 上的 example 代码报错
Feb 11, 2025
Is there an existing issue for this?
Current Behavior
执行 example 代码后直接报错如下:
Expected Behavior
No response
Steps To Reproduce
requirements.txt 代码
main.py
中的代码Environment
Anything else?
No response
The text was updated successfully, but these errors were encountered: