-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AutoTokenizer in Convert_vima notebook #10
Comments
Hi @ManishGovind , Thanks for your interest in our work. We are using "transformers==4.37.2", "tokenizers==0.15.1" as suggested by the original LLaVA project here LLaRA/train-llava/pyproject.toml Line 17 in f60f59d
Could you help me confirm this? Thanks, Xiang |
Yes, I'm also using the same. Name: transformers |
I see, let me initialize a new environment and test it again. Thanks for bringing this to my attention. |
Sure, No problem. Looking forward to your reply. |
Hi @ManishGovind , I've reproduced the issue and found a temporary fix. You can bypass the current transformers package version requirement and upgrade to the latest version with the following command:
After the upgrade, the tokenizer should work as expected. However, I haven't tested compatibility with other parts of the code, so if any issues arise, you may need to revert to version 4.37.2. I'll implement this quick fix for now and work on a proper solution. Thanks for your understanding. Best, Xiang |
I will also try to upgrade the version and see if it works. Best, |
Hi @LostXine , I Re-trained llara with D-InBC-Aux-D-80K instruction data and wanted to reproduce the results. But i end up with these results. May i know what could be the issue? ![]() Thanks, |
Hi @ManishGovind , I could not find Thanks, |
so the first two rows are nothing but the D-inBc+Aux-D. I have used D-inBC-text-multi-train-80k-front.json(i.e, D-inBC + Auxiliary tasks) to train . Do you want me to share my inference json ? |
Hello @LostXine ,
When i try to generate instruction Tuning BC (step 4). I get the below error :
May i know what should be the transformers version. I followed the same instructions for setting up LLaRA according to README.md . Thanks for your wonderful work !!
The text was updated successfully, but these errors were encountered: