-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make lazy mode autodetection more robust #921
base: habana_main
Are you sure you want to change the base?
Conversation
is_lazy = htorch.utils.internal.is_lazy{} | ||
if is_lazy: | ||
torch._dynamo.config.disable = True | ||
env_update_dict['PT_HPU_ENABLE_LAZY_COLLECTIVES'] = 'true' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we really need to clean this up? I'm assuming PT_HPU_ENABLE_LAZY_COLLECTIVES only affects lazy and is ignored in other cases. If we skip cleanup we could end up with something as simple as:
os.environ['PT_HPU_ENABLE_LAZY_COLLECTIVES'] = 'true'
import habana_frameworks.torch as htorch
if htorch.utils.internal.is_lazy:
torch._dynamo.config.disable = True
vllm/plugins/__init__.py
Outdated
is_lazy = lazy_mode_env_var == '1' | ||
if lazy_mode_env_var is None: | ||
import habana_frameworks.torch as htorch | ||
is_lazy = htorch.utils.internal.is_lazy{} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is this syntax? Is it a typo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, i should probably get a new pair of glasses
No description provided.