You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using the 8-bit AdamW optimizer in q_galore_adamw8bit.py, optimizer_update_8bit_blockwise() requires absmax2 but it is not defined. Seems like absmax2 expects passing a tensor.
[rank0]: File "/root/Q-GaLore/q_galore_torch/q_galore_adamw8bit.py", line 200, in update_step
[rank0]: F.optimizer_update_8bit_blockwise(
[rank0]: TypeError: optimizer_update_8bit_blockwise() missing 1 required positional argument: 'absmax2'
The text was updated successfully, but these errors were encountered:
When using the 8-bit AdamW optimizer in q_galore_adamw8bit.py, optimizer_update_8bit_blockwise() requires absmax2 but it is not defined. Seems like absmax2 expects passing a tensor.
[rank0]: File "/root/Q-GaLore/q_galore_torch/q_galore_adamw8bit.py", line 200, in update_step
[rank0]: F.optimizer_update_8bit_blockwise(
[rank0]: TypeError: optimizer_update_8bit_blockwise() missing 1 required positional argument: 'absmax2'
The text was updated successfully, but these errors were encountered: