Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix llama_v2_7b_16h for torch.jit.trace #2121

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Commits on Jan 18, 2024

  1. Fix llama_v2_7b_16h for torch.jit.trace

    Original error: Attention using SDPA can not be traced with torch.jit.trace
    when no attention_mask is provided. To solve this issue, please either load
    your model with the argument attn_implementation="eager" or
    pass an attention_mask input when tracing the model.
    Thiago Crepaldi committed Jan 18, 2024
    Configuration menu
    Copy the full SHA
    b871bde View commit details
    Browse the repository at this point in the history