You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using llama_cpp.Llama with seed=-1, the generated output remains identical across multiple runs, despite expectations that -1 should introduce randomness. Even after modifying sampling parameters (temperature, top_k, top_p) and restarting the script, the model continues to produce the same structured content.
Steps to Reproduce:
Load a GGUF model using llama_cpp.Llama with seed=-1.
Use Outlines’ generate.json() with a structured schema.
Run the script multiple times and compare outputs.
Modify sampling settings (e.g., temperature=1.2, top_k=80, top_p=0.7), but observe little to no change in output content.
Even after restarting the script or system, the issue persists.
Expected Behavior:
Each run should produce unique exam content when using seed=-1, assuming it enables true randomness.
Observed Behavior:
The generated output remains unchanged across runs, with only minor formatting differences (e.g., whitespace variations).
When using llama_cpp.Llama with
seed=-1
, the generated output remains identical across multiple runs, despite expectations that -1 should introduce randomness. Even after modifying sampling parameters (temperature
,top_k
,top_p
) and restarting the script, the model continues to produce the same structured content.Steps to Reproduce:
Expected Behavior:
Each run should produce unique exam content when using seed=-1, assuming it enables true randomness.
Observed Behavior:
The generated output remains unchanged across runs, with only minor formatting differences (e.g., whitespace variations).
Possible Workarounds Attempted (Without Success):
seed=random.randint(0, 2**32 - 1)
.top_k
,top_p
, andtemperature
.Here's the code:
The first output stream:
The second output stream:
Both outputs contain the phrase: “Nowadays, a big change is taking place in the way we write and consume stories...”
The text was updated successfully, but these errors were encountered: