[Issue]: reflection_with_llm
not working with ollama/llama3
#2474
Labels
0.2
Issues which are related to the pre 0.4 codebase
models
Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)
needs-triage
Describe the issue
using
Does not seem to work as the
result.summary
of the chat session is the last message instead of summary.by enabling openai lib debug I see that the summary request is sent as a message with
{"role": "system"}
If I copy the same request (which also contain all the previous messages) and run it manually it indeed returns the last message. If replace
{"role": "system"}
with{"role": "user"}
for the summary request message it works. (which makes sense, to me at least)Steps to reproduce
Screenshots and logs
Result
Additional Information
The text was updated successfully, but these errors were encountered: