Replies: 2 comments
-
Thanks for the issue. Looks like you can set a default reply message in the user proxy: see |
Beta Was this translation helpful? Give feedback.
0 replies
-
If you would like it to terminate when default_auto_reply is reached, you can set the default_auto_reply to "TERMINATE", or set is_termination_message on the recipient to check for empty messages. If you would like the agent to call the LLM, add an llm_config to user proxy. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
when user_proxy use NEVER hunman input mode, in the second round of dialogue,user_proxy send empty message to assistant:
![image](https://private-user-images.githubusercontent.com/30918026/316782558-04c38d67-6c67-40ee-afde-540f27cd2d51.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzg5MzMyOTksIm5iZiI6MTczODkzMjk5OSwicGF0aCI6Ii8zMDkxODAyNi8zMTY3ODI1NTgtMDRjMzhkNjctNmM2Ny00MGVlLWFmZGUtNTQwZjI3Y2QyZDUxLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDclMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA3VDEyNTYzOVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWIyODFmODIwZWFlMWIyODg5Nzk2NTYyZTI4YzQ1ZDkxM2VjOTc0NWUwYTM3MmUzZWVhMDA5NWQzMTIzMTNlMmYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.QFCVyYzHD962Nzb1OLjJXL2RWsQbH9KdSvUCAqp34yk)
Steps to reproduce
the code is agentchat_teaching.ipynb:
Model Used
GLM-4
Expected Behavior
I think maybe when message is empty, the agent should stop,or return some prompt to the caller?
Screenshots and logs
the result is:
Additional Information
the env is docker
Beta Was this translation helpful? Give feedback.
All reactions