Replies: 2 comments 1 reply
-
I'm unsure if this is a bug perhaps? |
Beta Was this translation helpful? Give feedback.
1 reply
-
After further research in the Docs, I found an example of how the model information shoult be formatted, in the documentation: E.g.: "model_client": {
"model": "bartowski/DeepSeek-R1-Distill-Qwen-7B-GGUF",
"model_type": "OpenAIChatCompletionClient",
"base_url": "http://localhost:1234/v1",
"api_version": "1.0",
"component_type": "model",
"model_capabilities": {
"vision": false,
"function_calling": true,
"json_output": false
}
}, @wpcool It would be helpful to see your full JSON, but my suspicion is your "model" field either didn't conform to the xxx/yyyy standard, or the order of the items in the dictionary is different from the above perhaps? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I use another LLM as base model ,when I run in playground , it errors :"Team creation failed: Model creation failed: model_info is required when model name is not a valid OpenAI mode" . but I have write model_info in the "model_client"
Beta Was this translation helpful? Give feedback.
All reactions