Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dafault sambanova model not working in local usage #4352

Open
3 tasks done
jhpiedrahitao opened this issue Feb 25, 2025 · 0 comments
Open
3 tasks done

dafault sambanova model not working in local usage #4352

jhpiedrahitao opened this issue Feb 25, 2025 · 0 comments
Assignees
Labels
area:configuration Relates to configuration options ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior "needs-triage"

Comments

@jhpiedrahitao
Copy link

Before submitting your bug report

Relevant environment info

- OS: mac os sequoia 15.1
- Continue version: 0.8.68
- IDE version: vscode 
- Model: Sambanova llama3.1 8b
- config:
  
{
  "models": [
    {
      "apiKey": "***********************",
      "title": "Llama3.1 Chat",
      "model": "llama3.1-8b",
      "provider": "sambanova"
    }
  ],
  "tabAutocompleteModel": {
    "title": "Codestral",
    "provider": "mistral",
    "model": "codestral-latest",
    "apiKey": ""
  },
  "contextProviders": [
    {
      "name": "code",
      "params": {}
    },
    {
      "name": "docs",
      "params": {}
    },
    {
      "name": "diff",
      "params": {}
    },
    {
      "name": "terminal",
      "params": {}
    },
    {
      "name": "problems",
      "params": {}
    },
    {
      "name": "folder",
      "params": {}
    },
    {
      "name": "codebase",
      "params": {}
    }
  ],
  "slashCommands": [
    {
      "name": "share",
      "description": "Export the current chat session to markdown"
    },
    {
      "name": "cmd",
      "description": "Generate a shell command"
    },
    {
      "name": "commit",
      "description": "Generate a git commit message"
    }
  ]
}

Description

default model when including sambanova is llama3.1 8b wich set in the config file the wrong model name, also the other models provided by sambanova don't appear

To reproduce

continue fresh install in vscode
add new model in the selector ui

Image

select sambanova
add llama3.1
use the model

Image

in config wrong model is set

Log output

404 Error

404 "Model not found"
View Logs
Likely causes:
InvalidapiBase: https://api.sambanova.ai/v1/
Model/deployment not found for: llama3.1-8b
@dosubot dosubot bot added area:configuration Relates to configuration options ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior labels Feb 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:configuration Relates to configuration options ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior "needs-triage"
Projects
None yet
Development

No branches or pull requests

2 participants