You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just got my B580 and it works fine with the IPEX-LLM Ollama portable zip installer, on Win10. I wonder whether the IPEX-LLM can be used with a VS Code extension such as Continue.dev as described below:
Hi,
I just got my B580 and it works fine with the IPEX-LLM Ollama portable zip installer, on Win10. I wonder whether the IPEX-LLM can be used with a VS Code extension such as Continue.dev as described below:
https://dev.to/lunaticprogrammer/using-deepseek-r1-in-visual-studio-code-for-free-2279
Any advice would be greatly appreciated.
All the best.
[SOLVED]
The local models will be shown. Great job Intel !! No Docker required !
The text was updated successfully, but these errors were encountered: