You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"llm.tokenizer": {"path": "C:\Users\XXX\tokenizer.json"} (which is get from model folder)
Phenomenon
Then set the endpoint to http://localhost:8000/generate and plugins works, but given the and other symbols which make the code completions won't work well any more.
Question
Is something wrong that cause this error?
Is there some exact tutorials that instructing developer to deploy a custom model that is totally available to work with llm-vscode?
Thanks for reading and thinking!
The text was updated successfully, but these errors were encountered:
Environment
vllm
)(which is get from model folder)
Phenomenon
Then set the endpoint to
http://localhost:8000/generate
and plugins works, but given the and other symbols which make the code completions won't work well any more.Question
Thanks for reading and thinking!
The text was updated successfully, but these errors were encountered: