Can't Use Ollama Models with LATS Example Notebook #463
Replies: 6 comments
-
Yeah ollama would need to support tool calling for the notebook to work without modification.. |
Beta Was this translation helpful? Give feedback.
-
Would be happy to review a PR by anyone who makes a reliable example of LATS with ollama or transformers or another OSS model provider! |
Beta Was this translation helpful? Give feedback.
-
@hinthornw I would gladly take a shot at trying to complete this. Is there an recommended starting place for what OSS models / API are easiest to add support for or experiment with for trying to add tool calling support? I have struggled to find a documented list of recommended OSS models/ API for langchain tool compatibility / function calling. |
Beta Was this translation helpful? Give feedback.
-
What size model are you able to run locally? There still isn't excellent multi-turn tool calling out of the box for most OSS models, though mixtral seems to be the best (maybe gemma is good too - haven't checked) |
Beta Was this translation helpful? Give feedback.
-
I have a dev server with 2 4090s (24gb VRAM) but also can use a100 / a6000 cards if the larger models would improve the performance. I have been interested in some of the more recent coding models that beat gpt-3.5 turbo on some of the coding benchmarks eval plus like the opencoder-ds 30B, deepseek coder, and starcoder2 models. I am not sure what benchmark or dataset would be best for measuring multiturn performance, tool usage performance, or function calling performance. I had thought that some of these coding models like open code interpreter-ds had been tuned to try to be able to do multiturn coding approaches, but i don't know if they do multiturn AND function calling AND tool use all together. I may have some interest in building a dataset for finetuning some of the OSS models for function calling and tool support if there aren't any other alternatives. |
Beta Was this translation helpful? Give feedback.
-
i have same issue, tool_choices has no items, i don't know if it is a misconfiguration in the graph or actually a missing condition in this function btw I'm using Llama-2 on-prem and it happens when i run following cell / line |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Example Code
so taking the lats example notebook from the repo and trying to switch out the OpenAI models for ollama models
gives bugs related to the tools either bind_tools not found or doesn't return a tool correctly. The following 4 approaches didn't work including the OpenAI llama api.
Beta Was this translation helpful? Give feedback.
All reactions