Documentation around using Mirascope with local models (e.g. vLLM, Ollama, etc.) #716
Labels
documentation
Improvements or additions to documentation
good first issue
Good for newcomers
mirascope
Description
Many local model providers such as vLLM and Ollama have support for OpenAI compatibility, which means you can use Mirascope with local models by setting the client.
This is not explicitly clear, and I'm sure there are also other limitations (e.g. tools) that are worth mentioning in more detail.
We should include a "Learn" section page specifically around how to use local models (i.e. "Local Models") that makes usage with Mirascope explicitly clear.
The text was updated successfully, but these errors were encountered: