Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation around using Mirascope with local models (e.g. vLLM, Ollama, etc.) #716

Open
willbakst opened this issue Nov 25, 2024 · 1 comment
Assignees
Labels
documentation Improvements or additions to documentation good first issue Good for newcomers mirascope

Comments

@willbakst
Copy link
Contributor

Description

Many local model providers such as vLLM and Ollama have support for OpenAI compatibility, which means you can use Mirascope with local models by setting the client.

This is not explicitly clear, and I'm sure there are also other limitations (e.g. tools) that are worth mentioning in more detail.

We should include a "Learn" section page specifically around how to use local models (i.e. "Local Models") that makes usage with Mirascope explicitly clear.

@willbakst willbakst added documentation Improvements or additions to documentation good first issue Good for newcomers labels Nov 25, 2024
@petri
Copy link

petri commented Nov 27, 2024

Ollama just released functions-as-tools support: https://ollama.com/blog/functions-as-tools

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation good first issue Good for newcomers mirascope
Projects
None yet
Development

No branches or pull requests

3 participants