Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat add support for openai compatible providers #12

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

nkzk
Copy link
Contributor

@nkzk nkzk commented Oct 13, 2024

Adds support for openai compatible llm providers. Resolves #11

Changes

  • cmd/login.go login flow now supports specifying provider, and gives different options based on the provider
  • Adjusted pkg/chat/chat.go to fetch and use the selected LLM model from configuration.
    • Note: A bit unsure if how I fetch the config here is optimal. Please review.
  • pkg/config/config.go added provider, model, and API version as a configurable parameters.
  • Implemented provider-specific logic in pkg/config/provider.go for setting up client configurations.
  • Updated DefaultValidator in pkg/config/validator.go to support the new provider field. Set a default model deployment and provider if it has not been set in configuration to avoid breaking changes
  • docs: Updated README.md with an alternative login example, and updated azure login example (add provider attribute)

Testing:

Tested with OpenAI API and Ollama.

Please verify non-breaking changes to azure openai deployment configs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Introduce Support for OpenAI-Compatible LLM Providers
1 participant