-
-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Advanced OpenAI configuration #258
base: main
Are you sure you want to change the base?
Conversation
See #166 |
Yeah as I see so far, your PR and your changes aim to add config options for Ollama and not to change anything OpenAI related. |
No worries, I was just linking them as they are related and there may be some merge conflicts for the other if/when one is merged. |
Alright, but I think we can skip the use of fetch and just use my change to also support Ollama. As far as I can understand just changing the url and the model will do the trick for Ollama. |
Yes I can confirm, that you can use my changes to use Ollama local models like this example: OPENAI_API_URL="http://127.0.0.1:11434/v1" |
Updated the PR description to include the Ollama instructions. |
In this PR I´ve added some OpenAI configuration parameters, updated the .env.example file and did some package updates for NextJs, OpenAI and Prisma.
OpenAI can no be configured via ENV vars. The following variables were added:
Using the Cloudflare AI Gateway will help decrease the amount of request that will be send to OpenAI and therefor decrease the cost that will be generated. Same expense titles for exampel will be cached by Cloudflare and the can be reused for the next requests.
It is also possible to define Ollama as your AI endpoint. Ollama is compatible with OpenAI, therefor the OpenAI SDK´s can be used with Ollama (Docs).
Just set your Ollama server url as your
OPENAI_API_URL="http://localhost:11434/v1"
and define your preferred local model inOPENAI_CATEGORY_EXPENSE_MODEL
andOPENAI_CATEGORY_RECEIPT_EXTRACT
.OPENAI_API_KEY
cannot be null and must be set to some random stuff, for Ollama it won´t be used.