Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose model parameters in UI (temperature, max_tokens, etc) #31

Open
scosman opened this issue Nov 12, 2024 · 2 comments
Open

Expose model parameters in UI (temperature, max_tokens, etc) #31

scosman opened this issue Nov 12, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@scosman
Copy link
Collaborator

scosman commented Nov 12, 2024

Add UI to expose setting (and persisting) model parameters.

@scosman scosman added the enhancement New feature or request label Nov 12, 2024
@scosman
Copy link
Collaborator Author

scosman commented Nov 28, 2024

I did this for fine-tuning in a really nice way. We should reuse that style/code when doing this.

@sammcj
Copy link

sammcj commented Dec 15, 2024

num_ctx for Ollama would probably make sense to have as an optional parameter.

For context (pun intended) - by default Ollama models from the public hub come configured with a tiny little 2k context which is only usable for very small tasks, similarly sometimes you'll create your models with a much larger default context size than what is efficient for dataset generation tasks and you might want to reduce this down to something reasonable like 4k-8k to keep the performance up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants