-
-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Set num_ctx config #27
Conversation
@CodiumAI-Agent /review |
PR Reviewer Guide 🔍
|
@CodiumAI-Agent /ask
it ensures its values.
mypy ensures that a range is not needed here. |
Ask❓
Answer:The concerns raised in the PR questions are valid. Let's address them one by one:
Here is an improved version of the code with these considerations: if self.config.model.startswith('ollama'):
max_input_tokens = getattr(self.config, 'max_input_tokens', None)
max_output_tokens = getattr(self.config, 'max_output_tokens', None)
if max_input_tokens is not None and max_output_tokens is not None:
if isinstance(max_input_tokens, int) and isinstance(max_output_tokens, int):
litellm.OllamaConfig.num_ctx = max_input_tokens + max_output_tokens
else:
raise ValueError("max_input_tokens and max_output_tokens must be integers")
else:
raise AttributeError("max_input_tokens and max_output_tokens must be defined in the config") Explanation:
This approach ensures that the code is more robust and less prone to runtime errors. |
All-Hands-AI#2927
fixes All-Hands-AI#2924
fixes All-Hands-AI#3287