Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: count tokens from tool definitions when adjusting for context window #942

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

g-linville
Copy link
Member

@g-linville g-linville commented Jan 30, 2025

The tokens used for tool definitions are counted as part of the context window, but we were failing to count them here when trying to stay beneath the limit. This introduces a fix so that we account for it.

Our token estimation still isn't great. For example, a message consisting of 1\n repeated 30000 times is counted as 20000 tokens in our method (since we just divide the character count by 3), but in OpenAI's tokenizer, it is 60000 tokens, meaning we grossly underestimate. I think we should consider calling out to Python to use the tiktoken library to count tokens for us, as that will be far more reliable. However, in practice, this rarely seems to cause us any trouble, so maybe we are fine sticking with our current approach of just dividing by 3.

@g-linville g-linville changed the title fix: count tokens from tool definitions when adjusting for context wi… fix: count tokens from tool definitions when adjusting for context window Jan 30, 2025
Signed-off-by: Grant Linville <[email protected]>
Signed-off-by: Grant Linville <[email protected]>
@g-linville g-linville marked this pull request as draft January 30, 2025 16:16
@g-linville
Copy link
Member Author

Moved to draft. I'm going to use a proper tokenizer library so that we can get accurate token counts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant