You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Chat history is silently truncated when going over max_token_limit, as we are using the LangChain class ConversationTokenBufferMemory for the chat_history object. The default max_token_limit in agent.yml is currently set to 4k, which is easily too little with current LLMs. Propose to:
Increase the limit to say 20k, and change default llm to gpt-4o
Add a logger if the limit is exceeded (maybe LangChain support this now?)
The text was updated successfully, but these errors were encountered:
Chat history is silently truncated when going over
max_token_limit
, as we are using the LangChain classConversationTokenBufferMemory
for thechat_history
object. The defaultmax_token_limit
inagent.yml
is currently set to 4k, which is easily too little with current LLMs. Propose to:llm
togpt-4o
The text was updated successfully, but these errors were encountered: