Replies: 1 comment 4 replies
-
p.s. I have just noticed the |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone,
while using instructor for my projects, I came across a case where the tool call might need some (fixed) arguments which are not present in the actual chat, but are related to the more general context in which the chat takes place. For example: a bot used to register the user for some service might not know the user ID, and neither does the user. This information is available in the backend, though. In this case, a simple solution would be to simply prepend the user's prompt with their ID, but this is not a general solution (+ we might not want to copy paste everything into the chat for privacy reasons).
I worked out a solution myself by allowing an "extra_args" argument in
from_response
and a correspondingexclude
list of parameters to exclude from theopenai_schema
, since we don't want the LLM to come up with these "fixed" arguments itself.The actual implementation is very simple and requires minimal changes, but I don't know if this is a problem more people encountered and is worth solving. What do you think? I'd be happy to contribute in case!
Beta Was this translation helpful? Give feedback.
All reactions