Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"max_tokens" not working for chat completion #16

Closed
someguy9 opened this issue Apr 5, 2023 · 3 comments
Closed

"max_tokens" not working for chat completion #16

someguy9 opened this issue Apr 5, 2023 · 3 comments

Comments

@someguy9
Copy link

someguy9 commented Apr 5, 2023

I believe "max_tokens" is missing for CreateChatCompletionRequest in https://github.com/gptlabs/openai-streams/blob/0892cb4e86419b55463d5fb36dbd002ce0e32832/src/lib/pinned.ts#L25

I get an error that doesn't allow me to set max_tokens when using a chat competition (great work on this library by the way!)

@someguy9
Copy link
Author

someguy9 commented Apr 5, 2023

Well you are amazing. thank you.

@ctjlewis
Copy link
Collaborator

ctjlewis commented Apr 5, 2023

This was not supported in OpenAI's original release, and so the types were a little behind. This was updated.

Ping: paradigmxyz/flux#12 (comment)

@ctjlewis
Copy link
Collaborator

ctjlewis commented Apr 5, 2023

No worries! Working with the Chat streams is still kind of crazy, we need to merge the hacks we've used in practice into the library soon. Until then, you'll probably find it a pain to use in production, where chunks may contain several objects or parts of objects.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants