Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added additional parameters to send() function #12

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 80 additions & 0 deletions chat.go
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,67 @@ type ChatCompletionRequest struct {
// (Optional)
// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse
User string `json:"user,omitempty"`

//Additional fields in fork -- Ben Meeker

// (Optional - default: false)
// Return log probabilities of output tokens.
LogProbs bool `json:"logprobs,omitempty"`

// (Optional - default: null)
// An integer between 0 and 20 specifying the most likely tokens to return at each token position.
// LogProbs MUST be set to TRUE to use this parameter.
Top_LogProbs int `json:"top_logprobs,omitempty"`

// (Optional - default: text)
// Specify the format the ChatGPT returns. Compatible with GPT-4o, GPT-4o mini, GPT-4 Turbo, and all GPT-3.5 Turbo models newer that gpt-3.5-turbo-1106
// Options
// Type: "json_object" to enable JSON mode.
// Type: "text" to enable plain text mode.
Response_Format *ResponseFormat `json:"response_format,omitempty"`

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Just use camelcase (here and elsewhere)? ResponseFormat

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added the underscore naming convention to match the pattern set by Top_P on line 62 of the chat.go file. I figured it was best to follow what was already in place. That way, it matches the API parameters OpenAI accepts more clearly as well.

I do see your nit side though!! I'm a big camelcase fan 😄


// (Optional - default: null)
// System will try to sample deterministically based on the seed provided. The same seed and parameters should return the same result.
// Determinism is not guaranteed, refer to system_fingerprint response paramater.
Seed int `json:"seed,omitempty"`

// (Optional - default: auto)
// Specifies latency tier to use for request
// 'auto' - system will use scale tier credits until exhausted
// 'default' - request processed using default service tier with lower uptime SLA and no latency guarantee.
Service_Tier string `json:"service_tier,omitempty"`

// (Optional - default: false)
// If set, partial message deltas will be sent. Tokens will be send as data-only server-sent events as they become available.
// Stream terminated by a data: [DONE] message.
Stream bool `json:"stream,omitempty"`

// (Optional - default: null)
// Only set this when Stream is True
// Set an additional chunk to stream before data: [DONE] message.
Stream_Options *StreamOptions `json:"stream_options,omitempty"`

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here: want to avoid pointers to avoid possible nil dereference issues haha (but should still be able to check if empty since go zeroes out the struct if we don't add anything to it)

Suggested change
Stream_Options *StreamOptions `json:"stream_options,omitempty"`
Stream_Options StreamOptions `json:"stream_options,omitempty"`


// (Optional - default: null)
// A list of tools the model may call
// Provide a list of functions the model may generate JSON inputs for. 128 functions max supported.
Tools *[]Tool `json:"tools,omitempty"`

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

want to avoid pointers since i keep running into nil dereference issues haha :(

Suggested change
Tools *[]Tool `json:"tools,omitempty"`
Tools []Tool `json:"tools,omitempty"`

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The use of pointers here is necessary, since you can set the value of a pointer to nil, but not the value of a custom struct. That means that if it isn't a pointer, then the json tag 'omitempty' won't function, and it will send that parameter in each call, resulting in errors!

This is the best way I know of the resolve this issue. But I'm very open to other solutions that resolve the omitempty issue!


// (Optional - default: none)
// Do NOT use this parameter in conjunction with Tool_Choice
// Options
// None: No tool will be called and a message will be generated
// Auto: Any number of tools can be used and/or message generation will take place
// Required: The model must call one or more tools
Tool_Choice_Type string `json:"tool_choice,omitempty"`

// (Optional - default: none)
// Do NOT use this parameter in conjunction with Tool_Choice_Type
// Provide a tool object to be called. This forces the model to use that tool.
Tool_Choice *Tool `json:"tool_choice,omitempty"`

// (Optional - default: true)
// Whether to enable parallel function calling during tool use
Parallel_Tool_Calls bool `json:"parallel_tool_calls,omitempty"`
}

type ChatMessage struct {
Expand Down Expand Up @@ -110,6 +171,25 @@ type ChatResponseUsage struct {
Total_Tokens int `json:"total_tokens"`
}

type ResponseFormat struct {
Type string `json:"type"`
}

type StreamOptions struct {
Include_Usage bool `json:"include_usage"`
}

type Tool struct {
Type string `json:"type"`
Function FunctionFormat `json:"function"`
}

type FunctionFormat struct {
Description string `json:"description"`
Name string `json:"name"`
Parameters interface{} `json:"parameters"`
}

func (c *Client) SimpleSend(ctx context.Context, message string) (*ChatResponse, error) {
req := &ChatCompletionRequest{
Model: GPT35Turbo,
Expand Down